Random Walk Graph Neural Networks

Overview

Random Walk Graph Neural Networks

This repository is the official implementation of Random Walk Graph Neural Networks.

Requirements

Code is written in Python 3.6 and requires:

  • PyTorch 1.5
  • scikit-learn 0.21

Datasets

Use the following link to download datasets:

https://ls11-www.cs.tu-dortmund.de/staff/morris/graphkerneldatasets

Extract the datasets into the datasets folder.

Training and Evaluation

To train and evaluate the model in the paper, run this command:

python main.py --dataset <dataset_name> 

Example

To train and evaluate the model on MUTAG, first specify the hyperparameters in the main.py file and then run:

python main.py --dataset MUTAG --use-node-labels

Results

Our model achieves the following performance on standard graph classification datasets (note that we used the evaluation procedure and same data splits as in this paper):

Model name MUTAG D&D NCI1 PROTEINS ENZYMES
SP 80.2 (± 6.5) 78.1 (± 4.1) 72.7 (± 1.4) 75.3 (± 3.8) 38.3 (± 8.0)
GR 80.8 (± 6.4) 75.4 (± 3.4) 61.8 (± 1.7) 71.6 (± 3.1) 25.1 (± 4.4)
WL 84.6 (± 8.3) 78.1 (± 2.4) 84.8 (± 2.5) 73.8 (± 4.4) 50.3 (± 5.7)
DGCNN 84.0 (± 6.7) 76.6 (± 4.3) 76.4 (± 1.7) 72.9 (± 3.5) 38.9 (± 5.7)
DiffPool 79.8 (± 7.1) 75.0 (± 3.5) 76.9 (± 1.9) 73.7 (± 3.5) 59.5 (± 5.6)
ECC 75.4 (± 6.2) 72.6 (± 4.1) 76.2 (± 1.4) 72.3 (± 3.4) 29.5 (± 8.2)
GIN 84.7 (± 6.7) 75.3 (± 2.9) 80.0 (± 1.4) 73.3 (± 4.0) 59.6 (± 4.5)
GraphSAGE 83.6 (± 9.6) 72.9 (± 2.0) 76.0 (± 1.8) 73.0 (± 4.5) 58.2 (± 6.0)
1-step RWNN 89.2 (± 4.3) 77.6 (± 4.7) 71.4 (± 1.8) 74.7 (± 3.3) 56.7 (± 5.2)
2-step RWNN 88.1 (± 4.8) 76.9 (± 4.6) 73.0 (± 2.0) 74.1 (± 2.8) 57.4 (± 4.9)
3-step RWNN 88.6 (± 4.1) 77.4 (± 4.9) 73.9 (± 1.3) 74.3 (± 3.3) 57.6 (± 6.3)
Model name IMDB-BINARY IMDB-MULTI REDDIT-BINARY REDDIT-MULTI-5K COLLAB
SP 57.7 (± 4.1) 39.8 (± 3.7) 89.0 (± 1.0) 51.1 (± 2.2) 79.9 (± 2.7)
GR 63.3 (± 2.7) 39.6 (± 3.0) 76.6 (± 3.3) 38.1 (± 2.3) 71.1 (± 1.4)
WL 72.8 (± 4.5) 51.2 (± 6.5) 74.9 (± 1.8) 49.6 (± 2.0) 78.0 (± 2.0)
DGCNN 69.2 (± 3.0) 45.6 (± 3.4) 87.8 (± 2.5) 49.2 (± 1.2) 71.2 (± 1.9)
DiffPool 68.4 (± 3.3) 45.6 (± 3.4) 89.1 (± 1.6) 53.8 (± 1.4) 68.9 (± 2.0)
ECC 67.7 (± 2.8) 43.5 (± 3.1) OOR OOR OOR
GIN 71.2 (± 3.9) 48.5 (± 3.3) 89.9 (± 1.9) 56.1 (± 1.7) 75.6 (± 2.3)
GraphSAGE 68.8 (± 4.5) 47.6 (± 3.5) 84.3 (± 1.9) 50.0 (± 1.3) 73.9 (± 1.7)
1-step RWNN 70.8 (± 4.8) 47.8 (± 3.8) 90.4 (± 1.9) 51.7 (± 1.5) 71.7 (± 2.1)
2-step RWNN 70.6 (± 4.4) 48.8 (± 2.9) 90.3 (± 1.8) 51.7 (± 1.4) 71.3 (± 2.1)
3-step RWNN 70.7 (± 3.9) 47.8 (± 3.5) 89.7 (± 1.2) 53.4 (± 1.6) 71.9 (± 2.5)

Cite

Please cite our paper if you use this code:

@inproceedings{nikolentzos2020random,
  title={Random Walk Graph Neural Networks},
  author={Nikolentzos, Giannis and Vazirgiannis, Michalis},
  booktitle={Proceedings of the 34th Conference on Neural Information Processing Systems},
  pages={16211--16222},
  year={2020}
}
Owner
Giannis Nikolentzos
Giannis Nikolentzos
A modular application for performing anomaly detection in networks

Deep-Learning-Models-for-Network-Annomaly-Detection The modular app consists for mainly three annomaly detection algorithms. The system supports model

Shivam Patel 1 Dec 09, 2021
The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate.

The lightweight PyTorch wrapper for high-performance AI research. Scale your models, not the boilerplate. Website • Key Features • How To Use • Docs •

Pytorch Lightning 21.1k Dec 29, 2022
一个目标检测的通用框架(不需要cuda编译),支持Yolo全系列(v2~v5)、EfficientDet、RetinaNet、Cascade-RCNN等SOTA网络。

一个目标检测的通用框架(不需要cuda编译),支持Yolo全系列(v2~v5)、EfficientDet、RetinaNet、Cascade-RCNN等SOTA网络。

Haoyu Xu 203 Jan 03, 2023
Official Pytorch Implementation of: "Semantic Diversity Learning for Zero-Shot Multi-label Classification"(2021) paper

Semantic Diversity Learning for Zero-Shot Multi-label Classification Paper Official PyTorch Implementation Avi Ben-Cohen, Nadav Zamir, Emanuel Ben Bar

28 Aug 29, 2022
Modifications of the official PyTorch implementation of StyleGAN3. Let's easily generate images and videos with StyleGAN2/2-ADA/3!

Alias-Free Generative Adversarial Networks (StyleGAN3) Official PyTorch implementation of the NeurIPS 2021 paper Alias-Free Generative Adversarial Net

Diego Porres 185 Dec 24, 2022
Robbing the FED: Directly Obtaining Private Data in Federated Learning with Modified Models

Robbing the FED: Directly Obtaining Private Data in Federated Learning with Modified Models This repo contains a barebones implementation for the atta

16 Dec 04, 2022
PyTorch implementation of an end-to-end Handwritten Text Recognition (HTR) system based on attention encoder-decoder networks

AttentionHTR PyTorch implementation of an end-to-end Handwritten Text Recognition (HTR) system based on attention encoder-decoder networks. Scene Text

Dmitrijs Kass 31 Dec 22, 2022
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.

============================================================================================================ `MILA will stop developing Theano https:

9.6k Jan 06, 2023
[ACMMM 2021, Oral] Code release for "Elastic Tactile Simulation Towards Tactile-Visual Perception"

EIP: Elastic Interaction of Particles Code release for "Elastic Tactile Simulation Towards Tactile-Visual Perception", in ACMMM (Oral) 2021. By Yikai

Yikai Wang 37 Dec 20, 2022
The official implementation of the IEEE S&P`22 paper "SoK: How Robust is Deep Neural Network Image Classification Watermarking".

Watermark-Robustness-Toolbox - Official PyTorch Implementation This repository contains the official PyTorch implementation of the following paper to

49 Dec 19, 2022
Deep Unsupervised 3D SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment.

(ACMMM 2021 Oral) SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment This repository shows two tasks: Face landmark detection and Fac

BoomStar 51 Dec 13, 2022
Fast Scattering Transform with CuPy/PyTorch

Announcement 11/18 This package is no longer supported. We have now released kymatio: http://www.kymat.io/ , https://github.com/kymatio/kymatio which

Edouard Oyallon 289 Dec 07, 2022
ProFuzzBench - A Benchmark for Stateful Protocol Fuzzing

ProFuzzBench - A Benchmark for Stateful Protocol Fuzzing ProFuzzBench is a benchmark for stateful fuzzing of network protocols. It includes a suite of

155 Jan 08, 2023
A library for performing coverage guided fuzzing of neural networks

TensorFuzz: Coverage Guided Fuzzing for Neural Networks This repository contains a library for performing coverage guided fuzzing of neural networks,

Brain Research 195 Dec 28, 2022
Like a cowsay but without cows!

Foxsay This is a simple program that generates pictures of a cute fox with a message. It is like a cowsay but without cows! Fox girls are better! Usag

Anastasia Kim 28 Feb 20, 2022
coldcuts is an R package to automatically generate and plot segmentation drawings in R

coldcuts coldcuts is an R package that allows you to draw and plot automatically segmentations from 3D voxel arrays. The name is inspired by one of It

2 Sep 03, 2022
Visual Memorability for Robotic Interestingness via Unsupervised Online Learning (ECCV 2020 Oral and TRO)

Visual Interestingness Refer to the project description for more details. This code based on the following paper. Chen Wang, Yuheng Qiu, Wenshan Wang,

Chen Wang 36 Sep 08, 2022
Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"

CSA: Contextual Similarity Aggregation with Self-attention for Visual Re-ranking PyTorch training code for CSA (Contextual Similarity Aggregation). We

Hui Wu 19 Oct 21, 2022
Official code for: A Probabilistic Hard Attention Model For Sequentially Observed Scenes

"A Probabilistic Hard Attention Model For Sequentially Observed Scenes" Authors: Samrudhdhi Rangrej, James Clark Accepted to: BMVC'21 A recurrent atte

5 Nov 19, 2022