Official implementation of Rethinking Graph Neural Architecture Search from Message-passing (CVPR2021)

Related tags

Deep LearningGNAS-MP
Overview

Rethinking Graph Neural Architecture Search from Message-passing

Intro

The GNAS can automatically learn better architecture with the optimal depth of message passing on the graph. Specifically, we design Graph Neural Architecture Paradigm (GAP) with tree-topology computation procedure and two types of fine-grained atomic operations (feature filtering & neighbor aggregation) from message-passing mechanism to construct powerful graph network search space. Feature filtering performs adaptive feature selection, and neighbor aggregation captures structural information and calculates neighbors’ statistics. Experiments show that our GNAS can search for better GNNs with multiple message-passing mechanisms and optimal message-passing depth.

Getting Started

0. Prerequisites

  • Linux
  • NVIDIA GPU + CUDA CuDNN

1. Setup Python environment for GPU

# clone Github repo
conda install git
git clone https://github.com/phython96/GNAS-MP.git
cd GNAS-MP

# Install python environment
conda env create -f environment_gpu.yml
conda activate gnas

2. Download datasets

The datasets are provided by project benchmarking-gnns, you can click here to download all the required datasets.

3. Searching

We have provided scripts for easily searching graph neural networks on five datasets.

# searching on ZINC dataset at graph regression task
sh scripts/search_molecules_zinc.sh [gpu_id]

# searching on SBMs_PATTERN dataset at node classification task
sh scripts/search_sbms_pattern.sh [gpu_id]

# searching on SBMs_CLUSTER dataset at node classification task
sh scripts/search_sbms_cluster.sh [gpu_id]

# searching on MNIST dataset at graph classification task
sh scripts/search_superpixels_mnist.sh [gpu_id]

# searching on CIFAR10 dataset at graph classification task
sh scripts/search_superpixels_cifar10.sh [gpu_id]

When the search procedure is finished, you need to copy the searched genotypes from file "./save/[data_name]_search.txt" to "./configs/genotypes.py".

For example, we have searched on MNIST dataset, and obtain genotypes result file "./save/MNIST_search.txt".

Epoch : 19
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_dense', 9, 7)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 0), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_identity', 9, 4)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 1), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_identity', 8, 7), ('f_sparse', 9, 4)], concat_node=None)]
Epoch : 20
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_dense', 8, 4), ('f_sparse', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_sparse', 9, 8)], concat_node=None)]
Epoch : 21
[Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 0), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_identity', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 4), ('f_identity', 9, 7)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_identity', 9, 4)], concat_node=None)]

Copy the fourth line from the above file and paste it into "./configs/genotypes.py" with the prefix "MNIST = ".

MNIST_Net = [Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_dense', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 6), ('f_identity', 8, 7), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_dense', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_identity', 7, 5), ('f_identity', 8, 6), ('f_sparse', 9, 8)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 0), ('f_sparse', 3, 0), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_dense', 8, 4), ('f_sparse', 9, 6)], concat_node=None), Genotype(alpha_cell=[('f_sparse', 1, 0), ('f_sparse', 2, 1), ('f_sparse', 3, 2), ('a_max', 4, 1), ('a_max', 5, 2), ('a_max', 6, 3), ('f_sparse', 7, 4), ('f_sparse', 8, 6), ('f_sparse', 9, 8)], concat_node=None)]

4. Training

Before training, you must confim that there is a genotype of searched graph neural network in file "./configs/genotypes.py".

We provided scripts for easily training graph neural networks searched by GNAS.

# training on ZINC dataset at graph regression task
sh scripts/train_molecules_zinc.sh [gpu_id]

# training on SBMs_PATTERN dataset at node classification task
sh scripts/train_sbms_pattern.sh [gpu_id]

# training on SBMs_CLUSTER dataset at node classification task
sh scripts/train_sbms_cluster.sh [gpu_id]

# training on MNIST dataset at graph classification task
sh scripts/train_superpixels_mnist.sh [gpu_id]

# training on CIFAR10 dataset at graph classification task
sh scripts/train_superpixels_cifar10.sh [gpu_id]

Results

Visualization

Here, we show 4-layer graph neural networks searched by GNAS on five datasets at three graph tasks.

Reference

to be updated

Owner
Shaofei Cai
Retired ICPC contestant, classic algorithm enthusiast.
Shaofei Cai
A semismooth Newton method for elliptic PDE-constrained optimization

sNewton4PDEOpt The Python module implements a semismooth Newton method for solving finite-element discretizations of the strongly convex, linear ellip

2 Dec 08, 2022
PyTorch implementation of Self-supervised Contrastive Regularization for DG (SelfReg)

SelfReg PyTorch official implementation of Self-supervised Contrastive Regularization for Domain Generalization (SelfReg, https://arxiv.org/abs/2104.0

64 Dec 16, 2022
Learning and Building Convolutional Neural Networks using PyTorch

Image Classification Using Deep Learning Learning and Building Convolutional Neural Networks using PyTorch. Models, selected are based on number of ci

Mayur 126 Dec 22, 2022
Code accompanying "Learning What To Do by Simulating the Past", ICLR 2021.

Learning What To Do by Simulating the Past This repository contains code that implements the Deep Reward Learning by Simulating the Past (Deep RSLP) a

Center for Human-Compatible AI 24 Aug 07, 2021
Hydra Lightning Template for Structured Configs

Hydra Lightning Template for Structured Configs Template for creating projects with pytorch-lightning and hydra. How to use this template? Create your

Model-driven Machine Learning 4 Jul 19, 2022
text_recognition_toolbox: The reimplementation of a series of classical scene text recognition papers with Pytorch in a uniform way.

text recognition toolbox 1. 项目介绍 该项目是基于pytorch深度学习框架,以统一的改写方式实现了以下6篇经典的文字识别论文,论文的详情如下。该项目会持续进行更新,欢迎大家提出问题以及对代码进行贡献。 模型 论文标题 发表年份 模型方法划分 CRNN 《An End-t

168 Dec 24, 2022
Bringing Computer Vision and Flutter together , to build an awesome app !!

Bringing Computer Vision and Flutter together , to build an awesome app !! Explore the Directories Flutter · Machine Learning Table of Contents About

Padmanabha Banerjee 14 Apr 07, 2022
Deep Federated Learning for Autonomous Driving

FADNet: Deep Federated Learning for Autonomous Driving Abstract Autonomous driving is an active research topic in both academia and industry. However,

AIOZ AI 12 Dec 01, 2022
This repository contains the implementation of Deep Detail Enhancment for Any Garment proposed in Eurographics 2021

Deep-Detail-Enhancement-for-Any-Garment Introduction This repository contains the implementation of Deep Detail Enhancment for Any Garment proposed in

40 Dec 13, 2022
A CROSS-MODAL FUSION NETWORK BASED ON SELF-ATTENTION AND RESIDUAL STRUCTURE FOR MULTIMODAL EMOTION RECOGNITION

CFN-SR A CROSS-MODAL FUSION NETWORK BASED ON SELF-ATTENTION AND RESIDUAL STRUCTURE FOR MULTIMODAL EMOTION RECOGNITION The audio-video based multimodal

skeleton 15 Sep 26, 2022
This repository contains the code for the paper in EMNLP 2021: "HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression".

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression This repository contains the code for the paper in EM

Chenhe Dong 2 Mar 24, 2022
A torch.Tensor-like DataFrame library supporting multiple execution runtimes and Arrow as a common memory format

TorchArrow (Warning: Unstable Prototype) This is a prototype library currently under heavy development. It does not currently have stable releases, an

Facebook Research 536 Jan 06, 2023
Simulations for Turring patterns on an apically expanding domain. T

Turing patterns on expanding domain Simulations for Turring patterns on an apically expanding domain. The details about the models and numerical imple

Yue Liu 0 Aug 03, 2021
Implementation of Convolutional LSTM in PyTorch.

ConvLSTM_pytorch This file contains the implementation of Convolutional LSTM in PyTorch made by me and DavideA. We started from this implementation an

Andrea Palazzi 1.3k Dec 29, 2022
CVNets: A library for training computer vision networks

CVNets: A library for training computer vision networks This repository contains the source code for training computer vision models. Specifically, it

Apple 1.1k Jan 03, 2023
VISNOTATE: An Opensource tool for Gaze-based Annotation of WSI Data

VISNOTATE: An Opensource tool for Gaze-based Annotation of WSI Data Introduction Requirements Installation and Setup Supported Hardware and Software R

SigmaLab 1 Jun 14, 2022
Code for "Diffusion is All You Need for Learning on Surfaces"

Source code for "Diffusion is All You Need for Learning on Surfaces", by Nicholas Sharp Souhaib Attaiki Keenan Crane Maks Ovsjanikov NOTE: the linked

Nick Sharp 247 Dec 28, 2022
RealFormer-Pytorch Implementation of RealFormer using pytorch

RealFormer-Pytorch Implementation of RealFormer using pytorch. Includes comparison with classical Transformer on image classification task (ViT) wrt C

Simo Ryu 90 Dec 08, 2022
End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model

onnx-facial-lmk-detector End-to-end face detection, cropping, norm estimation, and landmark detection in a single onnx model, model.onnx. Demo You can

atksh 42 Dec 30, 2022
Reimplementation of Dynamic Multi-scale filters for Semantic Segmentation.

Paddle implementation of Dynamic Multi-scale filters for Semantic Segmentation.

Hongqiang.Wang 2 Nov 01, 2021