Code for paper "Multi-level Disentanglement Graph Neural Network"

Overview

Multi-level Disentanglement Graph Neural Network (MD-GNN)

This is a PyTorch implementation of the MD-GNN, and the code includes the following modules:

  • Datasets (Cora, Citeseer, Pubmed, Synthetic, and ZINC)

  • Training paradigm for node classification, graph classification, and graph regression tasks

  • Visualization

  • Evaluation metrics

Main Requirements

  • dgl==0.4.3.post2
  • networkx==2.4
  • numpy==1.18.1
  • ogb==1.1.1
  • scikit-learn==0.22.2.post1
  • scipy==1.4.1
  • torch==1.5.0

Description

  • train.py

    • main() -- Train a new model for node classification task on the Cora, Citeseer, and Pubmed datasets
    • evaluate() -- Test the learned model for node classification task on the Cora, Citeseer, and Pubmed datasets
    • main_synthetic() -- Train a new model for graph classification task on the Synthetic dataset
    • evaluate_synthetic() -- Test the learned model for graph classification task on the Synthetic dataset
    • main_zinc() -- Train a new model for graph regression task on the ZINC datasets
    • evaluate_zinc() -- Test the learned model for graph regression task on the ZINC datasets
  • dataset.py

    • load_data() -- Load data of selected dataset
  • MDGNN.py

    • MDGNN() -- model and loss
  • utils.py

    • evaluate_att() -- Evaluate attribute-level disentanglement with the visualization of relation-related attributes
    • evaluate_corr() -- Evaluate node-level disentanglement with the correlation analysis of latent features
    • evaluate_graph() -- Evaluate graph-level disentanglement with the visualization of disentangled relation graphs

Running the code

  1. Install the required dependency packages and unzip files in the data folder.

  2. We use DGL to implement all the GNN models on three citation datasets (Cora, Citeseer, and Pubmed). In order to evaluate the model with different splitting strategy (fewer and harder label rates), you need to replace the following file with the citation_graph.py provided.

dgl/data/citation_graph.py

  1. To get the results on a specific dataset, run with proper hyperparameters
python train.py --dataset data_name

where the data_name is one of the five datasets (cora, citeseer, pubmed, synthetic, and zinc). The model as well as the training log will be saved to the corresponding dir in ./log for evaluation.

  1. The evaluation the performance of three-level disentanglement performance, run
python utils.py

License

MD-GNN is released under the MIT license.

Owner
Lirong Wu
Ph.D. student on Graph.
Lirong Wu
Graph Analysis From Scratch

Graph Analysis From Scratch Goal In this notebook we wanted to implement some functionalities to analyze a weighted graph only by using algorithms imp

Arturo Ghinassi 0 Sep 17, 2022
NNR conformation conditional and global probabilities estimation and analysis in peptides or proteins fragments

NNR and global probabilities estimation and analysis in peptides or protein fragments This module calculates global and NNR conformation dependent pro

0 Jul 15, 2021
Multi-agent reinforcement learning algorithm and environment

Multi-agent reinforcement learning algorithm and environment [en/cn] Pytorch implements multi-agent reinforcement learning algorithms including IQL, Q

万鲲鹏 7 Sep 20, 2022
NAACL2021 - COIL Contextualized Lexical Retriever

COIL Repo for our NAACL paper, COIL: Revisit Exact Lexical Match in Information Retrieval with Contextualized Inverted List. The code covers learning

Luyu Gao 108 Dec 31, 2022
A Survey on Deep Learning Technique for Video Segmentation

A Survey on Deep Learning Technique for Video Segmentation A Survey on Deep Learning Technique for Video Segmentation Wenguan Wang, Tianfei Zhou, Fati

Tianfei Zhou 112 Dec 12, 2022
Variational autoencoder for anime face reconstruction

VAE animeface Variational autoencoder for anime face reconstruction Introduction This repository is an exploratory example to train a variational auto

Minzhe Zhang 2 Dec 11, 2021
Reusable constraint types to use with typing.Annotated

annotated-types PEP-593 added typing.Annotated as a way of adding context-specific metadata to existing types, and specifies that Annotated[T, x] shou

125 Dec 26, 2022
CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Selection

CIFS This repository provides codes for CIFS (ICML 2021). CIFS: Improving Adversarial Robustness of CNNs via Channel-wise Importance-based Feature Sel

Hanshu YAN 19 Nov 12, 2022
This is the official implementation for "Do Transformers Really Perform Bad for Graph Representation?".

Graphormer By Chengxuan Ying, Tianle Cai, Shengjie Luo, Shuxin Zheng*, Guolin Ke, Di He*, Yanming Shen and Tie-Yan Liu. This repo is the official impl

Microsoft 1.3k Dec 26, 2022
MiniSom is a minimalistic implementation of the Self Organizing Maps

MiniSom Self Organizing Maps MiniSom is a minimalistic and Numpy based implementation of the Self Organizing Maps (SOM). SOM is a type of Artificial N

Giuseppe Vettigli 1.2k Jan 03, 2023
Code for the paper Relation Prediction as an Auxiliary Training Objective for Improving Multi-Relational Graph Representations (AKBC 2021).

Relation Prediction as an Auxiliary Training Objective for Knowledge Base Completion This repo provides the code for the paper Relation Prediction as

Facebook Research 85 Jan 02, 2023
This repository implements variational graph auto encoder by Thomas Kipf.

Variational Graph Auto-encoder in Pytorch This repository implements variational graph auto-encoder by Thomas Kipf. For details of the model, refer to

DaehanKim 215 Jan 02, 2023
Open-source code for Generic Grouping Network (GGN, CVPR 2022)

Open-World Instance Segmentation: Exploiting Pseudo Ground Truth From Learned Pairwise Affinity Pytorch implementation for "Open-World Instance Segmen

Meta Research 99 Dec 06, 2022
Using machine learning to predict and analyze high and low reader engagement for New York Times articles posted to Facebook.

How The New York Times can increase Engagement on Facebook Using machine learning to understand characteristics of news content that garners "high" Fa

Jessica Miles 0 Sep 16, 2021
Beginner-friendly repository for Hacktober Fest 2021. Start your contribution to open source through baby steps. 💜

Hacktober Fest 2021 🎉 Open source is changing the world – one contribution at a time! 🎉 This repository is made for beginners who are unfamiliar wit

Abhilash M Nair 32 Dec 11, 2022
BridgeGAN - Tensorflow implementation of Bridging the Gap between Label- and Reference-based Synthesis in Multi-attribute Image-to-Image Translation.

Bridging the Gap between Label- and Reference based Synthesis(ICCV 2021) Tensorflow implementation of Bridging the Gap between Label- and Reference-ba

huangqiusheng 8 Jul 13, 2022
[CVPR 2021] Modular Interactive Video Object Segmentation: Interaction-to-Mask, Propagation and Difference-Aware Fusion

[CVPR 2021] Modular Interactive Video Object Segmentation: Interaction-to-Mask, Propagation and Difference-Aware Fusion

Rex Cheng 364 Jan 03, 2023
Official implementation of "Learning Not to Reconstruct" (BMVC 2021)

Official PyTorch implementation of "Learning Not to Reconstruct Anomalies" This is the implementation of the paper "Learning Not to Reconstruct Anomal

Marcella Astrid 13 Dec 04, 2022
🇰🇷 Text to Image in Korean

KoDALLE Utilizing pretrained language model’s token embedding layer and position embedding layer as DALLE’s text encoder. Background Training DALLE mo

HappyFace 74 Sep 22, 2022
Easy Parallel Library (EPL) is a general and efficient deep learning framework for distributed model training.

English | 简体中文 Easy Parallel Library Overview Easy Parallel Library (EPL) is a general and efficient library for distributed model training. Usability

Alibaba 185 Dec 21, 2022