Official PyTorch Implementation of "Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs". NeurIPS 2020.

Overview

Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs

This repository is the implementation of SELAR.

Dasol Hwang* , Jinyoung Park* , Sunyoung Kwon, Kyung-min Kim, Jung-Woo Ha, Hyunwoo J. Kim, Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs, In Advanced in Neural Information Processing Systems (NeurIPS 2020).

Data Preprocessing

We used datasets from KGNN-LS and RippleNet for link prediction. Download meta-paths label (meta_labels/) from this link.

  • data/music/

    • ratings_final.npy : preprocessed rating file released by KGNN-LS;
    • kg_final.npy : knowledge graph file;
      • meta_labels/
        • pos_meta{}_{}.pickle : meta-path positive label for auxiliary task
        • neg_meta{}_{}.pickle : meta-path negative label for auxiliary task
  • data/book/

    • ratings_final.npy : preprocessed rating file released by RippleNet;
    • kg_final.npy : knowledge graph file;
      • meta_labels/
        • pos_meta{}_{}.pickle : meta-path positive label for auxiliary task
        • neg_meta{}_{}.pickle : meta-path negative label for auxiliary task

Required packages

A list of dependencies will need to be installed in order to run the code. We provide the dependency yaml file (env.yml)

$ conda env create -f env.yml

Running the code

# check optional arguments [-h]
$ python main_music.py
$ python main_book.py

Overview of the results of link prediction

Last-FM (Music)

Base GNNs Vanilla w/o MP w/ MP SELAR SELAR+Hint
GCN 0.7963 0.7899 0.8235 0.8296 0.8121
GAT 0.8115 0.8115 0.8263 0.8294 0.8302
GIN 0.8199 0.8217 0.8242 0.8361 0.8350
SGC 0.7703 0.7766 0.7718 0.7827 0.7975
GTN 0.7836 0.7744 0.7865 0.7988 0.8067

Book-Crossing (Book)

Base GNNs Vanilla w/o MP w/ MP SELAR SELAR+Hint
GCN 0.7039 0.7031 0.7110 0.7182 0.7208
GAT 0.6891 0.6968 0.7075 0.7345 0.7360
GIN 0.6979 0.7210 0.7338 0.7526 0.7513
SGC 0.6860 0.6808 0.6792 0.6902 0.6926
GTN 0.6732 0.6758 0.6724 0.6858 0.6850

Citation

@inproceedings{NEURIPS2020_74de5f91,
 author = {Hwang, Dasol and Park, Jinyoung and Kwon, Sunyoung and Kim, KyungMin and Ha, Jung-Woo and Kim, Hyunwoo J},
 booktitle = {Advances in Neural Information Processing Systems},
 editor = {H. Larochelle and M. Ranzato and R. Hadsell and M. F. Balcan and H. Lin},
 pages = {10294--10305},
 publisher = {Curran Associates, Inc.},
 title = {Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs},
 url = {https://proceedings.neurips.cc/paper/2020/file/74de5f915765ea59816e770a8e686f38-Paper.pdf},
 volume = {33},
 year = {2020}
}

License

Copyright (c) 2020-present NAVER Corp. and Korea University 
Owner
MLV Lab (Machine Learning and Vision Lab at Korea University)
MLV Lab (Machine Learning and Vision Lab at Korea University)
(CVPR 2022 - oral) Multi-View Depth Estimation by Fusing Single-View Depth Probability with Multi-View Geometry

Multi-View Depth Estimation by Fusing Single-View Depth Probability with Multi-View Geometry Official implementation of the paper Multi-View Depth Est

Bae, Gwangbin 138 Dec 28, 2022
Official code for "InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization" (ICLR 2020, spotlight)

InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization Authors: Fan-yun Sun, Jordan Hoffm

Fan-Yun Sun 232 Dec 28, 2022
Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR)

Personalized Transfer of User Preferences for Cross-domain Recommendation (PTUPCDR) This is the official implementation of our paper Personalized Tran

Yongchun Zhu 81 Dec 29, 2022
Implementation of SiameseXML (ICML 2021)

SiameseXML Code for SiameseXML: Siamese networks meet extreme classifiers with 100M labels Best Practices for features creation Adding sub-words on to

Extreme Classification 35 Nov 06, 2022
R interface to fast.ai

R interface to fastai The fastai package provides R wrappers to fastai. The fastai library simplifies training fast and accurate neural nets using mod

113 Dec 20, 2022
The codebase for Data-driven general-purpose voice activity detection.

Data driven GPVAD Repository for the work in TASLP 2021 Voice activity detection in the wild: A data-driven approach using teacher-student training. S

Heinrich Dinkel 75 Nov 27, 2022
PyTorch implementation of our ICCV 2021 paper Intrinsic-Extrinsic Preserved GANs for Unsupervised 3D Pose Transfer.

Unsupervised_IEPGAN This is the PyTorch implementation of our ICCV 2021 paper Intrinsic-Extrinsic Preserved GANs for Unsupervised 3D Pose Transfer. Ha

25 Oct 26, 2022
This is the official implementation of our proposed SwinMR

SwinMR This is the official implementation of our proposed SwinMR: Swin Transformer for Fast MRI Please cite: @article{huang2022swin, title={Swi

A Yang Lab (led by Dr Guang Yang) 27 Nov 17, 2022
The repository for our EMNLP 2021 paper "Finnish Dialect Identification: The Effect of Audio and Text"

Finnish Dialect Identification The repository for our EMNLP 2021 paper "Finnish Dialect Identification: The Effect of Audio and Text". We present a te

Rootroo Ltd 2 Dec 25, 2021
Sudoku solver - A sudoku solver with python

sudoku_solver A sudoku solver What is Sudoku? Sudoku (Japanese: 数独, romanized: s

Sikai Lu 0 May 22, 2022
A high-performance distributed deep learning system targeting large-scale and automated distributed training.

HETU Documentation | Examples Hetu is a high-performance distributed deep learning system targeting trillions of parameters DL model training, develop

DAIR Lab 150 Dec 21, 2022
A template repository for submitting a job to the Slurm Cluster installed at the DISI - University of Bologna

Cluster di HPC con GPU per esperimenti di calcolo (draft version 1.0) Per poter utilizzare il cluster il primo passo è abilitare l'account istituziona

20 Dec 16, 2022
DIT is a DTLS MitM proxy implemented in Python 3. It can intercept, manipulate and suppress datagrams between two DTLS endpoints and supports psk-based and certificate-based authentication schemes (RSA + ECC).

DIT - DTLS Interception Tool DIT is a MitM proxy tool to intercept DTLS traffic. It can intercept, manipulate and/or suppress DTLS datagrams between t

52 Nov 30, 2022
Implementation of momentum^2 teacher

Momentum^2 Teacher: Momentum Teacher with Momentum Statistics for Self-Supervised Learning Requirements All experiments are done with python3.6, torch

jemmy li 121 Sep 26, 2022
App customer segmentation cohort rfm clustering

CUSTOMER SEGMENTATION COHORT RFM CLUSTERING TỔNG QUAN VỀ HỆ THỐNG DỮ LIỆU Nên chuyển qua theme màu dark thì sẽ nhìn đẹp hơn https://customer-segmentat

hieulmsc 3 Dec 18, 2021
ScaleNet: A Shallow Architecture for Scale Estimation

ScaleNet: A Shallow Architecture for Scale Estimation Repository for the code of ScaleNet paper: "ScaleNet: A Shallow Architecture for Scale Estimatio

Axel Barroso 34 Nov 09, 2022
Container : Context Aggregation Network

Container : Context Aggregation Network If you use this code for a paper please cite: @article{gao2021container, title={Container: Context Aggregati

AI2 47 Dec 16, 2022
Pytorch implementation for "Density-aware Chamfer Distance as a Comprehensive Metric for Point Cloud Completion" (NeurIPS 2021)

Density-aware Chamfer Distance This repository contains the official PyTorch implementation of our paper: Density-aware Chamfer Distance as a Comprehe

Tong WU 93 Dec 15, 2022
Unofficial pytorch implementation of 'Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization'

pytorch-AdaIN This is an unofficial pytorch implementation of a paper, Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization [Hua

Naoto Inoue 873 Jan 06, 2023
This porject is intented to build the most accurate model for predicting the porbability of loan default

Estimating-Loan-Default-Probability IBA ML2 Mid-project / Kaggle Competition This porject is intented to build the most accurate model for predicting

Adil Gahramanov 1 Jan 24, 2022