Node-level Graph Regression with Deep Gaussian Process Models

Overview

Node-level Graph Regression with Deep Gaussian Process Models

Prerequests

our implementation is mainly based on tensorflow 1.x and gpflow 1.x:

python 3.x (3.7 tested)
conda install tensorflow-gpu==1.15
pip install keras==2.3.1
pip install gpflow==1.5
pip install gpuinfo

Besides, some basic packages like numpy are also needed. It's maybe easy to wrap the codes for TF2.0 and GPflow2, but it's not tested yet.

Specification

Source code and experiment result are both provided. Unzip two archive files before using experiment notebooks.

Files

  • dgp_graph/: cores codes of the DGPG model.
    • impl_parallel.py: a fast node-level computation parallelized implementation, invoked by all experiments.
    • my_op.py: some custom tensorflow operations used in the implementation.
    • impl.py: a basic loop-based implementation, easy to understand but not practical, leaving just for calibration.
  • data/: datasets.
  • doubly_stochastic_dgp/: codes from repository DGP
  • compatible/: codes to make the DGP source codes compatible with gpflow1.5.
  • gpflow_monitor/: monitoring tool for gpflow models, from this repo.
  • GRN inference: code and data for the GRN inference experiment.
  • demo_city45.ipynb: jupyter notebooks for city45 dataset experiment.
  • experiments.zip: jupyter notebooks for other experiments.
  • results.zip: contains original jupyter notebooks results. (exported as HTML files for archive)
  • run_toy.sh: shell script to run additional experiment.
  • toy_main.py: code for additional experiment (Traditional ML methods and DGPG with linear kernel).
  • ER-0.1.ipynb: example script for analyzing time-varying graph structures.

Experiments

The experiments are based on python src files and demonstrated by jupyter notebooks. The source of an experiment is under directory src/experiments.zip and the corresponding result is exported as a static HTML file stored in the directory results.zip. They are organized by dataset names:

  1. Synthetic Datasets

For theoretical analysis.

  • demo_toy_run1.ipynb

  • demo_toy_run2.ipynb

  • demo_toy_run3.ipynb

  • demo_toy_run4.ipynb

  • demo_toy_run5.ipynb

For graph signal analysis on time-varying graphs.

  • ER-0.05.ipynb

  • ER-0.2.ipynb

  • RWP-0.1.ipynb

  • RWP-0.2.ipynb

  • RWP-0.3.ipynb

  1. Small Datasets
  • demo_city45.ipynb
  • demo_city45_linear.ipynb (linear kernel)
  • demo_city45_baseline.ipynb (traditional regression methods)
  • demo_etex.ipynb
  • demo_etex_linear.ipynb
  • demo_etex_baseline.ipynb
  • demo_fmri.ipynb
  • demo_fmri_linear.ipynb
  • demo_fmri_baseline.ipynb
  1. Large Datasets (traffic flow prediction)
  • LA
    • demo_la_15min.ipynb
    • demo_la_30min.ipynb
    • demo_la_60min.ipynb
  • BAY
    • demo_bay_15min.ipynb
    • demo_bay_30min.ipynb
    • demo_bay_60min.ipynb
MNIST, but with Bezier curves instead of pixels

bezier-mnist This is a work-in-progress vector version of the MNIST dataset. Samples Here are some samples from the training set. Note that, while the

Alex Nichol 15 Jan 16, 2022
The coda and data for "Measuring Fine-Grained Domain Relevance of Terms: A Hierarchical Core-Fringe Approach" (ACL '21)

We propose a hierarchical core-fringe learning framework to measure fine-grained domain relevance of terms – the degree that a term is relevant to a broad (e.g., computer science) or narrow (e.g., de

Jie Huang 14 Oct 21, 2022
Neural Tangent Generalization Attacks (NTGA)

Neural Tangent Generalization Attacks (NTGA) ICML 2021 Video | Paper | Quickstart | Results | Unlearnable Datasets | Competitions | Citation Overview

Chia-Hung Yuan 34 Nov 25, 2022
基于PaddleOCR搭建的OCR server... 离线部署用

开头说明 DangoOCR 是基于大家的 CPU处理器 来运行的,CPU处理器 的好坏会直接影响其速度, 但不会影响识别的精度 ,目前此版本识别速度可能在 0.5-3秒之间,具体取决于大家机器的配置,可以的话尽量不要在运行时开其他太多东西。需要配合团子翻译器 Ver3.6 及其以上的版本才可以使用!

胖次团子 131 Dec 25, 2022
Graph Attention Networks

GAT Graph Attention Networks (Veličković et al., ICLR 2018): https://arxiv.org/abs/1710.10903 GAT layer t-SNE + Attention coefficients on Cora Overvie

Petar Veličković 2.6k Jan 05, 2023
Playing around with FastAPI and streamlit to create a YoloV5 object detector

FastAPI-Streamlit-based-YoloV5-detector Playing around with FastAPI and streamlit to create a YoloV5 object detector It turns out that a User Interfac

2 Jan 20, 2022
Continuum Learning with GEM: Gradient Episodic Memory

Gradient Episodic Memory for Continual Learning Source code for the paper: @inproceedings{GradientEpisodicMemory, title={Gradient Episodic Memory

Facebook Research 360 Dec 27, 2022
RP-GAN: Stable GAN Training with Random Projections

RP-GAN: Stable GAN Training with Random Projections This repository contains a reference implementation of the algorithm described in the paper: Behna

Ayan Chakrabarti 20 Sep 18, 2021
A PyTorch re-implementation of the paper 'Exploring Simple Siamese Representation Learning'. Reproduced the 67.8% Top1 Acc on ImageNet.

Exploring simple siamese representation learning This is a PyTorch re-implementation of the SimSiam paper on ImageNet dataset. The results match that

Taojiannan Yang 72 Nov 09, 2022
Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions

torch-imle Concise and self-contained PyTorch library implementing the I-MLE gradient estimator proposed in our NeurIPS 2021 paper Implicit MLE: Backp

UCL Natural Language Processing 249 Jan 03, 2023
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.

ARES This repository contains the code for ARES (Adversarial Robustness Evaluation for Safety), a Python library for adversarial machine learning rese

Tsinghua Machine Learning Group 377 Dec 20, 2022
Deal or No Deal? End-to-End Learning for Negotiation Dialogues

Introduction This is a PyTorch implementation of the following research papers: (1) Hierarchical Text Generation and Planning for Strategic Dialogue (

Facebook Research 1.4k Dec 29, 2022
A Simple and Versatile Framework for Object Detection and Instance Recognition

SimpleDet - A Simple and Versatile Framework for Object Detection and Instance Recognition Major Features FP16 training for memory saving and up to 2.

TuSimple 3k Dec 12, 2022
Demonstrates iterative FGSM on Apple's NeuralHash model.

apple-neuralhash-attack Demonstrates iterative FGSM on Apple's NeuralHash model. TL;DR: It is possible to apply noise to CSAM images and make them loo

Lim Swee Kiat 11 Jun 23, 2022
Official implementation of Deep Reparametrization of Multi-Frame Super-Resolution and Denoising

Deep-Rep-MFIR Official implementation of Deep Reparametrization of Multi-Frame Super-Resolution and Denoising Publication: Deep Reparametrization of M

Goutam Bhat 39 Jan 04, 2023
Semantic segmentation task for ADE20k & cityscapse dataset, based on several models.

semantic-segmentation-tensorflow This is a Tensorflow implementation of semantic segmentation models on MIT ADE20K scene parsing dataset and Cityscape

HsuanKung Yang 83 Oct 13, 2022
A PyTorch Implementation of the Luna: Linear Unified Nested Attention

Unofficial PyTorch implementation of Luna: Linear Unified Nested Attention The quadratic computational and memory complexities of the Transformer’s at

Soohwan Kim 32 Nov 07, 2022
CLUES: Few-Shot Learning Evaluation in Natural Language Understanding

CLUES: Few-Shot Learning Evaluation in Natural Language Understanding This repo contains the data and source code for baseline models in the NeurIPS 2

Microsoft 29 Dec 29, 2022
Behind the Curtain: Learning Occluded Shapes for 3D Object Detection

Behind the Curtain: Learning Occluded Shapes for 3D Object Detection Acknowledgement We implement our model, BtcDet, based on [OpenPcdet 0.3.0]. Insta

Qiangeng Xu 163 Dec 19, 2022
My implementation of Fully Convolutional Neural Networks in Keras

Keras-FCN This repository contains my implementation of Fully Convolutional Networks in Keras (Tensorflow backend). Currently, semantic segmentation c

The Duy Nguyen 15 Jan 13, 2020