Curvlearn, a Tensorflow based non-Euclidean deep learning framework.

Overview

English | 简体中文

Why Non-Euclidean Geometry

Considering these simple graph structures shown below. Nodes with same color has 2-hop distance whereas 1-hop distance between nodes with different color. Now how could we embed these structures in Euclidean space while keeping these distance unchanged?

Actually perfect embedding without distortion, appearing naturally in hyperbolic (negative curvature) or spherical (positive curvature) space, is infeasible in Euclidean space [1].

As shown above, due to the high capacity of modeling complex structured data, e.g. scale-free, hierarchical or cyclic, there has been an growing interest in building deep learning models under non-Euclidean geometry, e.g. link prediction [2], recommendation [3].

What's CurvLearn

In this repository, we provide a framework, named CurvLearn, for training deep learning models in non-Euclidean spaces.

The framework implements the non-Euclidean operations in Tensorflow and remains the similar interface style for developing deep learning models.

Currently, CurvLearn serves for training several recommendation models in Alibaba. We implement CurvLearn on top of our distributed (graph/deep learning) training engines including Euler and x-deeplearning. The figure below shows how the category tree is embedded in hyperbolic space by using CurvLearn.

Why CurvLearn

CurvLearn has the following major features.

  1. Easy-to-Use. Converting a Tensorflow model from Euclidean space to non-Euclidean spaces with CurvLearn is graceful and undemanding, due to the manifold operations are decoupled from model architecture and similar to vanilla Tensorflow operations. For researchers, CurvLearn also reserves lucid interfaces for developing novel manifolds and optimizers.
  2. Comprehensive methods. CurvLearn is the first Tensorflow based non-Euclidean deep learning framework and supports several typical non-Euclidean spaces, e.g. constant curvature and mixed-curvature manifolds, together with necessary manifold operations and optimizers.
  3. Verified by tremendous industrial traffic. CurvLearn is serving on Alibaba's sponsored search platform with billions of online traffic in several key scenarios e.g. matching and cate prediction. Compared to Euclidean models, CurvLearn can bring more revenue and the RPM (revenue per mille) increases more than 1%.

Now we are working on exploring more non-Euclidean methods and integrating operations with Tensorflow. PR is welcomed!

CurvLearn Architecture

Manifolds

We implemented several types of constant curvature manifolds and the mixed-curvature manifold.

  • curvlearn.manifolds.Euclidean - Euclidean space with zero curvature.
  • curvlearn.manifolds.Stereographic - Constant curvature stereographic projection model. The curvature can be positive, negative or zero.
  • curvlearn.manifolds.PoincareBall - The stereographic projection of the Lorentz model with negative curvature.
  • curvlearn.manifolds.ProjectedSphere - The stereographic projection of the sphere model with positive curvature.
  • curvlearn.manifolds.Product - Mixed-curvature space consists of multiple manifolds with different curvatures.

Operations

To build a non-Euclidean deep neural network, we implemented several basic neural network operations. Complex operations can be decomposed into basic operations explicitly or realized in tangent space implicitly.

  • variable(t, c) - Defines a riemannian variable from manifold or tangent space at origin according to its name.
  • to_manifold(t, c, base) - Converts a tensor t in the tangent space of base point to the manifold.
  • to_tangent(t, c, base) - Converts a tensor t in the manifold to the tangent space of base point.
  • weight_sum(tensor_list, a, c) - Computes the sum of tensor list tensor_list with weight list a.
  • mean(t, c, axis) - Computes the average of elements along axis dimension of a tensor t.
  • sum(t, c, axis) - Computes the sum of elements along axis dimension of a tensor t.
  • concat(tensor_list, c, axis) - Concatenates tensor list tensor_list along axis dimension.
  • matmul(t, m, c) - Multiplies tensor t by euclidean matrix m.
  • add(x, y, c) - Adds tensor x and tensor y.
  • add_bias(t, b, c) - Adds a euclidean bias vector b to tensor t.
  • activation(t, c_in, c_out, act) - Computes the value of activation function act for the input tensor t.
  • linear(t, in_dim, out_dim, c_in, c_out, act, scope) - Computes the linear transformation for the input tensor t.
  • distance(src, tar, c) - Computes the squared geodesic/distance between src and tar.

Optimizers

We also implemented several typical riemannian optimizers. Please refer to [4] for more details.

  • curvlearn.optimizers.rsgd - Riemannian stochastic gradient optimizer.
  • curvlearn.optimizers.radagrad - Riemannian Adagrad optimizer.
  • curvlearn.optimizers.radam - Riemannian Adam optimizer.

How to use CurvLearn

To get started with CurvLearn quickly, we provide a simple binary classification model as a quick start and three representative examples for the application demo. Note that the non-Euclidean model is sensitive to the hyper-parameters such as learning rate, loss functions, optimizers, and initializers. It is necessary to tune those hyper-parameters when transferring to other datasets.

Installation

CurvLearn requires tensorflow~=1.15, compatible with both python 2/3.

The preferred way for installing is via pip.

pip install curvlearn

Quick Start

Here we show how to build binary classification model using CurvLearn. Model includes Stereographic manifold, linear operations , radam optimizer, etc.

Instructions and implement details are shown in Quick Start.

HGCN on Link Prediction [2]

HGCN (Hyperbolic Graph Convolutional Neural Network) is the first inductive hyperbolic GCN that leverages both the expressiveness of GCNs and hyperbolic geometry to learn inductive node representations for hierarchical and scale-free graphs. Run the command to check the accuracy on the OpenFlight airport dataset. Running environment and performance are listed in hgcn.

python examples/hgcn/train.py

HyperML on Recommendation Ranking [3]

HyperML (Hyperbolic Metric Learning) applies hyperbolic geometry to recommender systems through metric learning approach and achieves state-of-the-art performance on multiple benchmark datasets. Run the command to check the accuracy on the Amazon Kindle-Store dataset. Running environment and performance are listed in hyperml.

python examples/hyperml/train.py

Hyper Tree Pre-train Model

In the real-world, data is often organized in tree-like structure or can be represented hierarchically. It has been proven that hyperbolic deep neural networks have significant advantages over tree-data representation than Euclidean models. In this case, we present a hyperbolic graph pre-train model for category tree in Taobao. The further details including dataset description, model architecture and visualization of results can be found in CateTreePretrain.

python examples/tree_pretrain/run_model.py

References

[1] Bachmann, Gregor, Gary Bécigneul, and Octavian Ganea. "Constant curvature graph convolutional networks." International Conference on Machine Learning. PMLR, 2020.

[2] Chami, Ines, et al. "Hyperbolic graph convolutional neural networks." Advances in neural information processing systems 32 (2019): 4868-4879.

[3] Vinh Tran, Lucas, et al. "Hyperml: A boosting metric learning approach in hyperbolic space for recommender systems." Proceedings of the 13th International Conference on Web Search and Data Mining. 2020.

[4] Bécigneul, Gary, and Octavian-Eugen Ganea. "Riemannian adaptive optimization methods." arXiv preprint arXiv:1810.00760 (2018).

License

This project is licensed under the Apache License, Version 2.0, unless otherwise explicitly stated.

Owner
Alibaba
Alibaba Open Source
Alibaba
Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021.

Dense Contrastive Learning for Self-Supervised Visual Pre-Training This project hosts the code for implementing the DenseCL algorithm for se

Xinlong Wang 491 Jan 03, 2023
Pixray is an image generation system

Pixray is an image generation system

pixray 883 Jan 07, 2023
Python script to download the celebA-HQ dataset from google drive

download-celebA-HQ Python script to download and create the celebA-HQ dataset. WARNING from the author. I believe this script is broken since a few mo

133 Dec 21, 2022
Regression Metrics Calculation Made easy for tensorflow2 and scikit-learn

Regression Metrics Installation To install the package from the PyPi repository you can execute the following command: pip install regressionmetrics I

Ashish Patel 11 Dec 16, 2022
Official repository for Natural Image Matting via Guided Contextual Attention

GCA-Matting: Natural Image Matting via Guided Contextual Attention The source codes and models of Natural Image Matting via Guided Contextual Attentio

Li Yaoyi 349 Dec 26, 2022
AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet buil

3.4k Jan 07, 2023
Align before Fuse: Vision and Language Representation Learning with Momentum Distillation

This is the official PyTorch implementation of the ALBEF paper [Blog]. This repository supports pre-training on custom datasets, as well as finetuning on VQA, SNLI-VE, NLVR2, Image-Text Retrieval on

Salesforce 805 Jan 09, 2023
ViViT: Curvature access through the generalized Gauss-Newton's low-rank structure

ViViT is a collection of numerical tricks to efficiently access curvature from the generalized Gauss-Newton (GGN) matrix based on its low-rank structure. Provided functionality includes computing

Felix Dangel 12 Dec 08, 2022
Unofficial PyTorch implementation of Google AI's VoiceFilter system

VoiceFilter Note from Seung-won (2020.10.25) Hi everyone! It's Seung-won from MINDs Lab, Inc. It's been a long time since I've released this open-sour

MINDs Lab 883 Jan 07, 2023
Blender Python - Node-based multi-line text and image flowchart

MindMapper v0.8 Node-based text and image flowchart for Blender Mindmap with shortcuts visible: Mindmap with shortcuts hidden: Notes This was requeste

SpectralVectors 58 Oct 08, 2022
Easy and comprehensive assessment of predictive power, with support for neuroimaging features

Documentation: https://raamana.github.io/neuropredict/ News As of v0.6, neuropredict now supports regression applications i.e. predicting continuous t

Pradeep Reddy Raamana 93 Nov 29, 2022
Bootstrapped Unsupervised Sentence Representation Learning (ACL 2021)

Install first pip3 install -e . Training python3 training/unsupervised_tuning.py python3 training/supervised_tuning.py python3 training/multilingual_

yanzhang_nlp 26 Jul 22, 2022
'Aligned mixture of latent dynamical systems' (amLDS) for stimulus decoding probabilistic manifold alignment across animals. P. Herrero-Vidal et al. NeurIPS 2021 code.

Across-animal odor decoding by probabilistic manifold alignment (NeurIPS 2021) This repository is the official implementation of aligned mixture of la

Pedro Herrero-Vidal 3 Jul 12, 2022
A robust pointcloud registration pipeline based on correlation.

PHASER: A Robust and Correspondence-Free Global Pointcloud Registration Ubuntu 18.04+ROS Melodic: Overview Pointcloud registration using correspondenc

ETHZ ASL 101 Dec 01, 2022
Bianace Prediction Pytorch Model

Bianace Prediction Pytorch Model Main Results ETHUSDT from 2021-01-01 00:00:00 t

RoyYang 4 Jul 20, 2022
scikit-learn: machine learning in Python

scikit-learn is a Python module for machine learning built on top of SciPy and is distributed under the 3-Clause BSD license. The project was started

scikit-learn 52.5k Jan 08, 2023
Official implementation of Long-Short Transformer in PyTorch.

Long-Short Transformer (Transformer-LS) This repository hosts the code and models for the paper: Long-Short Transformer: Efficient Transformers for La

NVIDIA Corporation 198 Dec 29, 2022
Official implementation of the paper DeFlow: Learning Complex Image Degradations from Unpaired Data with Conditional Flows

DeFlow: Learning Complex Image Degradations from Unpaired Data with Conditional Flows Official implementation of the paper DeFlow: Learning Complex Im

Valentin Wolf 86 Nov 16, 2022
Codes accompanying the paper "Learning Nearly Decomposable Value Functions with Communication Minimization" (ICLR 2020)

NDQ: Learning Nearly Decomposable Value Functions with Communication Minimization Note This codebase accompanies paper Learning Nearly Decomposable Va

Tonghan Wang 69 Nov 26, 2022
Bayesian Generative Adversarial Networks in Tensorflow

Bayesian Generative Adversarial Networks in Tensorflow This repository contains the Tensorflow implementation of the Bayesian GAN by Yunus Saatchi and

Andrew Gordon Wilson 1k Nov 29, 2022