Julia package for multiway (inverse) covariance estimation.

Overview

TensorGraphicalModels

TensorGraphicalModels.jl is a suite of Julia tools for estimating high-dimensional multiway (tensor-variate) covariance and inverse covariance matrices.

Installation

] add https://github.com/ywa136/TensorGraphicalModels.jl

Examples

Please check out a Julia colab created for illustration of some functionalities of the package. Here are some basic examples as well:

Example code for fitting a KP inverse covariance model:

using TensorGraphicalModels

model_type = "kp"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list) #multi-dimensional array (tensor) of dimension d_1 × … × d_K × N
Ψ_hat_list = kglasso(X)

Example code for fitting a KS inverse covariance model:

using TensorGraphicalModels

model_type = "ks"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Ψ_hat_list, _ = teralasso(X_kGram)

Example code for fitting a Sylvester inverse covariance model:

using TensorGraphicalModels

model_type = "sylvester"
sub_model_type = "sb" #this defines the structure of the Kronecker factors, sb = star-block
K = 3
N = 1000
d_list = [5, 10, 15]

X = gen_kronecker_data(model_type, sub_model_type, K, N, d_list, tensorize_out = false) #matrix of dimension d × N

# compute the mode-k Gram matrices (the sufficient statistics for TeraLasso)
X_kGram = [zeros(d_list[k], d_list[k]) for k = 1:K]
Xk = [zeros(d_list[k], Int(prod(d_list) / d_list[k])) for k = 1:K]
for k = 1:K
    for i = 1:N
        copy!(Xk[k], tenmat(reshape(view(X, :, i), d_list), k))
        mul!(X_kGram[k], Xk[k], copy(transpose(Xk[k])), 1.0 / N, 1.0)
    end
end

Psi0 = [sparse(eye(d_list[k])) for k = 1:K]
fun = (iter, Psi) -> [1, time()] # NULL func
lambda = [sqrt(px[k] * log(prod(d_list)) / N) for k = 1:K] 

Ψ_hat_list, _ = syglasso_palm(X, X_kGram, lambda, Psi0, fun = fun)

Example code for fitting a KPCA covariance model:

using TensorGraphicalModels

px = py = 25 #works for K=2 modes only
N = 100
X = zeros((px * py, N))

for i=1:N
    X[:, i] .= vec(rand(MatrixNormal(zeros((px, py)), ScalMat(px, 2.0), ScalMat(py, 4.0))))
end

S = cov(copy(X')) #sample covariance matrix
lambdaL = 20 * (px^2 + py^2 + log(max(px, py, N))) / N
lambdaS = 20 * sqrt(log(px * py)/N)

# robust Kronecker PCA methods using singular value thresholding
Sigma_hat = robust_kron_pca(S, px, py, lambdaL, lambdaS, "SVT"; tau = 0.5, r = 5)
Owner
Wayne Wang
Ph.D. candidate in statistics
Wayne Wang
Algorithmic Trading using RNN

Deep-Trading This an implementation adapted from Rachnog Neural networks for algorithmic trading. Part One — Simple time series forecasting and this c

Hazem Nomer 29 Sep 04, 2022
Picasso: a methods for embedding points in 2D in a way that respects distances while fitting a user-specified shape.

Picasso Code to generate Picasso embeddings of any input matrix. Picasso maps the points of an input matrix to user-defined, n-dimensional shape coord

Pachter Lab 45 Dec 23, 2022
Numerical-computing-is-fun - Learning numerical computing with notebooks for all ages.

As much as this series is to educate aspiring computer programmers and data scientists of all ages and all backgrounds, it is also a reminder to mysel

EKA foundation 758 Dec 25, 2022
Tensorflow-Project-Template - A best practice for tensorflow project template architecture.

Tensorflow Project Template A simple and well designed structure is essential for any Deep Learning project, so after a lot of practice and contributi

Mahmoud G. Salem 3.6k Dec 22, 2022
A Rao-Blackwellized Particle Filter for 6D Object Pose Tracking

PoseRBPF: A Rao-Blackwellized Particle Filter for 6D Object Pose Tracking PoseRBPF Paper Self-supervision Paper Pose Estimation Video Robot Manipulati

NVIDIA Research Projects 107 Dec 25, 2022
Virtual Dance Reality Stage: a feature that offers you to share a stage with another user virtually

Portrait Segmentation using Tensorflow This script removes the background from an input image. You can read more about segmentation here Setup The scr

291 Dec 24, 2022
Pre-Training 3D Point Cloud Transformers with Masked Point Modeling

Point-BERT: Pre-Training 3D Point Cloud Transformers with Masked Point Modeling Created by Xumin Yu*, Lulu Tang*, Yongming Rao*, Tiejun Huang, Jie Zho

Lulu Tang 306 Jan 06, 2023
Official codebase for Pretrained Transformers as Universal Computation Engines.

universal-computation Overview Official codebase for Pretrained Transformers as Universal Computation Engines. Contains demo notebook and scripts to r

Kevin Lu 210 Dec 28, 2022
The official implementation code of "PlantStereo: A Stereo Matching Benchmark for Plant Surface Dense Reconstruction."

PlantStereo This is the official implementation code for the paper "PlantStereo: A Stereo Matching Benchmark for Plant Surface Dense Reconstruction".

Wang Qingyu 14 Nov 28, 2022
An implementation of Geoffrey Hinton's paper "How to represent part-whole hierarchies in a neural network" in Pytorch.

GLOM An implementation of Geoffrey Hinton's paper "How to represent part-whole hierarchies in a neural network" for MNIST Dataset. To understand this

50 Oct 19, 2022
Evolutionary Scale Modeling (esm): Pretrained language models for proteins

Evolutionary Scale Modeling This repository contains code and pre-trained weights for Transformer protein language models from Facebook AI Research, i

Meta Research 1.6k Jan 09, 2023
Hierarchical Time Series Forecasting with a familiar API

scikit-hts Hierarchical Time Series with a familiar API. This is the result from not having found any good implementations of HTS on-line, and my work

Carlo Mazzaferro 204 Dec 17, 2022
Safe Policy Optimization with Local Features

Safe Policy Optimization with Local Feature (SPO-LF) This is the source-code for implementing the algorithms in the paper "Safe Policy Optimization wi

Akifumi Wachi 6 Jun 05, 2022
Serving PyTorch 1.0 Models as a Web Server in C++

Serving PyTorch Models in C++ This repository contains various examples to perform inference using PyTorch C++ API. Run git clone https://github.com/W

Onur Kaplan 223 Jan 04, 2023
An auto discord account and token generator. Automatically verifies the phone number. Works without proxy. Bypasses captcha.

JOIN DISCORD SERVER https://discord.gg/uAc3agBY FREE HCAPTCHA SOLVING API Discord-Token-Gen An auto discord token generator. Auto verifies phone numbe

3kp 271 Jan 01, 2023
AAAI-22 paper: SimSR: Simple Distance-based State Representationfor Deep Reinforcement Learning

SimSR Code and dataset for the paper SimSR: Simple Distance-based State Representationfor Deep Reinforcement Learning (AAAI-22). Requirements We assum

7 Dec 19, 2022
TorchXRayVision: A library of chest X-ray datasets and models.

torchxrayvision A library for chest X-ray datasets and models. Including pre-trained models. ( 🎬 promo video about the project) Motivation: While the

Machine Learning and Medicine Lab 575 Jan 08, 2023
A curated list of Machine Learning and Deep Learning tutorials in Jupyter Notebook format ready to run in Google Colaboratory

Awesome Machine Learning Jupyter Notebooks for Google Colaboratory A curated list of Machine Learning and Deep Learning tutorials in Jupyter Notebook

Carlos Toxtli 245 Jan 01, 2023
Контрольная работа по математическим методам машинного обучения

ML-MathMethods-Test Контрольная работа по математическим методам машинного обучения. Вычисление основных статистик, диаграмм и графиков, проверка разл

Stas Ivanovskii 1 Jan 06, 2022
Learning Dense Representations of Phrases at Scale (Lee et al., 2020)

DensePhrases DensePhrases provides answers to your natural language questions from the entire Wikipedia in real-time. While it efficiently searches th

Princeton Natural Language Processing 540 Dec 30, 2022