Deep Markov Factor Analysis (NeurIPS2021)

Overview

Deep Markov Factor Analysis (DMFA)

Codes and experiments for deep Markov factor analysis (DMFA) model accepted for publication at NeurIPS2021:

A. Farnoosh and S. Ostadabbas, “Deep Markov Factor Analysis: Towards concurrent temporal and spatial analysis of fMRI data,” in Thirty-fifth Annual Conference on Neural Information Processing Systems (NeurIPS), 2021.

Dependencies:

Numpy, Scipy, Pytorch, Nibabel, Tqdm, Matplotlib, Sklearn, Json, Pandas

Autism Dataset:

Run the following snippet to restore results from pre-trained checkpoints for Autism dataset in ./fMRI_results folder. A few instances from each dataset are included to help the code run without errors. You may replace {site} with Caltec, Leuven, MaxMun, NYU_00, SBL_00, Stanfo, Yale_0, USM_00, DSU_0, UM_1_0, or set -exp autism for the full dataset. Here, checkpoint files for Caltec, SBL_00, Stanfo are only included due to storage limitations.

python dmfa_fMRI.py -t 75 -exp autism_{site} -dir ./data_autism/ -smod ./ckpt_fMRI/ -dpath ./fMRI_results/ -restore

or run the following snippet for training with batch size of 10 (full dataset needs to be downloaded and preprocessed/formatted beforehand):

python dmfa_fMRI.py -t 75 -exp autism_{site} -dir ./data_autism/ -smod ./ckpt_fMRI/ -dpath ./fMRI_results/ -bs 10

After downloading the full Autism dataset, run the following snippet to preprocess/format data:

python generate_fMRI_patches.py -T 75 -dir ./path_to_data/ -ext /*.gz -spath ./data_autism/

Depression Dataset:

Run the following snippet to restore results from pre-trained checkpoints for Depression dataset in ./fMRI_results folder. A few instances from the dataset are included to help the code run without errors. You may replace {ID} with 1, 2, 3, 4. ID 4 corresponds to the first experiment on Depression dataset in the paper. IDs 2, 3 correspond to the second experiment on Depression dataset in the paper.

python dmfa_fMRI.py -exp depression_{ID} -dir ./data_depression/ -smod ./ckpt_fMRI/ -dpath ./fMRI_results/ -restore

or run the following snippet for training with batch size of 10 (full dataset needs to be downloaded and preprocessed/formatted beforehand):

python dmfa_fMRI.py -exp depression_{ID} -dir ./data_depression/ -smod ./ckpt_fMRI/ -dpath ./fMRI_results/ -bs 10

After downloading the full Depression dataset, run the following snippet to preprocess/format data:

python generate_fMRI_patches_depression.py -T 6 -dir ./path_to_data/ -spath ./data_depression/

Synthetic fMRI data:

Run the following snippet to restore results from the pre-trained checkpoint for the synthetic experiment in ./synthetic_results folder (synthetic fMRI data is not included due to storage limitations).

python dmfa_synthetic.py

Owner
Sarah Ostadabbas
Sarah Ostadabbas is an Assistant Professor at the Electrical and Computer Engineering Department of Northeastern University (NEU). Sarah joined NEU from Georgia
Sarah Ostadabbas
Code release for "Detecting Twenty-thousand Classes using Image-level Supervision".

Detecting Twenty-thousand Classes using Image-level Supervision Detic: A Detector with image classes that can use image-level labels to easily train d

Meta Research 1.3k Jan 04, 2023
Repository for the "Gotta Go Fast When Generating Data with Score-Based Models" paper

Gotta Go Fast When Generating Data with Score-Based Models This repo contains the official implementation for the paper Gotta Go Fast When Generating

Alexia Jolicoeur-Martineau 89 Nov 09, 2022
GLANet - The code for Global and Local Alignment Networks for Unpaired Image-to-Image Translation arxiv

GLANet The code for Global and Local Alignment Networks for Unpaired Image-to-Image Translation arxiv Framework: visualization results: Getting Starte

stanley 29 Dec 14, 2022
Towards Ultra-Resolution Neural Style Transfer via Thumbnail Instance Normalization

Towards Ultra-Resolution Neural Style Transfer via Thumbnail Instance Normalization Official PyTorch implementation for our URST (Ultra-Resolution Sty

czczup 148 Dec 27, 2022
Instance-Dependent Partial Label Learning

Instance-Dependent Partial Label Learning Installation pip install -r requirements.txt Run the Demo benchmark-random mnist python -u main.py --gpu 0 -

17 Dec 29, 2022
face_recognization (FaceNet) + TFHE (HNP) + hand_face_detection (Mediapipe)

SuperControlSystem Face_Recognization (FaceNet) 面部识别 (FaceNet) Fully Homomorphic Encryption over the Torus (HNP) 环面全同态加密 (TFHE) Hand_Face_Detection (M

liziyu0104 2 Dec 30, 2021
A flexible submap-based framework towards spatio-temporally consistent volumetric mapping and scene understanding.

Panoptic Mapping This package contains panoptic_mapping, a general framework for semantic volumetric mapping. We provide, among other, a submap-based

ETHZ ASL 194 Dec 20, 2022
🏃‍♀️ A curated list about human motion capture, analysis and synthesis.

Awesome Human Motion 🏃‍♀️ A curated list about human motion capture, analysis and synthesis. Contents Introduction Human Models Datasets Data Process

Dennis Wittchen 274 Dec 14, 2022
PyTorch implementation of "A Simple Baseline for Low-Budget Active Learning".

A Simple Baseline for Low-Budget Active Learning This repository is the implementation of A Simple Baseline for Low-Budget Active Learning. In this pa

10 Nov 14, 2022
Implement of "Training deep neural networks via direct loss minimization" in PyTorch for 0-1 loss

This is the implementation of "Training deep neural networks via direct loss minimization" published at ICML 2016 in PyTorch. The implementation targe

Cuong Nguyen 1 Jan 18, 2022
Implementation of the paper Recurrent Glimpse-based Decoder for Detection with Transformer.

REGO-Deformable DETR By Zhe Chen, Jing Zhang, and Dacheng Tao. This repository is the implementation of the paper Recurrent Glimpse-based Decoder for

Zhe Chen 33 Nov 30, 2022
Create time-series datacubes for supervised machine learning with ICEYE SAR images.

ICEcube is a Python library intended to help organize SAR images and annotations for supervised machine learning applications. The library generates m

ICEYE Ltd 65 Jan 03, 2023
Framework to build and train RL algorithms

RayLink RayLink is a RL framework used to build and train RL algorithms. RayLink was used to build a RL framework, and tested in a large-scale multi-a

Bytedance Inc. 32 Oct 07, 2022
The official code repo of "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound Classification and Detection"

Hierarchical Token Semantic Audio Transformer Introduction The Code Repository for "HTS-AT: A Hierarchical Token-Semantic Audio Transformer for Sound

Knut(Ke) Chen 134 Jan 01, 2023
HyperLib: Deep learning in the Hyperbolic space

HyperLib: Deep learning in the Hyperbolic space Background This library implements common Neural Network components in the hypberbolic space (using th

105 Dec 25, 2022
Official PyTorch implementation of CAPTRA: CAtegory-level Pose Tracking for Rigid and Articulated Objects from Point Clouds

CAPTRA: CAtegory-level Pose Tracking for Rigid and Articulated Objects from Point Clouds Introduction This is the official PyTorch implementation of o

Yijia Weng 96 Dec 07, 2022
10th place solution for Google Smartphone Decimeter Challenge at kaggle.

Under refactoring 10th place solution for Google Smartphone Decimeter Challenge at kaggle. Google Smartphone Decimeter Challenge Global Navigation Sat

12 Oct 25, 2022
Improving Convolutional Networks via Attention Transfer (ICLR 2017)

Attention Transfer PyTorch code for "Paying More Attention to Attention: Improving the Performance of Convolutional Neural Networks via Attention Tran

Sergey Zagoruyko 1.4k Dec 23, 2022
A pytorch implementation of the ACL2019 paper "Simple and Effective Text Matching with Richer Alignment Features".

RE2 This is a pytorch implementation of the ACL 2019 paper "Simple and Effective Text Matching with Richer Alignment Features". The original Tensorflo

287 Dec 21, 2022