PyTorch toolkit for biomedical imaging

Overview

logo

🤖 farabio ❤️

PyPI version DOI PyPI - Downloads Documentation Status GitHub commit activity GitHub

🎉 What's New

August 26, 2021

Publishing farabio==0.0.3 (latest version):
PyPI | Release notes

August 18, 2021

Publishing farabio==0.0.2:
PyPI | Release notes

April 21, 2021

This work is presented at PyTorch Ecosystem day. Poster is here.

April 2, 2021

Publishing farabio==0.0.1:
PyPI | Release notes

March 3, 2021

This work is selected for PyTorch Ecosystem Day.

💡 Introduction

farabio is a minimal PyTorch toolkit for out-of-the-box deep learning support in biomedical imaging. For further information, see Wikis and Docs.

🔥 Features

  • Biomedical datasets
  • Common DL models
  • Flexible trainers (*in progress)

📚 Biodatasets

🚢 Models

Classification:

Segmentation:

🚀 Getting started (Installation)

1. Create and activate conda environment:

conda create -n myenv python=3.8
conda activate myenv

2. Install PyTorch:

pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 torchaudio==0.9.0 -f https://download.pytorch.org/whl/torch_stable.html

3. Install farabio:

A. With pip:

pip install farabio

B. Setup from source:

git clone https://github.com/tuttelikz/farabio.git && cd farabio
pip install .

🤿 Tutorials

Tutorial 1: Training a classifier for ChestXrayDataset - Notebook
Tutorial 2: Training a segmentation model for DSB18Dataset - Notebook
Tutorial 3: Training a Faster-RCNN detection model for VinBigDataset - Notebook

🔎 Links

Credits

If you like this repository, please click on Star.

How to cite | doi:

@software{sanzhar_askaruly_2021_5746474,
  author       = {Sanzhar Askaruly and
                  Nurbolat Aimakov and
                  Alisher Iskakov and
                  Hyewon Cho and
                  Yujin Ahn and
                  Myeong Hoon Choi and
                  Hyunmo Yang and
                  Woonggyu Jung},
  title        = {Farabio: Deep learning for biomedical imaging},
  month        = dec,
  year         = 2021,
  publisher    = {Zenodo},
  version      = {v0.0.3-doi},
  doi          = {10.5281/zenodo.5746474},
  url          = {https://doi.org/10.5281/zenodo.5746474}
}

📃 Licenses

This work is licensed Apache 2.0.

🤩 Acknowledgements

This work is based upon efforts of open-source PyTorch Community. I have tried to acknowledge related works (github links, arxiv papers) inside the source material, eg. README, documentation, and code docstrings. Please contact if I missed anything.

You might also like...
PyTorch Extension Library of Optimized Scatter Operations

PyTorch Scatter Documentation This package consists of a small extension library of highly optimized sparse update (scatter and segment) operations fo

PyTorch Extension Library of Optimized Autograd Sparse Matrix Operations

PyTorch Sparse This package consists of a small extension library of optimized sparse matrix operations with autograd support. This package currently

Reformer, the efficient Transformer, in Pytorch
Reformer, the efficient Transformer, in Pytorch

Reformer, the Efficient Transformer, in Pytorch This is a Pytorch implementation of Reformer https://openreview.net/pdf?id=rkgNKkHtvB It includes LSH

higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.

higher is a library providing support for higher-order optimization, e.g. through unrolled first-order optimization loops, of "meta" aspects of these

PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf

README TabNet : Attentive Interpretable Tabular Learning This is a pyTorch implementation of Tabnet (Arik, S. O., & Pfister, T. (2019). TabNet: Attent

PyTorch extensions for fast R&D prototyping and Kaggle farming

Pytorch-toolbelt A pytorch-toolbelt is a Python library with a set of bells and whistles for PyTorch for fast R&D prototyping and Kaggle farming: What

An implementation of Performer, a linear attention-based transformer, in Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch

Performer - Pytorch An implementation of Performer, a linear attention-based transformer variant with a Fast Attention Via positive Orthogonal Random

The goal of this library is to generate more helpful exception messages for numpy/pytorch matrix algebra expressions.
The goal of this library is to generate more helpful exception messages for numpy/pytorch matrix algebra expressions.

Tensor Sensor See article Clarifying exceptions and visualizing tensor operations in deep learning code. One of the biggest challenges when writing co

You like pytorch? You like micrograd? You love tinygrad! ❤️
You like pytorch? You like micrograd? You love tinygrad! ❤️

For something in between a pytorch and a karpathy/micrograd This may not be the best deep learning framework, but it is a deep learning framework. Due

Comments
  • invalid input type

    invalid input type

    Instructions To Reproduce the Bug

    1. What exact command you run:
    If making changes to the project itself, please use output of the following command:
    git rev-parse HEAD; git diff
    
    <put code or diff here>
    
    1. Full logs or other relevant observations:
    <put logs here>
    
    1. please simplify the steps as much as possible so they do not require additional resources to run, such as a private dataset.

    Expected behavior:

    If there are no obvious error in "what you observed" provided above, please tell us the expected behavior.

    Environment:

    Provide your environment information using the following command:

    git clone https://gist.github.com/tuttelikz/ebd5ab3ffb29cb9399f2596b8f163a4e a && python a/cenv.py
    
    opened by aminemosbah 3
Releases(v0.0.3-doi)
  • v0.0.3-doi(Dec 1, 2021)

  • v0.0.3(Aug 25, 2021)

  • v0.0.2(Aug 17, 2021)

    TLDR: This is a fresh, restructured release package compared to v0.0.1. Here, we ship several classification models and biodatasets in PyTorch friendly format.

    Models:

    • AlexNet
    • GoogLeNet
    • MobileNetV2
    • MobileNetV3
    • ResNet
    • ShuffleNetV2
    • SqueezeNet
    • VGG

    Biodatasets:

    • ChestXrayDataset
    • DSB18Dataset
    • HistocancerDataset
    • RANZCRDataset
    • RetinopathyDataset
    Source code(tar.gz)
    Source code(zip)
    farabio-0.0.2-py3-none-any.whl(32.98 KB)
  • v0.0.1(Aug 25, 2021)

    TLDR: This is the very first release. In this release, we ship various baseline models for classification, segmentation, detection, super-resolution and image translation tasks. As well, basis for model trainers and biodatasets are described here. Architectures are not as clean. Please refer to new releases in the future.

    Biodatasets:

    • ChestXrayDataset
    • DSB18Dataset
    • HistocancerDataset
    • RANZCRDataset
    • RetinopathyDataset

    Trainers:

    • BaseTrainer
    • ConvnetTrainer
    • GanTrainer

    Models:

    • DenseNet
    • GoogLeNet
    • VGG
    • ResNet
    • MobileNetV2
    • ShuffleNetV2
    • ViT
    • U-Net
    • Attention U-Net
    • FasterRCNN
    • YOLOv3
    • CycleGAN
    • SRGAN
    Source code(tar.gz)
    Source code(zip)
    farabio-0.0.1-py3-none-any.whl(100.73 KB)
Owner
San Askaruly
Willing to join fast-paced team to build amazing future!
San Askaruly
PyTorch to TensorFlow Lite converter

PyTorch to TensorFlow Lite converter

Omer Ferhat Sarioglu 140 Dec 13, 2022
PyNIF3D is an open-source PyTorch-based library for research on neural implicit functions (NIF)-based 3D geometry representation.

PyNIF3D is an open-source PyTorch-based library for research on neural implicit functions (NIF)-based 3D geometry representation. It aims to accelerate research by providing a modular design that all

Preferred Networks, Inc. 96 Nov 28, 2022
A tutorial on "Bayesian Compression for Deep Learning" published at NIPS (2017).

Code release for "Bayesian Compression for Deep Learning" In "Bayesian Compression for Deep Learning" we adopt a Bayesian view for the compression of

Karen Ullrich 190 Dec 30, 2022
A pure Python implementation of Compact Bilinear Pooling and Count Sketch for PyTorch.

Compact Bilinear Pooling for PyTorch. This repository has a pure Python implementation of Compact Bilinear Pooling and Count Sketch for PyTorch. This

Grégoire Payen de La Garanderie 234 Dec 07, 2022
A simple way to train and use PyTorch models with multi-GPU, TPU, mixed-precision

🤗 Accelerate was created for PyTorch users who like to write the training loop of PyTorch models but are reluctant to write and maintain the boilerplate code needed to use multi-GPUs/TPU/fp16.

Hugging Face 3.5k Jan 08, 2023
Tacotron 2 - PyTorch implementation with faster-than-realtime inference

Tacotron 2 (without wavenet) PyTorch implementation of Natural TTS Synthesis By Conditioning Wavenet On Mel Spectrogram Predictions. This implementati

NVIDIA Corporation 4.1k Jan 03, 2023
PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf

README TabNet : Attentive Interpretable Tabular Learning This is a pyTorch implementation of Tabnet (Arik, S. O., & Pfister, T. (2019). TabNet: Attent

DreamQuark 2k Dec 27, 2022
GPU-accelerated PyTorch implementation of Zero-shot User Intent Detection via Capsule Neural Networks

GPU-accelerated PyTorch implementation of Zero-shot User Intent Detection via Capsule Neural Networks This repository implements a capsule model Inten

Joel Huang 15 Dec 24, 2022
A Pytorch Implementation for Compact Bilinear Pooling.

CompactBilinearPooling-Pytorch A Pytorch Implementation for Compact Bilinear Pooling. Adapted from tensorflow_compact_bilinear_pooling Prerequisites I

169 Dec 23, 2022
Model summary in PyTorch similar to `model.summary()` in Keras

Keras style model.summary() in PyTorch Keras has a neat API to view the visualization of the model which is very helpful while debugging your network.

Shubham Chandel 3.7k Dec 29, 2022
A code copied from google-research which named motion-imitation was rewrited with PyTorch

motor-system Introduction A code copied from google-research which named motion-imitation was rewrited with PyTorch. More details can get from this pr

NewEra 6 Jan 08, 2022
A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-precision, and PyTorch extensions.

A lightweight wrapper for PyTorch that provides a simple declarative API for context switching between devices, distributed modes, mixed-precision, and PyTorch extensions.

Fidelity Investments 56 Sep 13, 2022
TorchShard is a lightweight engine for slicing a PyTorch tensor into parallel shards

TorchShard is a lightweight engine for slicing a PyTorch tensor into parallel shards. It can reduce GPU memory and scale up the training when the model has massive linear layers (e.g., ViT, BERT and

Kaiyu Yue 275 Nov 22, 2022
Training PyTorch models with differential privacy

Opacus is a library that enables training PyTorch models with differential privacy. It supports training with minimal code changes required on the cli

1.3k Dec 29, 2022
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute

Lambda Networks - Pytorch Implementation of λ Networks, a new approach to image recognition that reaches SOTA on ImageNet. The new method utilizes λ l

Phil Wang 1.5k Jan 07, 2023
Use Jax functions in Pytorch with DLPack

Use Jax functions in Pytorch with DLPack

Phil Wang 106 Dec 17, 2022
Code snippets created for the PyTorch discussion board

PyTorch misc Collection of code snippets I've written for the PyTorch discussion board. All scripts were testes using the PyTorch 1.0 preview and torc

461 Dec 26, 2022
Pretrained EfficientNet, EfficientNet-Lite, MixNet, MobileNetV3 / V2, MNASNet A1 and B1, FBNet, Single-Path NAS

(Generic) EfficientNets for PyTorch A 'generic' implementation of EfficientNet, MixNet, MobileNetV3, etc. that covers most of the compute/parameter ef

Ross Wightman 1.5k Jan 01, 2023
higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.

higher is a library providing support for higher-order optimization, e.g. through unrolled first-order optimization loops, of "meta" aspects of these

Facebook Research 1.5k Jan 03, 2023
A PyTorch implementation of L-BFGS.

PyTorch-LBFGS: A PyTorch Implementation of L-BFGS Authors: Hao-Jun Michael Shi (Northwestern University) and Dheevatsa Mudigere (Facebook) What is it?

Hao-Jun Michael Shi 478 Dec 27, 2022