Deep learning for spiking neural networks

Overview

A deep learning library for spiking neural networks.

Test status chat on Discord DOI

Norse aims to exploit the advantages of bio-inspired neural components, which are sparse and event-driven - a fundamental difference from artificial neural networks. Norse expands PyTorch with primitives for bio-inspired neural components, bringing you two advantages: a modern and proven infrastructure based on PyTorch and deep learning-compatible spiking neural network components.

Documentation: norse.github.io/norse/

1. Getting started

To try Norse, the best option is to run one of the jupyter notebooks on Google collab.

Alternatively, you can install Norse and run one of the included tasks such as MNIST:

python -m norse.task.mnist

2. Using Norse

Norse presents plug-and-play components for deep learning with spiking neural networks. Here, we describe how to install Norse and start to apply it in your own work. Read more in our documentation.

2.1. Installation

We assume you are using Python version 3.7+, are in a terminal friendly environment, and have installed the necessary requirements. Read more in our documentation.

Method Instructions Prerequisites
From PyPi
pip install norse
Pip
From source
pip install -qU git+https://github.com/norse/norse
Pip, PyTorch
With Docker
docker pull quay.io/norse/norse
Docker
From Conda
conda install -c norse norse
Anaconda or Miniconda

2.2. Running examples

Norse is bundled with a number of example tasks, serving as short, self contained, correct examples (SSCCE). They can be run by invoking the norse module from the base directory. More information and tasks are available in our documentation and in your console by typing: python -m norse.task.<task> --help, where <task> is one of the task names.

  • To train an MNIST classification network, invoke
    python -m norse.task.mnist
  • To train a CIFAR classification network, invoke
    python -m norse.task.cifar10
  • To train the cartpole balancing task with Policy gradient, invoke
    python -m norse.task.cartpole

Norse is compatible with PyTorch Lightning, as demonstrated in the PyTorch Lightning MNIST task variant (requires PyTorch lightning):

python -m norse.task.mnist_pl --gpus=4

2.3. Example: Spiking convolutional classifier

Open In Colab

This classifier is a taken from our tutorial on training a spiking MNIST classifier and achieves >99% accuracy.

import torch, torch.nn as nn
from norse.torch import LICell             # Leaky integrator
from norse.torch import LIFCell            # Leaky integrate-and-fire
from norse.torch import SequentialState    # Stateful sequential layers

model = SequentialState(
    nn.Conv2d(1, 20, 5, 1),      # Convolve from 1 -> 20 channels
    LIFCell(),                   # Spiking activation layer
    nn.MaxPool2d(2, 2),
    nn.Conv2d(20, 50, 5, 1),     # Convolve from 20 -> 50 channels
    LIFCell(),
    nn.MaxPool2d(2, 2),
    nn.Flatten(),                # Flatten to 800 units
    nn.Linear(800, 10),
    LICell(),                    # Non-spiking integrator layer
)

data = torch.randn(8, 1, 28, 28) # 8 batches, 1 channel, 28x28 pixels
output, state = model(data)      # Provides a tuple (tensor (8, 10), neuron state)

2.4. Example: Long short-term spiking neural networks

The long short-term spiking neural networks from the paper by G. Bellec, D. Salaj, A. Subramoney, R. Legenstein, and W. Maass (2018) is another interesting way to apply norse:

import torch
from norse.torch import LSNNRecurrent
# Recurrent LSNN network with 2 input neurons and 10 output neurons
layer = LSNNRecurrent(2, 10)
# Generate data: 20 timesteps with 8 datapoints per batch for 2 neurons
data  = torch.zeros(20, 8, 2)
# Tuple of (output spikes of shape (20, 8, 2), layer state)
output, new_state = layer(data)

3. Why Norse?

Norse was created for two reasons: to 1) apply findings from decades of research in practical settings and to 2) accelerate our own research within bio-inspired learning.

We are passionate about Norse: we strive to follow best practices and promise to maintain this library for the simple reason that we depend on it ourselves. We have implemented a number of neuron models, synapse dynamics, encoding and decoding algorithms, dataset integrations, tasks, and examples. Combined with the PyTorch infrastructure and our high coding standards, we have found Norse to be an excellent tool for modelling scaleable experiments and Norse is actively being used in research.

Finally, we are working to keep Norse as performant as possible. Preliminary benchmarks suggest that Norse achieves excellent performance on small networks of up to ~5000 neurons per layer. Aided by the preexisting investment in scalable training and inference with PyTorch, Norse scales from a single laptop to several nodes on an HPC cluster with little effort. As illustrated by our PyTorch Lightning example task.

Read more about Norse in our documentation.

4. Similar work

The list of projects below serves to illustrate the state of the art, while explaining our own incentives to create and use norse.

  • BindsNET also builds on PyTorch and is explicitly targeted at machine learning tasks. It implements a Network abstraction with the typical 'node' and 'connection' notions common in spiking neural network simulators like nest.
  • cuSNN is a C++ GPU-accelerated simulator for large-scale networks. The library focuses on CUDA and includes spike-time dependent plasicity (STDP) learning rules.
  • decolle implements an online learning algorithm described in the paper "Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE)" by J. Kaiser, M. Mostafa and E. Neftci.
  • GeNN compiles SNN network models to NVIDIA CUDA to achieve high-performing SNN model simulations.
  • Long short-term memory Spiking Neural Networks (LSNN) is a tool from the University of Graaz for modelling LSNN cells in Tensorflow. The library focuses on a single neuron and gradient model.
  • Nengo is a neuron simulator, and Nengo-DL is a deep learning network simulator that optimised spike-based neural networks based on an approximation method suggested by Hunsberger and Eliasmith (2016). This approach maps to, but does not build on, the deep learning framework Tensorflow, which is fundamentally different from incorporating the spiking constructs into the framework itself. In turn, this requires manual translations into each individual backend, which influences portability.
  • Neuron Simulation Toolkit (NEST) constructs and evaluates highly detailed simulations of spiking neural networks. This is useful in a medical/biological sense but maps poorly to large datasets and deep learning.
  • PyNN is a Python interface that allows you to define and simulate spiking neural network models on different backends (both software simulators and neuromorphic hardware). It does not currently provide mechanisms for optimisation or arbitrary synaptic plasticity.
  • PySNN is a PyTorch extension similar to Norse. Its approach to model building is slightly different than Norse in that the neurons are stateful.
  • Rockpool is a Python package developed by SynSense for training, simulating and deploying spiking neural networks. It offers both JAX and PyTorch primitives.
  • Sinabs is a PyTorch extension by SynSense. It mainly focuses on convolutions and translation to neuromorphic hardware.
  • SlayerPyTorch is a Spike LAYer Error Reassignment library, that focuses on solutions for the temporal credit problem of spiking neurons and a probabilistic approach to backpropagation errors. It includes support for the Loihi chip.
  • SNN toolbox automates the conversion of pre-trained analog to spiking neural networks. The tool is solely for already trained networks and omits the (possibly platform specific) training.
  • snnTorch is a simulator built on PyTorch, featuring several introduction tutorials on deep learning with SNNs.
  • SpikingJelly is another PyTorch-based spiking neural network simulator. SpikingJelly uses stateful neurons. Example of training a network on MNIST.
  • SpyTorch presents a set of tutorials for training SNNs with the surrogate gradient approach SuperSpike by F. Zenke, and S. Ganguli (2017). Norse implements SuperSpike, but allows for other surrogate gradients and training approaches.
  • s2net is based on the implementation presented in SpyTorch, but implements convolutional layers as well. It also contains a demonstration how to use those primitives to train a model on the Google Speech Commands dataset.

5. Contributing

Contributions are warmly encouraged and always welcome. However, we also have high expectations around the code base so if you wish to contribute, please refer to our contribution guidelines.

6. Credits

Norse is created by

More information about Norse can be found in our documentation. The research has received funding from the EC Horizon 2020 Framework Programme under Grant Agreements 785907 and 945539 (HBP) and by the Deutsche Forschungsgemeinschaft (DFG, German Research Fundation) under Germany's Excellence Strategy EXC 2181/1 - 390900948 (the Heidelberg STRUCTURES Excellence Cluster).

7. Citation

If you use Norse in your work, please cite it as follows:

@software{norse2021,
  author       = {Pehle, Christian and
                  Pedersen, Jens Egholm},
  title        = {{Norse -  A deep learning library for spiking 
                   neural networks}},
  month        = jan,
  year         = 2021,
  note         = {Documentation: https://norse.ai/docs/},
  publisher    = {Zenodo},
  version      = {0.0.6},
  doi          = {10.5281/zenodo.4422025},
  url          = {https://doi.org/10.5281/zenodo.4422025}
}

Norse is actively applied and cited in the literature. We are keeping track of the papers cited by Norse in our documentation.

8. License

LGPLv3. See LICENSE for license details.

You might also like...
This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural Processes (NPs), Attentive Neural Processes (ANPs).

The Neural Process Family This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CN

Deep learning (neural network) based remote photoplethysmography: how to extract pulse signal from video using deep learning tools

Deep-rPPG: Camera-based pulse estimation using deep learning tools Deep learning (neural network) based remote photoplethysmography: how to extract pu

Code for
Code for "Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks", CVPR 2021

Neural Parts: Learning Expressive 3D Shape Abstractions with Invertible Neural Networks This repository contains the code that accompanies our CVPR 20

A flexible framework of neural networks for deep learning
A flexible framework of neural networks for deep learning

Chainer: A deep learning framework Website | Docs | Install Guide | Tutorials (ja) | Examples (Official, External) | Concepts | ChainerX Forum (en, ja

Code samples for my book "Neural Networks and Deep Learning"

Code samples for "Neural Networks and Deep Learning" This repository contains code samples for my book on "Neural Networks and Deep Learning". The cod

State of the Art Neural Networks for Deep Learning

pyradox This python library helps you with implementing various state of the art neural networks in a totally customizable fashion using Tensorflow 2

A flexible framework of neural networks for deep learning
A flexible framework of neural networks for deep learning

Chainer: A deep learning framework Website | Docs | Install Guide | Tutorials (ja) | Examples (Official, External) | Concepts | ChainerX Forum (en, ja

Transfer Learning library for Deep Neural Networks.
Transfer Learning library for Deep Neural Networks.

Transfer and meta-learning in Python Each folder in this repository corresponds to a method or tool for transfer/meta-learning. xfer-ml is a standalon

Comments
  • Add spack package file

    Add spack package file

    This adds a spack package file that successfully built on our local spack instance… however, this should be probably reflected in your github workflow → should I just try to add it (i.e. don't merge this but I'll update this PR) :)?

    opened by muffgaga 1
Releases(0.0.1)
Owner
Electronic Vision(s) Group — BrainScaleS Neuromorphic Hardware
Kirchhoff-Institute for Physics, Ruprecht-Karls-Universität Heidelberg
Electronic Vision(s) Group — BrainScaleS Neuromorphic Hardware
Unofficial reimplementation of ECAPA-TDNN for speaker recognition (EER=0.86 for Vox1_O when train only in Vox2)

Introduction This repository contains my unofficial reimplementation of the standard ECAPA-TDNN, which is the speaker recognition in VoxCeleb2 dataset

Tao Ruijie 277 Dec 31, 2022
[ICCV '21] In this repository you find the code to our paper Keypoint Communities

Keypoint Communities In this repository you will find the code to our ICCV '21 paper: Keypoint Communities Duncan Zauss, Sven Kreiss, Alexandre Alahi,

Duncan Zauss 262 Dec 13, 2022
Unofficial TensorFlow implementation of the Keyword Spotting Transformer model

Keyword Spotting Transformer This is the unofficial TensorFlow implementation of the Keyword Spotting Transformer model. This model is used to train o

Intelligent Machines Limited 8 May 11, 2022
Deep Unsupervised 3D SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment.

(ACMMM 2021 Oral) SfM Face Reconstruction Based on Massive Landmark Bundle Adjustment This repository shows two tasks: Face landmark detection and Fac

BoomStar 51 Dec 13, 2022
Deep and online learning with spiking neural networks in Python

Introduction The brain is the perfect place to look for inspiration to develop more efficient neural networks. One of the main differences with modern

Jason Eshraghian 447 Jan 03, 2023
The 7th edition of NTIRE: New Trends in Image Restoration and Enhancement workshop will be held on June 2022 in conjunction with CVPR 2022.

NTIRE 2022 - Image Inpainting Challenge Important dates 2022.02.01: Release of train data (input and output images) and validation data (only input) 2

Andrés Romero 37 Nov 27, 2022
Pytorch-3dunet - 3D U-Net model for volumetric semantic segmentation written in pytorch

pytorch-3dunet PyTorch implementation 3D U-Net and its variants: Standard 3D U-Net based on 3D U-Net: Learning Dense Volumetric Segmentation from Spar

Adrian Wolny 1.3k Dec 28, 2022
Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices

Joint Channel and Weight Pruning for Model Acceleration on Mobile Devices Abstract For practical deep neural network design on mobile devices, it is e

11 Dec 30, 2022
[CVPR 2020] Transform and Tell: Entity-Aware News Image Captioning

Transform and Tell: Entity-Aware News Image Captioning This repository contains the code to reproduce the results in our CVPR 2020 paper Transform and

Alasdair Tran 85 Dec 13, 2022
Code release for BlockGAN: Learning 3D Object-aware Scene Representations from Unlabelled Images

BlockGAN Code release for BlockGAN: Learning 3D Object-aware Scene Representations from Unlabelled Images BlockGAN: Learning 3D Object-aware Scene Rep

41 May 18, 2022
Unofficial PyTorch implementation of MobileViT based on paper "MobileViT: Light-weight, General-purpose, and Mobile-friendly Vision Transformer".

MobileViT RegNet Unofficial PyTorch implementation of MobileViT based on paper MOBILEVIT: LIGHT-WEIGHT, GENERAL-PURPOSE, AND MOBILE-FRIENDLY VISION TR

Hong-Jia Chen 91 Dec 02, 2022
Single-Stage Instance Shadow Detection with Bidirectional Relation Learning (CVPR 2021 Oral)

Single-Stage Instance Shadow Detection with Bidirectional Relation Learning (CVPR 2021 Oral) Tianyu Wang*, Xiaowei Hu*, Chi-Wing Fu, and Pheng-Ann Hen

Steve Wong 51 Oct 20, 2022
Image to Image translation, image generataton, few shot learning

Semi-supervised Learning for Few-shot Image-to-Image Translation [paper] Abstract: In the last few years, unpaired image-to-image translation has witn

yaxingwang 49 Nov 18, 2022
[arXiv22] Disentangled Representation Learning for Text-Video Retrieval

Disentangled Representation Learning for Text-Video Retrieval This is a PyTorch implementation of the paper Disentangled Representation Learning for T

Qiang Wang 49 Dec 18, 2022
The official implementation for "FQ-ViT: Fully Quantized Vision Transformer without Retraining".

FQ-ViT [arXiv] This repo contains the official implementation of "FQ-ViT: Fully Quantized Vision Transformer without Retraining". Table of Contents In

132 Jan 08, 2023
Whisper is a file-based time-series database format for Graphite.

Whisper Overview Whisper is one of three components within the Graphite project: Graphite-Web, a Django-based web application that renders graphs and

Graphite Project 1.2k Dec 25, 2022
PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to handle and build

simple, elegant and safe Introduction PassAPI is a password generator in hash format and fully developed in Python, with the aim of teaching how to ha

Johnsz 2 Mar 02, 2022
Breaking the Dilemma of Medical Image-to-image Translation

Breaking the Dilemma of Medical Image-to-image Translation Supervised Pix2Pix and unsupervised Cycle-consistency are two modes that dominate the field

Kid Liet 86 Dec 21, 2022
The NEOSSat is a dual-mission microsatellite designed to detect potentially hazardous Earth-orbit-crossing asteroids and track objects that reside in deep space

The NEOSSat is a dual-mission microsatellite designed to detect potentially hazardous Earth-orbit-crossing asteroids and track objects that reside in deep space

John Salib 2 Jan 30, 2022
Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer.

DocEnTR Description Pytorch implementation of the paper DocEnTr: An End-to-End Document Image Enhancement Transformer. This model is implemented on to

Mohamed Ali Souibgui 74 Jan 07, 2023