============================================================================================================ `MILA will stop developing Theano <https://groups.google.com/d/msg/theano-users/7Poq8BZutbY/rNCIfvAEAwAJ>`_. The PyMC developers are continuing Theano development in a `fork <https://github.com/pymc-devs/theano-pymc>`_. ============================================================================================================ To install the package, see this page: http://deeplearning.net/software/theano/install.html For the documentation, see the project website: http://deeplearning.net/software/theano/ Related Projects: https://github.com/Theano/Theano/wiki/Related-projects It is recommended that you look at the documentation on the website, as it will be more current than the documentation included with the package. In order to build the documentation yourself, you will need sphinx. Issue the following command: :: python ./doc/scripts/docgen.py Documentation is built into ``html/`` The PDF of the documentation can be found at ``html/theano.pdf`` ================ DIRECTORY LAYOUT ================ ``Theano`` (current directory) is the distribution directory. * ``Theano/theano`` contains the package * ``Theano/theano`` has several submodules: * ``gof`` + ``compile`` are the core * ``scalar`` depends upon core * ``tensor`` depends upon ``scalar`` * ``sparse`` depends upon ``tensor`` * ``sandbox`` can depend on everything else * ``Theano/examples`` are copies of the example found on the wiki * ``Theano/benchmark`` and ``Theano/examples`` are in the distribution, but not in the Python package * ``Theano/bin`` contains executable scripts that are copied to the bin folder when the Python package is installed * Tests are distributed and are part of the package, i.e. fall in the appropriate submodules * ``Theano/doc`` contains files and scripts used to generate the documentation * ``Theano/html`` is where the documentation will be generated
Theano is a Python library that allows you to define, optimize, and evaluate mathematical expressions involving multi-dimensional arrays efficiently. It can use GPUs and perform efficient symbolic differentiation.
Overview
Object DGCNN and DETR3D, Our implementations are built on top of MMdetection3D.
Object DGCNN & DETR3D This repo contains the implementations of Object DGCNN (https://arxiv.org/abs/2110.06923) and DETR3D (https://arxiv.org/abs/2110
Baselines for TrajNet++
TrajNet++ : The Trajectory Forecasting Framework PyTorch implementation of Human Trajectory Forecasting in Crowds: A Deep Learning Perspective TrajNet
The code used for the free [email protected] Webinar series on Reinforcement Learning in Finance
Reinforcement Learning in Finance [email protected] Webinar This repository provides the code f
Source code of SIGIR2021 Paper 'One Chatbot Per Person: Creating Personalized Chatbots based on Implicit Profiles'
DHAP Source code of SIGIR2021 Long Paper: One Chatbot Per Person: Creating Personalized Chatbots based on Implicit User Profiles . Preinstallation Fir
Model Zoo for AI Model Efficiency Toolkit
We provide a collection of popular neural network models and compare their floating point and quantized performance.
ICSS - Interactive Continual Semantic Segmentation
Presentation This repository contains the code of our paper: Weakly-supervised c
YOLOv7 - Framework Beyond Detection
🔥🔥🔥🔥 YOLO with Transformers and Instance Segmentation, with TensorRT acceleration! 🔥🔥🔥
Code for "Layered Neural Rendering for Retiming People in Video."
Layered Neural Rendering in PyTorch This repository contains training code for the examples in the SIGGRAPH Asia 2020 paper "Layered Neural Rendering
Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Coming soon!
ToxiChat Code and data for the EMNLP 2021 paper "Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts". Install depen
Highway networks implemented in PyTorch.
PyTorch Highway Networks Highway networks implemented in PyTorch. Just the MNIST example from PyTorch hacked to work with Highway layers. Todo Make th
Network Compression via Central Filter
Network Compression via Central Filter Environments The code has been tested in the following environments: Python 3.8 PyTorch 1.8.1 cuda 10.2 torchsu
A Number Recognition algorithm
Paddle-VisualAttention Results_Compared SVHN Dataset Methods Steps GPU Batch Size Learning Rate Patience Decay Step Decay Rate Training Speed (FPS) Ac
PyTorch implementation of paper: AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style Transfer, ICCV 2021.
AdaAttN: Revisit Attention Mechanism in Arbitrary Neural Style Transfer [Paper] [PyTorch Implementation] [Paddle Implementation] Overview This reposit
Codes for ACL-IJCNLP 2021 Paper "Zero-shot Fact Verification by Claim Generation"
Zero-shot-Fact-Verification-by-Claim-Generation This repository contains code and models for the paper: Zero-shot Fact Verification by Claim Generatio
This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroimaging" that has been accepted to NeurIPS 2021.
Dugh-NeurIPS-2021 This repo contains the code for the paper "Efficient hierarchical Bayesian inference for spatio-temporal regression models in neuroi
Dynamic hair modeling from monocular videos using deep neural networks
Dynamic Hair Modeling The source code of the networks for our paper "Dynamic hair modeling from monocular videos using deep neural networks" (SIGGRAPH
The Hailo Model Zoo includes pre-trained models and a full building and evaluation environment
Hailo Model Zoo The Hailo Model Zoo provides pre-trained models for high-performance deep learning applications. Using the Hailo Model Zoo you can mea
NumQMBasic - A mini-course offered to Undergrad physics students
The best way to use this material is by forking it by click the Fork button at the top, right corner. Then you will get your own copy to play with! Th
Source code of NeurIPS 2021 Paper ''Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration''
CaGCN This repo is for source code of NeurIPS 2021 paper "Be Confident! Towards Trustworthy Graph Neural Networks via Confidence Calibration". Paper L
Implementation of "The Power of Scale for Parameter-Efficient Prompt Tuning"
Prompt-Tuning Implementation of "The Power of Scale for Parameter-Efficient Prompt Tuning" Currently, we support the following huggigface models: Bart