Warning: This project does not have any current developer. See bellow.

Related tags

Deep Learningpylearn2
Overview

Pylearn2: A machine learning research library

Warning :

This project does not have any current developer. We will continue to review pull requests and merge them when appropriate, but do not expect new development unless someone decides to work on it.

There are other machine learning frameworks built on top of Theano that could interest you, such as: Blocks, Keras and Lasagne.

Pylearn2 is a library designed to make machine learning research easy.

Pylearn2 has online documentation. If you want to build a local copy of the documentation, run

python ./doc/scripts/docgen.py

More documentation is available in the form of commented examples scripts and ipython notebooks in the "pylearn2/scripts/tutorials" directory.

Pylearn2 was initially developed by David Warde-Farley, Pascal Lamblin, Ian Goodfellow and others during the winter 2011 offering of IFT6266, and is now developed by the LISA lab.

Quick start and basic design rules

  • Installation instructions are available here.
  • Subscribe to the pylearn-users Google group for important updates. Please write to this list for general inquiries and support questions.
  • Subscribe to the pylearn-dev Google group for important development updates. Please write to this list if you find any bug or want to contribute to the project.
  • Read through the documentation and examples mentioned above.
  • Pylearn2 should not force users to commit to the whole library. If someone just wants to implement a Model, they should be able to do that and not need to implement a TrainingAlgorithm. Try not to write library features that force users to buy into the whole library.
  • When writing reference implementations to go in the library, maximize code re-usability by decomposing your algorithm into a TrainingAlgorithm that trains a Model on a Dataset. It will probably do this by minimizing a Cost. In fact, you can probably use an existing TrainingAlgorithm.

Highlights

  • Pylearn2 was used to set the state of the art on MNIST, CIFAR-10, CIFAR-100, and SVHN. See pylearn2.models.maxout or pylearn2/scripts/papers/maxout
  • Pylearn2 provides a wrapper around Alex Krizhevsky's extremely efficient GPU convolutional network library. This wrapper lets you use Theano's symbolic differentiation and other capabilities with minimal overhead. See pylearn2.sandbox.cuda_convnet.

License and Citations

Pylearn2 is released under the 3-claused BSD license, so it may be used for commercial purposes. The license does not require anyone to cite Pylearn2, but if you use Pylearn2 in published research work we encourage you to cite this article:

Owner
Laboratoire d’Informatique des Systèmes Adaptatifs
Laboratoire d’Informatique des Systèmes Adaptatifs
Automatic caption evaluation metric based on typicality analysis.

SeMantic and linguistic UndeRstanding Fusion (SMURF) Automatic caption evaluation metric described in the paper "SMURF: SeMantic and linguistic UndeRs

Joshua Feinglass 6 Jan 09, 2022
Pytorch implementation of Deep Recursive Residual Network for Super Resolution (DRRN)

DRRN-pytorch This is an unofficial implementation of "Deep Recursive Residual Network for Super Resolution (DRRN)", CVPR 2017 in Pytorch. [Paper] You

yun_yang 192 Dec 12, 2022
Project ArXiv Citation Network

Project ArXiv Citation Network Overview This project involved the analysis of the ArXiv citation network. Usage The complete code of this project is i

Dennis Núñez-Fernández 5 Oct 20, 2022
Capstone-Project-2 - A game program written in the Python language

Capstone-Project-2 My Pygame Game Information: Description This Pygame project i

Nhlakanipho Khulekani Hlophe 1 Jan 04, 2022
CoaT: Co-Scale Conv-Attentional Image Transformers

CoaT: Co-Scale Conv-Attentional Image Transformers Introduction This repository contains the official code and pretrained models for CoaT: Co-Scale Co

mlpc-ucsd 191 Dec 03, 2022
constructing maps of intellectual influence from publication data

Influencemap Project @ ANU Influence in the academic communities has been an area of interest for researchers. This can be seen in the popularity of a

CS Metrics 13 Jun 18, 2022
Instance-based label smoothing for improving deep neural networks generalization and calibration

Instance-based Label Smoothing for Neural Networks Pytorch Implementation of the algorithm. This repository includes a new proposed method for instanc

Mohamed Maher 1 Aug 13, 2022
SCNet: Learning Semantic Correspondence

SCNet Code Region matching code is contributed by Kai Han ([email protected]). Dense

Kai Han 34 Sep 06, 2022
A modular, research-friendly framework for high-performance and inference of sequence models at many scales

T5X T5X is a modular, composable, research-friendly framework for high-performance, configurable, self-service training, evaluation, and inference of

Google Research 1.1k Jan 08, 2023
Torch-mutable-modules - Use in-place and assignment operations on PyTorch module parameters with support for autograd

Torch Mutable Modules Use in-place and assignment operations on PyTorch module p

Kento Nishi 7 Jun 06, 2022
Official Matlab Implementation for "Tiny Obstacle Discovery by Occlusion-aware Multilayer Regression", TIP 2020

Tiny Obstacle Discovery by Occlusion-aware Multilayer Regression Official Matlab Implementation for "Tiny Obstacle Discovery by Occlusion-aware Multil

Xuefeng 5 Jan 15, 2022
This repository introduces a short project about Transfer Learning for Classification of MRI Images.

Transfer Learning for MRI Images Classification This repository introduces a short project made during my stay at Neuromatch Summer School 2021. This

Oscar Guarnizo 3 Nov 15, 2022
This is a custom made virus code in python, using tkinter module.

skeleterrorBetaV0.1-Virus-code This is a custom made virus code in python, using tkinter module. This virus is not harmful to the computer, it only ma

AR 0 Nov 21, 2022
Official implementation of "Implicit Neural Representations with Periodic Activation Functions"

Implicit Neural Representations with Periodic Activation Functions Project Page | Paper | Data Vincent Sitzmann*, Julien N. P. Martel*, Alexander W. B

Vincent Sitzmann 1.4k Jan 06, 2023
This is an implementation of PIFuhd based on Pytorch

Open-PIFuhd This is a unofficial implementation of PIFuhd PIFuHD: Multi-Level Pixel-Aligned Implicit Function forHigh-Resolution 3D Human Digitization

Lingteng Qiu 235 Dec 19, 2022
Contrastive Multi-View Representation Learning on Graphs

Contrastive Multi-View Representation Learning on Graphs This work introduces a self-supervised approach based on contrastive multi-view learning to l

Kaveh 208 Dec 23, 2022
OpenABC-D: A Large-Scale Dataset For Machine Learning Guided Integrated Circuit Synthesis

OpenABC-D: A Large-Scale Dataset For Machine Learning Guided Integrated Circuit Synthesis Overview OpenABC-D is a large-scale labeled dataset generate

NYU Machine-Learning guided Design Automation (MLDA) 31 Nov 22, 2022
Provide baselines and evaluation metrics of the task: traffic flow prediction

Note: This repo is adpoted from https://github.com/UNIMIBInside/Smart-Mobility-Prediction. Due to technical reasons, I did not fork their code. Introd

Zhangzhi Peng 11 Nov 02, 2022
Self-Supervised Speech Pre-training and Representation Learning Toolkit.

What's New Sep 2021: We host a challenge in AAAI workshop: The 2nd Self-supervised Learning for Audio and Speech Processing! See SUPERB official site

s3prl 1.6k Jan 08, 2023