Implementations of CNNs, RNNs, GANs, etc

Overview

Tensorflow Programs and Tutorials

This repository will contain Tensorflow tutorials on a lot of the most popular deep learning concepts. It'll also contain some experiments on cool papers that I read. Hopefully, the notebooks will be helpful to anyone reading!

  • CNN's with Noisy Labels - This notebook looks at a recent paper that discusses how convolutional neural networks that are trained on random labels (with some probability) are still able to acheive good accuracy on MNIST. I thought that the paper showed some eye-brow raising results, so I went ahead and tried it out for myself. It was pretty amazing to see that even when training a CNN with random labels 50% of the time, and the correct labels the other 50% of the time, the network was still able to get a 90+% accuracy.

  • Character Level RNN (Work in Progress) - This notebook shows you how to train a character level RNN in Tensorflow. The idea was inspired by Andrej Karpathy's famous blog post and was based on this Keras implementation. In this notebook, you'll learn more about what the model is doing, and how you can input your own dataset, and train a model to generate similar looking text.

  • Convolutional Neural Networks - This notebook goes through a simple convolutional neural network implementation in Tensorflow. The model is very similar to the own described in the Tensorflow docs. Hopefully this notebook can give you a better understanding of what is necessary to create and train your own CNNs. For a more conceptual view of CNNs, check out my introductory blog post on them.

  • Generative Adversarial Networks - This notebook goes through the creation of a generative adversarial network. GANs are one of the hottest topics in deep learning. From a high level, GANs are composed of two components, a generator and a discriminator. The discriminator has the task of determining whether a given image looks natural (ie, is an image from the dataset) or looks like it has been artificially created. The task of the generator is to create natural looking images that are similar to the original data distribution, images that look natural enough to fool the discriminator network.For more of a conceptual view of GANs, check out my blog post.

  • Linear and Logistic Regression - This notebook shows you how Tensorflow is not just a deep learning library, but is a library centered on numerical computation, which allows you to create classic machine learning models relatively easily. Linear regression and logistic regression are two of the most simple, yet useful models in all of machine learning.

  • Simple Neural Networks - This notebook shows you how to create simple 1 and 2 layer neural networks. We'll then see how these networks perform on MNIST, and look at the type of hyperparamters that affect a model's accuracy (network architecture, weight initialization, learning rate, etc)

  • Math in Tensorflow - This notebook introduces you to variables, constants, and placeholders in Tensorflow. It'll go into describing sessions, and showinng you how to perform typical mathematical operations and deal with large matrices.

  • Question Pair Classification with RNNs (Work in Progress) - This notebook looks at the newly released question pair dataset released by Quora a little earlier this year. It looks at the ways in which you can build a machine learning model to predict whether two sentences are duplicates of one another. Before running this notebook, it's very important to extract all the data. We'll run the following command to get our word vectors and training/testing matrices.

    tar -xvzf Data/Quora/QuoraData.tar.gz
  • SELU Nonlinearity - A recent paper titled "Self Normalizing Neural Networks" started getting a lot of buzz starting in June 2017. The main contribution of the paper was this new nonlinear activation function called a SELU (scaled exponential linear unit). We'll be looking at how this function performs in practice with simple neural nets and CNNs.

  • Sentiment Analysis with LSTMs - In this notebook, we'll be looking at how to apply deep learning techniques to the task of sentiment analysis. Sentiment analysis can be thought of as the exercise of taking a sentence, paragraph, document, or any piece of natural language, and determining whether that text's emotional tone is positive, negative or neutral. We'll look at why RNNs and LSTMs are the most popular choices for handling natural language processing tasks. Be sure to run the following commands to get our word vectors and training data.

    tar -xvzf Data/Sentiment/models.tar.gz
    tar -xvzf Data/Sentiment/training_data.tar.gz
  • Universal Approximation Theorem (Work in Progress) - The Universal Approximation Theorem states that any feed forward neural network with a single hidden layer can model any function. In this notebook, I'll go through a practical example of illustrating why this theorem works, and talk about what the implications are for when you're training your own neural networks. cough Overfitting cough

  • Learning to Model the XOR Function (Work in Progress) - XOR is one of the classic functions we see in machine learning theory textbooks. The significance is that we cannot fit a linear model to this function no matter how hard we try. In this notebook, you'll see proof of that, and you'll see how adding a simple hidden layer to the neural net can solve the problem.

Owner
Adit Deshpande
Engineering at Forward | UCLA CS '19
Adit Deshpande
The implemetation of Dynamic Nerual Garments proposed in Siggraph Asia 2021

DynamicNeuralGarments Introduction This repository contains the implemetation of Dynamic Nerual Garments proposed in Siggraph Asia 2021. ./GarmentMoti

42 Dec 27, 2022
VOneNet: CNNs with a Primary Visual Cortex Front-End

VOneNet: CNNs with a Primary Visual Cortex Front-End A family of biologically-inspired Convolutional Neural Networks (CNNs). VOneNets have the followi

The DiCarlo Lab at MIT 99 Dec 22, 2022
An efficient PyTorch library for Global Wheat Detection using YOLOv5. The project is based on this Kaggle competition Global Wheat Detection (2021).

Global-Wheat-Detection An efficient PyTorch library for Global Wheat Detection using YOLOv5. The project is based on this Kaggle competition Global Wh

Chuxin Wang 11 Sep 25, 2022
PyTorch implementation HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projections

HoroPCA This code is the official PyTorch implementation of the ICML 2021 paper: HoroPCA: Hyperbolic Dimensionality Reduction via Horospherical Projec

HazyResearch 52 Nov 14, 2022
Revitalizing CNN Attention via Transformers in Self-Supervised Visual Representation Learning

Revitalizing CNN Attention via Transformers in Self-Supervised Visual Representation Learning This repository is the official implementation of CARE.

ChongjianGE 89 Dec 02, 2022
ZeroGen: Efficient Zero-shot Learning via Dataset Generation

ZEROGEN This repository contains the code for our paper “ZeroGen: Efficient Zero

Jiacheng Ye 31 Dec 30, 2022
Ground truth data for the Optical Character Recognition of Historical Classical Commentaries.

OCR Ground Truth for Historical Commentaries The dataset OCR ground truth for historical commentaries (GT4HistComment) was created from the public dom

Ajax Multi-Commentary 3 Sep 08, 2022
The mini-MusicNet dataset

mini-MusicNet A music-domain dataset for multi-label classification Music transcription is sequence-to-sequence prediction problem: given an audio per

John Thickstun 4 Nov 09, 2022
Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods

Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods Introduction Graph Neural Networks (GNNs) have demonstrated

37 Dec 15, 2022
Using multidimensional LSTM neural networks to create a forecast for Bitcoin price

Multidimensional LSTM BitCoin Time Series Using multidimensional LSTM neural networks to create a forecast for Bitcoin price. For notes around this co

Jakob Aungiers 318 Dec 14, 2022
ML course - EPFL Machine Learning Course, Fall 2021

EPFL Machine Learning Course CS-433 Machine Learning Course, Fall 2021 Repository for all lecture notes, labs and projects - resources, code templates

EPFL Machine Learning and Optimization Laboratory 1k Jan 04, 2023
交互式标注软件,暂定名 iann

iann 交互式标注软件,暂定名iann。 安装 按照官网介绍安装paddle。 安装其他依赖 pip install -r requirements.txt 运行 git clone https://github.com/PaddleCV-SIG/iann/ cd iann python iann

294 Dec 30, 2022
BBScan py3 - BBScan py3 With Python

BBScan_py3 This repository is forked from lijiejie/BBScan 1.5. I migrated the fo

baiyunfei 12 Dec 30, 2022
Pure python PEMDAS expression solver without using built-in eval function

pypemdas Pure python PEMDAS expression solver without using built-in eval function. Supports nested parenthesis. Supported operators: + - * / ^ Exampl

1 Dec 22, 2021
Public scripts, services, and configuration for running a smart home K3S network cluster

makerhouse_network Public scripts, services, and configuration for running MakerHouse's home network. This network supports: TODO features here For mo

Scott Martin 1 Jan 15, 2022
Evaluation Pipeline for our ECCV2020: Journey Towards Tiny Perceptual Super-Resolution.

Journey Towards Tiny Perceptual Super-Resolution Test code for our ECCV2020 paper: https://arxiv.org/abs/2007.04356 Our x4 upscaling pre-trained model

Royson 6 Mar 30, 2022
AI Face Mesh: This is a simple face mesh detection program based on Artificial intelligence.

AI Face Mesh: This is a simple face mesh detection program based on Artificial Intelligence which made with Python. It's able to detect 468 different

Md. Rakibul Islam 1 Jan 13, 2022
SMPL-X: A new joint 3D model of the human body, face and hands together

SMPL-X: A new joint 3D model of the human body, face and hands together [Paper Page] [Paper] [Supp. Mat.] Table of Contents License Description News I

Vassilis Choutas 1k Jan 09, 2023
Revealing and Protecting Labels in Distributed Training

Revealing and Protecting Labels in Distributed Training

Google Interns 0 Nov 09, 2022
Meta Learning Backpropagation And Improving It (VSML)

Meta Learning Backpropagation And Improving It (VSML) This is research code for the NeurIPS 2021 publication Kirsch & Schmidhuber 2021. Many concepts

Louis Kirsch 22 Dec 21, 2022