In this project, we create and implement a deep learning library from scratch.

Related tags

Deep LearningARA
Overview

ARA

In this project, we create and implement a deep learning library from scratch.

Table of Contents

About The Project

Deep learning can be considered as a subset of machine learning. It is a field that is based on learning and improving on its own by examining computer algorithms. Deep learning works with artificial neural networks consisting of many layers. This project, which is creating a Deep Learning Library from scratch, can be further implemented in various kinds of projects that involve Deep Learning. Which include, but are not limited to applications in Image, Natural Language and Speech processing, among others.

Aim

To implement a deep learning library from scratch.

Tech Stack

Technologies used in the project:

  • Python and numpy, pandas, matplotlib
  • Google Colab

File Structure

.
├── code
|   └── main.py                                   #contains the main code for the library
├── resources                                     #Notes 
|   ├── ImprovingDeepNeuralNetworks
|   |   ├── images
|   |   |   ├── BatchvsMiniBatch.png
|   |   |   ├── Bias.png
|   |   |   └── EWG.png
|   |   └── notes.md
|   ├── Course1.md                               
|   ├── accuracy.jpg
|   ├── error.jpg
|   └── grad_des_graph.jpg
├── LICENSE.txt
├── ProjectReport.pdf                            #Project Report
└── README.md                                    #Readme

Approach

The approach of the project is to basically create a deep learning library, as stated before. The aim of the project was to implement various deep learning algorithms, in order to drive a deep neural network and hence,create a deep learning library, which is modular,and driven on user input so that it can be applied for various deep learning processes, and to train and test it against a model.

Theory

A neural network is a network or circuit of neurons, or in a modern sense, an artificial neural network, composed of artificial neurons or nodes.

There are different types of Neural Networks

  • Standard Neural Networks
  • Convolutional Neural Networks
  • Recurring Neural Networks

Loss Function:

Loss function is defined so as to see how good the output ŷ is compared to output label y.

Cost Function :

Cost Function quantifies the error between predicted values and expected values.

Gradient Descent : -

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function.

Getting Started

Prerequisites

  • Object oriented programming in Python

  • Linear Algebra

  • Basic knowledge of Neural Networks

  • Python 3.6 and above

    You can visit the Python Download Guide for the installation steps.

  • Install numpy next

pip install numpy

Installation

  1. Clone the repo
git clone gi[email protected]:aayushmehta123/sra_eklavya_deeplearning_library.git

Results

Result

Results obtained during training: error (where Y-axis represents the value of the cost function and X axis represents the number of iterations) accuracy (where Y-axis represents the accuracy of the prediction wrt the labels and X-axis represents the number of iterations)

Future Work

  • Short term
    • Adding class for normalization and regularization
  • Near Future
    • Addition of support for linear regression
    • Addition of classes for LSTM and GRU blocks
  • Future goal
    • Addition of algorithms to support CNN models.
    • Addition of more Machine Learning algorithms
    • Include algorithms to facilitate Image Recognition, Machine Translation and Natural Language Processing

Troubleshooting

  • Numpy library not working so we shifted workspace to colab

Contributors

Acknowledgements

Resources

License

Describe your License for your project.

Website for D2C paper

D2C This is the repository that contains source code for the D2C Website. If you find D2C useful for your work please cite: @article{sinha2021d2c au

1 Oct 21, 2021
the official code for ICRA 2021 Paper: "Multimodal Scale Consistency and Awareness for Monocular Self-Supervised Depth Estimation"

G2S This is the official code for ICRA 2021 Paper: Multimodal Scale Consistency and Awareness for Monocular Self-Supervised Depth Estimation by Hemang

NeurAI 4 Jul 27, 2022
A code repository associated with the paper A Benchmark for Rough Sketch Cleanup by Chuan Yan, David Vanderhaeghe, and Yotam Gingold from SIGGRAPH Asia 2020.

A Benchmark for Rough Sketch Cleanup This is the code repository associated with the paper A Benchmark for Rough Sketch Cleanup by Chuan Yan, David Va

33 Dec 18, 2022
[NeurIPS'21] Shape As Points: A Differentiable Poisson Solver

Shape As Points (SAP) Paper | Project Page | Short Video (6 min) | Long Video (12 min) This repository contains the implementation of the paper: Shape

394 Dec 30, 2022
1st place solution in CCF BDCI 2021 ULSEG challenge

1st place solution in CCF BDCI 2021 ULSEG challenge This is the source code of the 1st place solution for ultrasound image angioma segmentation task (

Chenxu Peng 30 Nov 22, 2022
A Deep Reinforcement Learning Framework for Stock Market Trading

DQN-Trading This is a framework based on deep reinforcement learning for stock market trading. This project is the implementation code for the two pap

61 Jan 01, 2023
TransFGU: A Top-down Approach to Fine-Grained Unsupervised Semantic Segmentation

TransFGU: A Top-down Approach to Fine-Grained Unsupervised Semantic Segmentation Zhaoyun Yin, Pichao Wang, Fan Wang, Xianzhe Xu, Hanling Zhang, Hao Li

DamoCV 25 Dec 16, 2022
Project page for End-to-end Recovery of Human Shape and Pose

End-to-end Recovery of Human Shape and Pose Angjoo Kanazawa, Michael J. Black, David W. Jacobs, Jitendra Malik CVPR 2018 Project Page Requirements Pyt

1.4k Dec 29, 2022
Minimal deep learning library written from scratch in Python, using NumPy/CuPy.

SmallPebble Project status: experimental, unstable. SmallPebble is a minimal/toy automatic differentiation/deep learning library written from scratch

Sidney Radcliffe 92 Dec 30, 2022
Weakly Supervised 3D Object Detection from Point Cloud with Only Image Level Annotation

SCCKTIM Weakly Supervised 3D Object Detection from Point Cloud with Only Image-Level Annotation Our code will be available soon. The class knowledge t

1 Nov 12, 2021
A tutorial on DataFrames.jl prepared for JuliaCon2021

JuliaCon2021 DataFrames.jl Tutorial This is a tutorial on DataFrames.jl prepared for JuliaCon2021. A video recording of the tutorial is available here

Bogumił Kamiński 106 Jan 09, 2023
DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting

DenseCLIP: Language-Guided Dense Prediction with Context-Aware Prompting Created by Yongming Rao*, Wenliang Zhao*, Guangyi Chen, Yansong Tang, Zheng Z

Yongming Rao 322 Dec 31, 2022
The official implementation of ELSA: Enhanced Local Self-Attention for Vision Transformer

ELSA: Enhanced Local Self-Attention for Vision Transformer By Jingkai Zhou, Pich

DamoCV 87 Dec 19, 2022
In this project we investigate the performance of the SetCon model on realistic video footage. Therefore, we implemented the model in PyTorch and tested the model on two example videos.

Contrastive Learning of Object Representations Supervisor: Prof. Dr. Gemma Roig Institutions: Goethe University CVAI - Computational Vision & Artifici

Dirk Neuhäuser 6 Dec 08, 2022
Simple Tensorflow implementation of Toward Spatially Unbiased Generative Models (ICCV 2021)

Spatial unbiased GANs — Simple TensorFlow Implementation [Paper] : Toward Spatially Unbiased Generative Models (ICCV 2021) Abstract Recent image gener

Junho Kim 16 Apr 15, 2022
Codes of paper "Unseen Object Amodal Instance Segmentation via Hierarchical Occlusion Modeling"

Unseen Object Amodal Instance Segmentation (UOAIS) Seunghyeok Back, Joosoon Lee, Taewon Kim, Sangjun Noh, Raeyoung Kang, Seongho Bak, Kyoobin Lee This

GIST-AILAB 92 Dec 13, 2022
Conditional Gradients For The Approximately Vanishing Ideal

Conditional Gradients For The Approximately Vanishing Ideal Code for the paper: Wirth, E., and Pokutta, S. (2022). Conditional Gradients for the Appro

IOL Lab @ ZIB 0 May 25, 2022
PyTorch implementation of Neural Dual Contouring.

NDC PyTorch implementation of Neural Dual Contouring. Citation We are still writing the paper while adding more improvements and applications. If you

Zhiqin Chen 140 Dec 26, 2022
The MATH Dataset

Measuring Mathematical Problem Solving With the MATH Dataset This is the repository for Measuring Mathematical Problem Solving With the MATH Dataset b

Dan Hendrycks 267 Dec 26, 2022
[PyTorch] Official implementation of CVPR2021 paper "PointDSC: Robust Point Cloud Registration using Deep Spatial Consistency". https://arxiv.org/abs/2103.05465

PointDSC repository PyTorch implementation of PointDSC for CVPR'2021 paper "PointDSC: Robust Point Cloud Registration using Deep Spatial Consistency",

153 Dec 14, 2022