Deep Reinforcement Learning based Trading Agent for Bitcoin

Overview

Deep Trading Agent

license dep1 dep2 dep3 dep4 dep4
Deep Reinforcement Learning based Trading Agent for Bitcoin using DeepSense Network for Q function approximation.

model
For complete details of the dataset, preprocessing, network architecture and implementation, refer to the Wiki of this repository.

Requirements

  • Python 2.7
  • Tensorflow
  • Pandas (for pre-processing Bitcoin Price Series)
  • tqdm (for displaying progress of training)

To setup a ubuntu virtual machine with all the dependencies to run the code, refer to assets/vm.

Run with Docker

Pull the prebuilt docker image directly from docker hub and run it as

docker pull samre12/deep-trading-agent:latest
docker run -p 6006:6006 -it samre12/deep-trading-agent:latest

OR

Build the docker image locally by executing the command and the run the image as

docker build -t deep-trading-agent .
docker run -p 6006:6006 -it deep-trading-agent

This will setup the repository for training the agent and

  • mount the current directory into /deep-trading-agent in the container

  • during image build, the latest transactions history from the exchange is pulled and sampled to create per-minute scale dataset of Bitcoin prices. This dataset is placed at /deep-trading-agent/data/btc.csv

  • to initiate training of the agent, specify suitable parameters in a config file (an example config file is provided at /deep-trading-agent/code/config/config.cfg) and run the code using /deep-trading-agent/code/main.py

  • training supports logging and monitoring through Tensorboard

  • vim and screen are installed in the container to edit the configuration files and run tensorboard

  • bind port 6006 of container to 6006 of host machine to monitor training using Tensorboard

Support

Please give a to this repository to support the project 😄 .

ToDo

Docker Support

  • Add Docker support for a fast and easy start with the project

Improve Model performance

  • Extract highest and lowest prices and the volume of Bitcoin traded within a given time interval in the Preprocessor
  • Use closing, highest, lowest prices and the volume traded as input channels to the model (remove features calculated just using closing prices)
  • Normalize the price tensors using the price of the previous time step
  • For the complete state representation, input the remaining number of trades to the model
  • Use separate diff price blocks to calculate the unrealized PnL
  • Use exponentially decayed weighted unrealized PnL as a reward function to incorporate current state of investment and stabilize the learning of the agent

Trading Model

is inspired by Deep Q-Trading where they solve a simplified trading problem for a single asset.
For each trading unit, only one of the three actions: neutral(1), long(2) and short(3) are allowed and a reward is obtained depending upon the current position of agent. Deep Q-Learning agent is trained to maximize the total accumulated rewards.
Current Deep Q-Trading model is modified by using the Deep Sense architecture for Q function approximation.

Dataset

Per minute Bitcoin series is obtained by modifying the procedure mentioned in this repository. Transactions in the Coinbase exchange are sampled to generate the Bitcoin price series.
Refer to assets/dataset to download the dataset.

Preprocessing

Basic Preprocessing
Completely ignore missing values and remove them from the dataset and accumulate blocks of continuous values using the timestamps of the prices.
All the accumulated blocks with number of timestamps lesser than the combined history length of the state and horizon of the agent are then filtered out since they cannot be used for training of the agent.
In the current implementation, past 3 hours (180 minutes) of per minute Bitcoin prices are used to generate the representation of the current state of the agent.
With the existing dataset (at the time of writing), following are the logs generated while preprocessing the dataset:

INFO:root:Number of blocks of continuous prices found are 58863
INFO:root:Number of usable blocks obtained from the dataset are 887
INFO:root:Number of distinct episodes for the current configuration are 558471

Advanced Preprocessing
Process missing values and concatenate smaller blocks to increase the sizes of continuous price blocks.
Standard technique in literature to fill the missing values in a way that does not much affect the performance of the model is using exponential filling with no decay.
(To be implemented)

Implementation

Tensorflow "1.1.0" version is used for the implementation of the Deep Sense network.

Deep Sense

Implementation is adapted from this Github repository with a few simplifications in the network architecture to incorporate learning over a single time series of the Bitcoin data.

Deep Q Trading

Implementation and preprocessing is inspired from this Medium post. The actual implementation of the Deep Q Network is adapted from DQN-tensorflow.

Owner
Kartikay Garg
Major in Mathematics and Computing
Kartikay Garg
Show-attend-and-tell - TensorFlow Implementation of "Show, Attend and Tell"

Show, Attend and Tell Update (December 2, 2016) TensorFlow implementation of Show, Attend and Tell: Neural Image Caption Generation with Visual Attent

Yunjey Choi 902 Nov 29, 2022
NaturalProofs: Mathematical Theorem Proving in Natural Language

NaturalProofs: Mathematical Theorem Proving in Natural Language NaturalProofs: Mathematical Theorem Proving in Natural Language Sean Welleck, Jiacheng

Sean Welleck 83 Jan 05, 2023
Rethinking the U-Net architecture for multimodal biomedical image segmentation

MultiResUNet Rethinking the U-Net architecture for multimodal biomedical image segmentation This repository contains the original implementation of "M

Nabil Ibtehaz 308 Jan 05, 2023
Processed, version controlled history of Minecraft's generated data and assets

mcmeta Processed, version controlled history of Minecraft's generated data and assets Repository structure Each of the following branches has a commit

Misode 75 Dec 28, 2022
Team Enigma at ArgMining 2021 Shared Task: Leveraging Pretrained Language Models for Key Point Matching

Team Enigma at ArgMining 2021 Shared Task: Leveraging Pretrained Language Models for Key Point Matching This is our attempt of the shared task on Quan

Manav Nitin Kapadnis 12 Jul 08, 2022
wgan, wgan2(improved, gp), infogan, and dcgan implementation in lasagne, keras, pytorch

Generative Adversarial Notebooks Collection of my Generative Adversarial Network implementations Most codes are for python3, most notebooks works on C

tjwei 1.5k Dec 16, 2022
Steerable discovery of neural audio effects

Steerable discovery of neural audio effects Christian J. Steinmetz and Joshua D. Reiss Abstract Applications of deep learning for audio effects often

Christian J. Steinmetz 182 Dec 29, 2022
Official repository for Hierarchical Opacity Propagation for Image Matting

HOP-Matting Official repository for Hierarchical Opacity Propagation for Image Matting 🚧 🚧 🚧 Under Construction 🚧 🚧 🚧 🚧 🚧 🚧   Coming Soon   

Li Yaoyi 54 Dec 30, 2021
A tutorial on training a DarkNet YOLOv4 model for the CrowdHuman dataset

YOLOv4 CrowdHuman Tutorial This is a tutorial demonstrating how to train a YOLOv4 people detector using Darknet and the CrowdHuman dataset. Table of c

JK Jung 118 Nov 10, 2022
GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

GNN4Traffic - This is the repository for the collection of Graph Neural Network for Traffic Forecasting

564 Jan 02, 2023
Data-Uncertainty Guided Multi-Phase Learning for Semi-supervised Object Detection

An official implementation of paper Data-Uncertainty Guided Multi-Phase Learning for Semi-supervised Object Detection

11 Nov 23, 2022
ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

ONNX-GLPDepth - Python scripts for performing monocular depth estimation using the GLPDepth model in ONNX

Ibai Gorordo 18 Nov 06, 2022
A Demo server serving Bert through ONNX with GPU written in Rust with <3

Demo BERT ONNX server written in rust This demo showcase the use of onnxruntime-rs on BERT with a GPU on CUDA 11 served by actix-web and tokenized wit

Xavier Tao 28 Jan 01, 2023
Re-implement CycleGAN in Tensorlayer

CycleGAN_Tensorlayer Re-implement CycleGAN in TensorLayer Original CycleGAN Improved CycleGAN with resize-convolution Prerequisites: TensorLayer Tenso

89 Aug 15, 2022
Machine learning framework for both deep learning and traditional algorithms

NeoML is an end-to-end machine learning framework that allows you to build, train, and deploy ML models. This framework is used by ABBYY engineers for

NeoML 704 Dec 27, 2022
adversarial_multi_armed_bandit_variable_plays

Adversarial Multi-Armed Bandit with Variable Plays This code is for paper: Adversarial Online Learning with Variable Plays in the Evasion-and-Pursuit

Yiyang Wang 1 Oct 28, 2021
[NeurIPS-2021] Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data

MosaicKD Code for NeurIPS-21 paper "Mosaicking to Distill: Knowledge Distillation from Out-of-Domain Data" 1. Motivation Natural images share common l

ZJU-VIPA 37 Nov 10, 2022
PAthological QUpath Obsession - QuPath and Python conversations

PAQUO: PAthological QUpath Obsession Welcome to paquo 👋 , a library for interacting with QuPath from Python. paquo's goal is to provide a pythonic in

Bayer AG 60 Dec 31, 2022
网络协议2天集训

网络协议2天集训 抓包工具安装 Wireshark wireshark下载地址 Tcpdump CentOS yum install tcpdump -y Ubuntu apt-get install tcpdump -y k8s抓包测试环境 查看虚拟网卡veth pair 查看

120 Dec 12, 2022
Meta-learning for NLP

Self-Supervised Meta-Learning for Few-Shot Natural Language Classification Tasks Code for training the meta-learning models and fine-tuning on downstr

IESL 43 Nov 08, 2022