Statistical and Algorithmic Investing Strategies for Everyone

Overview

Eiten - Algorithmic Investing Strategies for Everyone

Eiten is an open source toolkit by Tradytics that implements various statistical and algorithmic investing strategies such as Eigen Portfolios, Minimum Variance Portfolios, Maximum Sharpe Ratio Portfolios, and Genetic Algorithms based Portfolios. It allows you to build your own portfolios with your own set of stocks that can beat the market. The rigorous testing framework included in Eiten enables you to have confidence in your portfolios.

If you are looking to discuss these tools in depth and talk about more tools that we are working on, please feel free to join our Discord channel where we have a bunch of more tools too.

Files Description

Path Description
eiten Main folder.
โ””  figures Figures for this github repositories.
โ””  stocks Folder to keep your stock lists that you want to use to create your portfolios.
โ””  strategies A bunch of strategies implemented in python.
backtester.py Backtesting module that both backtests and forward tests all portfolios.
data_loader.py Module for loading data from yahoo finance.
portfolio_manager.py Main file that takes in a bunch of arguments and generates several portfolios for you.
simulator.py Simulator that uses historical returns and monte carlo to simulate future prices for the portfolios.
strategy_manager.py Manages the strategies implemented in the 'strategies' folder.

Required Packages

You will need to install the following package to train and test the models.

You can install all packages using the following command. Please note that the script was written using python3.

pip install -r requirements.txt

Build your portfolios

Let us see how we can use all the strategies given in the toolkit to build our portfolios. The first thing you need to do is modify the stocks.txt file in the stocks folder and add the stocks of your choice. It is recommended to keep the list small i.e anywhere between 5 to 50 stocks should be fine. We have already put a small stocks list containing a bunch of tech stocks like AAPL, MSFT, TSLA etc. Let us build our portfolios now. This is the main command that you need to run.

python portfolio_manager.py --is_test 1 --future_bars 90 --data_granularity_minutes 3600 --history_to_use all --apply_noise_filtering 1 --market_index QQQ --only_long 1 --eigen_portfolio_number 3 --stocks_file_path stocks/stocks.txt

This command will use last 5 years of daily data excluding the last 90 days and build several portfolios for you. Based on those portfolios, it will then test them on the out of sample data of 90 days and show you the performance of each portfolio. Finally, it will also compare the performance with your choice of market index which is QQQ here. Let's dive into each of the parameters in detail.

  • is_test: The value determined if the program is going to keep some separate data for future testing. When this is enabled, the value of future_bars should be larger than 5.
  • future_bars: These are the bars that the tool will exclude during portfolio building and will forward test the portfolios on the excluded set. This is also called out of sample data.
  • data_granularity_minutes: How much granular data do you want to use to build your portfolios. For long term portfolios, you should use daily data but for short term, you can use hourly or minute level data. The possible values here are 3600, 60, 30, 15, 5, 1. 3600 means daily.
  • history_to_use: Whether to use a specific number of historical bars or use everything that we receive from yahoo finance. For minute level data, we only receive up to one month of historical data. For daily, we receive 5 years worth of historical data. If you want to use all available data, the value should be all but if you want to use smaller history, you can set it to an integer value e.g 100 which will only use the last 100 bars to build the portfolios.
  • apply_noise_filtering: This uses random matrix theory to filter out the covariance matrix from randomness thus yielding better portfolios. A value of 1 will enable it and 0 will disable it.
  • market_index: Which index do you want to use to compare your portfolios. This should mostly be SPY but since we analyzed tech stocks, we used QQQ.
  • only_long: Whether to use long only portfolio or enable short selling as well. Long only portfolios have shown to have better performance using algorithmic techniques.
  • eigen_portfolio_number: Which eigen portfolio to use. Any value between 1-5 should work. The first eigen portfolio (1) represents the market portfolio and should act just like the underlying index such as SPY or QQQ. The second one is orthogonal and uncorrelated to the market and poses the greatest risk and reward. The following ones have reduced risk and reward. Read more on eigen-portfolios.
  • stocks_file_path: File that contains the list of stocks that you want to use to build your portfolio.

Some Portfolio Building Examples

Here are a few examples for building different types of portfolios.

  • Both long and short portfolios by analyzing last 90 days data and keeping the last 30 days as testing data. This will give us 60 days of portfolio construction data and 30 days of testing.
python portfolio_manager.py --is_test 1 --future_bars 30 --data_granularity_minutes 3600 --history_to_use 90 --apply_noise_filtering 1 --market_index QQQ --only_long 0 --eigen_portfolio_number 3 --stocks_file_path stocks/stocks.txt
  • Only long portfolio on 60 minute bars of the last 30 days. No future testing. Compare the results with SPY index instead of QQQ.
python portfolio_manager.py --is_test 0 --future_bars 0 --data_granularity_minutes 60 --history_to_use all --apply_noise_filtering 1 --market_index SPY --only_long 1 --eigen_portfolio_number 3 --stocks_file_path stocks/stocks.txt
  • Do not apply noise filtering on the covariance matrix. Use the first eigen portfolio (market portfolio) and compare with SQQQ,
python portfolio_manager.py --is_test 1 --future_bars 90 --data_granularity_minutes 3600 --history_to_use all --apply_noise_filtering 0 --market_index SQQQ --only_long 1 --eigen_portfolio_number 1 --stocks_file_path stocks/stocks.txt

Portfolio Strategies

Four different portfolio strategies are currently supported by the toolkit.

  1. Eigen Portfolios
    1. These portfolios are orthogonal and uncorrelated to the market in general thus yielding high reward and alpha. However, since they are uncorrelated to the market, they can also provide great risk. The first eigen portfolio is considered to be a market portfolio which is often ignored. The second one is uncorrelated to the others and provides the highest risk and reward. As we go down the numbering, the risk as well as the reward are reduced.
  2. Minimum Variance Portfolio (MVP)
    1. MVP tries to minimize the variance of the portfolio. These portfolios are lowest risk and reward.
  3. Maximum Sharpe Ratio Portfolio (MSR)
    1. MSR solves an optimization problem that tries to maximize the sharpe ratio of the portfolio. It uses past returns during the optimization process which means if past returns are not the same as future returns, the results can vary in future.
  4. Genetic Algorithm (GA) based Portfolio
    1. This is our own implementation of a GA based portfolio that again tries to maximize the sharpe ratio but in a slightly more robust way. This usually provides more robust portfolios than the others.

When you run the command above, our tool will generate portfolios from all these strategies and give them to you. Let us look at some resulting portfolios.

Resulting Portfolios

For the purpose these results, we will use the 9 stocks in the stocks/stocks.txt file. When we run the above command, we first get the portfolio weights for all four strategies. For testing purposes, the above command used last five years of daily data up till April 29th. The remaining data for this year was used for forward testing i.e the portfolio strategies had no access to it when building the portfolios.

What if my portfolio needs different stocks?: All you need to do is change the stocks in the stocks.txt file and run the tool again. Here is the final command again that we run in order to get our portfolios:

python portfolio_manager.py --is_test 1 --future_bars 90 --data_granularity_minutes 3600 --history_to_use all --apply_noise_filtering 1 --market_index QQQ --only_long 1 --eigen_portfolio_number 3 --stocks_file_path stocks/stocks.txt

Portfolio Weights

We can see that the eigen portfolio is giving a large weight to TSLA while the others are dividing their weights more uniformly. An interesting phenomena happening here is the hedging with SQQQ that all the strategies have learned automatically. Every tool is assigning some positive weight to SQQQ while also assigning positive weights to other stocks which indicates that the strategies are automatically trying to hedge the portfolios from risk. Obviously this is not perfect, but just the fact that it's happening is fascinating. Let us look at the backtest results on the last five years prior to April 29, 2020.

Backtest Results

The backtests look pretty encouraging. The black dotted line is the market index i.e QQQ. Other lines are the strategies. Our custom genetic algorithm implementation seems to have the best backtest results because it's an advanced version of other strategies. The eigen portfolio that weighed TSLA the most have the most volatility but its profits are also very high. Finally, as expected, the MVP has the minimum variance and ultimately the least profits. However, since the variance is extremely low, it is a good portfolio for those who want to stay safe. The most interesting part comes next, let us look at the forward or future test results for these portfolios.

Forward Test Results

These results are from April 29th, 2020 to September 4th, 2020. The eigen portfolio performed the best but it also had a lot of volatility. Moreover, most of those returns are due to TSLA rocketing in the last few months. After that, our GA algorithm worked quite effectively as it beat the market index. Again, as expected, the MVP had the lowest risk and reward and slowly went up in 4-5 months. This shows the effectiveness and power of these algorithmic portfolio optimization strategies where we've developed different portfolios for different kinds of risk and reward profiles.

Conclusion and Discussion

We are happy to share this toolkit with the trading community and hope that people will like and contribute to it. As is the case with everything in trading, these strategies are not perfect but they are based on rigorous theory and some great empirical results. Please take care when trading with these strategies and always manage your risk. The above results were not cherry picked but the market has been highly bullish in the last few months which has led to the strong results shown above. We would love for the community to try out different strategies and share them with us.

Special Thanks

Special thanks to Scott Rome's blog. The eigen portfolios and minimum variance portfolio concepts came from his blog posts. The code for filtering eigen values of the covariance matrix was also mostly obtained from one of his posts.

License

License: GPL v3

A product by Tradytics

Copyright (c) 2020-present, Tradytics.com

Owner
Tradytics
Artificial Intelligence driven Trading Tools
Tradytics
Generative Modelling of BRDF Textures from Flash Images [SIGGRAPH Asia, 2021]

Neural Material Official code repository for the paper: Generative Modelling of BRDF Textures from Flash Images [SIGGRAPH Asia, 2021] Henzler, Deschai

Philipp Henzler 80 Dec 20, 2022
A coin flip game in which you can put the amount of money below or equal to 1000 and then choose heads or tail

COIN_FLIPPY ##This is a simple example package. You can use Github-flavored Markdown to write your content. Coinflippy A coin flip game in which you c

2 Dec 26, 2021
Faster Convex Lipschitz Regression

Faster Convex Lipschitz Regression This reepository provides a python implementation of our Faster Convex Lipschitz Regression algorithm with GPU and

Ali Siahkamari 0 Nov 19, 2021
Minimal implementation of PAWS (https://arxiv.org/abs/2104.13963) in TensorFlow.

PAWS-TF ๐Ÿพ Implementation of Semi-Supervised Learning of Visual Features by Non-Parametrically Predicting View Assignments with Support Samples (PAWS)

Sayak Paul 43 Jan 08, 2023
Reproducing Results from A Hybrid Approach to Targeting Social Assistance

title author date output Reproducing Results from A Hybrid Approach to Targeting Social Assistance Lendie Follett and Heath Henderson 12/28/2021 html_

Lendie Follett 0 Jan 06, 2022
Local-Global Stratified Transformer for Efficient Video Recognition

DualFormer This repo is the implementation of our manuscript entitled "Local-Global Stratified Transformer for Efficient Video Recognition". Our model

Sea AI Lab 19 Dec 07, 2022
Implementations of CNNs, RNNs, GANs, etc

Tensorflow Programs and Tutorials This repository will contain Tensorflow tutorials on a lot of the most popular deep learning concepts. It'll also co

Adit Deshpande 1k Dec 30, 2022
BADet: Boundary-Aware 3D Object Detection from Point Clouds (Pattern Recognition 2022)

BADet: Boundary-Aware 3D Object Detection from Point Clouds (Pattern Recognition

Rui Qian 17 Dec 12, 2022
AI grand challenge 2020 Repo (Speech Recognition Track)

KorBERT๋ฅผ ํ™œ์šฉํ•œ ํ•œ๊ตญ์–ด ํ…์ŠคํŠธ ๊ธฐ๋ฐ˜ ์œ„ํ˜‘ ์ƒํ™ฉ์ธ์ง€(2020 ์ธ๊ณต์ง€๋Šฅ ๊ทธ๋žœ๋“œ ์ฑŒ๋ฆฐ์ง€) ๋ณธ ํ”„๋กœ์ ํŠธ๋Š” ETRI์—์„œ ์ œ๊ณต๋œ ํ•œ๊ตญ์–ด korBERT ๋ชจ๋ธ์„ ํ™œ์šฉํ•˜์—ฌ ํญ๋ ฅ ๊ธฐ๋ฐ˜ ํ•œ๊ตญ์–ด ํ…์ŠคํŠธ๋ฅผ ๋ถ„๋ฅ˜ํ•˜๋Š” ๋‹ค์–‘ํ•œ ๋ถ„๋ฅ˜ ๋ชจ๋ธ๋“ค์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ๋ณธ ๊ฐœ๋ฐœ์ž๋“ค์ด ์ฐธ์—ฌํ•œ 2020 ์ธ๊ณต์ง€

Young-Seok Choi 23 Jan 25, 2022
A Broader Picture of Random-walk Based Graph Embedding

Random-walk Embedding Framework This repository is a reference implementation of the random-walk embedding framework as described in the paper: A Broa

Zexi Huang 23 Dec 13, 2022
This repository contains the code for EMNLP-2021 paper "Word-Level Coreference Resolution"

Word-Level Coreference Resolution This is a repository with the code to reproduce the experiments described in the paper of the same name, which was a

79 Dec 27, 2022
Python port of R's Comprehensive Dynamic Time Warp algorithm package

Welcome to the dtw-python package Comprehensive implementation of Dynamic Time Warping algorithms. DTW is a family of algorithms which compute the loc

Dynamic Time Warping algorithms 154 Dec 26, 2022
[ICCV 2021 Oral] SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer

This repository contains the source code for the paper SnowflakeNet: Point Cloud Completion by Snowflake Point Deconvolution with Skip-Transformer (ICCV 2021 Oral). The project page is here.

AllenXiang 65 Dec 26, 2022
PyTorch code of my ICDAR 2021 paper Vision Transformer for Fast and Efficient Scene Text Recognition (ViTSTR)

Vision Transformer for Fast and Efficient Scene Text Recognition (ICDAR 2021) ViTSTR is a simple single-stage model that uses a pre-trained Vision Tra

Rowel Atienza 198 Dec 27, 2022
Optimizing Deeper Transformers on Small Datasets

DT-Fixup Optimizing Deeper Transformers on Small Datasets Paper published in ACL 2021: arXiv Detailed instructions to replicate our results in the pap

16 Nov 14, 2022
Code for Talking Face Generation by Adversarially Disentangled Audio-Visual Representation (AAAI 2019)

Talking Face Generation by Adversarially Disentangled Audio-Visual Representation (AAAI 2019) We propose Disentangled Audio-Visual System (DAVS) to ad

Hang_Zhou 750 Dec 23, 2022
Codes and scripts for "Explainable Semantic Space by Grounding Languageto Vision with Cross-Modal Contrastive Learning"

Visually Grounded Bert Language Model This repository is the official implementation of Explainable Semantic Space by Grounding Language to Vision wit

17 Dec 17, 2022
End-To-End Optimization of LiDAR Beam Configuration

End-To-End Optimization of LiDAR Beam Configuration arXiv | IEEE Xplore This repository is the official implementation of the paper: End-To-End Optimi

Niclas 30 Nov 28, 2022
Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE)

OG-SPACE Introduction Optimized Gillespie algorithm for simulating Stochastic sPAtial models of Cancer Evolution (OG-SPACE) is a computational framewo

Data and Computational Biology Group UNIMIB (was BI*oinformatics MI*lan B*icocca) 0 Nov 17, 2021
SegTransVAE: Hybrid CNN - Transformer with Regularization for medical image segmentation

SegTransVAE: Hybrid CNN - Transformer with Regularization for medical image segmentation This repo is the official implementation for SegTransVAE. Seg

Nguyen Truong Hai 4 Aug 04, 2022