Minimal implementation and experiments of "No-Transaction Band Network: A Neural Network Architecture for Efficient Deep Hedging".

Overview

No-Transaction Band Network:
A Neural Network Architecture for Efficient Deep Hedging

Open In Colab

Minimal implementation and experiments of "No-Transaction Band Network: A Neural Network Architecture for Efficient Deep Hedging".

Hedging and pricing financial derivatives while taking into account transaction costs is a tough task. Since the hedging optimization is computationally expensive or even inaccessible, risk premiums of derivatives are often overpriced. This problem prevents the liquid offering of financial derivatives.

Our proposal, "No-Transaction Band Network", enables precise hedging with much fewer simulations. This improvement leads to the offering of cheaper risk premiums and thus liquidizes the derivative market. We believe that our proposal brings the data-driven derivative business via "Deep Hedging" much closer to practical applications.

Summary

  • Deep Hedging is a deep learning-based framework to hedge financial derivatives.
  • However, a hedging strategy is hard to train due to the action dependence, i.e., an appropriate hedging action at the next step depends on the current action.
  • We propose a "No-Transaction Band Network" to overcome this issue.
  • This network circumvents the action-dependence and facilitates quick and precise hedging.

Motivation and Result

Hedging financial derivatives (exotic options in particular) in the presence of transaction cost is a hard task.

In the absence of transaction cost, the perfect hedge is accessible based on the Black-Scholes model. The real market, in contrast, always involves transaction cost and thereby makes hedging optimization much more challenging. Since the analytic formulas (such as the Black-Scholes formula of European option) are no longer available in such a market, human traders may hedge and then price derivatives based on their experiences.

Deep Hedging is a ground-breaking framework to automate and optimize such operations. In this framework, a neural network is trained to hedge derivatives so that it minimizes a proper risk measure. However, training in deep hedging suffers difficulty of action dependence since an appropriate action at the next step depends on the current action.

So, we propose "No-Transaction Band Network" for efficient deep hedging. This architecture circumvents the complication to facilitate quick training and better hedging.

loss_lookback

The learning histories above demonstrate that the no-transaction band network can be trained much quicker than the ordinary feed-forward network (See our paper for details).

price_lookback

The figure above plots the derivative price (technically derivative price spreads, which are prices subtracted by that without transaction cost) as a function of the transaction cost. The no-transaction-band network attains cheaper prices than the ordinary network and an approximate analytic formula.

Proposed Architecture: No-Transaction Band Network

The following figures show the schematic diagrams of the neural network which was originally proposed in Deep Hedging (left) and the no-transaction band network (right).

nn

  • The original network:
    • The input of the neural network uses the current hedge ratio (δ_ti) as well as other information (I_ti).
    • Since the input includes the current action δ_ti, this network suffers the complication of action-dependence.
  • The no-transaction band network:
    • This architecture computes "no-transaction band" [b_l, b_u] by a neural network and then gets the next hedge ratio by clamping the current hedge ratio inside this band.
    • Since the input of the neural network does not use the current action, this architecture can circumvent the action-dependence and facilitate training.

Give it a Try!

Open In Colab

You can try out the efficacy of No-Transaction Band Network on a Jupyter Notebook: main.ipynb.

As you can see there, the no-transaction-band can be implemented by simply adding one special layer to an arbitrary neural network.

A comprehensive library for Deep Hedging, pfhedge, is available on PyPI.

References

  • Shota Imaki, Kentaro Imajo, Katsuya Ito, Kentaro Minami and Kei Nakagawa, "No-Transaction Band Network: A Neural Network Architecture for Efficient Deep Hedging". arXiv:2103.01775 [q-fin.CP].
  • 今木翔太, 今城健太郎, 伊藤克哉, 南賢太郎, 中川慧, "効率的な Deep Hedging のためのニューラルネットワーク構造", 人工知能学 金融情報学研究会(SIG-FIN)第 26 回研究会.
  • Hans Bühler, Lukas Gonon, Josef Teichmann and Ben Wood, "Deep hedging". Quantitative Finance, 2019, 19, 1271–1291. arXiv:1609.05213 [q-fin.CP].
The Official PyTorch Implementation of "VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models" (ICLR 2021 spotlight paper)

Official PyTorch implementation of "VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models" (ICLR 2021 Spotlight Paper) Zhisheng

NVIDIA Research Projects 45 Dec 26, 2022
Setup freqtrade/freqUI on Heroku

UNMAINTAINED - REPO MOVED TO https://github.com/p-zombie/freqtrade Creating the app git clone https://github.com/joaorafaelm/freqtrade.git && cd freqt

João 51 Aug 29, 2022
Implementation of QuickDraw - an online game developed by Google, combined with AirGesture - a simple gesture recognition application

QuickDraw - AirGesture Introduction Here is my python source code for QuickDraw - an online game developed by google, combined with AirGesture - a sim

Viet Nguyen 89 Dec 18, 2022
Stream images from a connected camera over MQTT, view using Streamlit, record to file and sqlite

mqtt-camera-streamer Summary: Publish frames from a connected camera or MJPEG/RTSP stream to an MQTT topic, and view the feed in a browser on another

Robin Cole 183 Dec 16, 2022
A demo of how to use JAX to create a simple gravity simulation

JAX Gravity This repo contains a demo of how to use JAX to create a simple gravity simulation. It uses JAX's experimental ode package to solve the dif

Cristian Garcia 16 Sep 22, 2022
Tensorflow Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE

SMU A Tensorflow Implementation of SMU: SMOOTH ACTIVATION FUNCTION FOR DEEP NETWORKS USING SMOOTHING MAXIMUM TECHNIQUE arXiv https://arxiv.org/abs/211

Fuhang 5 Jan 18, 2022
This repo in the implementation of EMNLP'21 paper "SPARQLing Database Queries from Intermediate Question Decompositions" by Irina Saparina, Anton Osokin

SPARQLing Database Queries from Intermediate Question Decompositions This repo is the implementation of the following paper: SPARQLing Database Querie

Yandex Research 20 Dec 19, 2022
Automatic Data-Regularized Actor-Critic (Auto-DrAC)

Auto-DrAC: Automatic Data-Regularized Actor-Critic This is a PyTorch implementation of the methods proposed in Automatic Data Augmentation for General

89 Dec 13, 2022
Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks

Bayesian-Torch is a library of neural network layers and utilities extending the core of PyTorch to enable the user to perform stochastic variational inference in Bayesian deep neural networks. Bayes

Intel Labs 210 Jan 04, 2023
Rank1 Conversation Emotion Detection Task

Rank1-Conversation_Emotion_Detection_Task accuracy macro-f1 recall 0.826 0.7544 0.719 基于预训练模型和时序预测模型的对话情感探测任务 1 摘要 针对对话情感探测任务,本文将其分为文本分类和时间序列预测两个子任务,分

Yuchen Han 2 Nov 28, 2021
GitHub repository for the ICLR Computational Geometry & Topology Challenge 2021

ICLR Computational Geometry & Topology Challenge 2022 Welcome to the ICLR 2022 Computational Geometry & Topology challenge 2022 --- by the ICLR 2022 W

42 Dec 13, 2022
COLMAP - Structure-from-Motion and Multi-View Stereo

COLMAP About COLMAP is a general-purpose Structure-from-Motion (SfM) and Multi-View Stereo (MVS) pipeline with a graphical and command-line interface.

4.7k Jan 07, 2023
Geometric Sensitivity Decomposition

Geometric Sensitivity Decomposition This repo is the official implementation of A Geometric Perspective towards Neural Calibration via Sensitivity Dec

16 Dec 26, 2022
Homepage of paper: Paint Transformer: Feed Forward Neural Painting with Stroke Prediction, ICCV 2021.

Paint Transformer: Feed Forward Neural Painting with Stroke Prediction [Paper] [PaddlePaddle Implementation] Homepage of paper: Paint Transformer: Fee

442 Dec 16, 2022
Generic Foreground Segmentation in Images

Pixel Objectness The following repository contains pretrained model for pixel objectness. Please visit our project page for the paper and visual resul

Suyog Jain 157 Nov 21, 2022
A Framework for Encrypted Machine Learning in TensorFlow

TF Encrypted is a framework for encrypted machine learning in TensorFlow. It looks and feels like TensorFlow, taking advantage of the ease-of-use of t

TF Encrypted 0 Jul 06, 2022
Implementation of Cross Transformer for spatially-aware few-shot transfer, in Pytorch

Cross Transformers - Pytorch (wip) Implementation of Cross Transformer for spatially-aware few-shot transfer, in Pytorch Install $ pip install cross-t

Phil Wang 40 Dec 22, 2022
A repository with exploration into using transformers to predict DNA ↔ transcription factor binding

Transcription Factor binding predictions with Attention and Transformers A repository with exploration into using transformers to predict DNA ↔ transc

Phil Wang 62 Dec 20, 2022
Tensorflow-seq2seq-tutorials - Dynamic seq2seq in TensorFlow, step by step

seq2seq with TensorFlow Collection of unfinished tutorials. May be good for educational purposes. 1 - simple sequence-to-sequence model with dynamic u

Matvey Ezhov 1k Dec 17, 2022
A Domain-Agnostic Benchmark for Self-Supervised Learning

DABS: A Domain Agnostic Benchmark for Self-Supervised Learning This repository contains the code for DABS, a benchmark for domain-agnostic self-superv

Alex Tamkin 81 Dec 09, 2022