LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION

Overview

Query Selector

Here you can find code and data loaders for the paper https://arxiv.org/pdf/2107.08687v1.pdf . Query Selector is a novel approach to sparse attention Transformer algorithm that is especially suitable for long term time series forecasting

Depencency

Python            3.7.9
deepspeed         0.4.0
numpy             1.20.3
pandas            1.2.4
scipy             1.6.3
tensorboardX      1.8
torch             1.7.1
torchaudio        0.7.2
torchvision       0.8.2
tqdm              4.61.0

Results on ETT dataset

Univariate

Data Prediction len Informer MSE Informer MAE Trans former MSE Trans former MAE Query Selector MSE Query Selector MAE MSE ratio
ETTh1 24 0.0980 0.2470 0.0548 0.1830 0.0436 0.1616 0.445
ETTh1 48 0.1580 0.3190 0.0740 0.2144 0.0721 0.2118 0.456
ETTh1 168 0.1830 0.3460 0.1049 0.2539 0.0935 0.2371 0.511
ETTh1 336 0.2220 0.3870 0.1541 0.3201 0.1267 0.2844 0.571
ETTh1 720 0.2690 0.4350 0.2501 0.4213 0.2136 0.3730 0.794
ETTh2 24 0.0930 0.2400 0.0999 0.2479 0.0843 0.2239 0.906
ETTh2 48 0.1550 0.3140 0.1218 0.2763 0.1117 0.2622 0.721
ETTh2 168 0.2320 0.3890 0.1974 0.3547 0.1753 0.3322 0.756
ETTh2 336 0.2630 0.4170 0.2191 0.3805 0.2088 0.3710 0.794
ETTh2 720 0.2770 0.4310 0.2853 0.4340 0.2585 0.4130 0.933
ETTm1 24 0.0300 0.1370 0.0143 0.0894 0.0139 0.0870 0.463
ETTm1 48 0.0690 0.2030 0.0328 0.1388 0.0342 0.1408 0.475
ETTm1 96 0.1940 0.2030 0.0695 0.2085 0.0702 0.2100 0.358
ETTm1 288 0.4010 0.5540 0.1316 0.2948 0.1548 0.3240 0.328
ETTm1 672 0.5120 0.6440 0.1728 0.3437 0.1735 0.3427 0.338

Multivariate

Data Prediction len Informer MSE Informer MAE Trans former MSE Trans former MAE Query Selector MSE Query Selector MAE MSE ratio
ETTh1 24 0.5770 0.5490 0.4496 0.4788 0.4226 0.4627 0.732
ETTh1 48 0.6850 0.6250 0.4668 0.4968 0.4581 0.4878 0.669
ETTh1 168 0.9310 0.7520 0.7146 0.6325 0.6835 0.6088 0.734
ETTh1 336 1.1280 0.8730 0.8321 0.7041 0.8503 0.7039 0.738
ETTh1 720 1.2150 0.8960 1.1080 0.8399 1.1150 0.8428 0.912
ETTh2 24 0.7200 0.6650 0.4237 0.5013 0.4124 0.4864 0.573
ETTh2 48 1.4570 1.0010 1.5220 0.9488 1.4074 0.9317 0.966
ETTh2 168 3.4890 1.5150 1.6225 0.9726 1.7385 1.0125 0.465
ETTh2 336 2.7230 1.3400 2.6617 1.2189 2.3168 1.1859 0.851
ETTh2 720 3.4670 1.4730 3.1805 1.3668 3.0664 1.3084 0.884
ETTm1 24 0.3230 0.3690 0.3150 0.3886 0.3351 0.3875 0.975
ETTm1 48 0.4940 0.5030 0.4454 0.4620 0.4726 0.4702 0.902
ETTm1 96 0.6780 0.6140 0.4641 0.4823 0.4543 0.4831 0.670
ETTm1 288 1.0560 0.7860 0.6814 0.6312 0.6185 0.5991 0.586
ETTm1 672 1.1920 0.9260 1.1365 0.8572 1.1273 0.8412 0.946

State Of Art

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

PWC

Citation

@misc{klimek2021longterm,
      title={Long-term series forecasting with Query Selector -- efficient model of sparse attention}, 
      author={Jacek Klimek and Jakub Klimek and Witold Kraskiewicz and Mateusz Topolewski},
      year={2021},
      eprint={2107.08687},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}

Contact

If you have any questions please contact us by email - [email protected]

Owner
MORAI
MORAI
PyTorch implementation of federated learning framework based on the acceleration of global momentum

Federated Learning with Acceleration of Global Momentum PyTorch implementation of federated learning framework based on the acceleration of global mom

0 Dec 23, 2021
Plug and play transformer you can find network structure and official complete code by clicking List

Plug-and-play Module Plug and play transformer you can find network structure and official complete code by clicking List The following is to quickly

8 Mar 27, 2022
ZEBRA: Zero Evidence Biometric Recognition Assessment

ZEBRA: Zero Evidence Biometric Recognition Assessment license: LGPLv3 - please reference our paper version: 2020-06-11 author: Andreas Nautsch (EURECO

Voice Privacy Challenge 2 Dec 12, 2021
Framework web SnakeServer.

SnakeServer - Framework Web 🐍 Documentação oficial do framework SnakeServer. Conteúdo Sobre Como contribuir Enviar relatórios de segurança Pull reque

Jaedson Silva 0 Jul 21, 2022
Official PyTorch code for CVPR 2020 paper "Deep Active Learning for Biased Datasets via Fisher Kernel Self-Supervision"

Deep Active Learning for Biased Datasets via Fisher Kernel Self-Supervision https://arxiv.org/abs/2003.00393 Abstract Active learning (AL) aims to min

Denis 29 Nov 21, 2022
A hobby project which includes a hand-gesture based virtual piano using a mobile phone camera and OpenCV library functions

Overview This is a hobby project which includes a hand-gesture controlled virtual piano using an android phone camera and some OpenCV library. My moti

Abhinav Gupta 1 Nov 19, 2021
we propose EfficientDerain for high-efficiency single-image deraining

EfficientDerain we propose EfficientDerain for high-efficiency single-image deraining Requirements python 3.6 pytorch 1.6.0 opencv-python 4.4.0.44 sci

Qing Guo 126 Dec 07, 2022
Pytorch implementation for the Temporal and Object Quantification Networks (TOQ-Nets).

TOQ-Nets-PyTorch-Release Pytorch implementation for the Temporal and Object Quantification Networks (TOQ-Nets). Temporal and Object Quantification Net

Zhezheng Luo 9 Jun 30, 2022
The original implementation of TNDM used in the NeurIPS 2021 paper (no longer being updated)

TNDM - Targeted Neural Dynamical Modeling Note: This code is no longer being updated. The official re-implementation can be found at: https://github.c

1 Jul 21, 2022
StyleTransfer - Open source style transfer project, based on VGG19

StyleTransfer - Open source style transfer project, based on VGG19

Patrick martins de lima 9 Dec 13, 2021
BboxToolkit is a tiny library of special bounding boxes.

BboxToolkit is a light codebase collecting some practical functions for the special-shape detection, such as oriented detection

jbwang1997 73 Jan 01, 2023
A package to predict protein inter-residue geometries from sequence data

trRosetta This package is a part of trRosetta protein structure prediction protocol developed in: Improved protein structure prediction using predicte

Ivan Anishchenko 185 Jan 07, 2023
Vertical Federated Principal Component Analysis and Its Kernel Extension on Feature-wise Distributed Data based on Pytorch Framework

VFedPCA+VFedAKPCA This is the official source code for the Paper: Vertical Federated Principal Component Analysis and Its Kernel Extension on Feature-

John 9 Sep 18, 2022
FCOS: Fully Convolutional One-Stage Object Detection (ICCV'19)

FCOS: Fully Convolutional One-Stage Object Detection This project hosts the code for implementing the FCOS algorithm for object detection, as presente

Tian Zhi 3.1k Jan 05, 2023
Optimising chemical reactions using machine learning

Summit Summit is a set of tools for optimising chemical processes. We’ve started by targeting reactions. What is Summit? Currently, reaction optimisat

Sustainable Reaction Engineering Group 75 Dec 14, 2022
Code for Domain Adaptive Video Segmentation via Temporal Consistency Regularization in ICCV 2021

Domain Adaptive Video Segmentation via Temporal Consistency Regularization Updates 08/2021: check out our domain adaptation for sematic segmentation p

36 Dec 12, 2022
Code repository for EMNLP 2021 paper 'Adversarial Attacks on Knowledge Graph Embeddings via Instance Attribution Methods'

Adversarial Attacks on Knowledge Graph Embeddings via Instance Attribution Methods This is the code repository to accompany the EMNLP 2021 paper on ad

Peru Bhardwaj 7 Sep 25, 2022
Official PyTorch implementation of "AASIST: Audio Anti-Spoofing using Integrated Spectro-Temporal Graph Attention Networks"

AASIST This repository provides the overall framework for training and evaluating audio anti-spoofing systems proposed in 'AASIST: Audio Anti-Spoofing

Clova AI Research 56 Jan 02, 2023
Implementation of paper "DCS-Net: Deep Complex Subtractive Neural Network for Monaural Speech Enhancement"

DCS-Net This is the implementation of "DCS-Net: Deep Complex Subtractive Neural Network for Monaural Speech Enhancement" Steps to run the model Edit V

Jack Walters 10 Apr 04, 2022
Restricted Boltzmann Machines in Python.

How to Use First, initialize an RBM with the desired number of visible and hidden units. rbm = RBM(num_visible = 6, num_hidden = 2) Next, train the m

Edwin Chen 928 Dec 30, 2022