"Learning and Analyzing Generation Order for Undirected Sequence Models" in Findings of EMNLP, 2021

Overview

undirected-generation-dev

This repo contains the source code of the models described in the following paper

  • "Learning and Analyzing Generation Order for Undirected Sequence Models" in Findings of EMNLP, 2021. (paper).

The basic code structure was adapted from the NYU dl4mt-seqgen. We also use the pybleu from fairseq to calculate BLEU scores during the reinforcement learning.

0. Preparation

0.1 Dependencies

  • PyTorch 1.4.0/1.6.0/1.8.0

0.2 Data

The WMT'14 De-En data and the pretrained De-En MLM model are provided in the dl4mt-seqgen.

  • Download WMT'14 De-En valid/test data.
  • Then organize the data in data/ and make sure it follows such a structure:
------ data
--------- de-en
------------ train.de-en.de.pth
------------ train.de-en.en.pth
------------ valid.de-en.de.pth
------------ valid.de-en.en.pth
------------ test.de-en.de.pth
------------ test.de-en.en.pth
  • Download pretrained models.
  • Then organize the pretrained masked language models in models/ make sure it follows such a structure:
------ models
--------- best-valid_en-de_mt_bleu.pth
--------- best-valid_de-en_mt_bleu.pth

2. Training the order policy network with reinforcement learning

Train a policy network to predict the generation order for a pretrained De-En masked language model:

./train_scripts/train_order_rl_deen.sh
  • By defaults, the model checkpoints will be saved in models/learned_order_deen_uniform_4gpu/00_maxlen30_minlen5_bsz32.
  • By using this script, we are only training the model on De-En sentence pairs where both the German and English sentences with a maximum length of 30 and a minimum length of 5. You can change the training parameters max_len and min_len to change the length limits.

3. Decode the undirected generation model with learned orders

  • Set the MODEL_CKPT parameter to the corresponding path found under models/00_maxlen30_minlen5_bsz32. For example:
export MODEL_CKPT=wj8oc8kab4/checkpoint_epoch30+iter96875.pth
  • Evaluate the model on the SCAN MCD1 splits by running:
export MODEL_CKPT=...
./eval_scripts/generate-order-deen.sh $MODEL_CKPT

4. Decode the undirected generation model with heuristic orders

  • Left2Right
./eval_scripts/generate-deen.sh left_right_greedy_1iter
  • Least2Most
./eval_scripts/generate-deen.sh least_most_greedy_1iter
  • EasyFirst
./eval_scripts/generate-deen.sh easy_first_greedy_1iter
  • Uniform
./eval_scripts/generate-deen.sh uniform_greedy_1iter

Citation

@inproceedings{jiang-bansal-2021-learning-analyzing,
    title = "Learning and Analyzing Generation Order for Undirected Sequence Models",
    author = "Jiang, Yichen  and
      Bansal, Mohit",
    booktitle = "Findings of the Association for Computational Linguistics: EMNLP 2021",
    month = nov,
    year = "2021",
    address = "Punta Cana, Dominican Republic",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.findings-emnlp.298",
    pages = "3513--3523",
}
Owner
Yichen Jiang
Yichen Jiang
Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective

Does-MAML-Only-Work-via-Feature-Re-use-A-Data-Set-Centric-Perspective Does MAML Only Work via Feature Re-use? A Data Set Centric Perspective Installin

2 Nov 07, 2022
LLVM-based compiler for LightGBM gradient-boosted trees. Speeds up prediction by ≥10x.

LLVM-based compiler for LightGBM gradient-boosted trees. Speeds up prediction by ≥10x.

Simon Boehm 183 Jan 02, 2023
A-ESRGAN aims to provide better super-resolution images by using multi-scale attention U-net discriminators.

A-ESRGAN: Training Real-World Blind Super-Resolution with Attention-based U-net Discriminators The authors are hidden for the purpose of double blind

77 Dec 16, 2022
Texture mapping with variational auto-encoders

vae-textures This is an experiment with using variational autoencoders (VAEs) to perform mesh parameterization. This was also my first project using J

Alex Nichol 41 May 24, 2022
Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your personal computer!

Reproducible research and reusable acyclic workflows in Python. Execute code on HPC systems as if you executed them on your machine! Motivation Would

Joeri Hermans 15 Sep 11, 2022
EDCNN: Edge enhancement-based Densely Connected Network with Compound Loss for Low-Dose CT Denoising

EDCNN: Edge enhancement-based Densely Connected Network with Compound Loss for Low-Dose CT Denoising By Tengfei Liang, Yi Jin, Yidong Li, Tao Wang. Th

workingcoder 115 Jan 05, 2023
Pca-on-genotypes - Mini bioinformatics project - PCA on genotypes

Mini bioinformatics project: PCA on genotypes This repo contains the code from t

Maria Nattestad 8 Dec 04, 2022
RNN Predict Street Commercial Vitality

RNN-for-Predicting-Street-Vitality Code and dataset for Predicting the Vitality of Stores along the Street based on Business Type Sequence via Recurre

Zidong LIU 1 Dec 15, 2021
LaneDetectionAndLaneKeeping - Lane Detection And Lane Keeping

LaneDetectionAndLaneKeeping This project is part of my bachelor's thesis. The go

5 Jun 27, 2022
Fast and Context-Aware Framework for Space-Time Video Super-Resolution (VCIP 2021)

Fast and Context-Aware Framework for Space-Time Video Super-Resolution Preparation Dependencies PyTorch 1.2.0 CUDA 10.0 DCNv2 cd model/DCNv2 bash make

Xueheng Zhang 1 Mar 29, 2022
TransZero++: Cross Attribute-guided Transformer for Zero-Shot Learning

TransZero++ This repository contains the testing code for the paper "TransZero++: Cross Attribute-guided Transformer for Zero-Shot Learning" submitted

Shiming Chen 6 Aug 16, 2022
Use evolutionary algorithms instead of gridsearch in scikit-learn

sklearn-deap Use evolutionary algorithms instead of gridsearch in scikit-learn. This allows you to reduce the time required to find the best parameter

rsteca 709 Jan 03, 2023
Adaptive FNO transformer - official Pytorch implementation

Adaptive Fourier Neural Operators: Efficient Token Mixers for Transformers This repository contains PyTorch implementation of the Adaptive Fourier Neu

NVIDIA Research Projects 77 Dec 29, 2022
Seq2seq - Sequence to Sequence Learning with Keras

Seq2seq Sequence to Sequence Learning with Keras Hi! You have just found Seq2Seq. Seq2Seq is a sequence to sequence learning add-on for the python dee

Fariz Rahman 3.1k Dec 18, 2022
Backdoor Attack through Frequency Domain

Backdoor Attack through Frequency Domain DEPENDENCIES python==3.8.3 numpy==1.19.4 tensorflow==2.4.0 opencv==4.5.1 idx2numpy==1.2.3 pytorch==1.7.0 Data

5 Jun 18, 2022
People log into different sites every day to get information and browse through these sites one by one

HyperLink People log into different sites every day to get information and browse through these sites one by one. And they are exposed to advertisemen

0 Feb 17, 2022
Official repo for BMVC2021 paper ASFormer: Transformer for Action Segmentation

ASFormer: Transformer for Action Segmentation This repo provides training & inference code for BMVC 2021 paper: ASFormer: Transformer for Action Segme

42 Dec 23, 2022
Research on Tabular Deep Learning (Python package & papers)

Research on Tabular Deep Learning For paper implementations, see the section "Papers and projects". rtdl is a PyTorch-based package providing a user-f

Yura Gorishniy 510 Dec 30, 2022
Learning Intents behind Interactions with Knowledge Graph for Recommendation, WWW2021

Learning Intents behind Interactions with Knowledge Graph for Recommendation This is our PyTorch implementation for the paper: Xiang Wang, Tinglin Hua

158 Dec 15, 2022
TrTr: Visual Tracking with Transformer

TrTr: Visual Tracking with Transformer We propose a novel tracker network based on a powerful attention mechanism called Transformer encoder-decoder a

趙 漠居(Zhao, Moju) 66 Dec 27, 2022