InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective
This is the official code base for our ICLR 2021 paper:
"InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective".
Boxin Wang, Shuohang Wang, Yu Cheng, Zhe Gan, Ruoxi Jia, Bo Li, Jingjing Liu
Usage
Prepare your environment
Download required packages
pip install -r requirements.txt
ANLI and TextFooler
To run ANLI and TextFooler experiments, refer to README in the ANLI
directory.
SQuAD
We will upload the code for the SQuAD experiments soon.
Citation
@inproceedings{
wang2021infobert,
title={InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective},
author={Wang, Boxin and Wang, Shuohang and Cheng, Yu and Gan, Zhe and Jia, Ruoxi and Li, Bo and Liu, Jingjing},
booktitle={International Conference on Learning Representations},
year={2021}}