Source code for our paper "Empathetic Response Generation with State Management"

Overview

Source code for our paper "Empathetic Response Generation with State Management"

this repository is maintained by both Jun Gao and Yuhan Liu

Model Overview

model

Environment Requirement

  • pytorch >= 1.4
  • sklearn
  • nltk
  • numpy
  • bert-score

Dataset

you can directly use the processed dataset located in data/empathetic:

├── data
│   ├── empathetic
│   │   ├── parsed_emotion_Ekman_intent_test.json
│   │   ├── parsed_emotion_Ekman_intent_train.json
│   │   ├── parsed_emotion_Ekman_intent_valid.json
│   │   ├── emotion_intent_trans.mat
│   │   ├── goEmotion_emotion_trans.mat

Or you want to reproduce the data annotated with goEmotion emotion classifier and empathetic intent classifier, you can run the command:

  • convert raw csv empathetic dialogue data into json format. (origin dataset link: EmpatheticDialogues)

    bash preprocess_raw.sh
  • train emotion classfier with goEmotion dataset and annotate (origin dataset link: goEmotion). Here $BERT_DIR is your pretrained BERT model directory which includes vocab.txt, config.json and pytorch_model.bin, here we simply use bert-base-en from Hugginface

    bash ./bash/emotion_annotate.sh  $BERT_DIR 32 0.00005 16 3 1024 2 0.1
  • train intent classfier with empathetic intent dataset and annotate (origin dataset link: Empathetic_Intent)

    bash ./bash/intent_annotate.sh  $BERT_DIR 32 0.00005 16 3 1024 2 0.1
  • build prior emotion-emotion and emotion-intent transition matrix

    bash ./bash/build_transition_mat.sh

Train

For training the LM-based model, you need to download bert-base-en and gpt2-small from Hugginface first, then run the following command. Here $GPT_DIR and $BERT_DIR are the downloaded model directory:

bash ./bash/train_LM.sh --gpt_path $GPT_DIR --bert_path $BERT_DIR --gpu_id 2 --epoch 5 --lr_NLU 0.00003 --lr_NLG 0.00008 --bsz_NLU 16 --bsz_NLG 16

for example:

bash ./bash/train_LM.sh --gpt_path /home/liuyuhan/datasets/gpt2-small --bert_path /home/liuyuhan/datasets/bert-base-en bert-base-en --gpu_id 2 --epoch 5 --lr_NLU 0.00003 --lr_NLG 0.00008 --bsz_NLU 16 --bsz_NLG 16

For training the Trs-based model, we use glove.6B.300d as the pretrained word embeddings. You can run the following command to train model. Here $GLOVE is the glove embedding txt file.

bash ./bash/train_Trs.sh --gpu_id 2 --epoch 15 --lr_NLU 0.00007 --lr_NLG 0.0015 --bsz_NLU 16 --bsz_NLG 16 --glove $GLOVE

for example:

bash ./bash/train_Trs.sh --gpu_id 2 --epoch 15 --lr_NLU 0.00007 --lr_NLG 0.0015 --bsz_NLU 16 --bsz_NLG 16 --glove /home/liuyuhan/datasets/glove/glove.6B.300d.txt

Evaluate

To generate the automatic metric results, firstly you need to make sure that bert-score is successfully installed. In our paper, we use roberta-large-en rescaled with baseline to calculate BERTScore. You can download roberta-large-en from Hugginface. For the rescaled_baseline file, we can download it from here and put it under the roberta-large-en model directory.

Then you can run the following command to get the result, here $hypothesis and $reference are the generated response file and ground-truth response file. $result is the output result file. $ROBERTA_DIR is the downloaded roberta-large-en model directory.

To evaluate LM-based model, the command is:

bash ./bash/eval.sh --hyp $hypothesis --ref ./data/empathetic/ref.txt --out $result --bert $ROBERTA_DIR --gpu_id 0 --mode LM

To evaluate Trs-based model, the command is:

bash ./bash/eval.sh --hyp $hypothesis --ref ./data/empathetic/ref_tokenize.txt --out $result --bert $ROBERTA_DIR --gpu_id 0 --mode Trs
Owner
Yuhan Liu
NLPer
Yuhan Liu
A curated list of awesome Deep Learning tutorials, projects and communities.

Awesome Deep Learning Table of Contents Books Courses Videos and Lectures Papers Tutorials Researchers Websites Datasets Conferences Frameworks Tools

Christos 20k Jan 05, 2023
Python implementation of Lightning-rod Agent, the Stack4Things board-side probe

Iotronic Lightning-rod Agent Python implementation of Lightning-rod Agent, the Stack4Things board-side probe. Free software: Apache 2.0 license Websit

2 May 19, 2022
Demonstrational Session git repo for H SAF User Workshop (28/1)

5th H SAF User Workshop The 5th H SAF User Workshop supported by EUMeTrain will be held in online in January 24-28 2022. This repository contains inst

H SAF 4 Aug 04, 2022
Pytorch code for semantic segmentation using ERFNet

ERFNet (PyTorch version) This code is a toolbox that uses PyTorch for training and evaluating the ERFNet architecture for semantic segmentation. For t

Edu 394 Jan 01, 2023
Neural Surface Maps

Neural Surface Maps Official implementation of Neural Surface Maps - Luca Morreale, Noam Aigerman, Vladimir Kim, Niloy J. Mitra [Paper] [Project Page]

Luca Morreale 49 Dec 13, 2022
CLIP (Contrastive Language–Image Pre-training) trained on Indonesian data

CLIP-Indonesian CLIP (Radford et al., 2021) is a multimodal model that can connect images and text by training a vision encoder and a text encoder joi

Galuh 17 Mar 10, 2022
Implementation of the paper All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training

SemCo The official pytorch implementation of the paper All Labels Are Not Created Equal: Enhancing Semi-supervision via Label Grouping and Co-training

42 Nov 14, 2022
PyTorch implementation of 'Gen-LaneNet: a generalized and scalable approach for 3D lane detection'

(pytorch) Gen-LaneNet: a generalized and scalable approach for 3D lane detection Introduction This is a pytorch implementation of Gen-LaneNet, which p

Yuliang Guo 233 Jan 06, 2023
Disentangled Lifespan Face Synthesis

Disentangled Lifespan Face Synthesis Project Page | Paper Demo on Colab Preparation Please follow this github to prepare the environments and dataset.

何森 50 Sep 20, 2022
A robotic arm that mimics hand movement through MediaPipe tracking.

La-Z-Arm A robotic arm that mimics hand movement through MediaPipe tracking. Hardware NVidia Jetson Nano Sparkfun Pi Servo Shield Micro Servos Webcam

Alfred 1 Jun 05, 2022
A 35mm camera, based on the Canonet G-III QL17 rangefinder, simulated in Python.

c is for Camera A 35mm camera, based on the Canonet G-III QL17 rangefinder, simulated in Python. The purpose of this project is to explore and underst

Daniele Procida 146 Sep 26, 2022
Pytorch and Torch testing code of CartoonGAN

CartoonGAN-Test-Pytorch-Torch Pytorch and Torch testing code of CartoonGAN [Chen et al., CVPR18]. With the released pretrained models by the authors,

Yijun Li 642 Dec 27, 2022
This is a pytorch implementation of the NeurIPS paper GAN Memory with No Forgetting.

GAN Memory for Lifelong learning This is a pytorch implementation of the NeurIPS paper GAN Memory with No Forgetting. Please consider citing our paper

Miaoyun Zhao 43 Dec 27, 2022
Doubly Robust Off-Policy Evaluation for Ranking Policies under the Cascade Behavior Model

Doubly Robust Off-Policy Evaluation for Ranking Policies under the Cascade Behavior Model About This repository contains the code to replicate the syn

Haruka Kiyohara 12 Dec 07, 2022
NumQMBasic - A mini-course offered to Undergrad physics students

The best way to use this material is by forking it by click the Fork button at the top, right corner. Then you will get your own copy to play with! Th

Raghu 35 Dec 05, 2022
STRIVE: Scene Text Replacement In Videos

STRIVE: Scene Text Replacement In Videos Dataset Types: RoboText SynthText RealWorld videos RoboText : Videos of texts collected using navigation robo

15 Jul 11, 2022
Introduction to AI assignment 1 HCM University of Technology, term 211

Sokoban Bot Introduction to AI assignment 1 HCM University of Technology, term 211 Abstract This is basically a solver for Sokoban game using Breadth-

Quang Minh 4 Dec 12, 2022
Cancer-and-Tumor-Detection-Using-Inception-model - In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks, specifically here the Inception model by google.

Cancer-and-Tumor-Detection-Using-Inception-model In this repo i am gonna show you how i did cancer/tumor detection in lungs using deep neural networks

Deepak Nandwani 1 Jan 01, 2022
Implementation of ICCV21 paper: PnP-DETR: Towards Efficient Visual Analysis with Transformers

Implementation of ICCV 2021 paper: PnP-DETR: Towards Efficient Visual Analysis with Transformers arxiv This repository is based on detr Recently, DETR

twang 113 Dec 27, 2022
(NeurIPS 2020) Wasserstein Distances for Stereo Disparity Estimation

Wasserstein Distances for Stereo Disparity Estimation Accepted in NeurIPS 2020 as Spotlight. [Project Page] Wasserstein Distances for Stereo Disparity

Divyansh Garg 92 Dec 12, 2022