Machine-in-the-Loop Rewriting for Creative Image Captioning
Data
Annotated sources of data used in the paper:
| Data Source | URL |
|---|---|
| Mohammed et al. | Link |
| Gordon et al. | Link |
| Bostan et al. | Link |
| Niculae et al. | Link |
| Steen et al. | Link |
TODO: Individual data cleaning scripts
Model Training
Follow the README in the model_training directory to train a Fairseq BART model. Reach out for our trained model.
Interface
Code to run the UI we used for interactive experiments. This UI hosts a server and needs you to have a backend GPU to run model inference during interaction. The code saves each interaction with a unique ID which we use to match to our crowdworkers for experimental analysis.