Code for "Adversarial Training for a Hybrid Approach to Aspect-Based Sentiment Analysis

Overview

HAABSAStar

Code for "Adversarial Training for a Hybrid Approach to Aspect-Based Sentiment Analysis". This project builds on the code from https://github.com/ofwallaart/HAABSA and https://github.com/mtrusca/HAABSA_PLUS_PLUS.

All software is written in PYTHON3 (https://www.python.org/) and makes use of the TensorFlow framework (https://www.tensorflow.org/).

Installation Instructions (Windows):

Dowload required files and add them to data/externalData folder:

  1. Download ontology: https://github.com/KSchouten/Heracles/tree/master/src/main/resources/externalData
  2. Download SemEval2015 Datasets: http://alt.qcri.org/semeval2015/task12/index.php?id=data-and-tools
  3. Download SemEval2016 Dataset: http://alt.qcri.org/semeval2016/task5/index.php?id=data-and-tools
  4. Download Glove Embeddings: http://nlp.stanford.edu/data/glove.42B.300d.zip
  5. Download Stanford CoreNLP parser: https://nlp.stanford.edu/software/stanford-parser-full-2018-02-27.zip
  6. Download Stanford CoreNLP Language models: https://nlp.stanford.edu/software/stanford-english-corenlp-2018-02-27-models.jar

Setup Environment

  1. Install chocolatey (a package manager for Windows): https://chocolatey.org/install
  2. Open a command prompt.
  3. Install python3 by running the following command: code(choco install python) (http://docs.python-guide.org/en/latest/starting/install3/win/).
  4. Make sure that pip is installed and use pip to install the following packages: setuptools and virtualenv (http://docs.python-guide.org/en/latest/dev/virtualenvs/#virtualenvironments-ref).
  5. Create a virtual environemnt in a desired location by running the following command: code(virtualenv ENV_NAME)
  6. Direct to the virtual environment source directory.
  7. Unzip the zip file of this GitHub repository in the virtual environment directrory.
  8. Activate the virtual environment by the following command: 'code(Scripts\activate.bat)`.
  9. Install the required packages from the requirements.txt file by running the following command: code(pip install -r requirements.txt).
  10. Install the required space language pack by running the following command: code(python -m spacy download en)

Note: the files BERT768embedding2015.txt and BERT768embedding2016.txt are too large for GitHub. These can be generated using getBERTusingColab.py.

Configure paths

The following scripts contain file paths to adapt to your computer (this is done by adding the path to you virtual environment before the filename. For example "/path/to/venv"+"data/programGeneratedData/GloVetraindata"): main_cross.py, main_hyper.py, config.py, HyperDataMaker.py, adversarial.py.

Run Software

  1. Configure one of the three main files to the required configuration (main.py, main_cross.py, main_hyper.py)
  2. Run the program from the command line by the following command: code(python PROGRAM_TO_RUN.py) (where PROGRAM_TO_RUN is main/main_cross/main_hyper)

Software explanation:

The environment contains the following main files that can be run: main.py, main_cross.py, main_hyper.py

  • main.py: program to run single in-sample and out-of-sample valdition runs. Each method can be activated by setting its corresponding boolean to True e.g. to run the Adversarial method set runAdversarial= True.

  • main_cross.py: similar to main.py but runs a 10-fold cross validation procedure for each method.

  • main_hyper.py: program that is able to do hyperparameter optimzation for a given space of hyperparamters for each method. To change a method change the objective and space parameters in the run_a_trial() function.

  • config.py: contains parameter configurations that can be changed such as: dataset_year, batch_size, iterations.

  • dataReader2016.py, loadData.py: files used to read in the raw data and transform them to the required formats to be used by one of the algorithms

  • lcrModel.py: Tensorflow implementation for the LCR-Rot algorithm

  • lcrModelAlt.py: Tensorflow implementation for the LCR-Rot-hop algorithm

  • lcrModelInverse.py: Tensorflow implementation for the LCR-Rot-inv algorithm

  • cabascModel.py: Tensorflow implementation for the CABASC algorithm

  • OntologyReasoner.py: PYTHON implementation for the ontology reasoner

  • svmModel.py: PYTHON implementation for a BoW model using a SVM.

  • adversarial.py: Tensorflow implementation of adversarial training for LCR-Rot-hop

  • att_layer.py, nn_layer.py, utils.py: programs that declare additional functions used by the machine learning algorithms.

Directory explanation:

The following directories are necessary for the virtual environment setup: __pycache, \Include, \Lib, \Scripts, \tcl, \venv

  • cross_results_2015: Results for a k-fold cross validation process for the SemEval-2015 dataset
  • cross_results_2016: Results for a k-fold cross validation process for the SemEval-2015 dataset
  • Results_Run_Adversarial: If WriteFile = True, a csv with accuracies per iteration is saved here
  • data:
    • externalData: Location for the external data required by the methods
    • programGeneratedData: Location for preprocessed data that is generated by the programs
  • hyper_results: Contains the stored results for hyperparameter optimzation for each method
  • results: temporary store location for the hyperopt package

Changed files with respect to https://github.com/mtrusca/HAABSA_PLUS_PLUS:

  • main.py
  • main_hyper.py
  • main_cross.py
  • config.py
  • adversarial.py (added)
Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

ConSERT Code for our ACL 2021 paper - ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer Requirements torch==1.6.0

Yan Yuanmeng 478 Dec 25, 2022
Plotting points that lie on the intersection of the given curves using gradient descent.

Plotting intersection of curves using gradient descent Webapp Link --- What's the app about Why this app Plotting functions and their intersection. A

Divakar Verma 2 Jan 09, 2022
This is the official code for the paper "Learning with Nested Scene Modeling and Cooperative Architecture Search for Low-Light Vision"

RUAS This is the official code for the paper "Learning with Nested Scene Modeling and Cooperative Architecture Search for Low-Light Vision" A prelimin

Vision & Optimization Group (VOG) 2 May 05, 2022
Official implementation of NeurIPS 2021 paper "Contextual Similarity Aggregation with Self-attention for Visual Re-ranking"

CSA: Contextual Similarity Aggregation with Self-attention for Visual Re-ranking PyTorch training code for CSA (Contextual Similarity Aggregation). We

Hui Wu 19 Oct 21, 2022
Hierarchical Time Series Forecasting with a familiar API

scikit-hts Hierarchical Time Series with a familiar API. This is the result from not having found any good implementations of HTS on-line, and my work

Carlo Mazzaferro 204 Dec 17, 2022
Link prediction using Multiple Order Local Information (MOLI)

Understanding the network formation pattern for better link prediction Authors: [e

Wu Lab 0 Oct 18, 2021
Neural network graphs and training metrics for PyTorch, Tensorflow, and Keras.

HiddenLayer A lightweight library for neural network graphs and training metrics for PyTorch, Tensorflow, and Keras. HiddenLayer is simple, easy to ex

Waleed 1.7k Dec 31, 2022
Pytorch Implementation of PointNet and PointNet++++

Pytorch Implementation of PointNet and PointNet++ This repo is implementation for PointNet and PointNet++ in pytorch. Update 2021/03/27: (1) Release p

Luigi Ariano 1 Nov 11, 2021
PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR) and Generative Adversarial Imitation Learning (GAIL).

PyTorch implementation of Advantage Actor Critic (A2C), Proximal Policy Optimization (PPO), Scalable trust-region method for deep reinforcement learning using Kronecker-factored approximation (ACKTR)

Ilya Kostrikov 3k Dec 31, 2022
Code and experiments for "Deep Neural Networks for Rank Consistent Ordinal Regression based on Conditional Probabilities"

corn-ordinal-neuralnet This repository contains the orginal model code and experiment logs for the paper "Deep Neural Networks for Rank Consistent Ord

Raschka Research Group 14 Dec 27, 2022
The source codes for ACL 2021 paper 'BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data'

BoB: BERT Over BERT for Training Persona-based Dialogue Models from Limited Personalized Data This repository provides the implementation details for

124 Dec 27, 2022
Fairness Metrics: All you need to know

Fairness Metrics: All you need to know Testing machine learning software for ethical bias has become a pressing current concern. Recent research has p

Anonymous2020 1 Jan 17, 2022
Live Hand Tracking Using Python

Live-Hand-Tracking-Using-Python Project Description: In this project, we will be

Hassan Shahzad 2 Jan 06, 2022
Pytorch code for paper "Image Compressed Sensing Using Non-local Neural Network" TMM 2021.

NL-CSNet-Pytorch Pytorch code for paper "Image Compressed Sensing Using Non-local Neural Network" TMM 2021. Note: this repo only shows the strategy of

WenxueCui 7 Nov 07, 2022
Towards Fine-Grained Reasoning for Fake News Detection

FinerFact This is the PyTorch implementation for the FinerFact model in the AAAI 2022 paper Towards Fine-Grained Reasoning for Fake News Detection (Ar

Ahren_Jin 15 Dec 15, 2022
Score refinement for confidence-based 3D multi-object tracking

Score refinement for confidence-based 3D multi-object tracking Our video gives a brief explanation of our Method. This is the official code for the pa

Cognitive Systems Research Group 47 Dec 26, 2022
Two-Stage Peer-Regularized Feature Recombination for Arbitrary Image Style Transfer

Two-Stage Peer-Regularized Feature Recombination for Arbitrary Image Style Transfer Paper on arXiv Public PyTorch implementation of two-stage peer-reg

NNAISENSE 38 Oct 14, 2022
toroidal - a lightweight transformer library for PyTorch

toroidal - a lightweight transformer library for PyTorch Toroidal transformers are of smaller size and lower weight than the more common E-I types. Th

MathInf GmbH 64 Jan 07, 2023
ERISHA is a mulitilingual multispeaker expressive speech synthesis framework. It can transfer the expressivity to the speaker's voice for which no expressive speech corpus is available.

ERISHA: Multilingual Multispeaker Expressive Text-to-Speech Library ERISHA is a multilingual multispeaker expressive speech synthesis framework. It ca

Ajinkya Kulkarni 43 Nov 27, 2022
Code for Paper Predicting Osteoarthritis Progression via Unsupervised Adversarial Representation Learning

Predicting Osteoarthritis Progression via Unsupervised Adversarial Representation Learning (c) Tianyu Han and Daniel Truhn, RWTH Aachen University, 20

Tianyu Han 7 Nov 22, 2022