A framework for evaluating Knowledge Graph Embedding Models in a fine-grained manner.

Related tags

Text Data & NLPKGEval
Overview

KGEval

A framework for evaluating Knowledge Graph Embedding Models in a fine-grained manner.

The framework and experimental results are described in Ben Rim et al. 2021 (Outstanding Paper Award, AKBC 2021).

Instructions

Create a virtual environment

virtualenv -p python3.6 eval_env
source eval_env/bin/activate
pip install -r requirements.txt

Download data

In the main folder, run:

source data/download.sh

Download model

If you want to test the framework immediately, you can download pre-trained Pykeen models by running:

source download_models.sh

Generate behavioral tests

Symmetry Tests

Can choose --dataset FB15K237, WN18RR, YAGO310

python tests/run.py --dataset FB15K237 --mode generate --capability symmetry

This should result into the following output, and the files for each test set will be added under behavioral_tests\dataset\symmetry:

2021-10-03 23:37:35,060 - [INFO] - Preparing test sets for the dataset FB15K237
2021-10-03 23:37:37,621 - [INFO] - ########################## <----TRAIN---> ############################
2021-10-03 23:37:37,621 - [INFO] - 0 repetitions removed
2021-10-03 23:37:37,621 - [INFO] - 272115 triples remaining in train set
2021-10-03 23:37:37,621 - [INFO] - 6778 symmetric triples found in train set
2021-10-03 23:37:37,786 - [INFO] - ########################## <----TEST---> ############################
2021-10-03 23:37:37,786 - [INFO] - 0 repetitions removed
2021-10-03 23:37:37,786 - [INFO] - 20466 triples remaining in test set
2021-10-03 23:37:37,786 - [INFO] - 113 symmetric triples found in test set
2021-10-03 23:37:37,806 - [INFO] - ########################## <----VALID---> ############################
2021-10-03 23:37:37,806 - [INFO] - 0 repetitions removed
2021-10-03 23:37:37,806 - [INFO] - 17535 triples remaining in valid set
2021-10-03 23:37:37,806 - [INFO] - 113 symmetric triples found in valid set
2021-10-03 23:37:39,106 - [INFO] - #################### <---TEST SET 1: MEMORIZATION ---> ##########################
2021-10-03 23:37:39,106 - [INFO] - There are 5470 entries in the memorization set (occur in both directions)
2021-10-03 23:37:39,106 - [INFO] - #################### <---TEST SET 2: ONE DIRECTION SEEN ---> ##########################
2021-10-03 23:37:39,106 - [INFO] - There are 1308 entries not shown in both directions (to be reversed for testing)
2021-10-03 23:37:39,836 - [INFO] - #################### <--- SYMMETRIC RELATIONS ---> ##########################
2021-10-03 23:37:39,836 - [INFO] - TRAIN SET contains 6778 symmetric entries
2021-10-03 23:37:39,836 - [INFO] - TEST SET contains  113 symmetric entries with 113 not in training
2021-10-03 23:37:39,836 - [INFO] - VALID SET contains 113 symmetric entries with 113 not in training
2021-10-03 23:37:39,839 - [INFO] - #################### <---TEST SET 3: UNSEEN INSTANCES ---> ##########################
2021-10-03 23:37:39,840 - [INFO] - There are 226 entries that are not seen in any direction in training
2021-10-03 23:37:40,267 - [INFO] - #################### <---TEST SET 4: ASYMMETRY ---> ##########################
2021-10-03 23:37:40,267 - [INFO] - There are 3000 asymmetric entries in test set added to test 4

Hierarchy Tests

Only available for FB15K237 dataset

python tests/run.py --dataset FB15K237 --mode generate --capability hierarchy

The output should be and will be available under behavioral_tests/dataset/hierarchy/, the naming of the files corresponds to triples where the tail belongs to a specified level. For example, 1.txt contains triples where the tail has a type of level 1 in the entity type hierarchy :

2021-10-04 01:38:13,517 - [INFO] - Results of Hierarchy Behavioral Tests for FB15K237
2021-10-04 01:38:20,367 - [INFO] - <--------------- Entity Hiararchy statistics ----------------->
2021-10-04 01:38:20,568 - [INFO] - Level 0 contains 1 types and 3415 triples
2021-10-04 01:38:20,887 - [INFO] - Level 1 contains 66 types and 2006 triples
2021-10-04 01:38:20,900 - [INFO] - Level 2 contains 136 types and 4273 triples
2021-10-04 01:38:20,913 - [INFO] - Level 3 contains 213 types and 3560 triples
2021-10-04 01:38:20,923 - [INFO] - Level 4 contains 262 types and 3369 triples

Run Tests (pykeen models)

Symmetry behavioral tests on distmult or rotate:

python tests/run.py --dataset FB15K237 --mode test --model_name rotate

The output will be printed as shown below, and will also be available in the results folder under dataset/symmetry:

2021-10-04 14:00:57,100 - [INFO] - Starting test1 with rotate model
2021-10-04 14:03:23,249 - [INFO] - On test1, MR: 1.2407678244972578, MRR: 0.9400152688974949, [email protected]: 0.9014624953269958, [email protected]: 0.988482654094696, [email protected]: 0.9965264797210693
2021-10-04 14:03:23,249 - [INFO] - Starting test2 with rotate model
2021-10-04 14:04:15,614 - [INFO] - On test2, MR: 23.446483180428135, MRR: 0.4409348919640765, [email protected]: 0.30351680517196655, [email protected]: 0.5894495248794556, [email protected]: 0.7025994062423706
2021-10-04 14:04:15,614 - [INFO] - Starting test3 with rotate model
2021-10-04 14:04:25,364 - [INFO] - On test3, MR: 1018.9469026548672, MRR: 0.04786047740344238, [email protected]: 0.008849557489156723, [email protected]: 0.06194690242409706, [email protected]: 0.12389380484819412
2021-10-04 14:04:25,365 - [INFO] - Starting test4 with rotate model
2021-10-04 14:05:38,900 - [INFO] - On test4, MR: 4901.459, MRR: 0.07606098649786266, [email protected]: 0.9496666789054871, [email protected]: 0.893666684627533, [email protected]: 0.8823333382606506

Hierarchy behavioral tests on distmult or rotate:

   python tests/run.py --dataset FB15K237 --mode test --capability hierarchy --model_name rotate

Run Tests on other models and other frameworks

(To be added)

Owner
NEC Laboratories Europe
Research software developed at NEC Laboratories Europe
NEC Laboratories Europe
Train BPE with fastBPE, and load to Huggingface Tokenizer.

BPEer Train BPE with fastBPE, and load to Huggingface Tokenizer. Description The BPETrainer of Huggingface consumes a lot of memory when I am training

Lizhuo 1 Dec 23, 2021
Code release for "COTR: Correspondence Transformer for Matching Across Images"

COTR: Correspondence Transformer for Matching Across Images This repository contains the inference code for COTR. We plan to release the training code

UBC Computer Vision Group 358 Dec 24, 2022
🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0.

State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0 🤗 Transformers provides thousands of pretrained models to perform tasks o

Hugging Face 77.3k Jan 03, 2023
Crowd sourced training data for Rasa NLU models

NLU Training Data Crowd-sourced training data for the development and testing of Rasa NLU models. If you're interested in grabbing some data feel free

Rasa 169 Dec 26, 2022
Easy to start. Use deep nerual network to predict the sentiment of movie review.

Easy to start. Use deep nerual network to predict the sentiment of movie review. Various methods, word2vec, tf-idf and df to generate text vectors. Various models including lstm and cov1d. Achieve f1

1 Nov 19, 2021
Python bindings to the dutch NLP tool Frog (pos tagger, lemmatiser, NER tagger, morphological analysis, shallow parser, dependency parser)

Frog for Python This is a Python binding to the Natural Language Processing suite Frog. Frog is intended for Dutch and performs part-of-speech tagging

Maarten van Gompel 46 Dec 14, 2022
List of GSoC organisations with number of times they have been selected.

Welcome to GSoC Organisation Frequency And Details 👋 List of GSoC organisations with number of times they have been selected, techonologies, topics,

Shivam Kumar Jha 41 Oct 01, 2022
Fast, DB Backed pretrained word embeddings for natural language processing.

Embeddings Embeddings is a python package that provides pretrained word embeddings for natural language processing and machine learning. Instead of lo

Victor Zhong 212 Nov 21, 2022
Graph4nlp is the library for the easy use of Graph Neural Networks for NLP

Graph4NLP Graph4NLP is an easy-to-use library for R&D at the intersection of Deep Learning on Graphs and Natural Language Processing (i.e., DLG4NLP).

Graph4AI 1.5k Dec 23, 2022
Pretty-doc - Composable text objects with python

pretty-doc from __future__ import annotations from dataclasses import dataclass

Taine Zhao 2 Jan 17, 2022
Unsupervised text tokenizer for Neural Network-based text generation.

SentencePiece SentencePiece is an unsupervised text tokenizer and detokenizer mainly for Neural Network-based text generation systems where the vocabu

Google 6.4k Jan 01, 2023
Contains descriptions and code of the mini-projects developed in various programming languages

TexttoSpeechAndLanguageTranslator-project introduction A pleasant application where the client will be given buttons like play,reset and exit. The cli

Adarsh Reddy 1 Dec 22, 2021
Using BERT-based models for toxic span detection

SemEval 2021 Task 5: Toxic Spans Detection: Task: Link to SemEval-2021: Task 5 Toxic Span Detection is https://competitions.codalab.org/competitions/2

Ravika Nagpal 1 Jan 04, 2022
An easy to use, user-friendly and efficient code for extracting OpenAI CLIP (Global/Grid) features from image and text respectively.

Extracting OpenAI CLIP (Global/Grid) Features from Image and Text This repo aims at providing an easy to use and efficient code for extracting image &

Jianjie(JJ) Luo 13 Jan 06, 2023
Official code of our work, Unified Pre-training for Program Understanding and Generation [NAACL 2021].

PLBART Code pre-release of our work, Unified Pre-training for Program Understanding and Generation accepted at NAACL 2021. Note. A detailed documentat

Wasi Ahmad 138 Dec 30, 2022
Deeply Supervised, Layer-wise Prediction-aware (DSLP) Transformer for Non-autoregressive Neural Machine Translation

Non-Autoregressive Translation with Layer-Wise Prediction and Deep Supervision Training Efficiency We show the training efficiency of our DSLP model b

Chenyang Huang 37 Jan 04, 2023
[Preprint] Escaping the Big Data Paradigm with Compact Transformers, 2021

Compact Transformers Preprint Link: Escaping the Big Data Paradigm with Compact Transformers By Ali Hassani[1]*, Steven Walton[1]*, Nikhil Shah[1], Ab

SHI Lab 367 Dec 31, 2022
Applied Natural Language Processing in the Enterprise - An O'Reilly Media Publication

Applied Natural Language Processing in the Enterprise This is the companion repo for Applied Natural Language Processing in the Enterprise, an O'Reill

Applied Natural Language Processing in the Enterprise 95 Jan 05, 2023
ChatterBot is a machine learning, conversational dialog engine for creating chat bots

ChatterBot ChatterBot is a machine-learning based conversational dialog engine build in Python which makes it possible to generate responses based on

Gunther Cox 12.8k Jan 03, 2023
DeLighT: Very Deep and Light-Weight Transformers

DeLighT: Very Deep and Light-weight Transformers This repository contains the source code of our work on building efficient sequence models: DeFINE (I

Sachin Mehta 440 Dec 18, 2022