"Inductive Entity Representations from Text via Link Prediction" @ The Web Conference 2021

Related tags

Deep Learningblp
Overview

Inductive entity representations from text via link prediction





This repository contains the code used for the experiments in the paper "Inductive entity representations from text via link prediction", presented at The Web Conference, 2021. To refer to our work, please use the following:

@inproceedings{daza2021inductive,
    title = {Inductive Entity Representations from Text via Link Prediction},
    author = {Daniel Daza and Michael Cochez and Paul Groth},
    booktitle = {Proceedings of The Web Conference 2021},
    year = {2021},
    doi = {10.1145/3442381.3450141},
}

In this work, we show how a BERT-based text encoder can be fine-tuned with a link prediction objective, in a graph where entities have an associated textual description. We call the resulting model BLP. There are three interesting properties of a trained BLP model:

  • It can predict a link between entities, even if one or both were not present during training.
  • It produces useful representations for a classifier, that don't require retraining the encoder.
  • It improves an information retrieval system, by better matching entities and questions about them.

Usage

Please follow the instructions next to reproduce our experiments, and to train a model with your own data.

1. Install the requirements

Creating a new environment (e.g. with conda) is recommended. Use requirements.txt to install the dependencies:

conda create -n blp python=3.7
conda activate blp
pip install -r requirements.txt

2. Download the data

Download the required compressed datasets into the data folder:

Download link Size (compressed)
UMLS (small graph for tests) 121 KB
WN18RR 6.6 MB
FB15k-237 21 MB
Wikidata5M 1.4 GB
GloVe embeddings 423 MB
DBpedia-Entity 1.3 GB

Then use tar to extract the files, e.g.

tar -xzvf WN18RR.tar.gz

Note that the KG-related files above contain both transductive and inductive splits. Transductive splits are commonly used to evaluate lookup-table methods like ComplEx, while inductive splits contain entities in the test set that are not present in the training set. Files with triples for the inductive case have the ind prefix, e.g. ind-train.txt.

2. Reproduce the experiments

Link prediction

To check that all dependencies are correctly installed, run a quick test on a small graph (this should take less than 1 minute on GPU):

./scripts/test-umls.sh

The following table is a adapted from our paper. The "Script" column contains the name of the script that reproduces the experiment for the corresponding model and dataset. For example, if you want to reproduce the results of BLP-TransE on FB15k-237, run

./scripts/blp-transe-fb15k237.sh
WN18RR FB15k-237 Wikidata5M
Model MRR Script MRR Script MRR Script
GlovE-BOW 0.170 glove-bow-wn18rr.sh 0.172 glove-bow-fb15k237.sh 0.343 glove-bow-wikidata5m.sh
BE-BOW 0.180 bert-bow-wn18rr.sh 0.173 bert-bow-fb15k237.sh 0.362 bert-bow-wikidata5m.sh
GloVe-DKRL 0.115 glove-dkrl-wn18rr.sh 0.112 glove-dkrl-fb15k237.sh 0.282 glove-dkrl-wikidata5m.sh
BE-DKRL 0.139 bert-dkrl-wn18rr.sh 0.144 bert-dkrl-fb15k237.sh 0.322 bert-dkrl-wikidata5m.sh
BLP-TransE 0.285 blp-transe-wn18rr.sh 0.195 blp-transe-fb15k237.sh 0.478 blp-transe-wikidata5m.sh
BLP-DistMult 0.248 blp-distmult-wn18rr.sh 0.146 blp-distmult-fb15k237.sh 0.472 blp-distmult-wikidata5m.sh
BLP-ComplEx 0.261 blp-complex-wn18rr.sh 0.148 blp-complex-fb15k237.sh 0.489 blp-complex-wikidata5m.sh
BLP-SimplE 0.239 blp-simple-wn18rr.sh 0.144 blp-simple-fb15k237.sh 0.493 blp-simple-wikidata5m.sh

Entity classification

After training for link prediction, a tensor of embeddings for all entities is computed and saved in a file with name ent_emb-[ID].pt where [ID] is the id of the experiment in the database (we use Sacred to manage experiments). Another file called ents-[ID].pt contains entity identifiers for every row in the tensor of embeddings.

To ease reproducibility, we provide these tensors, which are required in the entity classification task. Click on the ID, download the file into the output folder, and decompress it. An experiment can be reproduced using the following command:

python train.py node_classification with checkpoint=ID dataset=DATASET

where DATASET is either WN18RR or FB15k-237. For example:

python train.py node_classification with checkpoint=199 dataset=WN18RR
WN18RR FB15k-237
Model Acc. ID Acc. Bal. ID
GloVe-BOW 55.3 219 34.4 293
BE-BOW 60.7 218 28.3 296
GloVe-DKRL 55.5 206 26.6 295
BE-DKRL 48.8 207 30.9 294
BLP-TransE 81.5 199 42.5 297
BLP-DistMult 78.5 200 41.0 298
BLP-ComplEx 78.1 201 38.1 300
BLP-SimplE 83.0 202 45.7 299

Information retrieval

This task runs with a pre-trained model saved from the link prediction task. For example, if the model trained is blp with transe and it was saved as model.pt, then run the following command to run the information retrieval task:

python retrieval.py with model=blp rel_model=transe \
checkpoint='output/model.pt'

Using your own data

If you have a knowledge graph where entities have textual descriptions, you can train a BLP model for the tasks of inductive link prediction, and entity classification (if you also have labels for entities).

To do this, add a new folder inside the data folder (let's call it my-kg). Store in it a file containing the triples in your KG. This should be a text file with one tab-separated triple per line (let's call it all-triples.tsv).

To generate inductive splits, you can use data/utils.py. If you run

python utils.py drop_entities --file=my-kg/all-triples.tsv

this will generate ind-train.tsv, ind-dev.tsv, ind-test.tsv inside my-kg (see Appendix A in our paper for details on how these are generated). You can then train BLP-TransE with

python train.py with dataset='my-kg'

Alternative implementations

Owner
Daniel Daza
PhD student at VU Amsterdam and the University of Amsterdam, working on machine learning and knowledge graphs.
Daniel Daza
The official code for PRIMER: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization

PRIMER The official code for PRIMER: Pyramid-based Masked Sentence Pre-training for Multi-document Summarization. PRIMER is a pre-trained model for mu

AI2 114 Jan 06, 2023
Machine Translation Implement By Bi-GRU And Transformer

Seq2Seq Translation Implement By Bidirectional GRU And Transformer In Pytorch Before You Run The Code You should download the data through the link be

He Wang 2 Oct 27, 2021
Example-custom-ml-block-keras - Custom Keras ML block example for Edge Impulse

Custom Keras ML block example for Edge Impulse This repository is an example on

Edge Impulse 8 Nov 02, 2022
Using fully convolutional networks for semantic segmentation with caffe for the cityscapes dataset

Using fully convolutional networks for semantic segmentation (Shelhamer et al.) with caffe for the cityscapes dataset How to get started Download the

Simon Guist 27 Jun 06, 2022
Implementation of "JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting"

JOKR: Joint Keypoint Representation for Unsupervised Cross-Domain Motion Retargeting Pytorch implementation for the paper "JOKR: Joint Keypoint Repres

45 Dec 25, 2022
This project is for a Twitter bot that monitors a bird feeder in my backyard. Any detected birds are identified and posted to Twitter.

Backyard Birdbot Introduction This is a silly hobby project to use existing ML models to: Detect any birds sighted by a webcam Identify whic

Chi Young Moon 71 Dec 25, 2022
A collection of random and hastily hacked together scripts for investigating EU-DCC

A collection of random and hastily hacked together scripts for investigating EU-DCC

Ryan Barrett 8 Mar 01, 2022
OpenMMLab Computer Vision Foundation

English | 简体中文 Introduction MMCV is a foundational library for computer vision research and supports many research projects as below: MMCV: OpenMMLab

OpenMMLab 4.6k Jan 09, 2023
Event queue (Equeue) dialect is an MLIR Dialect that models concurrent devices in terms of control and structure.

Event Queue Dialect Event queue (Equeue) dialect is an MLIR Dialect that models concurrent devices in terms of control and structure. Motivation The m

Cornell Capra 23 Dec 08, 2022
PyTorch framework, for reproducing experiments from the paper Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks

Implicit Regularization in Hierarchical Tensor Factorization and Deep Convolutional Neural Networks. Code, based on the PyTorch framework, for reprodu

Asaf 3 Dec 27, 2022
QueryInst: Parallelly Supervised Mask Query for Instance Segmentation

QueryInst is a simple and effective query based instance segmentation method driven by parallel supervision on dynamic mask heads, which outperforms previous arts in terms of both accuracy and speed.

Hust Visual Learning Team 386 Jan 08, 2023
Pytorch reimplement of the paper "A Novel Cascade Binary Tagging Framework for Relational Triple Extraction" ACL2020. The original code is written in keras.

CasRel-pytorch-reimplement Pytorch reimplement of the paper "A Novel Cascade Binary Tagging Framework for Relational Triple Extraction" ACL2020. The o

longlongman 170 Dec 01, 2022
A PyTorch implementation for V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation

A PyTorch implementation of V-Net Vnet is a PyTorch implementation of the paper V-Net: Fully Convolutional Neural Networks for Volumetric Medical Imag

Matthew Macy 606 Dec 21, 2022
Official Implementation of "Learning Disentangled Behavior Embeddings"

DBE: Disentangled-Behavior-Embedding Official implementation of Learning Disentangled Behavior Embeddings (NeurIPS 2021). Environment requirement The

Mishne Lab 12 Sep 28, 2022
Learning Efficient Online 3D Bin Packing on Packing Configuration Trees

Learning Efficient Online 3D Bin Packing on Packing Configuration Trees This repository is being continuously updated, please stay tuned! Any code con

86 Dec 28, 2022
Pyeventbus: a publish/subscribe event bus

pyeventbus pyeventbus is a publish/subscribe event bus for Python 2.7. simplifies the communication between python classes decouples event senders and

15 Apr 21, 2022
Pyramid Pooling Transformer for Scene Understanding

Pyramid Pooling Transformer for Scene Understanding Requirements: torch 1.6+ torchvision 0.7.0 timm==0.3.2 Validated on torch 1.6.0, torchvision 0.7.0

Yu-Huan Wu 119 Dec 29, 2022
Official implementation of the ICCV 2021 paper "Joint Inductive and Transductive Learning for Video Object Segmentation"

JOINT This is the official implementation of Joint Inductive and Transductive learning for Video Object Segmentation, to appear in ICCV 2021. @inproce

Yunyao 35 Oct 16, 2022
The code for 'Deep Residual Fourier Transformation for Single Image Deblurring'

Deep Residual Fourier Transformation for Single Image Deblurring Xintian Mao, Yiming Liu, Wei Shen, Qingli Li and Yan Wang code will be released soon

145 Dec 13, 2022
A production-ready, scalable Indexer for the Jina neural search framework, based on HNSW and PSQL

🌟 HNSW + PostgreSQL Indexer HNSWPostgreSQLIndexer Jina is a production-ready, scalable Indexer for the Jina neural search framework. It combines the

Jina AI 25 Oct 14, 2022