• Stars
    star
    171
  • Rank 215,701 (Top 5 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 6 years ago
  • Updated about 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

EMNLP 2018: HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding

HyTE

HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding

Source code and dataset for EMNLP 2018 paper: HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding.

Overview of HyTE (proposed method). a temporally aware KG embedding method which explicitly incorporates time in the entity-relation space by stitching each timestamp with a corresponding hyperplane. HyTE not only performs KG inference using temporal guidance, but also predicts temporal scopes for relational facts with missing time annotations. Please refer paper for more details.

Dependencies

  • Compatible with TensorFlow 1.x and Python 3.x.
  • Dependencies can be installed using requirements.txt.

Dataset:

  • Download the processed version of WikiData and YAGO datasets.
  • Unzip the .zip file in data directory.
  • Documents are originally taken from YAGO and Wikidata.

Usage:

  • After installing python dependencies from requirements.txt.
  • time_proj.py contains TensorFlow (1.x) based implementation of HyTE (proposed method).
  • To start training:
    python time_proj.py -name yago_data_neg_sample_5_mar_10_l2_0.00 -margin 10 -l2 0.00 -neg_sample 5 -gpu 5 -epoch 2000 -data_type yago -version large -test_freq 5
  • Some of the important Available options include:
'-data_type' default ='yago', choices = ['yago','wiki_data'], help ='dataset to choose'
  '-version',  default = 'large', choices = ['large','small'], help = 'data version to choose'
  '-test_freq', 	 default = 25,   	type=int, 	help='testing frequency'
  '-neg_sample', 	 default = 5,   	type=int, 	help='negative samples for training'
  '-gpu', 	 dest="gpu", 		default='1',			help='GPU to use'
  '-name', 	 dest="name", 		help='Name of the run'
  '-lr',	 dest="lr", 		default=0.0001,  type=float,	help='Learning rate'
  '-margin', 	 dest="margin", 	default=1,   	type=float, 	help='margin'
  '-batch', 	 dest="batch_size", 	default= 50000,   	type=int, 	help='Batch size'
  '-epoch', 	 dest="max_epochs", 	default= 5000,   	type=int, 	help='Max epochs'
  '-l2', 	 dest="l2", 		default=0.0, 	type=float, 	help='L2 regularization'
  '-seed', 	 dest="seed", 		default=1234, 	type=int, 	help='Seed for randomization'
  '-inp_dim',  dest="inp_dim", 	default = 128,   	type=int, 	help='')
  '-L1_flag',  dest="L1_flag", 	action='store_false',   	 	help='Hidden state dimension of FC layer'

Evaluation:

  • Validate after Training.
  • Use the same model name and test frequency used at training as arguments for the following evalutation--
  • For getting best validation MR and hit@10 for head and tail prediction:
   python result_eval.py -eval_mode valid -model yago_data_neg_sample_5_mar_10_l2_0.00 -test_freq 5
  • For getting best validation MR and hit@10 for relation prediction:
   python result_eval_relation.py -eval_mode valid -model yago_data_neg_sample_5_mar_10_l2_0.00  -test_freq 5

The Evaluation run will output the Best Validation Rank and the corresponding Best Validation Epoch when it was achieved. Note them down for obtaining results on test set.

Testing:

  • Test after validation using the best validation weights.
  • First run the time_proj.py script once to restore parameters and then dump the predictions corresponding the the test set.
 python time_proj.py -res_epoch `Best Validation Epoch` -onlyTest -restore -name yago_data_neg_sample_5_mar_10_l2_0.00 -margin 10 -l2 0.00 -neg_sample 5 -gpu 0 -data_type yago -version large
  • Now evaluate the test predictions to obtain MR and hits@10 using
python result_eval.py -eval_mode test -test_freq `Best Validation Epoch` -model yago_data_neg_sample_5_mar_10_l2_0.00

Citing:

InProceedings{D18-1225,
  author = 	"Dasgupta, Shib Sankar
		and Ray, Swayambhu Nath
		and Talukdar, Partha",
  title = 	"HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding",
  booktitle = 	"Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing",
  year = 	"2018",
  publisher = 	"Association for Computational Linguistics",
  pages = 	"2001--2011",
  location = 	"Brussels, Belgium",
  url = 	"http://aclweb.org/anthology/D18-1225"
}

More Repositories

1

CompGCN

ICLR 2020: Composition-Based Multi-Relational Graph Convolutional Networks
Python
580
star
2

EmbedKGQA

ACL 2020: Improving Multi-hop Question Answering over Knowledge Graphs using Knowledge Base Embeddings
Python
408
star
3

WordGCN

ACL 2019: Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional Networks
Python
288
star
4

RESIDE

EMNLP 2018: RESIDE: Improving Distantly-Supervised Neural Relation Extraction using Side Information
CSS
247
star
5

HyperGCN

NeurIPS 2019: HyperGCN: A New Method of Training Graph Convolutional Networks on Hypergraphs
Python
175
star
6

ProteinGCN

ProteinGCN: Protein model quality assessment using Graph Convolutional Networks
Python
109
star
7

cesi

WWW 2018: CESI: Canonicalizing Open Knowledge Bases using Embeddings and Side Information
Python
100
star
8

ASAP

AAAI 2020 - ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations
Python
95
star
9

InteractE

AAAI 2020 - InteractE: Improving Convolution-based Knowledge Graph Embeddings by Increasing Feature Interactions
Python
86
star
10

EWISE

ACL 2019: Zero-shot Word Sense Disambiguation using Sense Definition Embedding
Python
74
star
11

SGCP

TACL 2020: Syntax-Guided Controlled Generation of Paraphrases
Python
72
star
12

DiPS

NAACL 2019: Submodular optimization-based diverse paraphrasing and its effectiveness in data augmentation
Python
70
star
13

NeuralDater

ACL 2018: Dating Documents using Graph Convolution Networks
Python
61
star
14

ConfGCN

AISTATS 2019: Confidence-based Graph Convolutional Networks for Semi-Supervised Learning
Python
57
star
15

CaRE

EMNLP 2019: CaRe: Open Knowledge Graph Embeddings
Python
37
star
16

kg-geometry

Python
21
star
17

sictf

Relation Schema Induction using SICTF
Python
17
star
18

AD3

EMNLP 2018: AD3: Attentive Deep Document Dater :: Swayambhu Nath Ray, Shib Sankar Dasgupta, Partha Talukdar
Python
12
star
19

lcn

AISTATS 2019: Lovász Convolutional Networks
Python
8
star
20

entity-centric-kb-pop

Python
3
star
21

pra-oda

Path Ranking Algorithm On-Demand
Java
3
star
22

reddit-icwsm16

OpenEdge ABL
2
star