Graph Neural Networks for Natural Language Processing
The repository contains code examples for GNN-for-NLP tutorial at EMNLP 2019 and CODS-COMAD 2020.
Slides can be downloaded from here.
Dependencies
- Compatible with PyTorch 1.x, TensorFlow 1.x and Python 3.x.
- Dependencies can be installed using
requirements.txt
.
TensorFlow Examples:
tf_gcn.py
contains simplified implementation of first-order approximation of GCN model proposed by Kipf et. al. (2016)- Extensions of the same implementation for different problems:
- Relation Extraction: RESIDE
- GCNs for Word Embeddings: WordGCN
- Document Time-stamping: NeuralDater
PyTorch Examples:
pytorch_gcn.py
is pytorch equivalent oftf_gcn.py
implemented using pytorch-geometric.- Several other examples are available here.
Additional Resources:
- Short writeup on theory behind Graph Convolutional Networks [Pdf] (refer Chapter-2).
- GNN recent papers.
Citation:
@inproceedings{vashishth-etal-2019-graph,
title = "Graph-based Deep Learning in Natural Language Processing",
author = "Vashishth, Shikhar and
Yadati, Naganand and
Talukdar, Partha",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP): Tutorial Abstracts",
month = nov,
year = "2019",
address = "Hong Kong, China",
publisher = "Association for Computational Linguistics",
abstract = "This tutorial aims to introduce recent advances in graph-based deep learning techniques such as Graph Convolutional Networks (GCNs) for Natural Language Processing (NLP). It provides a brief introduction to deep learning methods on non-Euclidean domains such as graphs and justifies their relevance in NLP. It then covers recent advances in applying graph-based deep learning methods for various NLP tasks, such as semantic role labeling, machine translation, relationship extraction, and many more.",
}