• Stars
    star
    271
  • Rank 146,586 (Top 3 %)
  • Language
    Python
  • Created almost 3 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)

How Attentive are Graph Attention Networks?

This repository is the official implementation of How Attentive are Graph Attention Networks?.

January 2022: the paper was accepted to ICLR'2022 !

alt text

Using GATv2

GATv2 is now available as part of PyTorch Geometric library!

from torch_geometric.nn.conv.gatv2_conv import GATv2Conv

https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.GATv2Conv

and also is in this main directory.

GATv2 is now available as part of DGL library!

from dgl.nn.pytorch import GATv2Conv

https://docs.dgl.ai/en/latest/api/python/nn.pytorch.html#gatv2conv

and also in this repository.

GATv2 is now available as part of Google's TensorFlow GNN library!

from tensorflow_gnn.graph.keras.layers.gat_v2 import GATv2Convolution

https://github.com/tensorflow/gnn/blob/main/tensorflow_gnn/docs/api_docs/python/gnn/keras/layers/GATv2.md

Code Structure

Since our experiments (Section 4) are based on different frameworks, this repository is divided into several sub-projects:

  1. The subdirectory arxiv_mag_products_collab_citation2_noise contains the needed files to reproduce the results of Node-Prediction, Link-Prediction, and Robustness to Noise (Tables 2a, 3 and Figure 4).
  2. The subdirectory proteins contains the needed files to reproduce the results of ogbn-proteins in Node-Prediction (Table 2b).
  3. The subdirectory dictionary_lookup contains the need files to reproduce the results of the DictionaryLookup benchmark (Figure 3).
  4. The subdirectory tf-gnn-samples contains the needed files to reproduce the results of the VarMisuse and QM9 datasets (Table 1 and Table 4).

Requirements

Each subdirectory contains its own requirements and dependencies.

Generally, all subdirectories depend on PyTorch 1.7.1 and PyTorch Geometric version 1.7.0 (proteins depends on DGL version 0.6.0). The subdirectory tf-gnn-samples (VarMisuse and QM9) depends on TensorFlow 1.13.

Hardware

In general, all experiments can run on either GPU or CPU.

Citation

How Attentive are Graph Attention Networks?

@inproceedings{
  brody2022how,
  title={How Attentive are Graph Attention Networks? },
  author={Shaked Brody and Uri Alon and Eran Yahav},
  booktitle={International Conference on Learning Representations},
  year={2022},
  url={https://openreview.net/forum?id=F72ximsx7C1}
}

More Repositories

1

code2vec

TensorFlow code for the neural network presented in the paper: "code2vec: Learning Distributed Representations of Code"
Python
1,067
star
2

code2seq

Code for the model presented in the paper: "code2seq: Generating Sequences from Structured Representations of Code"
Python
533
star
3

RASP

An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers"
Python
256
star
4

Nero

Code and resources for the paper: "Neural Reverse Engineering of Stripped Binaries using Augmented Control Flow Graphs"
Python
187
star
5

bottleneck

Code for the paper: "On the Bottleneck of Graph Neural Networks and Its Practical Implications"
Python
90
star
6

slm-code-generation

TensorFlow code for the neural network presented in the paper: "Structural Language Models of Code" (ICML'2020)
Java
81
star
7

esh

statistical similarity of binaries (Esh)
C#
73
star
8

lstar_extraction

implementation of ICML 2018 paper, Extracting Automata from Recurrent Neural Networks Using Queries and Counterexamples
Jupyter Notebook
69
star
9

layer_norm_expressivity_role

Code for the paper "On the Expressivity Role of LayerNorm in Transformers' Attention" (Findings of ACL'2023)
Python
34
star
10

c3po

Code for the paper "A Structural Model for Contextual Code Changes"
Python
25
star
11

weighted_lstar

implementation for "learning weighted deterministic automata from queries and counterexamples", neurips 2019
Python
16
star
12

adversarial-examples

Code for the paper: "Adversarial Examples for Models of Code"
Python
15
star
13

prime

Java
14
star
14

RASP-exps

Code for running the transformers in the ICML 2021 paper "Thinking Like Transformers"
Python
14
star
15

safe

SAFE static analysis tools
Java
12
star
16

differential

differential
C
12
star
17

counting_dimensions

demonstration for our ACL 2018 paper, "On the Practical Computational Power of Finite Precision RNNs for Language Recognition"
Jupyter Notebook
10
star
18

id2vec

Python
9
star
19

RNN_to_PRS_CFG

Implementation of TACAS 2021 paper, "Extrapolating CFGs from RNNs"
Python
9
star
20

atam

Example programs for ATAM
C
3
star