• Stars
    star
    408
  • Rank 105,946 (Top 3 %)
  • Language
    Python
  • License
    MIT License
  • Created over 6 years ago
  • Updated over 5 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Model

This repository is the official PyTorch implementation of GraphRNN, a graph generative model using auto-regressive model.

Jiaxuan You*, Rex Ying*, Xiang Ren, William L. Hamilton, Jure Leskovec, GraphRNN: Generating Realistic Graphs with Deep Auto-regressive Model (ICML 2018)

Installation

Install PyTorch following the instuctions on the official website. The code has been tested over PyTorch 0.2.0 and 0.4.0 versions.

conda install pytorch torchvision cuda90 -c pytorch

Then install the other dependencies.

pip install -r requirements.txt

Test run

python main.py

Code description

For the GraphRNN model: main.py is the main executable file, and specific arguments are set in args.py. train.py includes training iterations and calls model.py and data.py create_graphs.py is where we prepare target graph datasets.

For baseline models:

  • B-A and E-R models are implemented in baselines/baseline_simple.py.
  • Kronecker graph model is implemented in the SNAP software, which can be found in https://github.com/snap-stanford/snap/tree/master/examples/krongen (for generating Kronecker graphs), and https://github.com/snap-stanford/snap/tree/master/examples/kronfit (for learning parameters for the model).
  • MMSB is implemented using the EDWARD library (http://edwardlib.org/), and is located in baselines.
  • We implemented the DeepGMG model based on the instructions of their paper in main_DeepGMG.py.
  • We implemented the GraphVAE model based on the instructions of their paper in baselines/graphvae.

Parameter setting: To adjust the hyper-parameter and input arguments to the model, modify the fields of args.py accordingly. For example, args.cuda controls which GPU is used to train the model, and args.graph_type specifies which dataset is used to train the generative model. See the documentation in args.py for more detailed descriptions of all fields.

Outputs

There are several different types of outputs, each saved into a different directory under a path prefix. The path prefix is set at args.dir_input. Suppose that this field is set to ./:

  • ./graphs contains the pickle files of training, test and generated graphs. Each contains a list of networkx object.
  • ./eval_results contains the evaluation of MMD scores in txt format.
  • ./model_save stores the model checkpoints
  • ./nll saves the log-likelihood for generated graphs as sequences.
  • ./figures is used to save visualizations (see Visualization of graphs section).

Evaluation

The evaluation is done in evaluate.py, where user can choose which settings to evaluate. To evaluate how close the generated graphs are to the ground truth set, we use MMD (maximum mean discrepancy) to calculate the divergence between two sets of distributions related to the ground truth and generated graphs. Three types of distributions are chosen: degree distribution, clustering coefficient distribution. Both of which are implemented in eval/stats.py, using multiprocessing python module. One can easily extend the evaluation to compute MMD for other distribution of graphs.

We also compute the orbit counts for each graph, represented as a high-dimensional data point. We then compute the MMD between the two sets of sampled points using ORCA (see http://www.biolab.si/supp/orca/orca.html) at eval/orca. One first needs to compile ORCA by

g++ -O2 -std=c++11 -o orca orca.cpp` 

in directory eval/orca. (the binary file already in repo works in Ubuntu).

To evaluate, run

python evaluate.py

Arguments specific to evaluation is specified in class evaluate.Args_evaluate. Note that the field Args_evaluate.dataset_name_all must only contain datasets that are already trained, by setting args.graph_type to each of the datasets and running python main.py.

Visualization of graphs

The training, testing and generated graphs are saved at 'graphs/'. One can visualize the generated graph using the function utils.load_graph_list, which loads the list of graphs from the pickle file, and util.draw_graph_list, which plots the graph using networkx.

Misc

Jesse Bettencourt and Harris Chan have made a great slide introducing GraphRNN in Prof. David Duvenaud’s seminar course Learning Discrete Latent Structure.

More Repositories

1

snap

Stanford Network Analysis Platform (SNAP) is a general purpose network analysis and graph mining library.
C++
2,167
star
2

ogb

Benchmark datasets, data loaders, and evaluators for graph machine learning
Python
1,906
star
3

GraphGym

Platform for designing and evaluating Graph Neural Networks (GNN)
Python
1,669
star
4

pretrain-gnns

Strategies for Pre-training Graph Neural Networks
Python
955
star
5

deepsnap

Python library assists deep learning on graphs
Python
546
star
6

med-flamingo

Python
375
star
7

neural-subgraph-learning-GNN

Jupyter Notebook
327
star
8

stark

STaRK: Benchmarking LLM Retrieval on Textual and Relational Knowledge Bases (NeurIPS D&B 2024)
Python
297
star
9

snap-python

SNAP Python code, SWIG related files
C++
294
star
10

cs224w-notes

CS224W Course Notes
CSS
292
star
11

KGReasoning

Multi-Hop Logical Reasoning in Knowledge Graphs
Python
274
star
12

GreaseLM

[ICLR 2022 spotlight]GreaseLM: Graph REASoning Enhanced Language Models for Question Answering
Python
229
star
13

MLAgentBench

Python
224
star
14

relbench

RelBench: Relational Deep Learning Benchmark
Python
193
star
15

GEARS

GEARS is a geometric deep learning model that predicts outcomes of novel multi-gene perturbations
Python
189
star
16

distance-encoding

Distance Encoding for GNN Design
Jupyter Notebook
181
star
17

graphwave

Jupyter Notebook
169
star
18

UCE

UCE is a zero-shot foundation model for single-cell gene expression data
Python
158
star
19

covid-mobility

Jupyter Notebook
148
star
20

roland

Jupyter Notebook
125
star
21

GIB

Graph Information Bottleneck (GIB) for learning minimal sufficient structural and feature information using GNNs
Jupyter Notebook
123
star
22

mars

Discovering novel cell types across heterogenous single-cell experiments
Jupyter Notebook
119
star
23

comet

[ICLR 2021] Concept Learners for Few-Shot Learning
Python
111
star
24

SATURN

Jupyter Notebook
103
star
25

orca

[ICLR 2022] Open-World Semi-Supervised Learning
Python
85
star
26

prodigy

Python
75
star
27

CAW

Python
72
star
28

snapvx

Python
65
star
29

conformalized-gnn

Uncertainty Quantification over Graph with Conformalized Graph Neural Networks (NeurIPS 2023)
Python
64
star
30

multiscale-interactome

Python
62
star
31

plato

Python
61
star
32

miner-data

Python
60
star
33

stellar

Jupyter Notebook
58
star
34

mambo

Jupyter Notebook
37
star
35

lamp

[ICLR23] First deep learning-based surrogate model that jointly learns the evolution model and optimizes computational cost via remeshing
Python
36
star
36

crust

[NeurIPS 2020] Coresets for Robust Training of Neural Networks against Noisy Labels
Python
33
star
37

bc-emb

Python
32
star
38

csr

Python
30
star
39

zeroc

ZeroC is a neuro-symbolic method that trained with elementary visual concepts and relations, can zero-shot recognize and acquire more complex, hierarchical concepts, even across domains
Jupyter Notebook
28
star
40

masa

Motif-Aware State Assignment in Noisy Time Series Data
Python
24
star
41

le_pde

LE-PDE accelerates PDEs' forward simulation and inverse optimization via latent global evolution, achieving significant speedup with SOTA accuracy
Jupyter Notebook
21
star
42

ConE

Python
20
star
43

BioDiscoveryAgent

BioDiscoveryAgent is an LLM-based AI agent for closed-loop design of genetic perturbation experiments
Python
19
star
44

F-FADE

Python
17
star
45

MetroMaps

MetroMaps Release
Python
16
star
46

MAG

Programs for Microsoft Academic Graph
Python
16
star
47

snap-dev

SNAP repository for Ringo
C++
14
star
48

exposure-segregation

Python
13
star
49

ringo

Next generation graph processing platform
Python
12
star
50

planet

PlaNet: Predicting population response to drugs via clinical knowledge graph
Python
12
star
51

covid-mobility-tool

Jupyter Notebook
10
star
52

llm-social-network

Jupyter Notebook
10
star
53

reddit-processing

preprocessing of Reddit data
Python
7
star
54

ViRel

ViRel: Unsupervised Visual Relations Discovery with Graph-level Analogy
Python
7
star
55

news-search

search Internet news archive
Java
7
star
56

snap-python-64

C++
6
star
57

snap-dev-64

64-bit SNAP (in development, not intended for general use)
C++
6
star
58

snapworld

Python
6
star
59

lego

5
star
60

yperf

Simple performance monitor for Linux
Python
4
star
61

pebble-fit

become less sedentary with pebble
C
4
star
62

dec2vec

Python
3
star
63

caml

Python
3
star
64

SnapTimeTF

Python
2
star
65

covid-spillovers

Jupyter Notebook
2
star
66

curis-2012

Summer 2012 Curis Project
JavaScript
2
star
67

snaptime

Python
2
star
68

GNN-reading-group

1
star
69

supply-chains

Jupyter Notebook
1
star
70

relbench-user-study

Python
1
star
71

AutoTransfer

Python
1
star
72

hash

C++
1
star