• Stars
    star
    291
  • Rank 142,563 (Top 3 %)
  • Language
    HTML
  • License
    MIT License
  • Created about 7 years ago
  • Updated about 7 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

PyTorch implementation of spectral graph ConvNets, NeurIPS’16

Graph ConvNets in PyTorch

October 15, 2017

Xavier Bresson

http://www.ntu.edu.sg/home/xbresson
https://github.com/xbresson
https://twitter.com/xbresson

Description

Prototype implementation in PyTorch of the NIPS'16 paper:
Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering
M Defferrard, X Bresson, P Vandergheynst
Advances in Neural Information Processing Systems, 3844-3852, 2016
ArXiv preprint: arXiv:1606.09375

Code objective

The code provides a simple example of graph ConvNets for the MNIST classification task.
The graph is a 8-nearest neighbor graph of a 2D grid.
The signals on graph are the MNIST images vectorized as $28^2 \times 1$ vectors.

Installation

git clone https://github.com/xbresson/graph_convnets_pytorch.git
cd graph_convnets_pytorch
pip install -r requirements.txt # installation for python 3.6.2
python check_install.py
jupyter notebook # run the 2 notebooks

Results

GPU Quadro M4000

  • Standard ConvNets: 01_standard_convnet_lenet5_mnist_pytorch.ipynb, accuracy= 99.31, speed= 6.9 sec/epoch.
  • Graph ConvNets: 02_graph_convnet_lenet5_mnist_pytorch.ipynb, accuracy= 99.19, speed= 100.8 sec/epoch

Note

PyTorch has not yet implemented function torch.mm(sparse, dense) for variables: pytorch/pytorch#2389. It will be certainly implemented but in the meantime, I defined a new autograd function for sparse variables, called "my_sparse_mm", by subclassing torch.autograd.function and implementing the forward and backward passes.

class my_sparse_mm(torch.autograd.Function):
    """
    Implementation of a new autograd function for sparse variables, 
    called "my_sparse_mm", by subclassing torch.autograd.Function 
    and implementing the forward and backward passes.
    """
    
    def forward(self, W, x):  # W is SPARSE
        self.save_for_backward(W, x)
        y = torch.mm(W, x)
        return y
    
    def backward(self, grad_output):
        W, x = self.saved_tensors 
        grad_input = grad_output.clone()
        grad_input_dL_dW = torch.mm(grad_input, x.t()) 
        grad_input_dL_dx = torch.mm(W.t(), grad_input )
        return grad_input_dL_dW, grad_input_dL_dx

When to use this algorithm?

Any problem that can be cast as analyzing a set of signals on a fixed graph, and you want to use ConvNets for this analysis.




More Repositories

1

GML2023

Graph Machine Learning course, Xavier Bresson, 2023
Jupyter Notebook
558
star
2

CS6208_2023

Advanced Topics in Artificial Intelligence, NUS CS6208, 2023
Jupyter Notebook
307
star
3

CE7454_2019

Deep learning course CE7454, 2019
Jupyter Notebook
189
star
4

CS5242_2021

Neural Networks and Deep Learning, NUS CS5242, 2021
Jupyter Notebook
180
star
5

TSP_Transformer

Code for TSP Transformer
Jupyter Notebook
162
star
6

CS4243_2022

Computer Vision and Pattern Recognition, NUS CS4243, 2022
Jupyter Notebook
161
star
7

spatial_graph_convnets

PyTorch implementation of residual gated graph ConvNets, ICLR’18
Jupyter Notebook
121
star
8

CE7454_2018

Deep learning course CE7454, 2018
Jupyter Notebook
76
star
9

CE9010_2018

Python notebooks and slides for CE9010: Introduction to Data Science, Semester 2 2017/18
Jupyter Notebook
52
star
10

CS5284_2024

NUS CS5284 Graph Machine Learning course, Xavier Bresson, 2024
Jupyter Notebook
51
star
11

CE7454_2020

Deep learning course CE7454, 2020
Jupyter Notebook
28
star
12

IPAM_Tutorial_2019

Notebooks for IPAM Tutorial, March 15 2019
Jupyter Notebook
24
star
13

AI6103_2020

Master of AI, Deep learning course AI6103, 2020
Jupyter Notebook
22
star
14

old_codes

HTML
15
star
15

CE9010_2019

CE9010 Introduction to Data Analysis, 2019
Jupyter Notebook
11
star
16

Long_Tailed_Learning_Requires_Feature_Learning

Repository for ICLR'23 Long-tailed Learning Requires Feature Learning
Python
10
star
17

CE9010_2020

CE9010 Introduction to Data Analysis, 2020
Jupyter Notebook
7
star
18

feature_collapse

Code for "Feature Collapse"
Jupyter Notebook
6
star
19

CE9010_2021

CE9010 Introduction to Data Analysis, 2021
Jupyter Notebook
5
star
20

pcut

Code for Product Cut clustering technique, NIPS'16
Jupyter Notebook
4
star
21

Teaching_Resources

Xavier Bresson's Teaching Resources
HTML
2
star
22

demo_pytorch_dqn__with_4_observations

Jupyter Notebook
2
star