• Stars
    star
    194
  • Rank 200,219 (Top 4 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created over 6 years ago
  • Updated about 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Deep neural network kernel for Gaussian process

NNGP: Deep Neural Network Kernel for Gaussian Process

TensorFlow open source implementation of

Deep Neural Networks as Gaussian Processes

by Jaehoon Lee*, Yasaman Bahri*, Roman Novak, Sam Schoenholz, Jeffrey Pennington, Jascha Sohl-dickstein

Presented at the International Conference on Learning Representation(ICLR) 2018.

UPDATE (September 2020):

See also Neural Tangents: Fast and Easy Infinite Neural Networks in Python (ICLR 2020) available at github.com/google/neural-tangents for more up-to-date progress on computing NNGP as well as NT kernels supporting wide variety of architectural components.

Overview

A deep neural network with i.i.d. priors over its parameters is equivalent to a Gaussian process in the limit of infinite network width. The Neural Network Gaussian Process (NNGP) is fully described by a covariance kernel determined by corresponding architecture.

This code constructs covariance kernel for the Gaussian process that is equivalent to infinitely wide, fully connected, deep neural networks.

Usage

To use the code, run run_experiments.py, which uses NNGP kernel to make full Bayesian prediction on the MNIST dataset.

python run_experiments.py \
       --num_train=100 \
       --num_eval=10000 \
       --hparams='nonlinearity=relu,depth=100,weight_var=1.79,bias_var=0.83' \

Contact

Code author: Jaehoon Lee, Yasaman Bahri, Roman Novak

Pull requests and issues: @jaehlee

Citation

If you use this code, please cite our paper:

  @article{
    lee2018deep,
    title={Deep Neural Networks as Gaussian Processes},
    author={Jaehoon Lee, Yasaman Bahri, Roman Novak, Sam Schoenholz, Jeffrey Pennington, Jascha Sohl-dickstein},
    journal={International Conference on Learning Representations},
    year={2018},
    url={https://openreview.net/forum?id=B1EA-M-0Z},
  }

Note

This is not an official Google product.

More Repositories

1

self-attention-gan

Python
976
star
2

realistic-ssl-evaluation

Open source release of the evaluation benchmark suite described in "Realistic Evaluation of Deep Semi-Supervised Learning Algorithms"
Python
452
star
3

guided-evolutionary-strategies

Guided Evolutionary Strategies
Jupyter Notebook
263
star
4

acai

Code for "Understanding and Improving Interpolation in Autoencoders via an Adversarial Regularizer"
Python
240
star
5

mpnn

Open source implementation of "Neural Message Passing for Quantum Chemistry"
Python
220
star
6

tensorfuzz

A library for performing coverage guided fuzzing of neural networks
Python
204
star
7

l2hmc

TensorFlow implementation for training MCMC samplers from the paper: Generalizing Hamiltonian Monte Carlo with Neural Network
Jupyter Notebook
180
star
8

deep-molecular-massspec

Mass Spectrometry for Small Molecules using Deep Learning
Python
110
star
9

long-term-video-prediction-without-supervision

Implementation of Hierarchical Long-term Video Prediction without Supervision
Python
91
star
10

data-linter

The Data Linter identifies potential issues (lints) in your ML training data.
Python
84
star
11

conv-sv

The Singular Values of Convolutional Layers
Python
71
star
12

ncp

Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors
Python
63
star
13

mean-field-cnns

Jupyter Notebook
35
star
14

mirage-rl

Code to reproduce the experiments in The Mirage of Action-Dependent Baselines in Reinforcement Learning.
Python
17
star
15

LeaveNoTrace

Leave No Trace is an algorithm for safe reinforcement learning.
Python
15
star
16

fisher-rao-regularization

Python
10
star
17

wip-lambada-lm

LSTM language model on LAMBADA dataset
Python
9
star
18

hyperbolictext

TensorFlow source code for learning embeddings of text sequences in an unsupervised manner.
Python
8
star
19

wip-constrained-extractor

Work in progress inference, learning, and evaluation code for extractive summarization.
Python
6
star
20

flying-shapes

A potentially infinite dataset of coloured shapes which bounce around on a black background.
Python
4
star
21

metaq

Python
3
star