• Stars
    star
    1,577
  • Rank 29,662 (Top 0.6 %)
  • Language
    Jupyter Notebook
  • License
    GNU General Publi...
  • Created over 7 years ago
  • Updated almost 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Tutorials and implementations for "Self-normalizing networks"

Self-Normalizing Networks

Tutorials and implementations for "Self-normalizing networks"(SNNs) as suggested by Klambauer et al. (arXiv pre-print).

Versions

  • see environment file for full list of prerequisites. Tutorial implementations use Tensorflow > 2.0 (Keras) or Pytorch, but versions for Tensorflow 1.x users based on the deprecated tf.contrib module (with separate environment file) are also available.

Note for Tensorflow >= 1.4 users

Tensorflow >= 1.4 already has the function tf.nn.selu and tf.contrib.nn.alpha_dropout that implement the SELU activation function and the suggested dropout version.

Note for Tensorflow >= 2.0 users

Tensorflow 2.3 already has selu activation function when using high level framework keras, tf.keras.activations.selu. Must be combined with tf.keras.initializers.LecunNormal, corresponding dropout version is tf.keras.layers.AlphaDropout.

Note for Pytorch users

Pytorch versions >= 0.2 feature torch.nn.SELU and torch.nn.AlphaDropout, they must be combined with the correct initializer, namely torch.nn.init.kaiming_normal_ (parameter, mode='fan_in', nonlinearity='linear') as this is identical to lecun initialisation (mode='fan_in') with a gain of 1 (nonlinearity='linear').

Tutorials

Tensorflow 1.x

  • Multilayer Perceptron on MNIST (notebook)
  • Convolutional Neural Network on MNIST (notebook)
  • Convolutional Neural Network on CIFAR10 (notebook)

Tensorflow 2.x (Keras)

Pytorch

  • Multilayer Perceptron on MNIST (notebook)
  • Convolutional Neural Network on MNIST (notebook)
  • Convolutional Neural Network on CIFAR10 (notebook)

Further material

Design novel SELU functions (Tensorflow 1.x)

  • How to obtain the SELU parameters alpha and lambda for arbitrary fixed points (notebook)

Basic python functions to implement SNNs (Tensorflow 1.x)

are provided as code chunks here: selu.py

Notebooks and code to produce Figure 1 (Tensorflow 1.x)

are provided here: Figure1, builds on top of the biutils package.

Calculations and numeric checks of the theorems (Mathematica)

are provided as mathematica notebooks here:

UCI, Tox21 and HTRU2 data sets