• Stars
    star
    1,386
  • Rank 33,910 (Top 0.7 %)
  • Language
    Python
  • License
    Other
  • Created over 8 years ago
  • Updated almost 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Residual networks implementation using Keras-1.0 functional API

keras-resnet

Build Status license

Residual networks implementation using Keras-1.0 functional API, that works with both theano/tensorflow backend and 'th'/'tf' image dim ordering.

The original articles

Residual blocks

The residual blocks are based on the new improved scheme proposed in Identity Mappings in Deep Residual Networks as shown in figure (b)

Residual Block Scheme

Both bottleneck and basic residual blocks are supported. To switch them, simply provide the block function here

Code Walkthrough

The architecture is based on 50 layer sample (snippet from paper)

Architecture Reference

There are two key aspects to note here

  1. conv2_1 has stride of (1, 1) while remaining conv layers has stride (2, 2) at the beginning of the block. This fact is expressed in the following lines.
  2. At the end of the first skip connection of a block, there is a disconnect in num_filters, width and height at the merge layer. This is addressed in _shortcut by using conv 1X1 with an appropriate stride. For remaining cases, input is directly merged with residual block as identity.

ResNetBuilder factory

  • Use ResNetBuilder build methods to build standard ResNet architectures with your own input shape. It will auto calculate paddings and final pooling layer filters for you.
  • Use the generic build method to setup your own architecture.

Cifar10 Example

Includes cifar10 training example. Achieves ~86% accuracy using Resnet18 model.

cifar10_convergence

Note that ResNet18 as implemented doesn't really seem appropriate for CIFAR-10 as the last two residual stages end up as all 1x1 convolutions from downsampling (stride). This is worse for deeper versions. A smaller, modified ResNet-like architecture achieves ~92% accuracy (see gist).