• Stars
    star
    167
  • Rank 225,298 (Top 5 %)
  • Language
    Python
  • Created about 7 years ago
  • Updated over 6 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

BNN implementation in tensorflow

BinaryNet.tf

Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1. implementation in tensorflow (https://papers.nips.cc/paper/6573-binarized-neural-networks)

This is incomplete training example for BinaryNets using Binary-Backpropagation algorithm as explained in "Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1, on following datasets: Cifar10/100.

Note that in this folder I didn’t implemented (yet...) shift-base BN , shift-base AdaMax (instead I just use the vanilla BN and Adam). Likewise, I use deterministic binarization and I don't apply the initialization coefficients from GLorot&Bengio 2010. Finally "sparse_softmax_cross_entropy_with_logits" loss is used instead if the SquareHingeLoss.

The implementation is based on https://github.com/eladhoffer/convNet.tf but the main idea can be easily transferred to any tensorflow wrapper. I'll probably change it to keras soon. (e.g., slim,keras)

Data

This implementation supports cifar10/cifar100

Dependencies

tensorflow version 1.2.1

Training

  • Train cifar10 model using gpu: python main.py --model BNN_cifar10 --save BNN_cifar10 --dataset cifar10 --gpu True
  • Train cifar10 model using cpu: python main.py --model BNN_cifar10 --save BNN_cifar10 --dataset cifar10

Results

Cifar10 should reach at least 88% top-1 accuracy