• Stars
    star
    1
  • Language
    Python
  • Created about 3 years ago
  • Updated about 3 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

This is a super small repository demonstrating a Problem with DistributedDataParallel. A three layer feed forward neural network is trained with MNIST with and without data parallel with the same hyper parameters. If you configure DistributedDataParalell to use only one node, the model is quite worse in accuracy. If you have any suggestions how to make them equal beside tuning the learning rate please commend or send PR!