• Stars
    star
    303
  • Rank 137,655 (Top 3 %)
  • Language
    Jupyter Notebook
  • License
    MIT License
  • Created over 8 years ago
  • Updated almost 4 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Deep Learning Study Group

study-group

Laura Graesser and Wah Loon Keng

We are a diverse group of software engineering and data science professionals from industry and academia.

Inspired by the collaborative culture of artificial neural network research, we hold regular, chilled beverage-enhanced study sessions in midtown Manhattan. At these meetings, we summarise prescribed preparatory material and leverage our individual strengths in computer science, mathematics, statistics, neuroscience, and venture capital to cement our comprehension of concepts and to implement effective deep learning models.

smile and wave

Over the course of our sessions, we follow three parallel paths:

  1. Theory: We study academic textbooks, exercises, and coursework so that we command strong theoretical foundations for neural networks and deep learning. Broadly, we cover calculus, algebra, probability, computer science, with a focus on their intersection at machine learning.
  2. Application: We practice deep learning in the real world. We typically commence by collectively following tutorials then we move on to solving novel and illustrative data problems involving a broad range of techniques. In addition to incorporating deep learning into our respective academic and commercial applications, we commit code to the present, public repository where possible.
  3. Presentations: Study group members regularly share their progress on Deep Learning projects and their area of expertise. This elicits novel discourse outside of the relatively formal paths 1 and 2, playfully encouraging along serendipity.

jk1

Theory

Theory coverage has been led by:

If you're looking to get a handle on the fundamentals of Deep Learning, check out Jon Krohn's:

Application

Our applications have involved a broad range of neural network architectures built largely in Python with NumPy, TensorFlow, and PyTorch.

claudia

Presentations

In chronological order, we have experienced the joy of being enlightened by:

  1. Katya Vasilaky on the mathematics of deep learning (session II) and on regularization (session VIII)
  2. Thomas Balestri on countless machine-learning underpinnings (sessions III and IV), LSTM gates (session XII), and Reinforcement Learning (session XIV)
  3. Gabe Rives-Corbett on Keras implementations of deep learning deployed at untapt (session III)
  4. Dmitri Nesterenko on his NumPy implementation of k-nearest neighbours (session VI) and capsule networks (session XVII)
  5. Raphaela Sapire on the deep learning start-up investment atmosphere (session VIII)
  6. Grant Beyleveld on his U-Net convolutional network (session IX), GANs (session XII), and transformer architectures (session XVII)
  7. Jessica Graves on applications of deep learning to the fashion industry (session IX)
  8. V.T. Rajan on deriving the word2vec algorithm (session X)
  9. Karl Habermas on his NumPy implemention of the word2vec algorithm (session X)
  10. David Epstein on generative adversarial networks (session X)
  11. Claudia Perlich on predictability and how it creates biases when your target is created by mixtures (session XI)
  12. Brian Dalessandro on generating text with Keras LSTM models (session XI)
  13. Marianne Monteiro on TensorFlow Recurrent Neural Network implementations (session XIII)
  14. Druce Vertes on predicting which financial stories go viral on social media (session XIII)

katya


Session Notes

Click through for detailed summary notes from each session:

  1. August 17th, 2016: Perceptrons and Sigmoid Neurons
  2. September 6th, 2016: The Backpropagation Algorithm
  3. September 28th, 2016: Improving Neural Networks
  4. October 20th, 2016: Proofs of Key Properties
  5. November 10th, 2016: Deep (Conv)Nets
  6. November 30th, 2016: Convolutional Neural Networks for Visual Recognition
  7. January 12th, 2017: Implementing Convolutional Nets
  8. February 7th, 2017: Unsupervised Learning, Regularisation, and Venture Capital
  9. March 6th, 2017: Word Vectors, AI x Fashion, and U-Net
  10. March 27th, 2017: word2vec Mania + GANs
  11. April 19th, 2017: Recurrent Neural Networks, including GRUs and LSTMs
  12. July 1st, 2017: Translation, Attention, more LSTMs, Speech-to-Text, and TreeRNNs
  13. August 5th, 2017: Model Architectures for Answering Questions and Overcoming NLP Limits
  14. October 17th, 2017: Reinforcement Learning
  15. December 9th, 2017: Deep Reinforcement Learning (Deep Q-Learning and OpenAI Lab)
  16. February 17th, 2018: Deep Reinforcement Learning (Policy Gradients and SLM-Lab)
  17. October 16th, 2019: Deep Learning Illustrated Book Launch, Transformer Architectures, and Capsule Networks

Acknowledgements

classic Ed Donner

Thank you to untapt and its visionary, neural net-loving founder Ed Donner for hosting and subsidising all meetings of the Deep Learning Study Group.


With a desire to remain intimately-sized, our study group has reached its capacity. If you'd like to be added to our waiting list, please contact the organiser, Jon Krohn, describing your relevant experience as well as your interest in deep learning. We don't expect you to necessarily be a deep learning expert already :)

Jon Krohn