• Stars
    star
    123
  • Rank 288,511 (Top 6 %)
  • Language
    Jupyter Notebook
  • Created about 8 years ago
  • Updated about 7 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

a collection of my notes on deep learning

Notes on Deep Learning

New Repo of Deep Learning Papers! 🌟 πŸ’₯

I moved my collection of deep learning and machine learning papers from DropBox to this git repository! First blog post being planned is on "Boltzmann Machines, Statistical Mechanicsgit and Maximum Likelihood Estimation".

LINK: GitHub/episodeyang/deep_learning_papers_TLDR

from the Author

These are the notes that I left working through Nielsen's neural Net and Deep Learning book. You can find a table of contents of this repo below.

Table of Contents

Chapter 1: Intro to Deep Learning

Chapter 2: Intro to Tensorflow

Chapter 3: Advanced Tensorflow with GPU AWS Instance and PyCharm Remote Interpreter.

Chapter 4: Recurrent Networks.

Here I implemented a vanilla RNN from scratch. I didn't want to write the partial derivatives by hand, but Tensorflow feels a bit too opaque. The edf framework by TTIC is a poor-man's Tensorflow, and it provides auto-differentiation via component.backward() method. So I decided to go with it.

I also implemented RMSProp and Adam by hand, and tried hyper-parameter search. It was extremely informative.

Project: Doing Particle Simulation with Tensorflow

Project: LeNet with Novel Loss Function

Fun Highlights (Reverse Chronological Order)

some of the figures can be found scattered in the folder (I believe in a flat folder structure).

Particle Simulation with Tensorflow! (classical many body simulation for my quantum computing research)

It turned out that not needing to write the Jacobian of your equations of motion is a huge time saver in doing particle simulations.

Here is a 2D classical many body simulator I wrote for my quantum computing research. In my lab, I am building a new type of qubits by traping single electrons on the surface of super fluild helium. You can read more about our progress in this paper from PRX.

In this new experiment, we want to construct a very small electro-static trap so that we can couple a microwave mirror to the dipole of a single electron. To understand where electrons are likely to go, I need to build a simple electro-static simulation.

link to repo

Electron Configuration During Simulation

Projecting MNIST into a 2-Dimensional Deep Feature Space

It turned out that you can constrict the feature space of a convolutional neural network, and project the MNIST dataset onto a 2-dimensional plane!

This is my attempt at reproducing the work from Yandong Wei's paper (link see project readme (WIP)).

This makes very nice visualizations. Curious about how this embedding evolves during training, I made a few movies. You can find them inside the project folder.

network learning

MNIST ConvNet with TensorFlow

My first attempt at building a convolutional neural network with tensorflow.

This example does:

  • uses different GPUs for training and evaluation (manual device placement)
  • persist network parameters in check files (session saving and restore)
  • pushes loss and accuracy to summary, which can be visualized by tensorboard (summary and tensorboard)

MNIST ConvNet Tensorflow

A simple toy example

This one below shows how a simple network can be trained to emulate a given target function. Implemented with numpy without the help of tensorflow.

![network trained to emulate function](Ch1 Intro to Deep Learning/trained neural net emulate a step function.png)

Todos (02/07/2017):

  • Wormhole RNN [pdf]
  • Experiment with PyTorch
  • Proj RNN: Teach RNN how to do math
  • Proj NLP: syntax highlighter for natural language
  • Restricted Boltzman Machine, and how it is used in deep belief to initialize auto-encoders [Hinton, 2006]
  • binary weight networks XNOR net
  • Attention Networks: link: Augmented RNN
  • Image Captioning
  • Adversarial Hardened LeNet++ [1.0]
  • Adversarial Test of Hardened LeNet++ [1.0]
  • L2 Regularization with Logistic Regression [1.0]

Bucket List and Things Ran-into During Readings (not in order)

  • Denoising Autoencoder
  • Word2vec

Done:

  • work on optimize batch training. (numpy neural net)
  • add summary MNIST example with Tensorflow
  • Convolutional Neural Network
  • multi-GPU setup tensorflow doc [0.5 - 1.0]
  • CFAR Example [4.0]
  • Save and restore net
  • MNIST Perceptron logging and visualization with tensorboard
  • Feedforward Neural Network (Multilayer Perceptron) tensorboard doc [2.0]
  • TensorBoard
  • LeNet training ConvNet doc [1.0]
  • LeNet++ training [1.0]
  • Deep Feedforward Neural Network (Multilayer Perceptron with 2 Hidden Layers O.o)
  • Vanilla Recurrent Neural Network
  • regularization and batch normalization
  • LSTM with edf
  • GRU with edf

More Useful Links:

More Repositories

1

ml_logger

A logger, server and visualization dashboard for ML projects
Python
192
star
2

deep-auto-punctuation

a pytorch implementation of auto-punctuation learned character by character
Jupyter Notebook
142
star
3

grammar_variational_autoencoder

pytorch implementation of grammar variational autoencoder
Python
60
star
4

plan2vec

Public Release of Plan2vec Implementation in pyTorch
Python
56
star
5

char2wav_pytorch

pytorch implementation of lyre.ai's char2wav model
Python
32
star
6

deep_learning_papers_TLDR

repository for my TLDR for deep learning papers (and SML papers!)
Mathematica
18
star
7

gym-fetch

A collection of manipulation tasks with the fetch robot
Python
18
star
8

jaynes-starter-kit

a starter-kit for jaynes, the cloud-agnostic launch library
Python
16
star
9

react-prosemirror

JavaScript
16
star
10

e-maml

E-MAML, and RL-MAML baseline implemented in Tensorflow v1
Python
15
star
11

deep_learning_plotting_example

Some of the plotting code used in our paper, as an example for good-looking plots.
Python
13
star
12

variational_autoencoder_pytorch

pyTorch variational autoencoder, with explainations
Jupyter Notebook
10
star
13

params-proto

params_proto, a collection of decorators that makes shell argument passing declarative
Python
10
star
14

unitree-go1-setup-guide

Setup guide for the UniTree Go1 robot
Python
10
star
15

ffn

Public Repo for the paper "Overcoming The Spectral-Bias of Neural Value Approximation"
Python
7
star
16

react-docgen-loader

a small webpack loader that generates react component metaData using react-docgen
JavaScript
7
star
17

nanoGPT

Adapted version of nanoGPT for teaching
Jupyter Notebook
7
star
18

torch-ppo

PyTorch implementation of PPO, A2C, ACKTR, and GAIL
Python
6
star
19

gym-distracting-control

this is a packaged version of the distracting control suite from Stone et al.
Python
6
star
20

jaynes

Python
6
star
21

gym-dmc

DeepMind Control Suite plugin for gym
Jupyter Notebook
5
star
22

gym-sawyer

Sawyer robot adapter for OpenAI gym
Jupyter Notebook
4
star
23

yatta

yatta is a cli tool that manages a local *.bib index for your PDFs.
JavaScript
4
star
24

zaku

Machine Learning Job Queue and Procedural Calls
Python
4
star
25

visdom_helper

helpers for pyTorch and visdom to better impedance match.
Python
3
star
26

deep_machine_translation

A sequence to sequence model implemented in pyTorch
Python
3
star
27

practical-rl-at-scale

example for running RL code bases at scale
Python
3
star
28

ml-research-containers

A list of community maintained docker images for deep reinforcement learning research
Dockerfile
3
star
29

tensorflow_data_loading

A list of data loading patterns for TensorFlow
Python
3
star
30

reinforcement_learning_learning_notes

Python
3
star
31

legged-control-suite

A collection of legged robot environments built off DeepMind control
Python
3
star
32

moleskin

A Debugging and console print utility for python
Python
2
star
33

savio-starter-kit

starter-kit for running stuff on the Berkeley Research Cluster Savio cluster Batteries Included!πŸ”‹
Python
2
star
34

react-markdownit

JavaScript
2
star
35

KL_divergence

small example demonstrating the difference between KL(P||Q) vs KL(Q||P)
Jupyter Notebook
2
star
36

waterbear

Waterbear is a simple utility that makes your python dictionary accessible via dot notation.
Python
2
star
37

computer_science_basics

some of the basic topics for computer science.
Jupyter Notebook
2
star
38

ml-logger_examples

Usages Examples for ML-Logger
Python
1
star
39

vuer

Python
1
star
40

ICLR_2018_analysis

Analyzing the submission metadata from ICLR 2018
Python
1
star
41

graph-search

collection of graph-search algorithms
Python
1
star
42

react-component-props-table

a table component that takes the meta-data of a react component and renders it in a table
JavaScript
1
star
43

MyFirstApp

Java
1
star
44

tf_logger

logging utility for tensorboard
Python
1
star
45

rl-playground-old

playground repo for rl algorithms
Jupyter Notebook
1
star
46

dave

A command line utility that runs your script using arguments from a yaml config file
Python
1
star
47

many-world

a many-world task suite
Python
1
star
48

react-es6-template

a minimal react es6 template with automatic document generation
JavaScript
1
star
49

react-vis-graph-components

a link-graph component built on top of react-vis
JavaScript
1
star
50

react-codemirror

JavaScript
1
star
51

react-bristol

a react canvas paint component
JavaScript
1
star
52

torch_helpers

A collection of helpers for pyTorch
Python
1
star
53

megadraft-demo

JavaScript
1
star
54

memory

Collection of Replay Buffers for Reinforcement Learning Research
Makefile
1
star