Home

Awesome

Notes on Deep Learning

New Repo of Deep Learning Papers! :star2: :boom:

I moved my collection of deep learning and machine learning papers from DropBox to this git repository! First blog post being planned is on "Boltzmann Machines, Statistical Mechanicsgit and Maximum Likelihood Estimation".

LINK: GitHub/episodeyang/deep_learning_papers_TLDR

from the Author

These are the notes that I left working through Nielsen's neural Net and Deep Learning book. You can find a table of contents of this repo below.

Table of Contents

Chapter 1: Intro to Deep Learning

Chapter 2: Intro to Tensorflow

Chapter 3: Advanced Tensorflow with GPU AWS Instance and PyCharm Remote Interpreter.

Chapter 4: Recurrent Networks.

Here I implemented a vanilla RNN from scratch. I didn't want to write the partial derivatives by hand, but Tensorflow feels a bit too opaque. The edf framework by TTIC is a poor-man's Tensorflow, and it provides auto-differentiation via component.backward() method. So I decided to go with it.

I also implemented RMSProp and Adam by hand, and tried hyper-parameter search. It was extremely informative.

Project: Doing Particle Simulation with Tensorflow

Project: LeNet with Novel Loss Function

Fun Highlights (Reverse Chronological Order)

some of the figures can be found scattered in the folder (I believe in a flat folder structure).

Particle Simulation with Tensorflow! (classical many body simulation for my quantum computing research)

It turned out that not needing to write the Jacobian of your equations of motion is a huge time saver in doing particle simulations.

Here is a 2D classical many body simulator I wrote for my quantum computing research. In my lab, I am building a new type of qubits by traping single electrons on the surface of super fluild helium. You can read more about our progress in this paper from PRX.

In this new experiment, we want to construct a very small electro-static trap so that we can couple a microwave mirror to the dipole of a single electron. To understand where electrons are likely to go, I need to build a simple electro-static simulation.

link to repo

<p align="center"> <img width="300px" height="300px" alt="Electron Configuration During Simulation" src="Proj_Molecular_Simulation/figures/Electron%20Configuration%20Animated%20(WIP)%20small.gif"/> </p>

Projecting MNIST into a 2-Dimensional Deep Feature Space

It turned out that you can constrict the feature space of a convolutional neural network, and project the MNIST dataset onto a 2-dimensional plane!

This is my attempt at reproducing the work from Yandong Wei's paper (link see project readme (WIP)).

<p align="center"> <img width="348.8px" height="280.4px" src="Proj_Centroid_Loss_LeNet/LeNet_plus/figures/MNIST%20LeNet++%20with%202%20Deep%20Features%20(PReLU).png"/> </p>

This makes very nice visualizations. Curious about how this embedding evolves during training, I made a few movies. You can find them inside the project folder.

<p align="center"> <img alt="network learning" src="Proj_Centroid_Loss_LeNet/LeNet_plus_centerloss/figures/animation/MNIST_LeNet_centroid_loss_lambda_0.001.gif"/> </p>

MNIST ConvNet with TensorFlow

My first attempt at building a convolutional neural network with tensorflow.

This example does:

MNIST ConvNet Tensorflow

A simple toy example

This one below shows how a simple network can be trained to emulate a given target function. Implemented with numpy without the help of tensorflow.

![network trained to emulate function](Ch1 Intro to Deep Learning/trained neural net emulate a step function.png)

Todos (02/07/2017):

Bucket List and Things Ran-into During Readings (not in order)

Done:

More Useful Links: