Home

Awesome

Grammar Variational Autoencoder (implementation in pyTorch)

This repo has implemented the grammar variational autoencoder so far,

encoder: grammar_variational_encoder

decoder: grammar_variational_decoder

training performance

training_loss

Todo

Done

Usage (To Run)

All of the script bellow are included in the ./Makefile. To install and run training, you can just run make. For more details, take a look at the ./Makefile.

  1. install dependencies via
    pip install -r requirement.txt
    
  2. Fire up a visdom server instance to show the visualizations. Run in a dedicated prompt to keep this alive.
    python -m visdom.server
    
  3. In a new prompt run
    python grammar_vae.py
    

Program Induction Project Proposal

  1. specify typical program induction problems
  2. make model for each specific problem
  3. get baseline performance for each problem

Todo

List of problems that each paper tackles with their algorithms:

Grammar Variational Autoencoder https://arxiv.org/abs/1703.01925

methodfrac. validavg. score
GAVE0.990 ± 0.0013.47 ± 0.24
My Score0.16 ± 0.001 todo: need to measure MSE
CAVE-0.31 ± 0.0014.75 ± 0.25

Automatic Chemical Design https://arxiv.org/abs/1610.02415

The architecture above in fact came from this paper. There are a few concerns with how the network was implemented in this paper:

Synthesizing Program Input Grammars https://arxiv.org/abs/1608.01723

Percy Lian, learns CFG from small examples.

A Syntactic Neural Model for General-Purpose Code Generation https://arxiv.org/abs/1704.01696

need close reading of model and performance.

A Hybrid Convolutional Variational Autoencoder for Text Generation https://arxiv.org/abs/1702.02390

tons of characterization in paper, very worth while read for understanding the methodologies.

Reed, Scott and de Freitas, Nando. Neural programmer-interpreters (ICLR), 2015.

see note in another repo.

Mou, Lili, Men, Rui, Li, Ge, Zhang, Lu, and Jin, Zhi. On end-to-end program generation from user intention by deep neural networks. arXiv preprint arXiv:1510.07211, 2015.

Jojic, Vladimir, Gulwani, Sumit, and Jojic, Nebojsa. Probabilistic inference of programs from input/output examples. 2006.

Gaunt, Alexander L, Brockschmidt, Marc, Singh, Rishabh, Kushman, Nate, Kohli, Pushmeet, Taylor, Jonathan, and Tarlow, Daniel. Terpret: A probabilistic programming language for program induction. arXiv preprint arXiv:1608.04428, 2016.

Ellis, Kevin, Solar-Lezama, Armando, and Tenenbaum, Josh. Unsupervised learning by program synthesis. In Advances in Neural Information Processing Systems, pp. 973–981, 2015.

Bunel, Rudy, Desmaison, Alban, Kohli, Pushmeet, Torr, Philip HS, and Kumar, M Pawan. Adaptive neural compilation. arXiv preprint arXiv:1605.07969, 2016.

Riedel, Sebastian, Bosˇnjak, Matko, and Rockta ̈schel, Tim. Programming with a differentiable forth interpreter. arXiv preprint arXiv:1605.06640, 2016.