Awesome
Deep Learning v2 PyTorch
Documentation: https://deep-learning-v2-pytorch.readthedocs.io
Source Code: https://github.com/TeoZosa/deep-learning-v2-pytorch
Overview
Fork of udacity/deep-learning-v2-pytorch
Table of Contents
<!-- toc --> <!-- tocstop -->Tutorials
Introduction to Neural Networks
- Introduction to Neural Networks: Learn how to implement gradient descent and apply it to predicting patterns in student admissions data.
- Sentiment Analysis with NumPy: Andrew Trask leads you through building a sentiment analysis model, predicting if some text is positive or negative.
- Introduction to PyTorch: Learn how to build neural networks in PyTorch and use pre-trained networks for state-of-the-art image classifiers.
Convolutional Neural Networks
- Convolutional Neural Networks: Visualize the output of layers that make up a CNN. Learn how to define and train a CNN for classifying MNIST data, a handwritten digit database that is notorious in the fields of machine and deep learning. Also, define and train a CNN for classifying images in the CIFAR10 dataset.
- Transfer Learning. In practice, most people don't train their own networks on huge datasets; they use pre-trained networks such as VGGnet. Here you'll use VGGnet to help classify images of flowers without training an end-to-end network from scratch.
- Weight Initialization: Explore how initializing network weights affects performance.
- Autoencoders: Build models for image compression and de-noising, using feedforward and convolutional networks in PyTorch.
- Style Transfer: Extract style and content features from images, using a pre-trained network. Implement style transfer according to the paper, Image Style Transfer Using Convolutional Neural Networks by Gatys et. al. Define appropriate losses for iteratively creating a target, style-transferred image of your own design!
Recurrent Neural Networks
- Intro to Recurrent Networks (Time series & Character-level RNN): Recurrent neural networks are able to use information about the sequence of data, such as the sequence of characters in text; learn how to implement these in PyTorch for a variety of tasks.
- Embeddings (Word2Vec): Implement the Word2Vec model to find semantic representations of words for use in natural language processing.
- Sentiment Analysis RNN: Implement a recurrent neural network that can predict if the text of a moview review is positive or negative.
- Attention: Implement attention and apply it to annotation vectors.
Generative Adversarial Networks
- Generative Adversarial Network on MNIST: Train a simple generative adversarial network on the MNIST dataset.
- Batch Normalization: Learn how to improve training rates and network stability with batch normalizations.
- Deep Convolutional GAN (DCGAN): Implement a DCGAN to generate new images based on the Street View House Numbers (SVHN) dataset.
- CycleGAN: Implement a CycleGAN that is designed to learn from unpaired and unlabeled data; use trained generators to transform images from summer to winter and vice versa.
Deploying a Model (with AWS SageMaker)
- All exercise and project notebooks for the lessons on model deployment can be found in the linked, Github repo. Learn to deploy pre-trained models using AWS SageMaker.
Projects
- Predicting Bike-Sharing Patterns: Implement a neural network in NumPy to predict bike rentals.
- Dog Breed Classifier: Build a convolutional neural network with PyTorch to classify any image (even an image of a face) as a specific dog breed.
- TV Script Generation: Train a recurrent neural network to generate scripts in the style of dialogue from Seinfeld.
- Face Generation: Use a DCGAN on the CelebA dataset to generate images of new and realistic human faces.
Elective Material
- Intro to TensorFlow: Starting building neural networks with TensorFlow.
- Keras: Learn to build neural networks and convolutional neural networks with Keras.
Development
📝 Note
For convenience, many of the below processes are abstracted away and encapsulated in single Make targets.
🔥 Tip
Invokingmake
without any arguments will display auto-generated documentation on available commands.
Package and Dependencies Installation
Make sure you have Python 3.6+ and poetry
installed and configured.
To install the package and all dev dependencies, run:
make provision_environment
🔥 Tip
Invoking the above withoutpoetry
installed will emit a helpful error message letting you know how you can install poetry.
Testing
We use tox
for our test automation framework
and pytest
for our testing framework.
To invoke the tests, run:
make test
Run mutation tests to validate test suite robustness (Optional):
make test-mutations
📝 Note
Test time scales with the complexity of the codebase. Results are cached in.mutmut-cache
, so once you get past the initial cold start problem, subsequent mutation test runs will be much faster; new mutations will only be applied to modified code paths.
Code Quality
We are using pre-commit
for our code quality
static analysis automation and management framework.
To invoke the analyses and auto-formatting over all version-controlled files, run:
make lint
🚨 Danger
CI will fail if either testing or code quality fail, so it is recommended to automatically run the above locally prior to every commit that is pushed.
Automate via Git Pre-Commit Hooks
To automatically run code quality validation on every commit (over to-be-committed files), run:
make install-pre-commit-hooks
⚠️ Warning
This will prevent commits if any single pre-commit hook fails (unless it is allowed to fail) or a file is modified by an auto-formatting job; in the latter case, you may simply repeat the commit and it should pass.
Documentation
make docs-clean docs-html
📝 Note
For faster feedback loops, this will attempt to automatically open the newly built documentation static HTML in your browser.
Legal
License
Deep Learning v2 PyTorch is licensed under the Apache License, Version 2.0. See LICENSE for the full license text.
Credits
This project was generated from
@TeoZosa
's
cookiecutter-cruft-poetry-tox-pre-commit-ci-cd
template.