Home

Awesome

Downloads Image contributions welcome

nn_builder

nn_builder lets you build neural networks with less boilerplate code. You specify the type of network you want and it builds it.

Install

pip install nn_builder

Support

Network TypeNNCNNRNN
PyTorch:heavy_check_mark::heavy_check_mark::heavy_check_mark:
TensorFlow 2.0:heavy_check_mark::heavy_check_mark::heavy_check_mark:

Examples

On the left is how you can create the PyTorch neural network on the right in only 1 line of code using nn_builder:

Screenshot

Similarly for TensorFlow on the left is how you can create the CNN on the right in only 1 line of code using nn_builder:

Screenshot

Usage

See this colab notebook for lots of examples of how to use the module. 3 types of PyTorch and TensorFlow network are currently supported: NN, CNN and RNN. Each network takes the following arguments:

FieldDescriptionDefault
input_dimDimension of the input into the network. See below for more detail. Not needed for Tensorflow.N/A
layers_infoList to indicate the layers of the network you want. Exact requirements depend on network type, see below for more detailN/A
output_activationString to indicate the activation function you want the output to go through. Provide a list of strings if you want multiple output headsNo activation
hidden_activationsString or list of string to indicate the activations you want used on the output of hidden layers (not including the output layer), default is ReLU and for example "tanh" would have tanh applied on all hidden layer activationsReLU after every hidden layer
dropoutFloat to indicate what dropout probability you want applied after each hidden layer0
initialiserString to indicate which initialiser you want used to initialise all the parametersPyTorch & TF Default
batch_normBoolean to indicate whether you want batch norm applied to the output of every hidden layerFalse
columns of_data_to_be_embeddedList to indicate the column numbers of the data that you want to be put through an embedding layer before being fed through the hidden layers of the networkNo embeddings
embedding_dimensionsIf you have categorical variables you want embedded before flowing through the network then you specify the embedding dimensions here with a list of the form: [ [embedding_input_dim_1, embedding_output_dim_1], [embedding_input_dim_2, embedding_output_dim_2] ...]No embeddings
y_rangeTuple of float or integers of the form (y_lower, y_upper) indicating the range you want to restrict the output values to in regression tasksNo range
random_seedInteger to indicate the random seed you want to use0
return_final_seq_onlyOnly needed for RNN. Boolean to indicate whether you only want to return the output for the final timestep (True) or if you want to return the output for all timesteps (False)True

Each network type has slightly different requirements for input_dim and layers_info as explained below:


1. NN

from nn_builder.pytorch.NN import NN   
model = NN(input_dim=5, layers_info=[10, 10, 1], output_activation=None, hidden_activations="relu", 
           dropout=0.0, initialiser="xavier", batch_norm=False)            

2. CNN

from nn_builder.pytorch.CNN import CNN   
model = CNN(input_dim=(3, 64, 64), 
            layers_info=[["conv", 32, 3, 1, 0], ["maxpool", 2, 2, 0], 
                         ["conv", 64, 3, 1, 2], ["avgpool", 2, 2, 0], 
                         ["linear", 10]],
            hidden_activations="relu", output_activation="softmax", dropout=0.0,
            initialiser="xavier", batch_norm=True)

3. RNN

from nn_builder.pytorch.CNN import CNN   
model = RNN(input_dim=5, layers_info=[["gru", 50], ["lstm", 10], ["linear", 2]],
            hidden_activations="relu", output_activation="softmax", 
            batch_norm=False, dropout=0.0, initialiser="xavier")

Contributing

Anyone is very welcome to contribute via a pull request. Please see the issues page for ideas on the best areas to contribute to and try to:

  1. Add tests to the tests folder that cover any code you write
  2. Write comments for every function
  3. Create a colab notebook demonstrating how any extra functionality you created works

To help you remember things you learn about machine learning in general checkout Gizmo