Home

Awesome

#NNBlocks NNBlocks is a framework to build and train any kind of exotic neural networks architectures. It is made to be:

Please refer to the User Guide for more information on instalation and usage.

###Example: Let's say you have 3 input vectors as matrix lines. This will be the NN input. You want to pass it through a multilayer perceptron, then a recurrent neural network and finnaly get the last vector of the result. i.e.: ####Code:

import nnb
import numpy as np

#The input vectors will have size 3
input_vecs = np.arange(9)
input_vecs.resize(3,3)

input_layer = nnb.InputLayer(ndim=2)
p_layer1 = nnb.PerceptronLayer(insize=3, outsize=5)
p_layer2 = nnb.PerceptronLayer(insize=5, outsize=3)
rnn = nnb.RecurrentNeuralNetwork(insize=3, outsize=3)

#Join everything
final_model = input_layer | p_layer1 | p_layer2 | rnn[-1]

#Compiling the feedforward function
feedforward = final_model.compile()
feedforward(input_vecs)

###Another (more advanced) example: Now let's implement a Matrix-Vector Recursive Neural Network (MV-RNN), just as described in Socher's work (http://www.aclweb.org/anthology/D12-1110). This example is way more complicated than the last, but is an awesome example to demonstrate the power of NNBlocks. First of all, let's understand the architecture:

Now a step-by-step for the code:

import nnb
import numpy as np

#15 word vectors of size 5
word_vecs = np.random.random(size=(15,5))
#15 word matrices of size 5x5
word_mats = np.random.random(size=(15,5,5))

Here we just initialize some of our network's parameters, where

The rest of the parameters will be handled by NNBlocks' models. We need to declare these parameters here because they don't fit in any of NNBlocks already implemented models. This sounds like a bad thing, but it's actually a big plus that you can plug in your own parameters and use them in the network easily.

comp_tree = nnb.InputLayer(ndim=2, dtype='int32')
word_index = nnb.InputLayer(ndim=1, dtype='int32')

Now we initialize our network's inputs.

Now we shall start to build the actual model. We will first focus on building the model that will do the composition on each node of the tree.

words_comp = \
    nnb.CustomModel(fn=lambda x, X, y, Y: [x.dot(Y), y.dot(X)]) | \
    nnb.ConcatenationModel(axis=0) | \
    nnb.PerceptronLayer(insize=10, outsize=5)
    
matrix_comp = \
    nnb.SliceModel(slice=[1, 3]) | \
    nnb.ConcatenationModel(axis=1) | \
    nnb.PerceptronLayer(insize=10, outsize=5, activation_func=lambda x: x)

comp_model = words_comp & matrix_comp

Now comp_model, receiving two word vectors and matrices, will output a composed word vector and a composed word matrix. The RecursiveNeuralNetwork will take care of passing these inputs to our model. Again, if you want to understand how the composition is made, read the original paper.

Now the rest of the code:

rnn = nnb.RecursiveNeuralNetwork(comp_model=comp_model)
vec_picker = nnb.Picker(choices=word_vecs)
matrix_picker = nnb.Picker(choices=word_mats)

rnn_inputs = comp_tree & (word_index | vec_picker) & (word_index | matrix_picker)
all = rnn_inputs | rnn
#We are interested only in the composed vectors
composed_vectors = all[0]

#If we wanted to put a softmax layer on top of each node:
#classifications = composed_vectors | nnb.SoftmaxLayer(insize=5, outsize=NUM_CLASSES)

At last we finished our model. Now we can use NNBlocks' tools to train our network!