Home

Awesome

Neural Arithmetic Logic Units

[WIP]

Overview

This is the code in this video on Youtube by Siraj Raval on Neural Arithmetic Logic Units. Credits for this code go to kevinzakka.

This is a PyTorch implementation of Neural Arithmetic Logic Units by Andrew Trask, Felix Hill, Scott Reed, Jack Rae, Chris Dyer and Phil Blunsom.

<p align="center"> <img src="./imgs/arch.png" alt="Drawing", width=60%> </p>

API

from models import *

# single layer modules
NeuralAccumulatorCell(in_dim, out_dim)
NeuralArithmeticLogicUnitCell(in_dim, out_dim)

# stacked layers
NAC(num_layers, in_dim, hidden_dim, out_dim)
NALU(num_layers, in_dim, hidden_dim, out_dim)

Experiments

To reproduce "Numerical Extrapolation Failures in Neural Networks" (Section 1.1), run:

python failures.py

This should generate the following plot:

<p align="center"> <img src="./imgs/extrapolation.png" alt="Drawing", width=60%> </p>

To reproduce "Simple Function Learning Tasks" (Section 4.1), run:

python function_learning.py

This should generate a text file called interpolation.txt with the following results. (Currently only supports interpolation, I'm working on the rest)

Relu6NoneNACNALU
a + b4.4720.1320.1540.157
a - b85.7272.2242.40334.610
a * b89.2574.5735.3821.236
a / b97.07060.5945.7303.042
a ^ 289.9872.9774.7181.117
sqrt(a)5.93940.2437.2631.119