Home

Awesome

arc-pytorch

PyTorch implementation of Attentive Recurrent Comparators by Shyam et al.

A blog explaining Attentive Recurrent Comparators

Visualizing Attention

On Same characters

<img src="visualization/16_4_4_256/sim1.gif" width="400"> <img src="visualization/16_4_4_256/sim2.gif" width="400">

On Different Characters

<img src="visualization/16_4_4_256/dis1.gif" width="400"> <img src="visualization/16_4_4_256/dis2.gif" width="400">

How to run?

Download data

python download_data.py

A one-time 52MB download. Shouldn't take more than a few minutes.

Train

python train.py --cuda

Let it train until the accuracy rises to at least 80%. Early stopping is not implemented yet. You will have to manually kill the process.

Visualize

python viz.py --cuda --load 0.13591022789478302 --same

Run with exactly the same parameters as train.py and specify the model to load. Specify "--same" if you want to generate a sample with same characters in both images. The script dumps images to a directory in visualization. The name of directory is taken from --name parameter if specified, else name is a function of the parameters of network.