Home

Awesome

A neural machine translation model written in pytorch.

For a up-to-date PyTorch implementation of basic vanilla attentional NMT, please refer to this repo

With 256-dimensional LSTM hidden size, it achieves a training speed of 14000 words/sec and 26.9 BLEU score on the IWSLT 2014 Germen-English dataset (Ranzato et al., 2015).

File Structure

Usage

python vocab.py
. scripts/run_mle.sh
. scripts/run_raml.sh

TODO:

License

This work is licensed under a Creative Commons Attribution 4.0 International License.