Home

Awesome

Deprecated!

The maintenance of this project has moved to the AllenNLP framework. <br> Where you can use the model and an online demo. This thin wrapper may also be useful if you want to run the pretrained model.

<!-- START doctoc generated TOC please keep comment here to allow auto update --> <!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->

Table of Contents generated with DocToc

<!-- END doctoc generated TOC please keep comment here to allow auto update -->

supervised-oie

Code for training a supervised Neural Open IE model, as described in our NAACL2018 paper.<br> :construction: Still under construction :construction:

Citing :bookmark:

If you use this software, please cite:

@InProceedings{Stanovsky2018NAACL,
  author    = {Gabriel Stanovsky and Julian Michael and Luke Zettlemoyer and Ido Dagan},
  title     = {Supervised Open Information Extraction},
  booktitle = {Proceedings of The 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL HLT)},
  month     = {June},
  year      = {2018},
  address   = {New Orleans, Louisiana},
  publisher = {Association for Computational Linguistics},
  pages     = {(to appear)},
}

Quickstart :hatching_chick:

  1. Install requirements :bow:
pip install requirements.txt
  1. Download embeddings :walking:
cd ./pretrained_word_embeddings/
./download_external.sh
  1. Train model :running:
cd ./src
python  ./rnn/confidence_model.py  --train=../data/train.conll  --dev=../data/dev.conll  --test=../data/test.conll --load_hyperparams=../hyerparams/confidence.json```

NOTE: Models are saved by default to the models dir, unless a "--saveto" command line argument is passed. See confidence_model.py for more details.

  1. Predict with a trained model :clap:
python ./trained_oie_extractor.py \
    --model=path/to/model \
    --in=path/to/raw/sentences
    --out=path/to/output/file
    --conll

More scripts :bicyclist:

See src/scripts for more handy scripts. Additional documentation coming soon!