Awesome
Transformers-For-Negation-and-Speculation
- Transformers_for_Negation_and_Speculation.ipynb:
This is the code for the following papers:
- NegBERT: A Transfer Learning Approach for Negation Detection and Scope Resolution (Published at LREC 2020)
- Resolving the Scope of Speculation and Negation using Transformer-Based Architectures
- Multitask_Learning_of_Negation_and_Speculation.ipynb:
This is the code for the following paper:
- Multitask Learning of Negation and Speculation using Transformers (Published at LOUHI 2020)
Code has been utilized from the following sources:
- The starter code was taken from this article on Named Entity Recognition with Bert.
- To be able to implement XLNetForTokenClassification and RobertaForTokenClassification for the Transformers library by Huggingface, some code was copy-pasted from that code base.
- The implementation of Early Stopping for PyTorch was adapted from Bjarten's implementation(GitHub).
The colab notebook can be executed directly from Google Colaboratory.
Datasets:
Contributors: Aditya Khandelwal, Benita Kathleen Britto