Home

Awesome

KoBERT-NER

Dependencies

Dataset

How to use KoBERT on Huggingface Transformers Library

from transformers import BertModel
from tokenization_kobert import KoBertTokenizer

model = BertModel.from_pretrained('monologg/kobert')
tokenizer = KoBertTokenizer.from_pretrained('monologg/kobert')

Usage

$ python3 main.py --model_type kobert --do_train --do_eval

Prediction

$ python3 predict.py --input_file {INPUT_FILE_PATH} --output_file {OUTPUT_FILE_PATH} --model_dir {SAVED_CKPT_PATH}

Results

Slot F1 (%)
KoBERT86.11
DistilKoBERT84.13
Bert-Multilingual84.20
CNN-BiLSTM-CRF74.57

References