Home

Awesome

Yet another pre-trained model for Thai BERT

thbert

BERT, a pre-trained unsupervised natural language processing model, prepared for fine-tuning to perform NLP downstream tasks significantly.

To enable research oppotunities with very few Thai Computational Linguitic resources, we willingly introduce fundamental language resouces, Thai BERT, build from scratch for researchers and enthusiast.

Pre-trained models

Each .zip file contains three items:

Pre-training data

Source

Tokenization