Awesome
Happy Transformer
Documentation and news: happytransformer.com
Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.
3.0.0
- DeepSpeed for training
- Apple's MPS for training and inference
- WandB to track training runs
- Data supplied for training is automatically split into portions for training and evaluating
- Push models directly to Hugging Face's Model Hub
Read about the full 3.0.0 update including breaking changes here.
Tasks
Tasks | Inference | Training |
---|---|---|
Text Generation | ✔ | ✔ |
Text Classification | ✔ | ✔ |
Word Prediction | ✔ | ✔ |
Question Answering | ✔ | ✔ |
Text-to-Text | ✔ | ✔ |
Next Sentence Prediction | ✔ | |
Token Classification | ✔ |
Quick Start
pip install happytransformer
from happytransformer import HappyWordPrediction
#--------------------------------------#
happy_wp = HappyWordPrediction() # default uses distilbert-base-uncased
result = happy_wp.predict_mask("I think therefore I [MASK]")
print(result) # [WordPredictionResult(token='am', score=0.10172799974679947)]
print(result[0].token) # am
Maintainers
- Eric Fillion Lead Maintainer
- Ted Brownlow Maintainer
Tutorials
Text generation with training (GPT-Neo)
Text classification (training)
Text classification (hate speech detection)
Text classification (sentiment analysis)