Awesome
<h1 >llm.go</h1> <div > </div>GPT-2 implementation written in go only using the standard library.
🪞 Quick start
Install python dependencies, output tokenized dataset
make setup
Run the training script:
make train
This will run go run ./cmd/traingpt2/main.go
Run the testing script:
make test
This will run go run ./cmd/testgpt2/main.go
TODO
- Tokenize input text, the implementation of this is incorrect. Need to do pair matching not tries
- Very slow, need to improve performance.
- It runs in WASM but using WebGPU bindings might be fun.
- More refactoring.
- Running as CLI.
🖋️ License <a name = "license"></a>
See LICENSE for more details.
🎉 Acknowledgements <a name = "acknowledgement"></a>
- This is a fork of Andrej Karpathy's llm.c written in pure go.