Skip to content

Latest commit

ย 

History

History
50 lines (34 loc) ยท 1.27 KB

File metadata and controls

50 lines (34 loc) ยท 1.27 KB

llm.go

Status GitHub Issues GitHub Pull Requests License

GPT-2 implementation written in go only using the standard library.

๐Ÿชž Quick start

Install python dependencies, output tokenized dataset

make setup

Run the training script:

make train

This will run go run ./cmd/traingpt2/main.go

Run the testing script:

make test

This will run go run ./cmd/testgpt2/main.go

TODO

  • Tokenize input text, the implementation of this is incorrect. Need to do pair matching not tries
  • Very slow, need to improve performance.
  • It runs in WASM but using WebGPU bindings might be fun.
  • More refactoring.
  • Running as CLI.

๐Ÿ–‹๏ธ License

See LICENSE for more details.

๐ŸŽ‰ Acknowledgements

  • This is a fork of Andrej Karpathy's llm.c written in pure go.