Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
nanoGPT: The simplest repository for training medium-sized GPTs (github.com/karpathy)
3 points by isoprophlex on Jan 3, 2023 | hide | past | favorite | 1 comment


"The code itself aims by design to be plain and readable: train.py is a ~300-line boilerplate training loop and model.py a ~300-line GPT model definition, which can optionally load the GPT-2 weights from OpenAI."

Highly educational codebase if you're interested in learning the practical details of training and finetuning your own GPTs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: