Hacker News new | past | comments | ask | show | jobs | submit login

There are smaller models that you can freely play around with that work in roughly the same way. If you're working on a fairly regular computer some reasonable options are GPT2 or GPT-Neo. These can both perform inference on your local CPU if you have 8GB or more of RAM.

They are much less powerful that GPT-3, but they can still be fun for simple text generation or NLP tasks. You can play around with one of the smaller GPT-Neo models that should fit in RAM if run locally here:

https://huggingface.co/EleutherAI/gpt-neo-1.3B

That page includes instructions to run this locally in Python.

As others mentioned, there are larger models available, but they tend to be expensive to setup and use as an individual.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: