Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Odd analogy to use with Microsoft Windows, since GPT-3’s source is available, along with a series of papers that enables anyone with the money and knowledge to implement it themselves.


The reasons why MS windows and GPT-3 cannot easily be modified by anyone are different, but the result is the same: you're stuck with what you're sold.

To clarify: MS windows is closed source, but you can't very well train a large GPT model unless you're someone with the resources of OpenAI. So you're stuck with whatever they choose to train and make available to you.


The API allows you to fine-tune existing models on your own dataset [1]

[1] Cf second paragraph of https://openai.com/blog/openai-api/


"fine tuning", i.e. transfer learning is still limited by the training of the original model.

For instance, if the original model is trained on English text exclusively and you want to fine-tune it on Greek text you are S.O.L.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: