In one of their examples, they note “They saw ratings hover around 60% with their original, in-house tech — this improved by 7-8% with GPT-2 — and is now in the 80-90% range with the API.”
Bloomberg reports the API is based on GPT-3 and “other language models”.
If that’s true, this is a big deal, and it epitomizes OpenAI’s namesake. The largest NLP models require vast corporate resources to train, let alone put into production. Offering the largest model ever trained (with near-Turing results for some tasks) is a democratization of technology that would otherwise have been restricted to well-funded organizations.
Although the devil will be in the details of pricing and performance, this is a step worthy of respect. And it bodes well for the future.
How is this "democratization"? OpenAI trains a model, then they make it available through an API. You have no say in what that model is trained on or how (other than to say whether they can use your data- but not how), neither can you modify the model according to your needs. And of course, with no ability to modify the product you're buying you have no opportunity to innovate. You can wrap it up in a different kind of application, sure, but the nature and number of applications that it can be wrapped up in is restricted by the abilities of the model and therefore is entirely dependent on the choices made by OpenAI.
Imagine MS saying they "democratised" operating systems because, hey, you can buy their binaries, so everyone can use their operating system. Compare that kind of "democratisation" with open source oSs.
No, the truth is that as more and more resources are necessary to wring the last few drops of performance out of the current generation of deep neural net models it is only large, well-funded companies that have the resources to innovate - and everyone else is forced to follow in their wake. Any expectations that progress would lead to "democratisation" of deep neural networks research has gone out the window.
Odd analogy to use with Microsoft Windows, since GPT-3’s source is available, along with a series of papers that enables anyone with the money and knowledge to implement it themselves.
The reasons why MS windows and GPT-3 cannot easily be modified by anyone are different, but the result is the same: you're stuck with what you're sold.
To clarify: MS windows is closed source, but you can't very well train a large GPT model unless you're someone with the resources of OpenAI. So you're stuck with whatever they choose to train and make available to you.
Bloomberg reports the API is based on GPT-3 and “other language models”.
If that’s true, this is a big deal, and it epitomizes OpenAI’s namesake. The largest NLP models require vast corporate resources to train, let alone put into production. Offering the largest model ever trained (with near-Turing results for some tasks) is a democratization of technology that would otherwise have been restricted to well-funded organizations.
Although the devil will be in the details of pricing and performance, this is a step worthy of respect. And it bodes well for the future.