> Microsoft gained exclusive licensing to OpenAI's GPT-3 language model in 2020. Microsoft continues to assert rights to GPT-4, which it claims has not reached the level of AGI, which would block its licensing privileges.
Not sure this is a common knowledge - MSFT licence vis-a-vis AGI.
Fifth, the board determines when we've attained AGI. Again, by AGI we mean a highly autonomous system that outperforms humans at most economically valuable work. Such a system is excluded from IP licenses and other commercial terms with Microsoft, which only apply to pre-AGI technology.
> "Musk claims Microsoft's hold on Altman and the OpenAI board will keep them from declaring GPT-4 as a AGI in order to keep the technology private and profitable."
If he thinks GPT-4 is AGI, Elon should ask a team of GPT-4 bots to design, build and launch his rockets and see how it goes. If “economically valuable work” means creating terrible, wordy blog posts then yeah I guess it’s a risk.
I don’t think GPT-4 is AGI, but that seems like a foolish idea. An AGI doesn’t need to be hyperproficient at everything, or even anything. Ask a team of any non-aeronautical engineers to build a rocket and it will go poorly. Do those people not qualify as intelligent beings?
Have you used GPT-4? I'd criticize it in the opposite direction. It routinely defers to experts on even the simplest questions. If you ask it to tell you how to launch a satellite into orbit, it leads with:
>Launching a satellite into orbit is a complex and challenging process that requires extensive knowledge in aerospace engineering, physics, and regulatory compliance. It's a task typically undertaken by governments or large corporations due to the technical and financial resources required. However, I can give you a high-level overview of the steps involved:
You're just highlighting the issue. Nobody can agree on the definition of AGI. The most people would agree that being able to design, build, and launch rockets is definitely _not_ the definition. The fact that M$ has such a stronghold in OpenAI means that they won't declare anything as AGI even if most people would say it is.
I'm surprised such an important legal issue here is based on the definition of "AGI", seems really hard to define (I really think the concept is flawed). Does this consider that "most economically valuable work" is physical? And more importantly, with such money on the line, no one will agree on when AGI is attained.
Just a side node; we don’t actually even know. The glance we get to the stated GPT-4 model is highly censored and hindered so that it can be scaled to millions. What if OpenAI can use the uncencored version with the computing power of those millions devices without restrictions? Is that GPT-4 same as we get by spending $25 month?
Not sure this is a common knowledge - MSFT licence vis-a-vis AGI.