Hopefully the patent office will recognize that tacking on "...but with AI" isn't novel or non-obvious and a lot of the fever patents will be denied quickly.
Right now, having access to the inside info on what people are trying to use GPT for is itself possibly worth billions, if it can help you choose what to tune for and which startups to invest in…
Not sure what you mean, but for example, 2 separate competitors to DALL-E was released within months (SD and MJ). Arguable that both of these have since surpassed DALL-E's capabilities/ecosystem.
LLMs take vastly more resources to train and run than image generators. You can do quite a bit with SD on a few year old 4GB laptop GPU (that’s what I use mostly, though I’ve set up an instance with a better GPU on Compute Engine that I can fire up, too.)
GPT-NeoX-20B – an open (as in Open Source, not OpenAI) LLM intended as a start to move toward competing with GPT-3 (but still well behind, and smaller) requires a minimum 42GB of VRAM and 40GB system RAM to run for inference. The resources times time cost for training LLMs is…immense. The hardware cost alone of trying to catch up to ChatGPT is enormous, and unless a radical new approach that provides good results and insanely lower resource requirements is found, you aren’t going to have an SD-like community pushing things forward.
Will there be competition for ChatGPT? Yes, probably, but don’t expect it to look like the competition for Dall-E.
Basically just compute $ for training.