Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This. Despite how impressive the results are, there isn't a particular large moat to prevent competitors from entering the space.

Basically just compute $ for training.



Despite how impressive the results are, there isn't a particular large moat to prevent competitors from entering the space.

I have to assume that the only place busier than an AI lab is the patent office.


Hopefully the patent office will recognize that tacking on "...but with AI" isn't novel or non-obvious and a lot of the fever patents will be denied quickly.


they likely do lots of tricks and data collection inside which makes quality better.


Right now, having access to the inside info on what people are trying to use GPT for is itself possibly worth billions, if it can help you choose what to tune for and which startups to invest in…


100% - just IMO it's not particularly impenatrable as far as moats go.


exactly. this isn't a leetcode problem where all you have to do is re-run the function, or do it iteratively vs recursively.


Not sure what you mean, but for example, 2 separate competitors to DALL-E was released within months (SD and MJ). Arguable that both of these have since surpassed DALL-E's capabilities/ecosystem.

Not sure why ChatGPT will be any different.


> Not sure why ChatGPT will be any different.

LLMs take vastly more resources to train and run than image generators. You can do quite a bit with SD on a few year old 4GB laptop GPU (that’s what I use mostly, though I’ve set up an instance with a better GPU on Compute Engine that I can fire up, too.)

GPT-NeoX-20B – an open (as in Open Source, not OpenAI) LLM intended as a start to move toward competing with GPT-3 (but still well behind, and smaller) requires a minimum 42GB of VRAM and 40GB system RAM to run for inference. The resources times time cost for training LLMs is…immense. The hardware cost alone of trying to catch up to ChatGPT is enormous, and unless a radical new approach that provides good results and insanely lower resource requirements is found, you aren’t going to have an SD-like community pushing things forward.

Will there be competition for ChatGPT? Yes, probably, but don’t expect it to look like the competition for Dall-E.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: