Hacker News new | past | comments | ask | show | jobs | submit login

> This was the whole message behind the release of GPT-4.5: capacities are growing linearly while compute costs are on a geometric curve

I don't see where that is coming from. Capacities aren't really measureable in that way. Computers either can do something like PdD level mathematics research more or less under their own power or they cannot with a small period of ambiguity as subhuman becomes superhuman. This process seems to me to have been mostly binary with relatively clear tipping points that separate models that can't do something from models that can. That isn't easily mapped back to any sort of growth curve.

Regardless, we're in the stage of the boom where people are patenting clicking a button to purchase goods and services thinking that might be a tricky idea. It isn't clear yet what parts of the product are easy and standard and what parts are difficult and differentiators. People who talk in vague terms will turn out to be correct and specific predictions will be right or wrong at random. It is hard to stress how young all these practical models are. Stable diffusion was released in 2022, and ChatGPT is younger than that - almost yesterday years old; this stuff is early days magic.

Models could easily turn out to be a commodity.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: