Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Bard is overtly a reduced-resources model compared to the best version of the same technology

There's a scaling problem. ChatGPT/LLM systems cost far more to run per query than the Google search engine. Google can't afford to make those the first line query reply.

A big business model question is whether Google will insist you be logged in to get to the large language model.

At Google scale, these things are going to have to be a hierarchy. Not everything needs to go to a full LLM system. Most Google queries by volume can be answered from a a cache.



> Most Google queries by volume can be answered from a a cache.

And given how aggressively they limit the number of search results (in spite of listing some ridiculous number of results on page #1) that percentage may well be very large.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: