Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This scam is true for all AI technologies. It only "works" as far as we interpret it as working. LLMs generate text. If it answers our question, we say that the LLM works. If it doesn't, we say that it is "hallucinating".




Im sorta beginning to think some LLM/AI stuff is the Wizard of Oz(a fake it before you make it facade).

Like why can an LLM create a nicely designed website for me but asking it to do edits and changes to the design is a complete joke. Lots of the time it creates another brand new design (not what i asked all) and it's attempts at editing it LOL. It makes me think it does no design at all rather it just went and grab one from the ethers of the Internet acting like it created it.


> it does no design at all rather it just went and grab one from the ethers of the Internet

Bingo. It just "remembers" one of the many designs it has seen before.


It's not a scam because it does make you code faster even if you must review everything and possibly correct (either manually or via instruction) some things.

As far as hallucinations go, it is useful as long as its reliability is above a certain (high) percentage.

I actually tried to come up with a "perceived utility" function as a function of reliability: U(r)=Umax ⋅e^(−k(100−r)^n) with k=0.025 and n=1.5 is the best I came up with, plotted here: https://imgur.com/gallery/reliability-utility-function-u-r-u...


How is that any different from Googling something and believing any of the highly-SEO optimized results that pollute the front page?

That's the point, nobody really believes there is an intelligence generating Google results. It is a best-effort based engine. However, people have this belief that ChatGPT has somehow some intelligent engine generating results, which is incorrect. It is only generating statistically good results; if it is true or false depends on what the person using it will do with the results. If it is poetry, for example, it is always true. If it is how to find the cure for cancer it will with very high probability be false. But if you're writing a novel about a scientist finding a cure for cancer, then that same response will be great.

So "statistics" are enough to take gold at IMO?

Search engines didn't need $500B and growing in CAPEX.

"Trust me, bro, it works" has become kind of a theme song to these guys.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: