Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are times it just generates complete nonsense that has nothing to do with what I said, but it's certainly not most of the time. I do not know how often, but I'd say it's definitely under 10% and almost certainly under 5% that the above happens.

Sure, LLMs are incredibly impressive from a technical standpoint. But they're so fucking stupid I hate using them.

> This is not necessarily useful but it's quite impressive.

I think we mostly agree on this. Cheers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: