Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You clearly don't understand short-term and long-term memory and using multiple disciplinary information in problem solving.

LLMs can only provide answers to what already exists.

They cannot invent new answers.

Therefore, they cannot handle complexity.




That's so wrong, I don't even know where to begin. You should really look up the fundamentals of how these models work instead of listening to the "statistical parrot" nonsense that is constantly spewed around HN.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: