Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> LLM's cannot "assume". There is no thinking involved. It sees that the prompt looks like the monty hall problem and it just goes full steam ahead.

I think the poster's point was that many humans would do the same thing.

Try a completely different problem, one you invented yourself and see where you get? I'd be very interested to hear the response back here.



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: