Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What LLMs are really good at is guessing from fuzziness which is something formal languages are usually bad at. I can often ask for things I don't know much about in the wrong way and it gets to a response that shows where I was wrong and what may have misled me.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: