Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Take a look at your life and the signals you use to operate. If you are anything like me, summarizing them in a somewhat reasonable fashion feels basically impossible.

For example, my mother calls and asks if I want to come over.

How is an AI ever going to have the context to decide that for me? Given the right amount and quality of sensors starting from birth or soon after – sure, it's not theoretically impossible.

But as a grown up person that has knowledge about the things we share, and don't share, the conflicts in our present and past, the things I never talked about to anyone and that I would find hard to verbalize if I wanted to, or admit to myself that I don't.

It can check my calendar. But it can't understand that I have been thinking about doing something for a while, and I just heard someone randomly talking about something else, that resurfaced that idea and now I would really rather do that. How would the AI know? (Again, not theoretically impossible given the right sensors, but it seems fairly far away.)

I could try and explain of course. But where to start? And how would I explain how to explain this to mum? It's really fucking complicated. I am not saying that llm's would not be helpful here by generalization monsters, actually it's both insane and sobering how helpful they can be giving the amount of context that they do not have about us.



Exactly, even AGI would not be able to answer that question on my behalf.

Which means it cannot architect a software solution just by itself, unless it could read people's minds and know what they might want.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: