Hacker News new | past | comments | ask | show | jobs | submit login

Ya i feel like this issue is people think an LLM will someday "wake up" no, LLM's will just be multimodal and developed to use tools, and a software ecosystem around it will end up using the LLM to reason how to execute, basically the LLM will be the internal monologue of whatever the AGI looks like.



Agreed. I think it's more likely that we'll reach a point where their complexity is so great that no single person can usefully reason about their outputs in relation to their structure.

Not so much a them waking up as an us falling asleep.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: