Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That may be so, but if that is how we define AGI, then does it really need to be one to "have anything like a mind or intentionality"?


I don't believe AGI needs to have actual consciousness in order to functionally be AGI, and I personally am not of the view that we will ever make a conscious computer. That said, intentionality could certainly impact the way it operates, so it's something I think is worth keeping in mind for trying to predict its behavior.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: