Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A true super intelligence would need the ability to evolve, and would probably evolve its own wants and needs.


> A true super intelligence would need the ability to evolve, and would probably evolve its own wants and needs.

Why? This seems like your personal definition of what a super intelligence is. Why would we even want a super intelligence that can evolve on its own?


It would still need an objective to guide the evolution that was originally given by humans. Humans have the drive for survival and reproduction... what about AGI?

How do we go from a really good algorithm to an independently motivated, autonomous super intelligence with free reign in the physical world? Perhaps we should worry once we have robot heads of state and robot CEOs. Something tells me the current, human heads of state, and human CEOs would never let it get that far.


Someone will surely set its objective for survival and evolution.


That would be dumb and unethical but yes someone will do it and there will be many more AIs with access to greater computational power that will be set to protect against that kind of thing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: