Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

  in an effort to accomplish some other goal (most goals, 
  if you think about them long enough, could make use of 
  resources currently being used by humans) wipes us out
This is a line of reasoning put forward a lot, not only in reference to SMIs but also extraterrestrial entities (two concepts that actually have a lot in common), most notably by Stephen Hawking. We're instinctively wired to worry about our resources and like to think their value is universal. It's based on the assumption that even for non-human organisms, the Earth is the end-all-be-all prize. Nobody seems to question this assumption, so I will.

I posit there is nothing, nothing at all on Earth that couldn't be found in more abundance elsewhere in the galaxy. Also, Earth comes with a few properties that are great for humans but bad for everybody else: a deep gravity well, unpredictable weather and geology, corrosive atmosphere and oceans, threats from adaptive biological organisms, limited access to energy and rare elements.

There may well be reasons for an antagonistic or uncaring intelligence to wipe us all out, and an unlimited number of entities can be imagined who might do so just for the heck of it, but a conflict over resources seems unlikely to me. A terrestrial SMI starved for resources has two broad options to consider: sterilize the planet and start stripmining it, only to bump up against the planet's hard resource limitations soon after - or launching a single rocket into space and start working on the solar system, with a clear path to further expansion and a greatly reduced overall risk.

One other thing I'd like to comment on is this idea that an SMI has to be in some way separate from us. While it's absolutely possible for entities to develop that have no connection with humanity whatsoever, I think we're discounting 99% of the rest of the spectrum. It starts with a human, moving on to a human using basic tools, and right now we're humans with advanced information processing. I do not have the feeling that the technology I live my daily life with (and in) is all that separate from me. In a very real sense, I am the product of a complex interaction with my brain as the driving factor, but including just as essentially the IT I use.

When discussing SMI, this question of survival might have a shades-of-grey answer as well. To me, survival of the human mind does not mean "a continued, unmodified existence of close-to-natural humans on Earth". That's a very narrow-minded concept of what survival is. I think we have a greater destiny open to us, completing our long departure from the necessities of biology which we have begun millennia ago. We might fuse with machines, becoming SMIs or an integral component of machine intelligences. I think this is a worthwhile goal, and it's an evolutionary viable answer to the survival problem as well. It's in fact the only satisfying answer I can think of.



A superintelligence would likely pursue both paths simultaneously - stripmine the Earth, and head for space.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: