> I also found them bizarrely, inexplicably obsessed with the question of whether AI would soon become superhumanly powerful and change the basic conditions of life on earth, and with how to make the AI transition go well. Why that, as opposed to all the other sci-fi scenarios one could worry about, not to mention all the nearer-term risks to humanity?
The reason they landed on a not-so-rational risk to humanity is because it fulfilled the psycho-social need to have a "terrible burden" that binds the group together.
It's one of the reasons religious groups will get caught up on The Rapture or whatever, instead of eradicating poverty.
> I also found them bizarrely, inexplicably obsessed with the question of whether AI would soon become superhumanly powerful and change the basic conditions of life on earth, and with how to make the AI transition go well. Why that, as opposed to all the other sci-fi scenarios one could worry about, not to mention all the nearer-term risks to humanity?
The reason they landed on a not-so-rational risk to humanity is because it fulfilled the psycho-social need to have a "terrible burden" that binds the group together.
It's one of the reasons religious groups will get caught up on The Rapture or whatever, instead of eradicating poverty.