Hacker News new | past | comments | ask | show | jobs | submit login

This also related to that earlier bit:

> I also found them bizarrely, inexplicably obsessed with the question of whether AI would soon become superhumanly powerful and change the basic conditions of life on earth, and with how to make the AI transition go well. Why that, as opposed to all the other sci-fi scenarios one could worry about, not to mention all the nearer-term risks to humanity?

The reason they landed on a not-so-rational risk to humanity is because it fulfilled the psycho-social need to have a "terrible burden" that binds the group together.

It's one of the reasons religious groups will get caught up on The Rapture or whatever, instead of eradicating poverty.






Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: