Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I admit, my first sentence was in poor taste. Apologies to Andrej.

At the same time, saying that something isn't possible to prove when we literally have no idea about its provability isn't a good stance for a researcher to take.

This is similar to my problem with the list of AI researchers you've provided. Saying that there is nothing to fear and waving your hands isn't exactly scientific.

Also, it isn't like these people (Sam, Musk, etc) literally think it could happen any day. The point is that we should be aware of the risk and prepare accordingly -- why is that unreasonable?



I don't think anybody would posit that it's unreasonable to entertain the idea as kind of far-fetched long-term possibility, much like encounters with alien life or faster-than-light travel.

It's the fear-mongering that's the issue. It's as if these same pundits were warning us about the dangers of space travel because it could hypothetically cause us (1000 years from now?) to draw the attention of a dangerous alien civilization (does that even exist?) that could destroy the Earth. It's the same level of ridiculous speculation. And that has no place in the scientific discourse.

Write sci-fi novels if you care about this issue, but don't pretend it's science, much less a pressing technological issue.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: