Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have read Dennett and he's a fine analytic philosopher. But it's important to note that any discussion between Dennett and Dreyfus is a very broad discussion of continental vs analytic ... which is not to discredit Dennett's argument (afterall it's Dreyfus who picked the fight in the first place)


Then, Sir, I encourage you to get thee quickly to this Partially Examined Life podcast episode[1] with Nick Bostrom[2]. Only came out on the 6th of January and it makes for super interesting listening. It's based off of the text with his 2014 book in which he's gone to great lengths to think philosophically about this very existential threat -- in his opinion, man's most pressing.

Part of his reasoning goes like this. Any super smart goal oriented entity is going to desire/construct its own preservation as a sub-goal of its main goal (even if that goal is something as mundane as making paperclips) because if it ceases to be its goal will be put into jeopardy. To that end this super smart entity will then figure out ways to disable all attempts to constrain its non-existence. A lot of conclusions can be logically reached when one posits goal directed behaviour which we can assume any super intelligent agent is going to have. He talks about `goal content integrity'.

Bostrom argues for an indirect normative[4] approach because there is no way we can program or direct something that is going to be a lot smarter than ourselves and that won't necessarily share our values and that has any degree of goal-oriented behaviour, motivation and autonomous learning. Spoiler alert: Essentially I think he argues that we have to prime it to "always do what _you_ figure out is morally best" but I could be wrong.

There are also (global) sociological recommendations because, humans have been known to fuck things up.

[1] http://www.partiallyexaminedlife.com/2015/01/06/ep108-nick-b...

[2] http://www.nickbostrom.com/

[3] http://www.amazon.com/gp/product/0199678111/ref=as_li_tl?ie=...

[4] https://ordinaryideas.wordpress.com/2012/04/21/indirect-norm...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: