You're right that I've been unfairly dismissive of him, and made my objections somewhat too bluntly. At least it's fostered a discussion.
However, let me be clear: how he did it is the only thing I care about. I am not convinced that the threat of superintelligence merits our resources compared to other concrete problems. To me the experiment is not meaninguflly different to stories of the temptation of christ in the desert. Except more fun than that story, because yudowsky is a more interesting character than satan.
EDIT: if rationality is about winning, what could be simpler than a game where you just keep repeating the same word in order to win? It seems like almost the base-case for rationality, if one accepts that definition.
I would submit that an unstated definition of rationality is "dealing with difficult, complex situations in ones life algorithmically" ie. most of HPMOR, the large amounts of self-help stuff on LR. Someone who had internalized this stuff would be more vulnerable than the average population to "spock-style bullshit", to reuse that unfortunate phrase.
Well, then let's see what we can agree on. I hope that you can agree that if one was to consider superintelligence a serious threat which needs dealing with, then AI boxing isn't the way to go in dealing with it?
That's what he was trying to show in all this, and I think that the point is made. How seriously to take superintelligent AIs is a different issue that he talks about elsewhere, and should be dealt with separately. But if you or someone els were to try to deal with it seriously, I'm pretty sure that you'd agree with me that the way to go about it isn't just boxing the AI and thinking that solves everything, right?
Oh yes, I agree with that premise. It's hard to disagree with. Milgram, the art of Sales plus the aforementioned Derren Brown and his many layers of deception are enough to make the point.
I suppose it's unfortunate that he came up with such an amazingly provocative way of demonstrating his argument, it's somewhat eclipsed the argument itself. I am definitely a victim of nerd sniping here. It must be the open-ended secrecy that does it.
However, let me be clear: how he did it is the only thing I care about. I am not convinced that the threat of superintelligence merits our resources compared to other concrete problems. To me the experiment is not meaninguflly different to stories of the temptation of christ in the desert. Except more fun than that story, because yudowsky is a more interesting character than satan.
EDIT: if rationality is about winning, what could be simpler than a game where you just keep repeating the same word in order to win? It seems like almost the base-case for rationality, if one accepts that definition.
I would submit that an unstated definition of rationality is "dealing with difficult, complex situations in ones life algorithmically" ie. most of HPMOR, the large amounts of self-help stuff on LR. Someone who had internalized this stuff would be more vulnerable than the average population to "spock-style bullshit", to reuse that unfortunate phrase.