This complaint seems to amount to "They believe something is very important, just like religious people do, therefore they're basically a religion". Which feels to me like rather too broad a notion of "religion".
That's a fairly reductive take of my point. In my experience with the Rationalist movement (who I have the misfortune of being 1-2 people away from), the millenarian threat of AGI remains the primary threat.
Whenever I try to get an answer of HOW (as in the attack path), I keep getting a deus ex machina. Reverting to a deus ex machina in a self purported Rationalist movement is inherently irrational. And that's where I feel the crux of the issue is - it's called a "Rationalist" movement, but rationalism (as in the process of synthesizing information using a heuristic) is secondary to the overarching theme of techno-millenarianism.
This is why I feel rationalism is for all intents and purposes a "secular religion" - it's used by people to scratch an itch that religion often was used as well, and the same Judeo-Christian tropes are basically adopted in an obfuscated manner. Unsurprisingly, Eliezer Yudkowsky is an ex-talmid.
There's nothing wrong with that, but hiding behind the guise of being "rational" is dumb when the core belief is inherently irrational.
My understanding of the Yudkowskian argument for AI x-risk is that a key step is along the lines of "an AI much smarter than us will find ways to get what it wants even if we want something else -- even though we can't predict now what those ways will be, just as chimpanzees could not have predicted how humans would outcompete them and just as you could not predict exactly how Magnus Carlsen will crush you if you play chess against him".
I take it this is what you have in mind when you say that whenever you ask for an "attack path" you keep getting a deus ex machina. But it seems to me like a pretty weak basis for calling Yudkowsky's position on this a religion.
(Not all people who consider themselves rationalists agree with Yudkowsky about how big a risk prospective superintelligent AI is. Are you taking "the Rationalist movement" to mean only the ones who agree with Yudkowsky about that?)
> Unsurprisingly, Eliezer Yudkowsky is an ex-talmid
So far as I can tell this is completely untrue unless it just means "Yudkowsky is from a Jewish family". (I hope you would not endorse taking "X is from a Jewish family" as good evidence that X is irrationally prone to religious thinking.)
> I take it this is what you have in mind when you say that whenever you ask for an "attack path" you keep getting a deus ex machina. But it seems to me like a pretty weak basis for calling Yudkowsky's position on this a religion.
Agree to disagree.
> So far as I can tell this is completely untrue
I was under the impression EY attended Yeshivat Sha'alvim (the USC of Yeshivas - rigorous and well regarded, but a "warmer" student body), but that was his brother. That said, EY is absolutely from a Daatim or Chabad household given that his brother attended Yeshivat Sha'alvim - and they are not mainstream in the Orthodox Jewish community.
And the feel and zeitgeist around the rationalist community with it's veneration of a couple core people like EY or Scott Alexander does feel similar to the veneration a subset of people would do for Baba Sali or Alter Rebbe in those communities.
Happy to agree-to-disagree but I will push on it just one bit more.
Let's take the chess analogy. I take it you agree that I would very reliably lose if I played Magnus Carlsen at chess; he's got more than 1000 Elo points on me. But I couldn't tell you the "attack path" he would use. I mean, I could say vague things like "probably he will spot tactical errors I make and win material, and in the unlikely event that I don't make any he will just make better moves than me and gradually improve his position until mine collapses", but that's the equivalent of things like "the AI will get some of the things it wants by being superhumanly persuasive" or "the AI will be able to figure out scientific/engineering things much better than us and that will give it an advantage" which Yudkowsky can also say. I won't be able to tell you in advance what mistakes I will make or where my pawn structure will be weak or whatever.
Does this mean that, for you, if I cared enough about my inevitable defeat at Carlsen's hands that expectation would be religious?
To me it seems obvious that it wouldn't, and that if Yudkowsky's (or other rationalists') position on AI is religious then it can't be just because one important argument they make has a step in it where they can't fill out all the details. I am pretty sure you have other things in mind too that you haven't made so explicit.
(The specific other things I've heard people cite as reasons why rationalism is really a religion also, individually and collectively, seem very unconvincing to me. But if you throw 'em all in then we are in what seems to me like more reasonable agree-to-disagree territory.)
> That said, EY is absolutely from a Daatim or Chabad household
I think holding that against him, as you seem to be doing, is contemptible. If his ideas are wrong, they're fair game, but insinuating that we should be suspicious of his ideas because of the religion of his family, which he has rejected? Please, no. That goes nowhere good.