This is vibe-based, but I think the Rationalists get more vitriol than they deserve. Upon reflecting, my hypothesis for this is threefold:
1. They are a community—they have an in-group, and if you are not one of them you are by-definition in the out-group. People tend not to like being in other peoples' out-groups.
2. They have unusual opinions and are open about them. People tend not to like people who express opinions different than their own.
3. They're nerds. Whatever has historically caused nerds to be bullied/ostracized, they probably have.
> They are a community—they have an in-group, and if you are not one of them you are by-definition in the out-group.
The rationalist community is most definitely not exclusive. You can join it by declaring yourself to be a rationalist, posting blogs with "epistemic status" taglines, and calling yourself a rationalist.
The criticisms are not because it's a cool club that won't let people in.
> They have unusual opinions and are open about them. People tend not to like people who express opinions different than their own.
Herein lies one of the problems with the rationalist community: For all of their talk about heterodox ideas and entertaining different viewpoints, they are remarkably lockstep in many of their opinions.
From the outside, it's easy to see how one rationalist blogger plants the seed of some topic and then it gets adopted by the others as fact. A few years ago a rationalist blogger wrote a long series postulating that trace lithium in water was causing obesity. It even got an Astral Codex Ten monetary grant. For years it got shared through the rationalist community as proof of something, even though actual experts picked it apart from the beginning and showed how the author was misinterpreting studies, abusing statistics, and ignoring more prominent factors.
The problem isn't differing opinions, the problem is that they disregard actual expertise and try ham-fisted attempts at "first principals" evaluations of a subject while ignoring contradictory evidence and they do this very frequently.
> The rationalist community is most definitely not exclusive.
I agree, and didn't intend to express otherwise. It's not an exclusive community, but it is a community, and if you aren't in it you are in the out-group.
> The problem isn't differing opinions, the problem is that they disregard actual expertise and try ham-fisted attempts at "first principals" evaluations of a subject while ignoring contradictory evidence
I don't know if this is true or not, but if it is I don't think it's why people scorn them. Maybe I don't give people enough credit and you do, but I don't think most people care how you arrived at an opinion; they merely care about whether you're in their opinion-tribe or not.
> Maybe I don't give people enough credit and you do, but I don't think most people care how you arrived at an opinion; they merely care about whether you're in their opinion-tribe or not.
Yes, most people don't care how you arrived at an opinion, they rather care about the practical impact of said opinion. IMO this is largely a good thing.
You can logically push yourself to just about any opinion, even absolutely horrific ones. Everyone has implicit biases and everyone is going to start at a different starting point. The problem with string of logic for real-world phenomena is that you HAVE to make assumptions. Like, thousands of them. Because real-world phenomena are complex and your model is simple. Which assumptions you choose to make and in which directions are completely unknown, even to you, the one making said assumptions.
Ultimately most people aren't going to sit here and try to psychoanalyze why you made the assumptions you made and if you were abused in childhood or deduce which country you grew up in or whatever. It's too much work and it's pointless - you yourself don't know, so how would we know?
So, instead, we just look at the end opinion. If it's crazy, people are just going to call you crazy. Which I think is fair.
Lockstep like this? https://www.lesswrong.com/posts/7iAABhWpcGeP5e6SB/it-s-proba... (a post on Less Wrong, karma score currently +442, versus +102 and +230 for the two posts it cites as earlier favourable LW coverage of the lithium claim -- the comments on both of which, by the way, don't look to me any more positive than "skeptical but interested")
Or maybe this https://substack.com/home/post/p-39247037 (I admit I don't know for sure whether the author considers himself a rationalist, but I found the link via a search for whether Scott Alexander had written anything about the lithium theory, which it looks like he hasn't, which turned this up in the subreddit dedicated to his writing).
Speaking of which, I can't find any sign that they got an ACX grant. I can find https://www.astralcodexten.com/p/acx-grants-the-first-half which is basically "hey, here are some interesting projects we didn't give any money to, with a one-paragraph pitch from each" and one of the things there is "Slime Mold Time Mold" talking about lithium; incidentally, the comments there are also pretty skeptical.
So I'm not really seeing this "gets adopted by the others as fact" thing in this case; it looks to me as if some people proposed this hypothesis, some other people said "eh, doesn't look right to me", and rationalists' attitude was mostly "interesting idea but probably wrong". What am I missing here?
> Lockstep like this? https://www.lesswrong.com/posts/7iAABhWpcGeP5e6SB/it-s-proba... (a post on Less Wrong, karma score currently +442, versus +102 and +230 for the two posts it cites as earlier favourable LW coverage of the lithium claim -- the comments on both of which, by the way, don't look to me any more positive than "skeptical but interested")
That post came out a year later, in response to the absurdity of the situation. The very introduction of that post has multiple links showing how much the SMTM post was spreading through the rationalist community with little question.
Pretending that this theory didn't grip the rationalist community all the way to top bloggers like Yudkowsky and Scott Alexander is revisionist history.
> That post came out a year later [...] multiple links showing how much the SMTM post was spreading through the rationalist community with little question.
The SMTM series started in July 2021 and finished in November 2021; there was also a paper, similar enough that I assume it's by the same people, from July 2021. The first of those "multiple links" is from July 2021, but the second is from January 2022 and the third from May 2022. The critical post is from June 2022. I agree it's a year later than something but I'm not seeing that the SMTM theory was "spreading ... with little question" a year before it.
The "multiple links" you mention -- the actual number is three -- are the two I mentioned before and a third that (my apologies!) I had somehow not noticed. That third one is at +74 karma, again much lower than the later critical post, and it doesn't endorse the lithium theory.
The one written by E.Y. is the second. Quite aside from the later disclaimer, it's hardly an uncritical endorsement: "you are still probably saying "Wait, lithium?" This is still mostly my own reaction, honestly." and "low-probability massive-high-value gamble".
What about the first post? That one's pretty positive, but to me it reads as "here's an interesting theory; it sounds plausible to me but I am not an expert" rather than "here's a theory that is probably right", still less "here's a theory that is definitely right".
The comments, likewise, don't look to me like lockstep uncritical acceptance. I see "here are some interesting experiments one could do to check this" and "something like this seems plausible but I bet the actual culprit is vegetable oils" and "something like this seems plausible but I bet the actual culprit is rising CO2 levels" and "I bet it's corn somehow" and "quite convincing but didn't really rule out the obvious rival hypothesis" and so forth; I don't think a single one of the comments is straightforwardly agreeing with the theory.
If you've found something Scott Alexander wrote about this then I'd be interested to see it. All I found was that (contrary to what you claimed above) it looks like ACX Grants declined to fund exploration of the lithium theory but included that proposal in a list of "interesting things we didn't fund".
So I'm just not seeing this lockstep thing you claim. Maybe I'm looking in the wrong places. The specific things you've cited don't seem like they support it: you said there was an ACX grant but there wasn't; you say the links in the intro to that critical post show the theory spreading with little question, but what they actually show is one person saying "here's an interesting theory but I'm not an expert", E.Y. saying "here's a theory that's probably wrong but worth looking into" (and later changing his mind), and another person saying "I put together some data that might be relevant"; in every case the comments are full of people not agreeing with the lithium theory.
> The very introduction of that post has multiple links showing how much the SMTM post was spreading through the rationalist community with little question.
By "multiple links" you're referring to the same "two posts". Again, they weren't as popular, nor were they as uncritical as you describe. From Yudkowsky's post, for example:
> If you know about the actual epidemiology of obesity and how ridiculous it makes the gluttony theory look, you are still probably saying "Wait, lithium?" This is still mostly my own reaction, honestly.... If some weird person wants to go investigate, I think money should be thrown at them, both to check the low-probability massive-high-value gamble
Yudkowsky's argument is emphatically not that the lithium claim is true. He was merely advocating for someone to fund a study. He explicitly describes the claim as "low-probability", and advocates on the basis of a (admittedly clearly subjective) expected-value calculation.
> One of the links is a Eliezer Yudkowsky blog praising the work
That does not constitute "praise" of the work. Yudkowsky only praised the fact that someone was bucking the trend of
> almost nobody is investigating it in a way that takes the epidemiological facts seriously and elevates those above moralistic gluttony theories
.
> Pretending that this theory didn't grip the rationalist community all the way to top bloggers like Yudkowsky and Scott Alexander is revisionist history.
Aside from the remark given in the other reply to your comment, I wonder what the standard is: how quickly should a community appear to correct its incorrect beliefs for them to not count as sheep?
> A few years ago a rationalist blogger wrote a long series postulating that trace lithium in water was causing obesity. It even got an Astral Codex Ten monetary grant. For years it got shared through the rationalist community as proof of something
As proof of what, exactly? And where is your evidence that such a thing happened?
> while ignoring contradictory evidence and they do this very frequently.
The evidence available to me suggests that the rationalist community was not at all "lockstep" as regards the evaluation of SMTM's hypothesis.
Agree, but I think there is another, more important factor. They are a highly visible part of the internet, and their existence is mainly internet-based. This means that the people assessing them are mainly on the internet, and as we all know, internet discourse tends to the blandly negative (ironically my own comment is a mild example of this).
HN judges rationality quite severely. I mean, look at this thread about Mr. Beast[1], who it's safe to say is a controversial figure, and notice how all the top comments are all pretty charitable. It's pretty funny to take the conversation there and then compare the comments to this article.
Scott Aaronson - in theory someone HN should be a huge fan of, from all reports a super nice and extremely intelligent guy who knows a staggering amount about quantum mechanics - says he likes rationality, and gets less charity than Mr. Beast. Huh?
The people commenting under the Mr. Beast post are probably different to the people commenting under this post.
Anyway, Mr. Beast doesn't really pretend to be more than what he is afaik. In contrast, the Rationalist tendency to use mathematics (especially Bayes's theorem) as window dressing is really, really annoying.
Most people are trying to be rational (to be sure, with varying degrees of success), and people who aren't even trying aren't really worth having abstract intellectual discussions with. I'm reminded of CS Lewis's quip in a different context that "you might just as well expect to be congratulated because, whenever you do a sum, you try to get it quite right."
Being rational and rationalist are not the same thing. Funnily this sort of false equivalence that relies on being "technically correct" is at the core of what makes them...difficult.
There's more to the group identity than just applying logic and rationality, yes. But surely 'rationalists' of all people wouldn't want to take the position that some of their key ideas don't result primarily from applying rational thought processes. Any part of their worldview that doesn't follow immediately from foundational principles of logic and reason is presumably subject to revision based on evidence and argument.
Fittingly enough, the Rationalist community talks about this a lot. The canonical article is here ("I can tolerate anything except the outgroup").*
The gist is that if people are really different from us then we tend to be cool with them. But if they're close to us - but not quite the same - then they tend to annoy us. Hacker News people are close enough to Rationalists that HN people find them annoying.
It's the same reason why e.g. Hitler-style Neo Nazis can have a beer with Black Nationalists, but they tend to despise Klan-style Neo Nazis. Or why Sunni and Shia Muslims have issues with each other but neither group really cares about Indigenous American religions or whatever.
Three examples of feelings-based conclusions were presented. There is what is so, and how you feel about them. By all means be empirical about what you felt, and maybe look into that. “How this made me feel” describes the cause of how we got the USA today.
I will throw in an additional factor: any group, community, or segmentation of the general population wherein the participants both tend to have a higher than average intelligence (whatever that means) and whose preoccupation revolves around almost any form of cogitation, consideration, products of human thought ... will invariably get hit with some form of snobbery/envy, even if no explicitly stated intelligence threshold or gatekeeping is present.
Bluntly put, you are not allowed to be even a little smart and not all "aww shucks" about it. It has to be in service of something else like medicine or being a CPA. (Fun fact I found in a statistics course: the average CPA has about five points of IQ on the average doctor.) And it is almost justified, because you are in constant peril of falling down into your own butt until you disappear, but at the same time it keeps a lot of people under the thumb (or heel, pick your oppressive body part) of dumbass managers and idiots who blithely plow forward without a trace of doubt.
I think these are all true and relevant, but the main problem is that their thesis that "ASI alignment will be extremely difficult" can only really be proven in hindsight.
It's like they're crying wolf but can't prove there's actually a wolf, only vague signs of one, but if the wolf ever becomes visible it will be way too late to do anything. Obviously no one is going to respect a group like that and many people will despise them.
Nope. Has nothing to do with them being nerds. They are actively dangerous, their views almost always lead to extremely reactionary politics. EA and RA are deeply anti-human. In some cases that manifests as a desire to subjugate humanity with an iron fist technocratic rule, in other cases it manifests as a desire to kill off humanity.
Either way, as an ideology it must be stopped. It should not be treated with kids gloves, it is an ideology that is actively influencing the ruling elites right now (JD Vance, Musk, Thiel are part of this cult, and also simultaneously believe in German-style Nazism, which is broadly compatible with RA). The only silver lining is that some of their ideas about power-seeking tactics are so ineffective they will never work -- in other words, humanity will prevail over these ghouls, because they came in with so many bad assumptions that they've lost touch with reality.
Huh? They give millions of dollars to global humanitarian development funds saving at least 50,000 lives per year. Maybe you’re taking a few kooks who have a controversial lecture somewhere as representing everyone else.
It is true that there were EAs before the conception of "longtermism" that were relatively mundane and actually helping humanity, and not part of a death cult. But those have been shunned from the EA movement for a while now.
The support for eugenics and ethnic cleansing, the absolute obsession with strictly utilitarian ethics and ignorance of other ethics, the "kill a bunch of humans now so that trillions can live in the future" longtermist death cult, and the whole Roko's basilisk worship that usually goes like "one AI system can take over the entirety of humanity and start eating the galaxy, therefore we must forcefully jump in the driver seat of that dangerous AI right now so that our elite ideology is locked in for a trillion years of galactic evolution".
1. They are a community—they have an in-group, and if you are not one of them you are by-definition in the out-group. People tend not to like being in other peoples' out-groups.
2. They have unusual opinions and are open about them. People tend not to like people who express opinions different than their own.
3. They're nerds. Whatever has historically caused nerds to be bullied/ostracized, they probably have.