Hacker News new | past | comments | ask | show | jobs | submit login

> it overlooks the fact that what triumphs in the long run is simply reproductive fitness.

Why can't that observation be taken into account? Isn't the entire point of the approach accounting for all inputs to the extent possible?

I think you are making invalid assumptions about the motivations or goals or internal state or etc of the actors which you are then conflating with the approach itself. That there are certain conditions under which the approach is not an optimal strategy does not imply that it is never competitive under any.

The observation is then that rationalism requires certain prerequisites before it can reliably out compete other approaches. That seems reasonable enough when you consider that a fruit fly is unlikely to be able to successfully employ higher level reasoning as a survival strategy.






> Why can't that observation be taken into account?

Of course it can be. I'm saying that AFAICT it generally isn't.

> rationalism requires certain prerequisites before it can reliably out compete other approaches

Yes. And one of those, IMHO, is explicit recognition that rationalism does not triumph simply because it is rational, and coming up with strategies to compensate. But the rationalist community seems too hung up on things like malicious AI and Roko's basilisk to put much effort into that.


This argument proves too much. If rationalism can't "triumph" (presumably over other modes of thought) because evolution makes moral realism unobservable, then no epistemic framework will help you - does empirically observing the brutality of evolution lead to better results? Or perhaps we should hypothesise that it's brutal and then test that prediction against what we observe?

I'm sympathetic to the idea that we know nothing because of the reproductive impulse to avoid doing or thinking about things that led our ancestors to avoid procreation, but such a conclusion can't be total because otherwise it is self defeating because is is contingent on rationalist assumptions about the mind's capacity to model knowledge.


The point being made is that rationalism is a framework. Having a framework does not imply competent execution. At lower levels of competence other strategies win out. At higher levels of competence we expect rationalism to win out.

Even then that might not always be the case. Sometimes there are severe time or bandwidth or energy or other constraints that preclude carefully collecting data and thinking things through. In those cases a heuristic that is very obviously not derived from any sort of critical thought process might well be the winning strategy.

There will also be cases where the answer provided by the rational approach will be to conform to some other framework. For example where cult type ingroup dynamics are involved across a large portion of the population.


> Having a framework does not imply competent execution,

Exactly right. It is not rationalism per se that is the problem, it is the way that The Rationalists are implementing it, the things they are choosing to focus their attention on. They are worried about things like hostile AI and Roko's Basilisk when what they should be worried about is MAGA, because that is not being driven by rationalism, it is being driven by Christian nationalism. MAGA is busily (and openly!) undermining every last hint of rationalism in the U.S. government, but the Rationalist community seems oddly unconcerned with this. Many self-styled Rationalists are even Trump supporters.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: