Hacker News new | past | comments | ask | show | jobs | submit login

Sure! Rationality is what Eliezer called his project about teaching people to reason better (more empirically, more probabilistically) in the events I described over here: https://news.ycombinator.com/item?id=44320919 .

I don't know rationalism too well but I think it was a historical philosophical movement asserting you could derive knowledge by reasoning from axioms rather than observation.

The primary difference here is that rationality mostly teaches "use your reason to guide what to observe and how to react to observations" rather than doing away with observations altogether; it's basically an action loop alternating between observation and belief propagation.

A prototypical/mathematical example of a pure LessWrong-type "rational" reasoner is Hutter's AIXI (a definition of the "optimal" next step given an input tape and a goal), though it has certain known problems of self-referentiality. Though of course reasoning in this way does not work for humans; a large part of the Sequences is attempts to port mathematically correct reasoning to human cognition.

You can kind of read it as a continuation of early-2000s internet atheism: instead of defining correct reasoning by enumerating incorrect logic, ie. "fallacies", it attempts to construct it positively, by describing what to do rather than just what not to do.






Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: