A lot of people know what this is and it bothers them?
I’m still lost, so I looked at that older blog post and its also not explained there and the linked subreddit is the “non political one”
so, reading the room here, there is a political context and those above terms are political and I should google something about “lesswrong politics”
I’ll maybe check out that particular rabbit hole in synthesizing but can you enlighten me further because I still have no idea what you’re talking about
They're explicitly not political, lesswrong is a website/community and rationality is about trying to think better by being aware of normal cognitive biases and correcting for them. Also trying to make better predictions and understand things better by applying Bayes' theorem when possible to account for new evidence: https://en.wikipedia.org/wiki/Bayes%27_theorem (and being willing to change your mind when the evidence changes).
It's about trying to understand and accept what's true no matter what political tribe it could potentially align with. See: https://www.lesswrong.com/rationality
The reason the groups overlap a lot with AGI is that Eliezer Yudkowsky started less wrong and founded MIRI (the machine intelligence research institute). He's also formalized a lot of the thinking around the goal alignment problem and the existential risk of discovering how to create an AGI that can improve itself without first figuring out how to align it to human goals.
Great yeah that sounds like something I wish I knew existed
Its been very hard to find people able to separate their emotions from an accurate description of reality even if it sounds like a different political tribe, or moreso that people are more willing to assume you are part of a political tribe if some words don't match their political tribe’s description of reality even if what was said was most accurate
I found the community around 2012 and I remember wishing I had known it existed too.
In that list, the less wrong posts are probably what I'd read first since they're generally short (Scott Alexander's are usually long) and you'll get a feel for the writing.
As an aside about the emotions bit, it’s not so much separating them but recognizing when they’re aligned with the truth and when they’re not: https://www.lesswrong.com/tag/emotions
Et Cetera?
A lot of people know what this is and it bothers them?
I’m still lost, so I looked at that older blog post and its also not explained there and the linked subreddit is the “non political one”
so, reading the room here, there is a political context and those above terms are political and I should google something about “lesswrong politics”
I’ll maybe check out that particular rabbit hole in synthesizing but can you enlighten me further because I still have no idea what you’re talking about