Psych research is starting to suggest that the organization of the human brain makes it especially bad at predicting the future. That is, we make more mistakes building timelines like this than can be explained by a simple lack of information. You could call it the Eloi-Morlock bias:
Regarding distant futures, however, we’ll be too confident, focus too much on unlikely global events, rely too much on trends, theories, and loose abstractions, while neglecting details and variation. We’ll assume the main events take place far away (e.g., space), and uniformly across large regions. We’ll focus on untrustworthy consistently-behaving globally-organized social-others. And we’ll neglect feasibility, taking chances to achieve core grand symbolic values, rather than ordinary muddled values. Sound familiar?
More bluntly, we seem primed to confidently see history as an inevitable march toward a theory-predicted global conflict with an alien united them determined to oppose our core symbolic values, making infeasible overly-risky overconfident plans to oppose them. We seem primed to neglect the value and prospect of trillions of quirky future creatures not fundamentally that different from us, focused on their simple day-to-day pleasures, mostly getting along peacefully in vastly-varied uncoordinated and hard-to-predict local cultures and life-styles.
The last two are great reads, with one caveat. The author specializes in asking uncomfortable questions, so it's best to take them as a challenge to your sense of open-mindedness. If you find yourself getting angry or thinking "this can't possibly be true," you might want to think about it some more.
Regarding distant futures, however, we’ll be too confident, focus too much on unlikely global events, rely too much on trends, theories, and loose abstractions, while neglecting details and variation. We’ll assume the main events take place far away (e.g., space), and uniformly across large regions. We’ll focus on untrustworthy consistently-behaving globally-organized social-others. And we’ll neglect feasibility, taking chances to achieve core grand symbolic values, rather than ordinary muddled values. Sound familiar?
More bluntly, we seem primed to confidently see history as an inevitable march toward a theory-predicted global conflict with an alien united them determined to oppose our core symbolic values, making infeasible overly-risky overconfident plans to oppose them. We seem primed to neglect the value and prospect of trillions of quirky future creatures not fundamentally that different from us, focused on their simple day-to-day pleasures, mostly getting along peacefully in vastly-varied uncoordinated and hard-to-predict local cultures and life-styles.
The original paper in Science: http://www.sciencemag.org/content/322/5905/1201.full.pdf
The quote is from here, with more explanation: http://www.overcomingbias.com/2008/11/abstractdistant.html
And here's more on the implications for evolutionary game theory: http://www.overcomingbias.com/2009/01/a-tale-of-two-tradeoff...
The last two are great reads, with one caveat. The author specializes in asking uncomfortable questions, so it's best to take them as a challenge to your sense of open-mindedness. If you find yourself getting angry or thinking "this can't possibly be true," you might want to think about it some more.