Hacker News new | past | comments | ask | show | jobs | submit login

The other problem is that all the details matter all the time.

Like if you want to mathematically model what happens in a pool table hall when somebody strikes a ball with a great deal of force... by the time you get to the six or seventh bounce you are going to have to start to take into account the position and movement of people standing around the table watching it. The airflow, the vibration of them moving, the relative gravitational forces, etc. It all matters all the time.

And the problem only gets worse the larger the scale and longer the timelines.

Like if you want to manage a economy.

It is tempting to want to look at "things from a high level" and imagine that all the details just kinda average themselves out. So it isn't necessary to figure out the behavior of each individual in a national economy. It should be possible to simply plot out the results of their decision making over time and extrapolate that into the future to come up with meaningful policy decisions and 5 year plans.

The problem is that that doesn't work. Because all the details matter all the time.

Also the very act of making policies causes changes in the behavior economy in wildly unpredictable manners. Every individual actor involved is going to change their behavior and decision making based on your decision making, which then changes the behavior and decision making of every other individual, etc etc. In a endless fractal involving billions of actors, since your national economy is not isolated from the forces of every other economy and visa versa.

Also trying to make targets out of measurements and indicators tends to destroy the value of the measurements and indicators. Meaning that by setting policies you are destroying the information you are basing your decisions off of.

So you can't collect enough information to make good decisions. The information you receive is already obsolete by the time you get it. And the act of trying to create rules and policies based on the information you do have tends to destroy what little value it has left.




You sound like you would really enjoy the book Seeing Like a State: How Certain Schemes to Improve the Human Condition Have failed.

I wonder if a lot of this comes back to the enlightenment/science-y way of looking at the world that imagines that the way to understand stuff is to break it into subproblems, solve those, and build back up from there. It relies on a fundamental assumption that there are separate things instead of a big continuous process of happening. I recently read about a study where participants were asked to pick the best car for a set of needs, and were given 4 variables per car in one case, and 16 variables per car in another. Then, each group was either distracted while pondering, or allowed to think through it directly/consciously. The conscious thought group did better than distracted group did when there were 4 variables, but worse when there were more. Intuition is great at the missable details.


…and that’s all before you get into all the other logical fallacies that tend to compromise one’s perspective. Anything that requires anticipating and/or interpreting the behaviors of other people, or that involves accounting for risks or probabilities—these are especially fraught as our own instincts and nature actively works to warp objective reality.

In the context of policy making (or presidential fiat, as the case may be), there is always the risk of mistaking what people should do with what they will do. A pragmatic strategy for success will include systems that can help to thwart the worst impulses of our flawed reasoning, including things like dispassionate peer reviewed analyses (oops) that is untethered by the ambitions or ideologies of individual people or groups (oops), a diverse array of advisory opinions (oops), functional checks on monolithic authority (oops), and mechanisms for correcting prior mistakes (fingers crossed).

I think this all contributes to the phenomenon that folks have (a bit erroneously) associated with the Dunning-Kruger effect—essentially the idea that people who haven’t learned enough to know how much they don’t know are dangerously overconfident and naive. That said, I think there is a tendency to assume this about others that’s probably fallacious in and of itself. In the case of current events, I don’t believe the individuals involved actually care enough to have even mounted the left peak of the Dunning-Kruger chart, but rather are fully uninformed and unconcerned with much of any implications outside of their own very narrow ideological ends (it’s probably more accurate to apply Dunning-Kruger to the ideology itself, or maybe the broader coalition of partisan cohorts who share it, than it is the people wielding it).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: