Maybe it's just me, but I find Kahneman frustrating to read and this article really demonstrates the problem: he defines a concept and then gives examples which don't meet the definition.
He defines intuition as thinking you know without knowing how you know. OK, great. Now the example: guessing someone's GPA based on the fact that they started reading early. Like most people, I guessed it would be on the high side. Kahneman calls this "intuition that is generated automatically with high confidence." And that's… just wrong.
I have very little confidence in my guess, and it certainly isn't generated automatically. It's based on a belief—which turns out to be wrong—that early reading has some relationship with college GPA.
So maybe in this example, he is using the dictionary definition of "intuition", which is basically non-conscious reasoning. He thinks this shows the problem with intuition, but again, that's just wrong.
I didn't get the answer wrong because I failed to reason consciously. It's because my belief about the relationship between early reading and GPA was wrong. If I take his advice and don't trust my intuition, and carefully and logically reason my way to the answer, guess what? I'm still wrong because my information is wrong.
His explanation of when to trust your intuition is similarly flawed. He basically says you can trust your intuition when you have good information. Well yeah. That's a generic statement that goes for almost any method of drawing conclusions about the world, and doesn't help us understand when intuition is useful or not useful.
I think at least some of it comes from the fact that Kahneman didn't write the article. The article is composed from both his speech that was referenced and from the book that he wrote, and kind of marries them together in a weird way. I find articles like this extremely thin because they usually synthesize points with very thin lines. Original authors will have hundreds of repeated examples and data sets, of which Kahneman is one of them. He certainly didn't win the Nobel Prize in Economic Sciences for nothing -- he has serious data to back up his research.
If I had to summarize Thinking, Fast & Slow up in one sentence: human intuition shouldn't be trusted when it comes to statistics problems, because humans aren't naturally statistical thinkers.
All of the research in the behavioral economics category shows this pattern very well. GPA distribution among populations is statistically quantifiable. Due to human cognitive biases, however, it appears to be a problem that has an obvious answer by intuition, and that answer is wrong.
The specific three qualities that are outlined as giving way to intuition boil down to having a large body of evidence and a very tight feedback loop, which statistics does not have but things like emotion reading do have. If you get the emotions of your significant other wrong you'll immediately see the repercussions. Casually misjudging someone on the street because of statistics will never see you being corrected.
But really, I think you should just read the book. This article doesn't even come close to doing behavioral economics justice.
I have read the book, and found it frustrating for the same reasons I mentioned. For example, the core concept of System 1 vs System 2. He says somewhere in the book early on that they aren't actually "systems", i.e. a physical brain structure. But then he uses language like "System 1 does X and System 2 does Y," implying there's some grounding in neuroscience, when there is in fact none. They are just a grab bag of functions that don't have much in common, but we're led to believe he's describing something fundamental about the brain.
The other thing that annoys me is his pessimism and nihilism of seeing human beings as fatally irrational. One response that I found interesting was the book The Enigma of Reason by two evolutionary psychologists who argue that reason evolved in humans specifically because we are social animals, and the purpose of reason is to persuade others. It's why we are so much better at catching other people's poor reasoning than our own. The implication is that reason is designed for group settings. Showing that isolated individuals reason badly is like riding a bike on an ice skating rink and saying the bike sucks. We test human reasoning in conditions its not designed for and then come away with an overly pessimistic view of humanity.
I've not read "Thinking Fast and Slow" but long ago I read "Heuristics and Biases"[0] and attended one of Kahneman's talks during his tour promoting his newest book.
I couldn't help but think that TFaS was the product not so much of a desire to share some new insights he'd recently discovered but rather that he saw (or some publisher pointed out to him) how authors such as Pinker, Greene, Tyson, et al were garnering considerable attention and revenue translating sophisticated scientific topics developed by others for a lay audience. As if someone said to him: "Hey, you were involved in discovering some sophisticated and important topics, why don't you translate those to a lay audience?!"
I have great respect for Kahneman and H&B is a wonderful collection. However I find it peculiar that the entirety of TFaS seems, on the surface (again I've not read it) strikingly similar to essay #22 in H&B by Steven A. Sloman titled "Two Systems of Reasoning". In it he uses Kahneman's and Tversky's work to argue for two competing strategies for arriving at the truth: associative vs rule-based.
It's as if Kahneman is borrowing from Sloman who borrowed from Kahneman.
> But then he uses language like "System 1 does X and System 2 does Y," implying there's some grounding in neuroscience, when there is in fact none.
I think saying "System 1 does X and System 2 does Y" is fine even without grounding in neuroscience. It seems equivalent to saying, "The foreach loop will iterate over every item in the collection," when in fact, there are nothing but registers being decremented (or going further, it's merely the presence or absence of electrical circuit).
The view of people as irrational has huge implications of how we think about public policy and even the concept of democracy too. It’s a huge threat to the enlightenment era form of classic liberalism. Unfortunately given the success of the new model from fields as disparate to economics to hostage negotiation (Never Split the Difference, Chris Voss) it seems this is closer to reality.
1) First, the underlying study likely is asking both 'what do you think' and 'how confident are you in your answer?" The conditions of the study are very different than the conditions of reading an article about Kahneman's work: when you read the article, you know you should have low confidence in your answers because he's a master of seeing places where our brains do screwy things.
2) He's defining 'good information' a little bit more precisely though - how do you know your information is "good?" Everyone will say they've got good information if you ask them, right?
Didn't read the article, just commenting on Kahneman. I also found his books very annoying. I was already familiar with most of the research he references and thought that on nearly every other page he slightly misrepresented someone's work. He doesn't just cherry pick, but writes like a Procrustes. If you are interested in pursuing the topic of heuristic thinking, Gerd Gigerenzer is equally interesting but more careful in his conclusions.
He defines intuition as thinking you know without knowing how you know. OK, great. Now the example: guessing someone's GPA based on the fact that they started reading early. Like most people, I guessed it would be on the high side. Kahneman calls this "intuition that is generated automatically with high confidence." And that's… just wrong.
I have very little confidence in my guess, and it certainly isn't generated automatically. It's based on a belief—which turns out to be wrong—that early reading has some relationship with college GPA.
So maybe in this example, he is using the dictionary definition of "intuition", which is basically non-conscious reasoning. He thinks this shows the problem with intuition, but again, that's just wrong.
I didn't get the answer wrong because I failed to reason consciously. It's because my belief about the relationship between early reading and GPA was wrong. If I take his advice and don't trust my intuition, and carefully and logically reason my way to the answer, guess what? I'm still wrong because my information is wrong.
His explanation of when to trust your intuition is similarly flawed. He basically says you can trust your intuition when you have good information. Well yeah. That's a generic statement that goes for almost any method of drawing conclusions about the world, and doesn't help us understand when intuition is useful or not useful.