I'm really tired of people who claim the result of a psychological study is "obvious".
No, it isn't obvious at all and your example doesn't even correspond to the paper in any way. [1]
With GPS, it's easy to simply not pay attention. And if you're not even paying equal attention, then you're not even trying to learn. But that's not even what the study's about.
The study is about people actively trying to learn, first of all. It's about the rates of people who are trying to solve something from trial and error, not people trying to memorize a route they research or look up or are told.
But the study is specifically about discovering that choice confirmation bias appears to lead to higher learning rates, which is entirely non-obvious. In layman's terms, why would "assuming I'm right" lead to better learning outcomes? Traditionally, common sense tells us that prejudice or bias leads to worse learning outcomes.
So this is actually a quite interesting, non-obvious result.
At the same time, in a big enough forum, there will always be many people who have already intuited what most psychological scientific studies are about, and had it confirmed by their own experience many times over.
I’m equally irritated when people label an idea invalid until it’s been scientifically confirmed and peer reviewed. Science is often about confirmation, not about creation. Many people are likely to have realized something long before a team of scientists got funded, figured out how to test the idea, analyzed results, wrote papers, published, and officially confirmed it.
You're right, for some things people will have already intuited it, and in a large enough group someone will have intuited pretty much everything. But many people will also have the wrong intuition.
The danger here is if we find data that goes against our common sense, we have to really consider that data, not reject it and rely on our common sense only. Sometimes it's right, sometimes it's wrong. Having a study prove something by no means makes it definitive, but it does provide better evidence and lets us better know if our intuition aligns with reality.
Often people will learn something and then that thing will become intuitive even though it wasn't before. As an example, we've all had that calculus teacher that thought everything was obvious and students were dumb (or programming). Most of us struggle and THEN it becomes obvious. In fact, on a forum like this you might see people respond that they didn't struggle and it was intuitive and we won't know if that's real or if they struggled and just forgot (which our brain does a lot).
Sure. But it's often that people "know" something that is false. The expression "like pulling a band-aid" is a great example. It's what doctors and nurses believe to the degree that they willingly torture their patients because they believe honestly they are reducing total suffering. But they are wrong.
Source: the work of Dan Ariely. His Ted talk is a good introduction.
Sure. But you won’t see anyone commenting about that. I’m only saying some people will in fact intuit a thing and be right, and we should hardly admonish them for doing so.
The higher rated their comment is, the more other people likely intuited the same.
It’s almost a decent measure of how obvious the thing being proven actually was — assuming you could baseline it and compare it to others in a meaningful way.
>Many people are likely to have realized something long before a team of scientists got funded, figured out how to test the idea, analyzed results, wrote papers, published, and officially confirmed it.
Can you expand on that? Who are these many people? Any examples? Lay people who have nothing to do with the field, simply intuiting results without doing the work to carefully rule out external factors? Or people without scientific training performing science without realizing that what they're doing is science ?
"I'm really tired of people who claim the result of a psychological study is "obvious".
Thank you crazygringo2 for saying this.
"Obvious" is my most disliked word.
Its reserved for 'expert' people (gatekeepers) who forgot what its like to be a beginner.
"People aren't dumb, its just that the world is a complicated place."
-Professor Richard Thaler, thought leader of Behavioral Economics, 2017 Nobel Prize winner
Colloquially, "this will be obvious to anyone who X" means "you will already have strong belief values for this based on evidence gained through X". I wouldn't sweat it.
As you can imagine, sometimes this ends up declaring the obviousness of things that are untrue.
For instance, it's obvious to anyone who has walked in a stiff breeze that it is impossible to sail faster than the wind :)
Haven't finished reading the paper yet, but it is not entirely obvious that when participants are being directed on how to choose, they are paying as much attention as when they have to choose for themselves. That seems like a possible confound.
And the GPS analogy also suggested itself to me.
In fact, when the GPS gives me bad advice and I turn the wrong way, I seemed to have learned more from that too...
I'm really tired of people who claim the result of a psychological study is "obvious".
Personally, I don't get too wound up on people's opinions on psychological studies. They're fun little conversation filler, but psychology isn't a science.[1]
Plus, you make assumptions about what I'm saying is obvious and what the study actually showed. At first when I got a GPS, I even noticed the effect of not knowing where I was, so I even tried to pay attention in case I didn't want to use the GPS later. I noticed that even paying attention, trying to learn, you just don't engage the full set of neurons that you do when you're struggling to do something yourself. I still felt lost in situations where prior to a GPS I would have felt confident in my ability to navigate.
How could the study show that people were trying to learn when they were being told what to do? You can't control for that or see into their heads to see if they're making the same efforts in the same ways.
This study makes the same mistake of tons of psychological studies. It attempts to simplify mental processes or abstract them and then pretend that the simplification or abstraction stands in for some other mental process result.
It's weirdly ironic that you cited an article that doesn't support what you're asserting in response to a comment complaining about, among other things, your giving an irrelevant example that does not address the original paper.
There are also a group of people who take a simple study model, and apply it to anything and everything in their lives - that's about as anti-science as you can get. Maybe I'm more sensitive to this behavior since I spend a lot of time evaluating research papers for work, whereas the average person might just read a headline or two in a journal.
I'm sure folks mean well but it stems from this thinking that science is some kind of truth generator. It simply isn't, its a method of investigating the natural world. Only when enough people independently verify your result, will you get _closer_ to what could be defined as objectively true. And even after verification, there could still be huge gaps in our understanding of a natural phenomenon. There is no guarantee that one group of researchers is going to present a complete picture. It may take many researchers studying the same topic over many decades to reach a scientific consensus. Until then, its simply an idea/ideas - which could be interesting to think about by itself - but its not incontrovertible truth.
If you're right, you already have mental systems in place to produce the right answer. If you're wrong, the feedback will help your systems self-correct.
If you ask for help, you're using someone else's systems, and whether they're right or wrong carries very little weight about how you should be approaching things. You're not producing a system, you're adding an item to a lookup table that you might someday use to build a coherent approach.
'Fire is hot' is a good entry for a lookup table, but you also need to develop a general model of where potential dangers are and that you should approach them with caution. There is no number of entries that equal that system.
Thank you for this comment. I see "its obvious" here and on reddit all the time with scientific papers and you eloquently addressed why its so wrong to say.
If you have strong prior belief about outcome of your action, if it doesn't go as you planned. You have a way better signal to update your internal model.
No, it isn't obvious at all and your example doesn't even correspond to the paper in any way. [1]
With GPS, it's easy to simply not pay attention. And if you're not even paying equal attention, then you're not even trying to learn. But that's not even what the study's about.
The study is about people actively trying to learn, first of all. It's about the rates of people who are trying to solve something from trial and error, not people trying to memorize a route they research or look up or are told.
But the study is specifically about discovering that choice confirmation bias appears to lead to higher learning rates, which is entirely non-obvious. In layman's terms, why would "assuming I'm right" lead to better learning outcomes? Traditionally, common sense tells us that prejudice or bias leads to worse learning outcomes.
So this is actually a quite interesting, non-obvious result.
[1] https://www.nature.com/articles/s41562-020-0919-5