What I found interesting is that the period for reappearance of “failed ideas” is roughly the length of a career. Coincidence or due to any real effect I don’t know. I wrote a little about it here:
“Don’t be discouraged by the failure of technology or approaches of the past. If the problem is still important after all this time then it’s a problem worth solving. Don’t avoid unsuccessful solutions. Avoid insignificant problems. Success this time around could be unlocked by advances in any number of unrelated disciplines.”
I went through grad school and I saw how change comes about one funeral at a time. A big shot professor that has made their name with their pet theory/work will fiercely defend it against competing ideas, and people will defer to their expertise. It's not scientific, it's egotistic. They are a wall against progress that gets removed upon death. These professors can also create a dogma through their students who also all rely on the continued success of their theory for their employment. The incentives are misaligned against accepting new theories if it means someone will lose prestige.
We tend to romanticise it, but in the end it's just another job. It shouldn't, because selling you doing science is not the same as doing science, but on the other hand social expectations are that you must have a job, so what kind of exemption were we hoping for exactly? Once you realise it's ordinary people looking for whatever ordinary people look for anywhere else everything falls into place. Some science gets done regardless, we're talking about professionals here.
Academia can be likened more to a clergy or priesthood than a job. Although academics are expected to publish, it's not like the results or output are as tangible or quantifiable compared to most jobs.
Well, it's not that. Output is quantifiable and expected, try to survive doing your thing before tenure (at least and don't). Alas, the Republic of Letters is long gone. Kind of an ought-is problem.
I don't know, I've mentioned tenure though. I know that Peter Higgs has published IIRC 11? papers and that he's talked about his former status at his departament and not being productive enough for the modern academic system.
That was about 15 books, several I've heard of, but it's unclear about academic papers at least on that web page. He's clearly making an impact with his ideas.
Why does a scholarly work need to be labeled as an ‘academic paper’ when evaluating someone’s work? Did work from the 17th century not qualify because it wasn’t published in a journal?
But that's the system we built. Starting in college you're going to spend a decade and a half studying to be a world-class expert researching a topic of your own choosing.
And as soon as you become a professor you stop doing research and start begging for research funding so others can do what you did. And no one sees the problem with this so it continues.
"If an elderly but distinguished scientist says that something is possible, he is almost certainly right; but if he says that it is impossible, he is very probably wrong."
It is unfortunate -- sometimes very smart people try good ideas, but the world isn't quite ready yet so they fail, move on to something different, and write off the old idea.
That's more than a little ageist. Age has little to do with it. I've worked for--and been managed by--thick-headed, egotistical 20-somethings and stodgy older folks alike.
Youth who think they're forging new territory usually aren't, but there are plenty who think they are and have an arrogance about it that is insufferable.
> Youth who think they're forging new territory usually aren't
But that's part of the point of the article: The territory actively changes, so even if it's the same location in some sense, it's still new territory, ripe for exploration. Old people can see that, but it requires looking with fresh eyes, as opposed to only being willing to see what was there decades ago.
Not disputing the claims of you or msla, but how is this sentence ageist -
> Old people can see that, but it requires looking with fresh eyes, as opposed to only being willing to see what was there decades ago.
while this one is not -
> Youth who think they're forging new territory usually aren't, but there are plenty who think they are and have an arrogance about it that is insufferable.
At least in science there's the Planck's principle which is essentially "Science progresses one funeral at a time" - that adoption of major new mental models is driven mostly not by people changing their minds, but by them "dying out" and being replaced with new people who have adopted the new model since the beginning.
>>Old people can see that, but it requires looking with fresh eyes, as opposed to only being willing to see what was there decades ago.
The controversy arises from the structure of the sentence. It implies that 'looking with fresh eyes' is the rare thing, and the common thing is that older people are "only being willing to see what was there decades ago."
Sure, you technically acknowledged that it was possible, but the sentence structure guides the reader to conclude that your view is that old people only see the old stuff, thereby treating them as a monolithic unit, and thus ageist.
I've often had this sort of issue myself. I've found the only thing that helps is merciless self-editing, and keeping in mind two things.
First is a passing comment from a professional writer friend who was weighing how our choice of allocating more words or fewer gave different weight to each concept we were trying to get across in a short piece we were writing, saying something like "using that many words here gives it too much weight".
The second thing I've found often useful is to put the key concept and a key word at the end of the sentence and paragraph, where it is actually most punchy and emphatic. The beginning is second best, and the middle just buries it.
I notice that you sentence buried the 'fresh eyes' concept in the middle, and used a much longer ending phrase ending in 'decades ago', so compared to your intent stated here, the emphasis was kind of backwards.
Perhaps better would have been "Old people can see that, and although everyone's tendency to see just what they already know, they'll see the new opportunities just as well as anyone else when they make the effort to see with fresh eyes.".
This doesn't make any sense.
Aging is certainly a problem that needs to be solved, but the idea that you expressed here seems to be that younger people are better at their jobs - which is very wrong.
This line of thinking is akin to "If less of the postmen had diabetes, I would get more of my mail faster!"
Don't fall for these ageist logical inconsistencies
> Aging is certainly a problem that needs to be solved
by that logic, childhood is also a problem that needs to be solved.
I have to get metaphysical to argue against the mindset, but IMO the origin of seeing aging as a problem stems from seeing death as the opposite of life.
> but the idea that you expressed here seems to be that younger people are better at their jobs - which is very wrong
That is not the idea I took from the GP's comment about government workers being younger.
Younger people have a different perspective to older people. That might lead to a better direction.
If you think about progress/improvement as a vector, there's speed and direction. Young and old people might have the same speed but different directions, and younger people might have a direction which is more beneficial long term.
It's less that I think they'd be individually better at their jobs and more that I wonder (without evidence fwiw) if a slightly younger organization would have more diversity of thought because they're coming to things with objectively fresher eyes.
"If my chess team of 45-year-olds were 10 years younger, I would win more games." Is this also an ageist logical inconsistency? At some point claims like this are testable.
On a separate note, of course 20-year-olds are better at their jobs than 100-year-olds. So the phenomenon of younger people being better at their jobs is true at an extreme. You can argue that it doesn't extend much further than this extreme, but saying it doesn't exist at all is silly.
Head to Capitol Hill and you'll see that the legislative side of that is much much younger. Though perhaps the age of the legislators skews the average.
It would go perfect if they were not bothering and spoiling citizens and making dependents. We would be far more useful ourselves and less restricted to do real contributions to the society.
That generational thing sounds a lot like IT's eternal flip-flop between thick clients and thin clients. (I'm fully expecting "web browsers are too thick, we need a thin alternative" to appear any day now). On the back-end, serverless is time-sharing.
Your attempt-fail-stigmatise curve has a lot in common with the Gartner hype cycle.
I read a while back that scientific ideas usually take about 50 years to be accepted. Which means essentially that they never are accepted, the people who disagree with them just die and the next generation embrace them. Einstein pretty much hated quantum mechanics and ended up making a bunch of discoveries while making every effort to disprove it.
So if that’s how scientists do it - who should be the most amenable profession to new information and ideas, what chance do the rest of us have?
I mean.. Einstein did prove himself wrong a bunch he just hated it. Its not like the work was flawed, or not performed. Its hard to find the sort of fault in that you seem to be implying.
“A new scientific truth does not generally triumph by persuading its opponents and getting them to admit their errors, but rather by its opponents gradually dying out and giving way to a new generation that is raised on it. […] An important scientific innovation rarely makes its way by gradually winning over and converting its opponents: it rarely happens that Saul becomes Paul. What does happen is that its opponents gradually die out, and that the growing generation is familiarized with the ideas from the beginning: another instance of the fact that the future lies with the youth.”
I don't think this is accurate. Sometimes the facts are so plain that the new truth wins (or, at least, the old truth dies) quickly without much fuss.
Perhaps the earliest example of this was the destruction of the Ptolemaic view of the solar system by Galileo's observation of the phases of Venus. This was a "killer fact" that rapidly (after 1611) led to abandonment of that venerable theory, even by its supporters (although Clavius would die soon after). Sales of the standard texts on Ptolemaic astronomy, Sacrobosco's Sphere and Peuerbach's Theoricae novae planetarum went into immediate terminal decline.
A more recent example is the Big Bang's success over the Steady State model. The cosmic microwave background radiation was such an overwhelming piece of evidence in favor of the former that only cranks persisted with the latter (including unfortunately Hoyle, admittedly.)
“Don’t be discouraged by the failure of technology or approaches of the past. If the problem is still important after all this time then it’s a problem worth solving. Don’t avoid unsuccessful solutions. Avoid insignificant problems. Success this time around could be unlocked by advances in any number of unrelated disciplines.”
https://blog.eutopian.io/the-next-big-thing-go-back-to-the-f...