Hacker News new | past | comments | ask | show | jobs | submit login

The whys are obvious. What is not obvious is why we allow bodies like the NSF to continue advocating nonspecifically for careers in STEM when there is no domestic market for most PHDs. It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

It's worse than that they encourage young people to pursue a brutal career. They require what little money is allotted to research to be partially spent on making this problem worse.

But most readers of nature wouldn't like to hear those things.




What makes you think 'most' faculty appreciate the pyramid scheme? Many of us spend a lot of time worrying about our postdocs' future, in no small part because we recognize that the struggle for funding never actually ends. Tenure is not the end of the rainbow, it's reassurance that your salary can't go all the way to zero - but your lab, your team, even your office space absolutely can go to zero. At least in CS there's a career path to success that skips indentured servitude. In biology, unless you want to be a lifelong lab tech, good luck trying to get somewhere in biotech without toiling in the academic trenches. I promise, even those of us a bit closer to the top of the pyramid wish it were otherwise.


Unless you're amazingly talented at nurturing your postdocs, something like half of them will not have lasting careers in academia. I applaud your attention to their futures. It would help others in your position to understand what you envision for these non-academic-future postdocs and how you prepare them for their futures. What are you doing for them?


This is the right question to ask - not what NSF or NIH can do. I'm afraid I don't have a very satisfying answer, I'm still trying to figure it out myself. The first thing I tell my postdocs is that they can't put their life on hold and wait for adulthood to start after they finish this training (or more training). We need to be discussing career paths now, not 18 months from now. The second thing I tell them is that in my field (medicine) it's likely they can make a greater contribution on the industry side than in academia, because as important as fundamental (basic) research can be, industry is where most new treatments come from. For some I encourage entrepreneurship - harder in bio than CS but do-able for the right person. And some /can/ stay in academia as staff scientists - not responsible for running their own labs or generating their own funding, but acting mostly autonomously.


It's great to think about what you can do personally, but actually the NSF/congress did create this problem by choosing the financial structure for science in the US in the 40s-50s. That was done centrally and was not an organic extension of how science was previously funded. Many possible alternatives such as direct student funding like they do in Canada (advisors who are advisors not feudal lords), are imaginable. The best fit depends a lot on field. For example I'd say medical research is some of the most-marketable, and most medicine faculty choose academia for the right reasons.

Unfortunately in my benighted corner of science things are going the wrong way. The NSF actually just created a master's program for managing a sort of scientific software development for which there is no market. It did this because saying scientists care about education and commercialization "sounds good" on a grant proposal. This will do nothing but waste 50million, and 40 highly skilled human years of time. It's laughable to anyone in the field, but such are the incentives at NSF.


Very satisfying, I think. By all means, keep working at it! And this is great already.


It's also worth noting how the educational industrial complex plays into this. In my college, the university takes a whopping 50% of grant money in administrative fees, leaving so much less for actual hiring of postdocs and graduate students.


> In my college, the university takes a whopping 50% of grant money in administrative fees, leaving so much less for actual hiring of postdocs and graduate students.

OMG


This is effectively your research team's rent and taxes -- it goes to pay for things like the accountants, the heating bills, building repairs, instructional support, upgrading the building's electrical grid to support your new equipment, all of the library (books, journals, computers, librarians), campus security, student health, interest on debt taken out to build the building research takes place in, deprication of buildings and equipment etc. 50 percent is maybe on the high end, but the low end would be maybe 30 percent.

Many grants come with a stipulation that no more than X percent be allocated to administrative fee rate. Which basically means the non-stipulated grants have to pay an even higher rate. I believe I've seen some people argue that the ivy league's rather high indirect cost rate with the NSF, which was negotiated / grandfathered in, is an unfair advantage.


a) I'm a faculty member :P b) I meant you no personal affront, there are good an bad actors, but indisputably the career incentives are firmly directed at irresponsible ruinous growth in student/postdoc populations. At the very least professors should be removed from papers in which they contributed less than 10% of the effort.


Look at what someone does. Not what they say.

You say you worry. But you cash your cheques and continue acting in your capacity as if university degrees mean something.


University is where all the shiny tech is created that people like Peter Thiel use to make enough money to be able to tell people not to go to university.


I feel that could be said of higher edu in general. That is, it's being promoted and recommended to too many who just aren't the right fit. God forbid a parent with a degree have a kid without a degree. As if 25+ yrs ago is the same as now.

It's not much different than the housing crisis / crash. The unaware are fueling the fire, only to find out they're stuck with a bill beyond their means.

It's only a matter of time before higher edu implodes. It's not sustainable. It's not practical.


"If you think education is expensive, try ignorance".

Not saying that people without higher degrees are ignorant, of course there are many ways to learn. But I think many people wouldn't find the motivation to learn without the social pressure to get a degree. So that pressure effectively achieves a society with less ignorance.

In the US you should fix the problem of the crazy tuition fees that send people into bankrupcy, though. In countries like Germany, tuition is free, i.e., the whole society pays for your degree. I find this reasonable: as a member of society, it benefits me that e.g. we have competent doctors for when I get sick, and that they are the best possible (i.e., selected by knowledge and skill, and not by being rich enough to pay for a degree).


> "But I think many people wouldn't find the motivation to learn without the social pressure to get a degree."

On one hand, I agree.

On the other, you hit the nail on the head. This is a socialization issue. As much of the USA gives a nod to individualism, the fact is we are taught to not take that too far.

The status quo feeds the higher edu beast. But there are too many blindly chasing the promise, and only piling up tons of debt. The wrath of Higher Edu Industrial Complex continues. No questions asked.

Yes. We can fix the tuition issue. But that also means saying "Sorry Charlie, you don't qualify." This latter bit, no one wants to discuss.

In addition, the truth is, cost is a function of demand. Edu prices - like housing prices - were driven up by demand; demand that was increase with cheap loans and easy availability.

Edu's charge more? Because. They. Can.

But everyone gets to go. Some use the opportunity wisely. Some do not.

But if the pay model changes so must the who gets to go where model.

All that said, edu is not a gateway to success. And the edu model is broken. It's going to implode. When AI replaces the vast majority of jobs (including white collar) then there will be an excess of edu'ed people with too much free time on their hands (and too much debt with no viable way to pay).

Perhaps higher edu isn't the only thing headed for a reckoning?


>In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

The missing part of the equation is that Germany also has high-quality, well-regarded vocational education. In the US, university is widely regarded as the only reasonable route to a secure middle-class lifestyle. In Germany, about half of school-leavers go into the Duale Ausbildung system, spending part of their time working as a paid apprentice and part of their time studying at college. The role of universities in society is completely different when there's a good alternative.


The difference is that getting a degree in Germany is much harder.

I’d be okay with publicly funded education (all levels) if the admissions requirements were a lot stricter.


That sort of defeats the point of education as socialization (that is, developing the common language that holds society together). The only fallback we have remaining is pop culture, and if we let that be our main means of socialization, people will start letting actors and reality TV stars make all their decisions.


To the extent society needs education as civic socialization, it seems like it should be done at the pre-college level.

In the US today, it is happening neither in high school nor in college (at least for most people at most high schools and colleges). The former is mostly about keeping youth supervised while parents work and enabling sufficient standardized test scores to get into college; the latter about getting the piece of paper that allows you to get a livable-wage job.


As someone who graduated high school with a whopping 1.8 GPA I totally disagree :). So far the industry seems to like my work.


>> In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

The same is true for most of the EU, except, notably, the UK, which is the other developed country, besides the US, where higher education is a proper industry and where students often get some serious debt to go to university.

I did all of my studies in the UK, though I'm from Greece. From the start I was struck with how much the public discussion around education is focused on money. It's not so much that university studies are aimed at getting you a (better) job, it's more like the UK society has lost its ability to measure the need for education with any other metric than how much more money you're making afterwards.

With the debate focusing on money like that, I think it will be pretty hard for the US or the UK to do anything about their "crazy fees" like you say. They now see those fees as a kind of investment: you pay some money up front to get a return to your investment later.

That is certainly a broken way to do education, higher or otherwise, but the solution is not to tear down higher education altogether. The obvious (to me) solution is to make it publicly funded again, as it used to be in both countries in the past, so that younger generations can benefit as their parents and grandparents did, without taking on extravagant debt.


> In the US you should fix the problem of the crazy tuition fees that send people into bankrupcy, though. In countries like Germany, tuition is free, i.e., the whole society pays for your degree.

It is not always free, but at least at public universities it is really cheap compared to US universities.


>I find this reasonable: as a member of society, it benefits me that e.g. we have competent doctors for when I get sick

Problem with this is, in the list of degrees, useless degrees far outnumber useful degrees.


Education is unnecessarily expensive, but if it were otherwise, I’d say send everyone, pass or fail, for the benefit of what they glean trying. However, I’d also say , see the participation as benefit in itself and credit those that do.


Totally agree. And I would like to add that I don't even think a degree is the best way to learn even for many who excel at them.


i agree with you. and most of the teaching in universities is absolutely pathetic. students primarily learn hopelessly alone through poor but overpriced books. the curriculum and professors are uninspiring, save for a few. it’s amazing what a good professor can do, but those are unfortunately rare.

as someone who keeps thinking if i should go back to finish my ph.d., i can’t help but think of how dogmatic most graduate programs are in addition to being logistical nightmares. there’s basically no time or space for one to explore one’s own thoughts and program unless it specifically matches up with what someone else (namely an advisor) wants to do. there’s so much hand holding through required coursework (and homework) that one can’t hardly think deeply about anything actually original.

the book disciplined minds touches upon this. that those born in institutions will carry that institutional conservatism with them. i have felt this. i went to talk about some of my ideas to a well known and respected researcher. before i could even finish my sentence that began with “i’m interested in...” he cut me off shaking his head and, literally, said “i don’t believe in it...”. goddamnit.


>. there’s so much hand holding through required coursework (and homework) that one can’t hardly think deeply about anything actually original.

Take an actual graduate course in pure mathematics, very little hand holding.

>there’s basically no time or space for one to explore one’s own thoughts and program unless it specifically matches up with what someone else (namely an advisor) wants to do.

Not necessarily true. Find an adviser that will support YOUR interest (they exist). Sounds like you have some biases against PhD programs without actual experience of being in a good one.


sounds like you like making assumptions.

> Take an actual graduate course in pure mathematics, very little hand holding.

i have taken many. and you missed my point. i didn’t mean having one’s hand held through the actual material. that isn’t true at all of course, in part due to poor teaching and in part due to the difficulty of the material. i meant that one is forced to take x amount of courses. i believe, past a certain level, that this becomes extremely limiting. you spend all this time and time effort doing coursework and going to class rather than researching. learning within a context is much better than learning without context, aka most courses. and many schools hardly take any coursework transfers. there aren’t many good reason for this.

and of course i have a bias because that’s what personal experience yields. but there’s no reason for you to insult me. i would wager my bias is shared by many who have been through ph.d. programs. and of course it’s not necessarily true. there are excellent advisors. but they are rare.


To offer another point of contrast - while it's true that you often need to spend a year taking courses, in most graduate programs they're very very flexible on what courses to take. And those courses themselves are very flexible on how much work you want to do.

I don't find this too troubling - it's important to have a shared fundamental knowledge so you can communicate effectively and not reinvent the wheel, and I don't think most programs are giving you busy work (that's really against the whole ethos).


>> you spend all this time and time effort doing coursework and going to class rather than researching

Is it possible the problem is particular to the US postgraduate system? In UK PhD programmers you are not required to follow any courses. That said, at least at my university as a PhD student you have the right to follow any courses you want for the duration of your PhD (obviously at no extra fee), which I'm absolutely taking advantage of.

I should also point out that a PhD is not just a time to do research- it's an opportunity to become an expert in your chosen field. And you don't do that just by inventing new knowledge. You need to also become familiar with the work others have contributed before you- and by "familiar" I mean "learn it very, very well". Perhaps the US universities are just trying to make sure you don't spend all your time with your nose in your own research, while ignoring everything others are doing around you?


just as a point of optimistic contrast, i've taken only three required courses over my PhD. some of my peers (who came in with better undergrad prereqs) have taken less, and in a few cases, none


You know I’d actually pay the crazy fees if only I could find a place that’d reteach me advanced maths “in context”. I find often that adding context means subtracting the “advanced” part :)


Study Physics. I'd take the math prerequisites and not understand what this is useful for, and then the first week of physics class we'd review the math and suddenly it would become quite practical.


Solid advise. I think finance massively recruits maths aptitude from physics for precisely this ability to connect hard core maths to models of the world.

I wouldn’t expect a pure mathematician to be good at this nor to care about it either.

Did you ever read “How to Solve it?” Polya says he became a mathematician because he was too clever to be a philosopher and not clever enough to be a physicist :)


Not all math has applications; some of it is just interesting for its own sake.


Come to Europe. Or at least to some European countries. In mine, Spain, Ph.D. programs now have no courses at all (let alone homework). They are exclusively research-oriented, your goal is to produce good research (papers) and then your thesis. Also, the duration is capped at 4 years, and the tuition fee at my university is around €250/year. Other European countries have similar systems.


That is not the whole picture. Most Ph.D programs require you to successfully complete 2 years in a research master program which is based on classes and homework assignments. It's not easy.

After that you enter into the Ph.D. for 3 years(max 4) and you are right, you can almost do whatever you want. Just bear in mind that either you publish in top journals or your academic career ends here.


This is true, but in practice admission in most Ph.D. programs in Spain is not very selective as there are more slots than applicants, so any master's degree will do, even if it's not research-oriented. And for foreign students, many Ph.D. programs have a rule that "if you can prove that your degree gives access to Ph.D. programs in your own country, then you can enter" so it's even possible to be admitted with e.g. a US bachelor's degree (YMMV per university though).


You don’t need a masters in the UK. Three years bachelors and then straight into a PhD which is just research - no classes or teaching. And we produce more than our share of top science so it must work.


You'll have to have a first in order to be able to skip the masters, and since entry to funded programs is competitive, you have to be an exceptional candidate for this to actually happen – it's quite rare.

Then there's the program itself. DTCs (doctoral training centres) now mostly run 3 + 1 programmes, where your first year is a master's, involving taught classes and a research project. This is to get everyone up to speed on how to actually do independent research, and to build core skills. While it's true that teaching doesn't always form a mandatory part of a doctoral degree, working as a teaching assistant for at least one semester is often mandatory.

In short, you've mischaracterised doctoral education in England and Wales (Scotland may be different) to quite an extraordinary degree. I see no reason to comment upon the "more than our top share" claim, for reasons which are, I hope, obvious.


> You'll have to have a first in order to be able to skip the masters

Well I'm not sure that many people without a first would seriously consider doing a PhD, so this isn't relevant in most cases.

> DTCs (doctoral training centres) now mostly run 3 + 1 programmes

Right, which is a big disadvantage of them. DTCs are just one way to do a PhD and they're not the traditional approach in the UK.

> In short, you've mischaracterised doctoral education in England and Wales

I don't agree! This matches from I see of students going into and coming out of the system today.


Worse. A degree is no proxy for drive, ambition, creative thinking, problem solving, etc.

Too many have come to believe a degree is the magic. In some cases it is. In plenty of others it's a curse. The time and money could have been better spent elsewhere.


It's certainly no proxy, but it is a standard metric to measure said qualities against. I hope we see a rise in autodidact accreditation and acceptance of non traditional paths, along with more decentralized and public research taking place.


"it's not sustainable"

Unfortunately it is, in a grim sort of way, with grads' wages being siphoned off for decades to pay for debt and interest. Over a 15 year time scale even the largest educational debt loads are reducible, but they may be very uncomfortable to reduce for someone making, say, $60,000 a year.


To solve future problems and avoid war we need more educated people, not less.


The 'education' in this context is a system of indoctrination in a pseudo-cult like atmoshpere ("there is nothing more glorious than doing pure science in an accredited faculty and if you disagree you are an idiot"). The atmosphere is not due to malfeasance, but a combination of professional management mixed with borderline asperger short sightedness and deafness to human needs as psychological beings of flesh.

This specific system has very little to do with war or piece - it's only causal link with the society at large is it's funding - how much, by whom and by witch terms.

This is not to say all academia is problematic - but too large areas, as claimed by this article, clearly are.


I'm not saying all education systems are perfect, I'm sure they can be improved. And problems are not solved by education alone, you need a engineering/scientific mindset, that you only get from higher education, or acquired by self study and inquisitivity. The opposite is ignorance, which can be a bliss, but it will not help solve conflicts and problems.


There are no formal prerequisites to maintaining peace and avoiding war, and I don't think a formal education or a "scientific mindset" is of much help here. Further, the opposite of a scientific mindset is not ignorance, for the opposite of ignorance is awareness, and awareness is not synonymous with the ambiguous 'scientific mindset'.

I believe one could argue rather successfully that we need more poets, painters, cinematographers, musicians and artist of all shades and stripes over scientist if peace is the ultimate goal.


>> you need a engineering/scientific mindset

Another citation required. Liberal arts and philosophy majors aren't useful in spreading world peace? Ancient historians would strenuously disagree.


Ancient liberal arts and philosophy majors have generally considered war one of the highest, noblest human callings.


Communist dictators throughout history have been highly education engineers. I don't think you can say engineering degree = peace.


Not true. Not if they are educated to be ignorant.

The US was / is involved in 5 to 7 "conflicts." Edu isn't helping to stop that. In fact, anecdotally, I see educated people part of the problem. They have become less open minded, less willing to change position because their info was incomplete.

We need thought, and people willing and able to use it. Not education.


From James Loewen's "Lies my Teacher Told Me": http://loewen.homestead.com/ "Over ten years I have asked more than a thousand undergraduates and several hundred non-students their beliefs about what kind of adults, by educational level, supported the war in Vietnam. ... By an overwhelming margin - almost 10 to 1 - my audiences believe that college-educated persons were more dovish. ... However, the truth is quite different. Educated people disproportionately supported the Vietnam War. ... These results surprise even some professional social scientists. If you look at other polls taken throughout the course of the Vietnam War, you'll see that the grade-school educated were ALWAYS the most dovish, the college-educated ALWAYS the most hawkish. ... My audiences are keen to learn why educated Americans were more hawkish. Two social processes, each tied to schooling, can account for educated Americans' support of the Vietnam War. The first can be summarized by the term allegiance. Educated adults tend to be successful and earn high incomes -- partly because schooling leads to better jobs and higher incomes, but mainly because high parental incomes lead to more education for their offspring. Also, parents transmit affluence and education directly to their children. ... The other process causing educated adults to be more likely to support the Vietnam War can be summarized under the rubric socialization. ... Education as socialization influences students simply to accept the rightness of our society. American history textbooks overtly tell us to be proud of America. The more schooling, the more socialization, and the more likely the individual will conclude that America is good. ... Both the allegiance and socialization processes cause the educated to believe that what America does is right. Public opinion polls show the non-thinking results. In late spring 1966, just before we began bombing Hanoi and Haiphong in North Vietnam, Americans split 50/50 as to whether we should bomb these targets. After the bombing began, 85 percent favored the bombing while only 15 percent opposed. The sudden shift was the result, not the cause, of the government's decision to bomb. ... We like to think of education as a mix of thoughtful learning processes. Allegiance and socialization, however, are intrinsic to the role of schooling in our society or any hierarchical society. Socialist leaders such as Fidel Castro and Mao Tse-tung vastly extended schooling in Cuba and China in part because they knew than an educated people is a socialized populace and a bulwark of allegiance. Education works the same way here: it encourages students not to think about society but merely to trust that it is good. To the degree that American history in particular is celebratory, it offers no way to understand any problem -- such as the Vietnam War, poverty, inequality, international haves and have-nots, environmental degradation, or changing sex roles -- that has historical roots. Therefore we might expect that the more traditional schooling in history that Americans have, the less they will understand Vietnam or any other historically based problem. This is why educated people were more hawkish on the Vietnam War. ... Students who have taken more mathematics courses are more proficient at math than other students. The same is true in English, foreign language and almost every other subject. Only in history is stupidity the result of more, not less, schooling. Why do students buy into the mindless "analysis" they encounter in American history courses? For some students, it is in their ideological interest. Upper-middle-class students are comforted by a view of society that emphasizes schooling as the solution to intolerance, poverty, even perhaps war. Such a rosy view of education and its effects lets them avoid considering the need to make major changes in other institutions. To the degree that this view permeates our society, students automatically think well of education and expect the educated to have seen through the Vietnam War. ..."

See also John Taylor Gatto's "The Underground History of American Education" and Howard Zinn's "A People's History of the United States".


Having a paper is not the only means to get educated these days.

Oh and your assumption that education somewhat prevents wars is completely false. Societies like pre ww2 Germany and Japan were full of well educated folks.


A quick search gave plenty of articles saying higher education prevents war. For example "Among independents, education is highly related to support for the war. Those with only a high school education give clear majority support, but support declines as educational level increases."

http://news.gallup.com/poll/7768/war-support-education-gap.a...

"The results provide evidence for both the grievance and stability arguments, providing strong support for the pacifying effects of education on civil war"

http://www.uky.edu/~clthyn2/thyne-ISQ-06.pdf


Very poor data to support your point.

> There is a modest decline in war support as educational levels increase from high school or less, to some college, to college graduate (with no postgraduate education). However, the largest gap occurs at the highest level, between those with a postgraduate education and those without. Only among the postgraduate group is the majority opposed to war, 56% to 40%, while the other three categories all show majority support.

Postgraduates almost don't count since there are so few of them in a civil society. So virtually no difference between high school level and college graduates, which does not support your point.


Do citizens vote for war? All of our leaders are very well educated, by American standards, yet war and police action persists.


> Societies like pre ww2 Germany and Japan were full of well educated folks

Yes, and these educated folks opposed the Nazis. The parts of [Weimar culture](https://en.wikipedia.org/wiki/Weimar_culture) that were the centre of art, music, literature, poetry, mathematics, and science was the parts that did not become Nazi.

Out of their Nobel Prize Winners, many of them were Jewish. Out of the places that participated most in that culture was Berlin (famous for the [Berlin Circle](https://en.wikipedia.org/wiki/Berlin_Circle), also the area that voted least for the Nazis).

The people who were cultured were less likely to be Nazis. The gay clubs in Berlin were later shut down by Nazis. Nazi conservatism was to some extent a backlash against the permissive and flourishing culture - and they associated Jews, gays and Marxists with that culture.

I'm not saying cultured people can't also be barbaric - they can. But it's also not true that somehow the most cultured place in the world was hiding a dark secret - the parts that were cultured were not the parts that instigated the backlash.


You seem to conveniently ignore the mass of german intellectuals who supported the national socialist regime, as well as industrialists and artists. Even outside Germany the Third Reich had its share of supporters among elites. Lets not falsify History by pretending only stupid people were nazi supporters.


There were intellectuals who supported the Nazis, there are intellectuals who currently support the alt-right and believe in establishing ethno-states. But it's negatively correlated.


> But it's negatively correlated.

Negatively correlated? We'd need hard and reliable data to compare and make proper judgment here, and it is going to be difficult since almost everyone involved is not neutral.

The problem is that as humans, we tend to believe that the ones who support different parties/ideologies have low IQ compared to our own parties/ideologies.

It is far from being obvious because intelligence and education is usually widespread among various groups and currents of thoughts.

As individuals it is actually very dangerous to think in this way, because it leads us to underestimate our adversaries wherever they are.


What is an educated person? How do you measure education? Why do you think universities should have a monopoly of higher education?


Citation needed that we need more "educated" people (for what definition of educated?) to do those things, and that the American system is even remotely decent at doing those things. Given the trajectory of our country's political efforts combined with ever-higher enrollment in college, the correlation at least doesn't look so good.


The identity politics shoved down the throats students at universities across the country are going to create wars, not prevent them.


It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

Maybe it's a selection thing -- the ones who can't sleep at night generally leave before they reach that point, as the realization dawns on them about what's expected of them. The process selects for the people who can maintain it.


An anecdote is not a datapoint, but I left my PhD because I saw the pyramid scheme and that I didn't want (or even could) maintain the effort required to climb up the pyramid. Now I'm a research assistant, I still do science but I no longer face the insane pressure of academia.


More generously, the people giving you advice are self-selected from the group of people who succeeded by taking that advice. I expect that the faculty are well-meaning, but perhaps not able to accurately gauge what will work for others.


I my world I have not met people who can accurately gauge what will work for others.

Even your peers in study might create pressure that having phd is something cool. I dropped my masters because I had job opportunity, some of my friends assume that I finished masters and that is why I have good job. Then I clarify, I did not finish it and they are somewhat disappointed.


I recall during undergrad that the stated goal of the professors was to 'unteach' us everything we learned in high school, and twenty years later I was thankfully finally able to unlearn everything I learned in undergrad. People that stay in a cult are unlikely to be fully honest with themselves and others about its shortcomings.

Graduate school was a non-issue as I went through that as a means to an end and virtually all of the professors weren't wrapped up in academia.


What were you studying? How do you know that your professors were wrong (that what they taught should be unlearned)?


Undergraduate in Art, I think they were right to unteach us what we learned in high school but that they were wrong in only teaching Art as it relates to the 'Art world' and that bias is a result of how colleges/universities hire Art professors.


It's also befuddling why most faculty are able to sleep at night knowing they are at the top of a pyramid scheme.

I just tell prospective students the truth. That the probability of being able to stay in academia is very low. That they are going to learn and do cool things, but if they end up in industry, this is not going to be valued much (generally true in my country, it's better in others). Those that stay know what they should expect.


Here is an explanation from 1994 by Dr. David Goodstein of Caltech, who testified to Congress on this back then, whose "The Big Crunch" essay concludes: https://www.its.caltech.edu/~dg/crunch_art.html "Let me finish by summarizing what I've been trying to tell you. We stand at an historic juncture in the history of science. The long era of exponential expansion ended decades ago, but we have not yet reconciled ourselves to that fact. The present social structure of science, by which I mean institutions, education, funding, publications and so on all evolved during the period of exponential expansion, before The Big Crunch. They are not suited to the unknown future we face. Today's scientific leaders, in the universities, government, industry and the scientific societies are mostly people who came of age during the golden era, 1950 - 1970. I am myself part of that generation. We think those were normal times and expect them to return. But we are wrong. Nothing like it will ever happen again. It is by no means certain that science will even survive, much less flourish, in the difficult times we face. Before it can survive, those of us who have gained so much from the era of scientific elites and scientific illiterates must learn to face reality, and admit that those days are gone forever."

And see also "Disciplined Minds" from 2000 about some other consequences: http://disciplinedminds.tripod.com/ "In this riveting book about the world of professional work, Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline." The hidden root of much career dissatisfaction, argues Schmidt, is the professional's lack of control over the political component of his or her creative work. Many professionals set out to make a contribution to society and add meaning to their lives. Yet our system of professional education and employment abusively inculcates an acceptance of politically subordinate roles in which professionals typically do not make a significant difference, undermining the creative potential of individuals, organizations and even democracy. Schmidt details the battle one must fight to be an independent thinker and to pursue one's own social vision in today's corporate society."

Or Philip Greenspun from 2006: http://philip.greenspun.com/careers/women-in-science "This is how things are likely to go for the smartest kid you sat next to in college. He got into Stanford for graduate school. He got a postdoc at MIT. His experiment worked out and he was therefore fortunate to land a job at University of California, Irvine. But at the end of the day, his research wasn't quite interesting or topical enough that the university wanted to commit to paying him a salary for the rest of his life. He is now 44 years old, with a family to feed, and looking for job with a "second rate has-been" label on his forehead. Why then, does anyone think that science is a sufficiently good career that people should debate who is privileged enough to work at it? Sample bias."

Or the Village Voice from 2004 about how it is even worse in the humanities than sci/tech grad school: https://web.archive.org/web/20130115173649/http://www.villag... "Here's an exciting career opportunity you won't see in the classified ads. For the first six to 10 years, it pays less than $20,000 and demands superhuman levels of commitment in a Dickensian environment. Forget about marriage, a mortgage, or even Thanksgiving dinners, as the focus of your entire life narrows to the production, to exacting specifications, of a 300-page document less than a dozen people will read. Then it's time for advancement: Apply to 50 far-flung, undesirable locations, with a 30 to 40 percent chance of being offered any position at all. You may end up living 100 miles from your spouse and commuting to three different work locations a week. You may end up $50,000 in debt, with no health insurance, feeding your kids with food stamps. If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off. Welcome to the world of the humanities Ph.D. student, 2004, where promises mean little and revolt is in the air."

The odd of success are probably even lower now with expanding use of adjuncts to replace tenured faculty.

Of course, the irony is that US society now has more than enough wealth so that anyone who wanted to could live like a graduate student researching whatever they wanted on a basic income.


> Of course, the irony is that US society now has more than enough wealth so that anyone who wanted to could live like a graduate student researching whatever they wanted on a basic income.

Which is more or less my plan. Some details here:

https://news.ycombinator.com/item?id=16610362

> My plan right now is to save money, retire early, and then do whatever research I want that fits my budget. This avoids many of the problems with the current system, but is not possible for many.

> This would allow me to pursue more risky research (in the sense that the research may fail to produce useful results) than an assistant professor trying to get tenure could. I also wouldn't have to raise funds, so I could focus on projects I believe are important, not just what can get funded.

I wish this option was more known and accepted. Some people seem to think I'm insane to intentionally pursue this, but as far as I can tell they don't see that the ship they are on (academia/government research/etc.) is sinking. It would be nice to talk with other independent researchers of this variety, exchange best practices, etc.


I wish you the best, but there's no one to talk to because no one has succeeded doing it. No science is done by independent researchers. You'd be limited to a very tiny sliver of research that doesn't require staff or expensive equipment. You'd be doing research without the support network of peers in a department, or colleagues at a conference to talk to. We should not advertise this as an option since it might make the gullible think this is feasible to do.


That depends on what you mean by "independent." There is a space between standard faculty and hermit researcher.

Depending upon the field, you can do research, publish, go to conferences, have a network of peers, without being standard faculty. I know research professors who have very few of the standard faculty obligations, for example. I also know people who do research entirely on private funding, but this almost always requires significantly more than just savings from retiring early, and they still stay part of the academic community.

I agree that it is not something that should be advertised as an option, because it is very rare, but it does exist.


> I wish you the best, but there's no one to talk to because no one has succeeded doing it.

There are many examples. Charles Darwin is the most famous. In my field (fluid dynamics) Robert Kraichnan is also well known and influential.

https://en.wikipedia.org/wiki/Independent_scientist

Of course, plenty of cranks go this route too. It's an option, not a panacea.

> You'd be limited to a very tiny sliver of research that doesn't require staff or expensive equipment.

Not a problem for me as a theorist. And I believe that expensive equipment is overused in my field anyway.

I also disagree that this is a "tiny sliver" of the research. Computation and theory is roughly half the research in my field, and I suspect this is true for many fields.

If this option doesn't work for you, don't do it.

> You'd be doing research without the support network of peers in a department, or colleagues at a conference to talk to.

I disagree. Collaboration does not require "official" status, and neither does attending a conference. In my field, no one cares what your affiliation is. At worst I could start a consulting company, which I'd probably do anyway. Plenty of consultants attend conferences in my field and collaborate with researchers in government, industry, and academia.


> If this option doesn't work for you, don't do it.

No need to get overly defensive. I'm responding to your argument that this should be more widely known and accepted. Science gets harder to do every year as the the lowest hanging fruit keeps getting plucked. Your examples aren't convincing -- Charles Darwin was born in 1809 and things were different back then. Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.

But hey, if you end up doing this successfully. Come back and tell us how it went. But in the meantime, I'm going to continue disagreeing with you, that this is a reasonable route to productively conduct research.


Low hanging fruit is one great example of where independent research can shine. The incentives are different, so what is considered "low hanging fruit" is also different. An independent researcher can focus on projects where the timescale is longer or the project is unlikely to be funded. With independent research being rare, I can see a lot of low hanging fruit for independent researchers which traditional academics would not touch.

I could provide an example of research which is disincentivized in traditional academia but incentivized in independent research from my own PhD research if you're interested.

> Your other example according to Wikipedia is someone who got a PhD at MIT, and applied for grants and held faculty positions at a number of universities.

Kraichnan was an independent researcher from 1962 to 2003, the majority of his career. Yes, he was affiliated with a university at multiple points of his career, but he spent 4 decades as an independent researcher and produced some of his most important work during that time.


> I could provide an example of research which is disincentivized in traditional academia but incentivized in independent research from my own PhD research if you're interested.

I'd be interested, if possible.


Sorry for the length.

Traditional academics have the "publish or perish" incentive. In practice this means that they prioritize "quick wins" over "slow wins", e.g., given a choice between publishing 1 paper after 1 year (quick win) or publishing 5 papers after 5 years (with no publications before then) (slow win), they'll choose 1 paper after 1 year. If an academic goes too long without a publication, that will be counted against them. The low hanging fruit for quick wins has been taken due to this incentive, but I see no shortage of slow wins. (The scenario I describe is an extreme case, but the same incentive still exists in less extreme cases.)

There's also the problem that what can get funded is not necessarily what's most important. Norbert Wiener discusses this at length in his 1950s book "Invention". Wiener notes that despite the obvious political differences between the USSR and US, research funding is allocated similarly: by people too far removed from the actual research, who are often not in a good position to evaluate its merit. It doesn't matter if these people are managers, bureaucrats, or fellow scientists. There's generally an asymmetry in information between the scientists requesting funding and those able to provide. (Having more time to learn about each proposal could help, but the trend I imagine is that time available to review each proposal has decreased over the years.) This ignores the lottery like nature of the entire funding process.

To get more specific, both of these problems are would disincentivize a traditional academic from publishing this paper I recently submitted to a conference:

https://engrxiv.org/35u7g

In principle, a traditional academic could have written this paper. It's possible, but I think less likely because of "publish or perish" incentives. (To be clear, I am a PhD student right now, and most of the time I was doing the research in the paper I was a TA. I don't have the "publish or perish" incentives that make this research less likely. If I stayed in academia longer I would.)

My advisor and I tried to get funding for this project, but our grant proposal (I wrote the vast majority of it) was rejected for reasons beyond our control (which I have no problem with). We received positive comments on the proposal, and it served as a draft my PhD proposal.

Without going into detail, the paper develops a simple mathematical model of a certain physical process. The theory and its validation would not have been possible unless I did two things that traditional academics seem to think are a waste of time:

1. Very comprehensive literature review.

2. Very comprehensive data compilation.

Now, I think most people would believe these two are just what academics do. But apparently not. Traditional academics are incentivized to do the bare minimum to get another publication. There's an epidemic of copying of citations and merely paraphrasing review sections of papers without reading the original papers, and I think this is caused partly because of these incentives.

The literature review I did (not all of which made it into the short conference paper) was considerably more comprehensive than any I've seen published in the field before, and I was able to synthesize past theories and improve upon them by recognizing some of their flaws.

How do I know I was more comprehensive? One way is by the excellent papers I found which few seem to be aware of. In the paper I mention, papers 3 through 8 have very few citations. Some of them have not been cited at all in the past 40 years to the best of my knowledge. Someone could say that these papers are just unimportant, but they're not. In my view they're "sleeping beauties":

https://www.nature.com/news/sleeping-beauty-papers-slumber-f...

Further, I spent a year or two alone digging deeper and deeper into the literature in this problem. There were several times when I thought I probably had at least touched everything, but a few weeks later I found yet another area that I had missed. Being comprehensive is difficult and time consuming. If you just want the minimum to publish, you won't bother.

I also benefited from certain heuristics which allowed me to identify important neglected research. For example, I spent a lot of time tracking down foreign language papers and books because I recognized that this research was avoided because it was written in a different language, not because it was bad. The entry costs to foreign language literature have dropped greatly over the past decade with options like Google Translate. I've translated around 10 full papers into English right now, and produced many more partial translations. These papers have provided critical insights that were necessary towards writing this paper. At this point some people I know use the fact that I like reading foreign language papers as a joke. Traditional academics think this is absurd, but I see that there's value, just that it takes time to be realized.

It was through my comprehensive literature review that I got the idea behind my data compilation. By taking advantage of the properties of a special case, I was able to get information that most researchers in this field seem to believe is extremely difficult and expensive to obtain. I did not come up with the idea myself. I was translating a 1960s Russian language paper into English when I realized based on what was written in one paragraph that I could use the properties of a special case to get some hard to obtain information. The author was actually leading into this. The next paragraph explicitly said the author was taking advantage of the properties of a special case. So it wasn't very original on my part. The 1960s Russian researcher didn't have a lot of data to use, but there's a lot now 50 years later.

So I started compiling data. I get the impression that few academics would have compiled even half as much as I did, or have been even half as careful as I have about it. I was very careful to select only the least ambiguous data sources. Out of over 100 candidate data sources, there were only around 20 which were acceptable. I then took the time to carefully transcribe all of the relevant data from these sources, and develop a computational framework to handle this data (based on Python and Pandas). It was probably at least 6 months of work, but I can produce several papers based on it, so it's worthwhile in my view. My advisor was not initially enthusiastic about compiling this data, by the way. He's a successful traditional academic, however, and his intuitions are calibrated differently than mine are.


At the very least, you need someone to bounce ideas off of. I used to think I was an introvert, until I went to grad school and tried living in my own head for months at a time.

I clicked on that link and I noticed that Darwin and Kraichnan both had institutional affiliations.

Maybe possible if you're paying your own way. Lots of newly-minted professors need those small grants to get their research going. Find one who can see you as a colleague and not as an ATM.


Almost all good independent researchers had some sort of institutional support at some point during their careers. What makes them mainly independent researchers is that they spent a large fraction of their careers without institutional support, not that they never had any.


Increasingly, you can simply contract out the physical end of experimentation out to dedicated commercial labs (at least in the biological sciences.) It takes a bit of money (well, a lot, even) but it’s much friendlier to small-scale enterprises, among the other advantages.


> tiny sliver of research that doesn't require staff or expensive equipment

Like anyone who presents at tech conferences.


I wasn't being dismissive of tech conferences, I genuinely meant there's a lot of valuable research done by small teams. Sure, it's not work with the large hadron collider, but assuming the only thing that advances human knowledge is big budget research is just plainly false.

You're on Hacker News for Knuth's sake. The entire premise baked into the word hacker is that enterprising smart people can change the world with just a little determination and hustle.

Even the hackers out there not making the next breakthrough technology but emulating and testing some arcane ICS, it's crazy to warn people away from that as risky and possibly not useful.

Every career is possibly not groundbreaking, every path has that risk. People should be realistic, sure, but if someone wants to support themselves while they do interesting research, I have no objections to that.

Do what you love, this isn't someone whose hopes and dreams are contingent on starring in the next Hollywood blockbuster, it's someone who wants to geek out on their own dime. Great! Amazing! Tell HN how it goes and most of us will love it, world changing or even just something fascinating to you, we have your back.


I have a similar plan!

I think the opportunities in tech are promising towards this end, and would be interested in getting in contact with other people with similar plans.

I'm on the "build skills, save money" - phase currently, and probably will be for > 5 years to come. So that is my main focus right now.


Sounds like we're at a similar place. Feel free to contact me. My email address is in my profile here.


I feel like there's an opportunity for a web platform for the collaboration of (independent) researchers. Some place where I could propose an idea and others could tell me why that idea doesn't work or if it's already been done. And if the idea doesn't have any flaws and hasn't already been explored, then other researchers could help me perform the research or "fork" the project to go down different paths.

I think some platforms come close. There are places to ask questions (e.g. StackExchange) but they don't seem to like "what if" questions and if the idea is good there's no good way to follow up over months or years. Github works for some fields where the idea is tightly coupled with the implementation (e.g. some computer systems fields) but not for others.

I guess I would just like a place to discuss ideas. In academia it's common to discuss nascent ideas within your lab, but larger collaboration happens at things like conferences where you're presenting only the ideas that worked out. I think such a platform could make independent research like you're describing much more effective and attractive.


When I mentioned your comment to my wife, she suggested that Reddit has some subreddits a bit like that. A starting point: http://www.refinethemind.com/reddit-mind-44-smart-subreddits...

That said, you have a great insight here with the idea that some places on the web could replace (or at least supplement) the traditional advisor/advisee relationship with a more peer-to-peer approach -- especially for independent researchers.

That issue of how to follow up over months or years seems key as a difference from more casual one-off interactions. Well-run mailing lists or forums may provide some of that continuity -- but maybe there is a social or technical way to go further in that direction?

I'm not sure if there ever would be just one place to discuss ideas, but you might want to bring up this general idea of some new platform with Michel Bauwens of the P2P Foundation: https://en.wikipedia.org/wiki/Michel_Bauwens


My whole life, I've wanted to be a "gentleman scientist". Just went back to university to study Physics, with Bio on the side. I'm going to run out of money in 1-2 years unless the trading system works out, or I get another real jorb. Good luck. Keep the dream alive!

P.S. 42, with ADHD. Started University at 15, started high school at 11. Great at getting jobs. Not so great at keeping them. :P


What do you think of having your research funded through platforms like https://experiment.com/?


Browsing the projects on the site, I'm amazed by the fraction which have full backing. Much higher than I would have expected. But the total amounts are mostly too small for someone intending to live off them, unless they have a large number of projects at once.

From a diversification standpoint I think this should only be one source of one's funding. The bulk of my planned funding is going to come from savings from a job. I have the most control over that, and it's a much larger source. I've also considered working as a lecturer from time to time as engineering lecturers seem to be reasonably well paid (at my university, ~$10K per class). Might as well take advantage of the rise of adjuncts. You can travel to different universities regularly for collaboration this way too. Some more permanent lecturer positions are fairly decently paid from what I understand and may be a decent way to save money while also having opportunities for research. The research is the goal, not the title of "independent researcher".

Also, having to solicit funding regularly is something I'd rather not do. It takes away time from research, and I'd like to focus on research which is not so easily funded. To go back to the "low hanging fruit" point mentioned elsewhere in this comment tree, I think there are many research topics which don't sound good to a third party but are actually good. It can be hard to convince people of this. The easiest way to move forward on these research topics is to risk your own money. And with the most easily funded ideas taking priority, I can see many examples of "low hanging fruit" in my own field.

The Patreon model could get around the "research not sounding good enough to fund" problem. Pay an individual to do work in general, not specific work. But aside from someone working on topics of popular interest (e.g., gwern), I don't think this would work.


This is basically what I do and recommend. After saving up around €100K and continuing to freelance 2 months per year, I spend most of my time on research projects.

It's too bad there isn't more of an independent academic community, but it sure beats pressure to publish and writing grant applications.


This is something I used to think about too but how do you do research without learning about prior work and how do you learn about prior work without an advisor to guide you?


Follow the paper trail. Find a research topic you are interested in and find an interesting paper on Google Scholar.

Then look up some of the references and referencees or stuff from the same author. Keep doing this one tactic and you'll easily find years worth of reading to do.

Besides finding the abstracts you'll need access to the research. This used to be an issue but luckily now we have sci-hub.tw and paperdownloader.cf


All disciplines have journals, and many are either public access or available on sci-hub. If all else fails you can get a limited JSTOR subscription for only a moderately outrageous sum.

Mostly you do reaearch by being interested in something. If you need someone to tell you what to be interested in, a PhD may not be for you.

A good supervisor will tell you to be interested in things you may not have considered. But empirically, most supervisors will steer you in the direction of their own interests.

These may or may not match your own interests. The mismatch us at least as likely to be a bad thing as a good one.

In my own experience, I decided to work independently instead of starting a PhD. There are only a couple of directly relevant journals, and I literally skimmed every issue, reading and taking notes on the papers that counted as prior art.

Those papers often quoted other papers outside the immediate domain, so I followed them up - and that’s how you start.

I have a pretty good idea of the directions I’d be steered in if I was being supervised, and near certainty that those are not the directions I want to explore.


> Mostly you do research by being interested in something. If you need someone to tell you what to be interested in, a PhD may not be for you.

Maybe I'm misinterpreting but this comment seems a bit condescending.

The thing is, it's not really research unless it's something new, and you can't know if something is new unless you know what already exists.

> All disciplines have journals, and many are either public access or available on sci-hub. If all else fails you can get a limited JSTOR subscription for only a moderately outrageous sum.

Sure you can have access to journals but how do you even know what you should search for? I suppose this is sufficient if what you want to do research in is something that is currently mainstream?


> Sure you can have access to journals but how do you even know what you should search for? I suppose this is sufficient if what you want to do research in is something that is currently mainstream?

I "research" stuff all the time, do a google search for some term and find an interesting paper then go through the references that seem interesting. Works better than you'd imagine, I was looking up CESK machines and down through the rabbit hole I eventually found the original paper with the non-obvious name of The Calculi of λ-v-CS Conversion: A Syntactic Theory of Control and State in Imperative Higher-Order Programming Languages. Honestly, those math-heavy CS papers tend to make my head hurt though.

Don't think there's a whole lot of people who would claim lambda calculus is mainstream but I can spend hours upon hours finding stuff to read about and most of (or all) the papers are easily downloaded off the author's website.


> If you are the luckiest out of every five entrants, you may win the profession's ultimate prize: A comfortable middle-class job, for the rest of your life, with summers off.

That's pretty good actually. Your chances are far less than 10% in physics and that's for postdocs not students, and that's assuming you're from a top university with a good publication history in the top of the top journal(s).

Edit: For all the replies that focus on "summers off", I'm not saying you have summers off, that's a part of something I quoted. I'm well aware you don't get summers off from experience. Heck you can't take off time even after a baby unless you want to jeopardize your chances in tenure. If you read the portion that I wrote, you can see that my reply is about the job security (=tenure).


As a former professor, I can tell you summers are the busiest work times as that is when you are trying to catch up on all the research activities you put off during the semester.

It is actually nearly impossible to find time to actually take a vacation. I had more than 6 months banked when I quit.

Edit. To add to your edit, tenure is far less secure that it appears from outside. It allows you to get away with being a moderate pain, but if you get too bad what they do is make your position redundant in the next departmental reshuffle. Since these occur about every 2 to 3 years tenure is more illusory than real.

You not only have to worry about your position being made redundant, but your whole department. It is easier for admin to get rid of a few troublemakers by killing the whole department than just going after the troublemakers directly. A fair amount of my admin/politics time was spent defending the entire department rather than worrying about my own position directly.


As a current one, I can also tell you the same thing. Where did you get the idea that I am talking about summers being off?


I know you were quoting the OP, I was just adding to your post that professors don't get the summer off based on my personal experience.


Interesting do tell more please.

How much of a professors life is showing that they are doing vs actually doing.

What aspects are better and what worse? And which are popularly misunderstood (like summer vacation) ?


Professors have three full time jobs; administrator, researcher (well supervision of people actually doing the research, but this is very time consuming if done right), and teaching. Each job is more than 35 hours a week (the time I was nominally paid for).

I was deliberately terrible at the administration side and outright refused to do anything unless it meant more money for my department (the one good thing about tenure is you can be a pain in the backside and not get fired), but even spending less than 10 hours a week on admin, my work week was rarely less than 80 to 90 hours.

One of the major reasons why I quit academia and went back to my startup was to have more time to spend with my family.


> I quit academia and went back to my startup was to have more time to spend with my family.

That alone seems like a rather damning indictment of academia.


Yep. I only work around 60 hours a week now.


Can you explain why you made a point to say "supervise students"? From your statement it looks like there is no expectation to do original/independent research. Is that the standard practise too?


I didn't say supervise students so it was not a point I made??

I of course did supervise students. This does vary by field, but done right each student needs about 5 hours a week one-on-one direct enagagement (more at the beginning of their degree and less at the end). Less time than this and the student will struggle.

You can of course do what some of the big empire builders do and just let the student sink or swim on their own, but this is not good for the student. Good supervision is very time consuming, but probably the most rewarding aspect of being an academic.


I was referring to this :

> researcher (well supervision of people actually doing the research

You explicitly distinguished the two.

And thank you for your answers.


I am not danieltillett, but I am familiar with the difference between doing research and supervising research. A tenure-track professor is expected to advise students who do research. The professor's involvement in the day-to-day of the research itself will vary depending on the student's experience, how large the professor's group is, and what the professor wants. While I do know of some professors who did almost an equal amount of day-to-day research work as their student on a project, that is rare, and such professors can typically only have one or two students at at a time.

Instead, a professor is usually spending more time supervising research. This is typically because a professor has multiple students, and as danieltillett explained, they already have many other time commitments. But this is also because the point of the entire process is for the students to become independent researchers. That is, the research supervision is teaching their students how to be researchers. In my experiences, professors in science and engineering tend towards being managers rather than being in the lab or at the keyboard.

I presume danieltillett made this distinction because there is a difference between supervising and doing the research.


Also your summers aren't off, necessarily. If you're at a research institute, it's not like the group of PhD students and postdocs you manage vanishes.


Seems like the [VC funded] startup grind. Working college grads to death with the hope of being Mark Zuckerberg (or maybe not anymore). But mostly to make billionaires even richer.

The irony is that US society now has more than enough wealth so that anyone who wanted to could live like a startup "founder" making whatever app they wanted on a basic income.


I think your use of the term "startup" would be more accurate if you said "VC funded startup".


Agreed. I made the edit.


I mean, it sounds like people have been confidently predicting the demise of academia for 25 years, and it has yet to happen. And I don’t know about you but it certainly seems like we have still been exponentiall since 94.


Insightful. Congress probably saw that as a blueprint for how to oppress nerds.


You give Congress far too much credit. They’re just pushing policies that sounds good on the campaign trail, same as ever. And nerds probably cheered them. These kinds of effects are hard to understand after they happened, let alone see them coming.


> You may end up $50,000 in debt

Remember when $50,000 sounded like some kind of scary worst case scenario for student debt?


Thank you for this: the most telling aspect for me is the history of it all. This is not a new problem. I should like to look at the follow ups / aftermaths of each of these sources. I wonder what excuse(s) academia decided on.


You're welcome. I agree -- it would be great to see followups on each.

I don't think there was much response to any of them though.

As Upton Sinclair wrote about a century ago: "It is difficult to get a man to understand something, when his salary depends on his not understanding it."

How does that apply to the exploitation of the stars in the eyes of graduate students? There may be vast amounts of self-serving denial of the pyramid scheme aspect of much of academia.

Like George Orwell wrote in 1946: "The point is that we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield. ..."

I'd guess the outcome will continue to be mainly gradually increasing pain for all involved. Human systems seem to be able to tolerate a large amount of needless suffering when there is no obvious credible alternative and there are still some positive aspects of the current system. Related: https://www2.ucsc.edu/whorulesamerica/change/science_freshst...

Likely things will keep going on a downward trend until some significant shock causes a massive reorientation of resources. Alternatively, the shock may just be the crossing of various trend lines like increasing student debt versus decreasing graduate opportunities to the point where no one could justify the cost as an investment.

See also a few different theories of social collapse which could be applied to understanding possibilities for academia: https://en.wikipedia.org/wiki/Societal_collapse#Theories

And: https://en.wikipedia.org/wiki/Dark_Age_Ahead "Dark Age Ahead is a 2004 book by Jane Jacobs describing what she sees as the decay of five key "pillars" in "North America": community and family, higher education, science and technology, taxes and government responsiveness to citizen's needs, and self-regulation by the learned professions. She argues that this decay threatens to create a dark age unless the trends are reversed. Jacobs characterizes a dark age as a "mass amnesia" where even the memory of what was lost is lost."

And we are seeing that sort of amnesia in the USA in academia and other places -- where fewer people remember what academia used to be like decades ago.

Just like there is a growing amnesia where fewer people remember what it was like to go to school in the USA back in the 1960s when kids were taught how awful the USSR was because it kept its citizens under constant surveillance...

But we still might hope for a gradual transition to other ways of organizing research and discussion like via the internet (such as Hacker News) -- but people still need to somehow get enough available time to participate in productive ways.

And the original Nature article is an example of an attempt at self-correction.

Here are a couple recent satires on academia both from 2013:

From Amazon: "Option Three: A Novel about the University by Joel Shatzky -- When Acting Visiting Assistant Professor L. Circassian is fired and rehired in the same week (with a 35 percent pay cut), he is only at the beginning of a cycle of abuse and professional debasement at the university. Joel Shatzky has created an hilarious novel about the corporatization of higher education - a book filled with blowhard deans, corrupt politicians, grasping CEOs, inept union officials, inappropriately dressed students, and scholars in donkey ears."

https://www.lawrenceswittner.com/books/whats-going-uaardvark "What’s Going On at UAardvark? by Lawrence S. Wittner -- What’s Going On at UAardvark? is a faced-paced political satire about how an increasingly corporatized, modern American university becomes the site of a rambunctious rebellion that turns the nation’s campus life upside down."

Both relate to this Atlantic essay from 2000: https://www.theatlantic.com/magazine/archive/2000/03/the-kep... "The Kept University: Commercially sponsored research is putting at risk the paramount value of higher education—disinterested inquiry. Even more alarming, the authors argue, universities themselves are behaving more and more like for-profit companies."

Here is an essay I wrote mostly around 2001 on one way to fix one negative aspect of much of modern academia and other not-for-profits supposedly dedicated to working in the public interest: http://pdfernhout.net/open-letter-to-grantmakers-and-donors-... "Foundations, other grantmaking agencies handling public tax-exempt dollars, and charitable donors need to consider the implications for their grantmaking or donation policies if they use a now obsolete charitable model of subsidizing proprietary publishing and proprietary research. In order to improve the effectiveness and collaborativeness of the non-profit sector overall, it is suggested these grantmaking organizations and donors move to requiring grantees to make any resulting copyrighted digital materials freely available on the internet, including free licenses granting the right for others to make and redistribute new derivative works without further permission. It is also suggested patents resulting from charitably subsidized research research also be made freely available for general use. The alternative of allowing charitable dollars to result in proprietary copyrights and proprietary patents is corrupting the non-profit sector as it results in a conflict of interest between a non-profit's primary mission of helping humanity through freely sharing knowledge (made possible at little cost by the internet) and a desire to maximize short term revenues through charging licensing fees for access to patents and copyrights. In essence, with the change of publishing and communication economics made possible by the wide spread use of the internet, tax-exempt non-profits have become, perhaps unwittingly, caught up in a new form of "self-dealing", and it is up to donors and grantmakers (and eventually lawmakers) to prevent this by requiring free licensing of results as a condition of their grants and donations."

And here is a book-length essay be me from 2008 on how to rethink Princeton University as a mental-health-promoting post-scarcity institution: "Post-Scarcity Princeton, or, Reading between the lines of PAW for prospective Princeton students, or, the Health Risks of Heart Disease" http://pdfernhout.net/reading-between-the-lines.html "The fundamental issue considered in this essay is how an emerging post-scarcity society affects the mythology by which Princeton University defines its "brand", both as an educational institution and as an alumni community. ... We can, and should, ask how we can create institutions that help everyone in them become healthier, more loving, more charitable, more hopeful, more caring..."

So essentially, if we want the better parts of old academia from the US 1950s-1970s back, there will need to be some radical changes. As G.K. Chesteron wrote in 1908: http://www.ccel.org/ccel/chesterton/orthodoxy.x.html "We have remarked that one reason offered for being a progressive is that things naturally tend to grow better. But the only real reason for being a progressive is that things naturally tend to grow worse. The corruption in things is not only the best argument for being progressive; it is also the only argument against being conservative. The conservative theory would really be quite sweeping and unanswerable if it were not for this one fact. But all conservatism is based upon the idea that if you leave things alone you leave them as they are. But you do not. If you leave a thing alone you leave it to a torrent of change. If you leave a white post alone it will soon be a black post. If you particularly want it to be white you must be always painting it again; that is, you must be always having a revolution. Briefly, if you want the old white post you must have a new white post. But this which is true even of inanimate things is in a quite special and terrible sense true of all human things. An almost unnatural vigilance is really required of the citizen because of the horrible rapidity with which human institutions grow old. It is the custom in passing romance and journalism to talk of men suffering under old tyrannies. But, as a fact, men have almost always suffered under new tyrannies; under tyrannies that had been public liberties hardly twenty years before. Thus England went mad with joy over the patriotic monarchy of Elizabeth; and then (almost immediately afterwards) went mad with rage in the trap of the tyranny of Charles the First. So, again, in France the monarchy became intolerable, not just after it had been tolerated, but just after it had been adored. The son of Louis the well-beloved was Louis the guillotined. So in the same way in England in the nineteenth century the Radical manufacturer was entirely trusted as a mere tribune of the people, until suddenly we heard the cry of the Socialist that he was a tyrant eating the people like bread. So again, we have almost up to the last instant trusted the newspapers as organs of public opinion. Just recently some of us have seen (not slowly, but with a start) that they are obviously nothing of the kind. They are, by the nature of the case, the hobbies of a few rich men. We have not any need to rebel against antiquity; we have to rebel against novelty. ..."

Or for a more modern take on that, from 1963, as John W. Gardner said in "Self-Renewal: The Individual and the Innovative Society", every generation needs to relearn for itself what the words carved into the stone monuments mean. He says essentially that fundamental values are not some long-ago-filled-but-now-running-out reservoir from previous generations but a reservoir that must be refilled anew by each generation in its own way.

Without necessarily approving of their specific actions, value re-creation is something that people like the late Aaron Swartz (taking on JSTOR and MIT with his efforts) and Alexandra Elbakyan (taking on Elsevier with Sci-Hub) were and are trying to do. Richard Stallman with the GPL and GNU Manifesto from 1985 as a response to proprietary software agreements in academia is another less-controversial example because he worked within the existing copyright laws. So are -- also less controversially -- Wikipedia, Hacker News, Reddit (Swartz again), Slashdot, Archive.org, GitHub, and many other internet-mediated venues -- which are creating ways to have discussions and learn about sci/tech/humanities topics outside of formal academia. They are all essentially treating formal academic systems as-we-know-them-in-practice as damage and routing around them.


> Jeff Schmidt demonstrates that the workplace is a battleground for the very identity of the individual, as is graduate school, where professionals are trained. He shows that professional work is inherently political, and that professionals are hired to subordinate their own vision and maintain strict "ideological discipline."

I'll need to read that book, thanks. In the meantime, I was also reminded of things Erich Fromm wrote, or said, in this case:

> Our main way of relating ourselves to others is like things relate themselves to things on the market. We want to exchange our own personality, or as one says sometimes, our "personality package", for something. Now, this is not so true for the manual workers. The manual worker does not have to sell his personality. He doesn't have to sell his smile. But what you might call the "symbolpushers" , that is to say, all the people who deal with figures, with paper, with men, who manipulate - to use a better, or nicer, word - manipulate men and signs and words, all those today have not only to sell their service but in the bargain they're to sell their personality, more or less. There are exceptions.

-- Erich Fromm in an interview, https://www.youtube.com/watch?v=Cu-7UDT0Xe4&t=1m34s


Thanks for the interesting link to the Eric Fromm interview and the quote from the first half. The second half about pursuing happiness through engagement in society (rather than leaving it to "experts") is also quite interesting.


You're most welcome. I found this too late to edit the comment, but here's the full interview and a transcript:

https://www.youtube.com/watch?v=OTu0qJG0NfU

http://www.hrc.utexas.edu/multimedia/video/2008/wallace/from...


Does this remind others of other enterprises, like beginning video game programmers? When your are ready to leave from abuse, there is always another new person happy to take over.

The supply of new labor in the academic market seems endless. I have spoken to ~200 incoming grad students about careers, and never once did it sink in. My opinion is that it must be corrected earlier up the chain, in high school or early college.


That's like asking why there was an aristocratic court in Heian Japan when their skills in prosody and haiku were clearly not viable as jobs. I don't think every degree has to land a job, and education in itself must be a strong goal for our nation, not just economic feasibility. Don't be ruled by the Almighty Dollar.


> education in itself must be a strong goal for our nation

Too bad it isn't. Education is being used as a yardstick to isolate the poor from the economy. Don't have a BS? Guess you aren't qualified to work at McDonalds.

And for those that do get those degrees, like you said, haikus aren't a viable job. For all the history and literature majors of the world a lot of them were spoon fed what their parents thought - "get a degree, any degree, and you are ahead in the world" just to find out that "ahead" didn't mean comfortable living. It just meant you had higher priority for the limited pie of work than those who didn't have that generic piece of paper.

If you want people not to be ruled by the dollar, change society so your ability to be safe and secure in your day to day survival isn't bound so tightly to the pursuit and amassment of said dollar.


Decoupling Safety and Security from the Almighty Dollar is a noble pursuit.


I agree, but if we are going to agree to not be ruled by the almighty dollar, we have to stop giving so many almighty dollars to universities for degrees that will produce no almighty dollars after graduation.

Let people study fine arts with a minor in Russian literature. But we shouldn't subsidize huge student loans so they can get those degrees.


> “The arts are essen­tial to any com­plete national life. The State owes it to itself to sus­tain and encour­age them…. Ill fares the race which fails to salute the arts with the rev­er­ence and delight which are their due.” -- Churchill, 1938

We could debate how much subsidy should be allocated for arts vs. science, but I'd prefer that the wealthiest countries didn't restrict arts. They/we can afford it.


The arts won't go away if the US stops subsidizing it. Universities will just charge less for it.


Sure, but staying in school until your 30's is a bit ridiculous. Only so many people can do it. Think about it. You're living off of somebody else's dime until almost 40.


If you are actually trying for academia (and are considered as have a good chance of doing so), rather than being in academic wrappings around professional education, you should have consistent TA/RA/fellowships, and while it can be argued that, in living off research funding, you're living off someone else's dime, aren't you likely going to be living off that same dime the rest of your life?


Strongly agree. I got my MSc in Biological Oceanography, and it was mostly a waste of time and money. I’d imagine ~90% of my graduating class feel the same way.

I transitioned into data analytics, then data engineering, and now customer facing software development, and love my career, but the MSc has contributed very little to it.


I wonder if things are just as brutal when you take industry into account as a place to wind up post-PhD. There are research labs in industry that are interested in hiring PhDs. But maybe this is just the rose coloured glasses we have in CS.


> when there is no domestic market for most PHDs.

Isn't this common knowledge though. Isn't not like NSF is advocating people to not google search their career prospects before embarking on the journey.


I have a feeling most people don't really accept it applies to them, if the general atmosphere differs from the stats.


Just to clarify, I am faculty. I just don't take students ;P


Isn't teaching part of the definition of faculty? In other words, isn't it impossible to be 'faculty' if you don't take 'students' (i.e., 'teach')? At least anywhere I've ever heard of, this is true; someone who doesn't have students is considered 'staff'.


> Isn't teaching part of the definition of faculty?

No - I'm not sure where you got that definition from. You can be a researcher and be on the faculty of a department. It just means as opposed to the support members of staff.


I'm guessing you're either tenured or emeritus, or you're a research professor. You don't have that luxury on tenure track almost anywhere.


TT faculty at SLACs don't take graduate students (by and large) and still get tenure...


So, as tenured faculty I don't know about NSF advocating for careers in STEM. Maybe that's a good thing maybe its a bad thing.

What I do see driving this ultimately is something broader and more systemic, which is universities being underfunded, administrator heavy, and being run as for-profit businesses.

In many ways, this all reduces to how grants are awarded and funded, and the history of grant funding.

Through the 80s and 90s, and to some extent early 2000s, federal grant funding increased. There was sort of a heydey in that time, depending on field, where there was good funding opportunities, in general, for faculty.

At the same time, state funding to universities started decreasing, and costs started increasing due to the higher ed bubble. Because of indirect costs on grants, which allow universities to charge the federal gov a blanket percentage of a grant, universities were able to skim off a profit almost automatically. It was a sort of win-win it seemed because faculty got to fund their research, unis got a cut to offset decreased funding.

The problems are many, though. First, federal funding went down after a peak, so this money pool dried up. Second, this system sort of led to the pyramid scheme you mention, where grad students get recruited to maintain grants and research programs with nowhere to go, lest universities and faculty lose indirect funding. Third, it prioritizes research based on grant receipts rather than utility, which distorts research priorities in a perverse way (especially given nepotism in grant culture, which has been written about extensively elsewhere). Finally, because of the history of all of this, you kind of end up with a system in which those running the system are either beneficiaries of the 90s and 2000s, and don't realize what it is like now, or came up through that system and have a kind of survivorship bias.

Agencies like NSF and NIH are basically just maintaining any semblance of research in the US at the moment, making up for all the deficiencies that are occurring at the state level.

There are problems with NSF and NIH, but they're not with advocating for STEM grads. They're with how grants are funded. Basically, review panels need to randomly rotate in some kind of jury pool system almost; grants need to be awarded based on some kind of lottery system (proposed by former heads of NIH); indirect cost charges need to be eliminated or made line-item justified; some grants need to be awarded on the basis of publication record without regard to a specific proposal (ala Hungary); universities need to be adequately funded outside of the grant system; and tenure needs to be protected (I say this as someone who is probably going to step down from my tenured position soon).

The problems facing grad students and postdocs are just the tip of the iceberg. They don't end, even when you get a tenure-track position, and even when you get tenure. The system is broken, it's a mess that favors hype and popularity over rigor. What you end up with is the current reproducibility crisis, academic fraud, people manipulating the system to overclaim credit, and many more problems that society is unaware of because they have a stereotyped idea of what happens in science--the sort of "great man in the lab with a eureka moment" which is a false model of scientific process.


I think you're not giving enough weight to the massive increase (as a ratio) of people seeking postgraduate education. Today about 12% of people have at least a masters. And that number continues to increase as education, even in fields without meaningful career prospects, is continually evangelized. By contrast in years from 1975-1995 the total percent of people age 25-29 who had even a bachelors degree was only in the range of 21-25%. That number today is 36%, and again - growing. I doubt the ratio of gross 'revenue' (including grants) to total faculty today is less today than it was in the 'heyday.' Given the massive increases in tuition, I would be quite surprised if it's not vastly higher.

I completely and absolutely agree with you about the perversion of research directions, but I think this can be pin pointed more precisely to just a single point you said - universities are increasingly being run as for-profit businesses. That is really the root of all of these problems.


Yes, and the massive increase in people seeking postgrad education was preceded by a massive increase in people offering postgrad education. AND at the same time, you remind me of working in a college where a project was started to launch a grad program, and the justification made in committee meetings was, "we need to get some grad students around here to teach the classes and free up our time. Otherwise we'll never get ahead."


You are correct about how the current system is the result of a draw down of past heavy funding. The timing of the draw down was different in different fields. An NSF official explained to me in 1990 that, in the 1960's, 90% of grant applications were funded. By 1990, 90% were rejected. The same is true for hiring: in 1945-1970, with the G.I. bill and the explosion of new universities (e.g., in California), almost everyone with a Ph.D. was hired for a tenure track position, and many tenure track positions were held by professors without Ph.D.'s. As you say, faculty only very slowly learns that the systems they came up in are gone, and so they over-prepare young people for a world that is gone. The future may be a combination of the older tradition that professors start with inherited wealth and a new pattern that an academic career begins with 10 years building wealth in business.


I guess I sleep fine at night and I am faculty. It's only a pyramid scheme if you think everyone wants to become faculty.


Talk about Being in the Bubble Exhibit A.

People at the top of pyramid schemes get there by harming the people at the bottom and others outside the structure. One of the hardest lessons to learn in grad school was that those at the top can be incredibly nice, compassionate, affable people, but fantastically destructive.

If this topic doesn't concern you, I feel incredibly bad for your students and post-docs. It isn't a problem because it pops up on occasion. It's a problem because it's everywhere and people with the power to change it soothe their conscience with straw men and pedantic aphorisms of tenuous relevance. The bias that helps you sleep at night allows the problems to fester




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: