I joke with my team that "we're never short on problems". It's tempting to hear about something interesting and want to be involved—or worse, feel like you should be involved. It's very hard to say no.
2. Taking on high-risk projects with factors out of my control.
If there's one thing I'm good at, it's setting expectations. But combined with #1, handling too many projects mean you never have time to properly think (and mitigate) the risks.
3. Taking on too many students.
You could flip this to "growing too fast," though that's not a problem I've personally had. There was a culture of mentorship at my previous company, which a pressure to take on mentees. Similar...
4. Wasted too much time courting companies for money.
Probably an analog into the startup scene, but again, not a problem I've had.
5. Way, way, way too much travel. / 6. All the boondoggles.
Forget travel, the core message here is opportunities. The author says, "I found it hard to avoid this trap because you think, well, saying no means you're not going to get asked to do the more important thing next time."
Oh man, do I agree. And I agree that most of the time it's a moot point. But there's something to be said for "visibility" and there's a 1-10% chance that the one "boondoggle" you sign up for ends up yielding an opportunity you wouldn't have had otherwise. This is my major personal and professional issue, and I haven't solved it yet, either.
Anecdotally, I'd say these aren't just mistakes he made; these are systemic problems I see with academia (I can only speak from my experience in the US). Every professor I know under age 40-45 did or does these things. Among its many causes are peer pressure, fear (of competition, of missing out), tenure anxiety, hubris, and "tradition".
It depends. I'm still under 40, and many of the things he considers mistakes I don't (yet). That said, three of the things on the list I've been able to avoid, and I think it's made the rest easier. 1. I was lucky to get a bunch of money early, and stopped spending a lot of time writing grants after the first year. 2. Made a conscious decision to not travel excessively, spend time at home, and vacation for all sorts of family reasons. That said, I've used twitter to supplement the backroom chatter I am missing and that's worked well enough 3. I initially took a lot of committee work but have backed off a ton in the last few years.
I do run a pretty big group, though I've downsized a bit (now 8 Phd, 3 postdocs, many undergrads). We work on too many things and most are pretty ambitious, but that makes it fun. I wouldn't change that much honestly. Maybe I need a few more years before I regret everything, but things are pretty good IMO, but again, I think you need a few chips to fall your way before things become manageable (e.g., like early grants).
Yes, it really depends on the field. That said in the last 5 or so years there are a lot of early investigator pools of money in biology. This is not true for everyone, but the hard years everyone talks about now is mid-career. I'm not there yet, but I'm starting to see how it gets a lot more difficult.
Getting a research grant is not "hitting the lottery", it's being recognized for your previous work and having people believe that your new proposals make sense and that you have what it takes to carry on your new ambitions.
Honestly, this sounds like a "reverse any advice you hear" sort of situation. Welsh did too much. However, from many people's perspective, Too Much is exactly what you have to do to succeed in academia. You would really need a retrospective survey from those granted tenure to actually calibrate how much Too Much is a healthy margin to move up in your job, versus how much is actually way too much and harms your prospects.
(I'm assuming that to some degree it will harm your health. This is unfortunate but true in most competitive fields.)
As an N of 1, a professor with tenure, I think he's exactly right about being wrong in what he did, from the perspective of what is rational in the overall scheme of things. However, what he did is exactly what a professor is expected to do, from the perspective of promotion.
I've read his essay before, and it touches a nerve, only because it's so revealing of how broken academics is.
Everything Welsh says was wrong is wrong for all the reasons he says it is. The irony is that he seems to blame himself, when he should be blaming the standards of academics. Everything he says about grant writing, conferences, etc. being irrational is true.
> Too Much is exactly what you have to do to succeed in academia
I won't dispute that statement with respect to his situation. Academia is very broad, and the expectations at Harvard are rather different from most of academia. At many academic institutions you can have a life and get tenure.
I think that statement is getting less true every year.
Increasingly, for instance, we seem to only hire people who would be tenurable today. So people are thinking they need to have a number of papers, etc., before they get a PhD.
It is an accelleration and a person wonders if it doesn't result in people churning out stuff because it it publishable, rather than because it is worthwhile. And when you reward that, that's what you get.
At what point of this does he comment on the quality of his ability to teach students and lead them to become incredibly successful? This sounds more of a professional researcher than a professor.
1. Advising is teaching. In fact, advising is the majority component of an R1 teaching load! So, he DID discuss the teaching component of his job. A lot.
2. In terms of classroom teaching, he was probably teaching 1-2 courses per year at Harvard. So classroom teaching really wasn't the majority component of his job. It's unsurprising that he didn't spend much time talking about this minority component of his job, in the same way that it's unsurprising when a developer talks primarily about writing/reading/testing code rather than the ~20% of their workday activities that are something other than writing/reading/testing code.
Of course, one hopes that developers introspect on their effectiveness in various meetings, when talking to clients or internal stakeholders, and so on. But you'd be unsurprised if a dev, discussing their biggest mistakes during their first few years, focus on code-related mistakes.
3. Maybe he felt good about his non-advising, classroom teaching? The title of the post is "everything I did wrong", not "a summary of everything I did".
I think monsky's comment and the exchange below hit on an important point: different understandings of what a professor does.
I would venture that most people equate professor with teaching. While many of the people that responded to monksy, understand that the position is much more than that. Seeing this dichotomy is important to having a productive discussion.
This happens a lot in life. Another example from higher education—University President. Most people equate University President with something along the lines of U.S. President and hold the requisite expectations for them. Reality = fundraising figurehead.
My point of contention is that professors who do not teach or do so badly, should be treated and considered differently.
There should be a title of Graduate Advisor, and Teaching Professor (those who teach others to become a professor/teach in a classroom). There should be specialization on being an effective teacher. There should be pathways to success. Pushing all students into research and half heartedly force them to learn how to teach only re-enforces this idea.
For those "graduate professors" who just consider advising and mentoring grad students teaching, you have to apply that same logic that by mentoring anyone in the field, you're now a teacher. (That's a dualism)
My big issue here: we're completely ignoring the issue that a student comes into the university to learn. (Even if it's a research institution) By assigning "professors" that don't actually teach you're doing a huge disservice to everyone. In those cases students are left to their own devices, they're not going to do well for research, community, or industry. Really what's happening is people are paying for an apprentiship to become a researcher.
> There should be a title of... Teaching Professor
This is already a job title! An alternative, often equivalent title is "Instructor". And these people are primarily responsible for classroom teaching! And they are primarily evaluated upon that teaching! What you are asking for already exists.
There are also colleges and universities where all people with the title "professor" are dedicated to classroom teaching. Harvard isn't one of them.
> There should be a title of Graduate Advisor
There is. For historical reasons, at Harvard, the "Graduate Advisor" is called an "assistant/associate/full professor". But don't get confused by the name. These professors don't do much classroom teaching, just like most software engineers don't build engines. It's just a title, not a job description, and getting angry about it is just as silly as getting angry that most software engineers don't write software for engine control units!
> we're completely ignoring the issue that a student comes into the university to learn
And assisting in that learning is not the majority component of the jobs of tenure-track professors at research institutions.
> Really what's happening is people are paying for an apprenticeship to become a researcher.
Research grants, not undergraduate tuition, pay for tenure-track professors' research at places like Harvard.
You seem to have a deeply flawed understanding about what a modern research university is, how it operates, and what undergraduate tuition money is spent on. That's perfectly fine, except that you're continually arguing with people who try to explain to you your misconceptions.
When academia wants to educate their own, they don't use lectures - they use apprenticeships. Lectures are a cash cow they sell to the rest of society.
Most students come to university to be credentialed, and the higher prestige the institution the better - student demand seems to show they would rather listen to the ramble of a famous professor who can't wait to get back to the lab, than an effective dedicated teacher.
I've yet to see someone complain that software engineers don't build engines and are therefore bad at their jobs. Indeed, that'd be a silly argument.
Also, you've completely ignored my first point above. Even if we accept that teaching is the primary responsibility of a "Professor", it does not follow that this teaching manifests primarily as classroom teaching. Advising is teaching, and is the primary form of teaching undertaken by professors at certain research-intensive universities.
Incidentally, there are job titles that describe classroom teachers at research universities -- "instructor" or "teaching professor". The author of this post was not an "instructor" or a "teaching professor". He was an assistant and then associate professor. Classroom teaching was not a majority component of his job.
Luckily for every software engineer out there, historical cruft in the names of job titles is a bad criteria for evaluating job performance.
I don't think anyone would consider an software engineer to be someone who works on engines. They engineer software solutions. (Designing, developing, gathering requirements, and insure correctness)
From the dictionary: "Engineer: a person who designs, builds, or maintains engines"
Hopefully you see the point I'm making? Just because you think a job title means something, and just because that's an accurate historical meaning, doesn't mean that the job title retains the same meaning today.
I did say software engineer. It would be equally absurd to claim that a civic engineer or chemical engineer would be responsible for building an engine.
Yes, that was my point. Your argument -- that a professor should be expected teach in classrooms a lot because they're called a professor -- is absurd :-)
Teaching is the reason why professors are there. If you can't/don't want to teach you should be a professional researcher and admit that. That is what the tax payers support when supporting universities. (Via tutiion, etc.. [grants are not considered in this])
The current setup where a professor can delagate teaching to TAs, and overwork/underpay assistents/juniors is ethically wrong, often illegal in work practices (payment below min wage), and fraudulent to the students paying tutition.
Repeating this line over and over does not make it so. (Tenure-track CS) professors at Harvard are not there to (primarily) teach (undergraduate) courses. This is just a fact.
> If you can't/don't want to teach you should be a professional researcher and admit that
Professional researchers do not typically advise doctoral students. So professors at research universities are doing a lot of teaching that someone at an industry lab probably would not be doing.
Classroom teaching is not the only form of teaching, and professional researchers do not grow on trees.
> Via tuition, etc... fraudulent to the students paying tutition
Tenure-track professors at places like Harvard fund themselves through grants, not undergraduate tuition dollars.
This is a big misconception regarding what professors do at research-oriented institutions.
Teaching is why people get professorships at teaching universities, where the main part of your job is to teach (typically 6-8 courses per year).
If you are getting a job at a research-oriented university, you typically teach 1-3 courses per year, and the focus of your activity is on running a world-class research program (doing research, training PhD students, and raising money to pay those students / buy equipment, etc.). Teaching is still important and takes time, but you have a lot of other responsibilities. There are an enormous amount of pulls on a professor's time, though, and most work very long hours (my colleagues and I probably work 60-80 hours per week and we aren't at Harvard).
I don't think it is appropriate to delegate teaching to TAs (I delegate much grading and help with homework assignments). Regarding pay for research assistants, there simply isn't a way to change this in the US. There just isn't enough government funding. To pay a single PhD student a stipend of around $30K per year for 20 hours of work per week* costs a budget around $60-80K with overhead and other factors per year. Many US agencies (e.g., NSF) only fund around $100-150K per year if you win a grant (7% success rate for many of them), so you can only fund 1-2 PhD students with that and you fought really hard to get that award to pay your students (lots of rejections). Most universities also won't let you pay them higher salaries, even if you want to as a professor.
I used to complain about these things as a graduate student, but once you get the job (and have made a ton of budgets) you see the world differently. I'd love to change things, but don't blame the professors for things way out of their control.
* 20 hours of work per week that's not related to their thesis, etc. Most of the time we are trying to get students funded on projects where that 20 hours of work does apply to their thesis, so that they get more out of it. They of course must work much more than 20 hours per week on their thesis project(s) to do cutting-edge work and graduate on time.
There are also a lot of non-university research positions. The US has a large number of national laboratories where researchers work full time (mostly in the physical and mathematical sciences), plus government agencies like the NIH, FDA, and CDC where the focus in on biology/medicine. And other nations have other non-university jobs like Germany's Max Planck institutes for practically every type of science.
There are a lot of commonly held misconceptions about the job of a professor in your comment, so I will break it down line by line. Note that this is simply for laying down the reality of today's professor job market rather than advocating that it should be one way or another.
> Teaching is the reason why professors are there.
This is a commonly held belief by non-academics. It is not one held by academics at R1 schools.
Note that there are many lower level universities (not R1) at which the teaching is sometimes quite good. There are also very real things called "teaching colleges". If someone wants to go to a place with great teachers, they largely should not go to an R1 school.
> If you can't/don't want to teach you should be a professional researcher and admit that.
Being a professor at an R1 school is basically making the claim that you are a researcher -- one who happens to teach from time to time.
> That is what the tax payers support when supporting universities. (Via tutiion, etc.. [grants are not considered in this])
Are you sure about that?
This is a potentially huge discussion, but "teaching" is not high on the list of added value if you attend an R1 school. In fact, the instruction is often terrible. The idea is that you will be given access to resources, and smart, motivated people know how to avail themselves of useful resources.
Taxpayers get research labs that spawn new industries, develop mature industries, and contribute to high value-add labor markets (great for local economy).
Students get access to bleeding edge research, researchers, and tools.
Much tuition these days goes towards facilities that the people who can pay want (e.g., nicer dorms, gyms, etc.). Note that many of the top student scholars at R1 schools pay little or no tuition.
One could also argue that people pay tuition for access to certain social circles -- social capital is a real thing with real value at a wide range of R1 schools.
I could go on, but the value derived from how tuition and taxpayer money is used is a deep and complex topic.
> The current setup where a professor can delegate teaching to TAs, and overwork/underpay assistents/juniors is ethically wrong, often illegal in work practices (payment below min wage), and fraudulent to the students paying tuition.
The delegation part is fine -- the content of the classes that have TAs is not terribly challenging or interesting to a professional in the field, so delegating it (especially for large classes) is an intelligent allocation of human resources.
That said, the illegal parts you mentioned are terrible and should not happen. To be fair, the only cases I've directly seen of overly worked TA/RA labor was when the grad student was doing something wrong (e.g., grading with heuristics that were overly granular). That said, I have read stories, and I believe that they are true at least in part if not in whole.
Well, he does talk about having too many PhD students to really mentor them well. For better or worse, TT professors at modern R1 universities aren't really there to contribute to undergraduate education. They're there to do research and to mentor graduate students.
When I interview and they ask me "Tell us about your biggest flaw" or something along those lines, I too always answer that it's that I'm too ambitious and that I work too hard.
The "what are your weaknesses?" question, when devoid of context, is always a stupid question that is guaranteed to evoke either disingenuous response or an honest one that will count against the candidate.
Much better to use a behavioral interviewing approach where the question is asked in the context of a specific past project which has just been discussed at length.
If I get asked this question, I always stop intentionally as if I’ve never been asked this before, look out of the window (if there is one), then look back at the interviewer and say - with lots of pathos: well, I guess I’m too much of a perfectionist
Yes, that's the classic standard response to this question, because it makes a kind-of positive outcome.
But, really, isn't this the only kind-of positive response to this question? Is there any other response that works this way? And hence, why are interviewers asking this question in the first place? I pretty sure they know they will always get some variant of "I'm a perfectionist" as response, because that's the textbook answer to this textbook question.
I had one guy who was serious about this and I learned a lot through that: he sincerely asked this, but differently: “What are the things you are working on to be better at?”
At first, I was kinda startled and couldn’t answer really, tried to bullshit him like the answer I gave above. He called it though and said that he is disappointed by such an answer and that he’d expect everybody to work on 2-3 things. Otherwise, he concluded, there is a lack of introspection to be assumed. Fair point, I thought
The interviewer may be looking for self awareness, in which case any honest answer will be positive.
I am good at lots of things and bad at others. I will be most valuable on a team with others who can mitigate my weaknesses. I don’t want to be hired for a role where I will be expected to excel at my weaknesses.
If someone asks me this question, I dig deep ad give an honest answer. There are lots of jobs. I don’t think of my interviewer as an adversary. When I’m interviewing I’m looking for someone who wants to hire me with eyes open.
> But, really, isn't this the only kind-of positive response to this question? Is there any other response that works this way?
It's not a good question. People answer it in many different ways. Some people give a clearly rehearsed stock answer that they've read online. Some people give an example of something they're not great at, and how they think they have started to rectify it. A small number of people give an open and honest answer.
Some people will flip it back at the end of the interview when they're asked if they have any questions. "Earlier you asked me for my biggest flaw, so what's the w biggest flaw with this company?"
> And hence, why are interviewers asking this question in the first place?
Ideally they don't, but if they are asking it they should be looking for either honesty, or for interview preparation (because some variant of this question is routinely asked).
Also if the interviewer is a naive, cultureless soon-to-be corporate tech drone, fresh from the saccharine bleatings of their politically correct and sex-less, drug-less university days, with no concept of sarcasm, or human complexity, or of what a hideously self-important and embarrassing answer it is.
Completely agree. I also think, "disingenuous" since it's the most obvious fake answer, and also "uncreative" since they couldn't come up with anything better.
The author mentioned not taking on too much risk and avoiding straying to far from the lab.
When I studied CS I was never that excited with the pure, abstract CS-bits. I was a lot more interested in how we could apply those bits in the real world to affect change in peoples lives. Volcano sensors may be high risk, but may also have a lot more impact than more theoretical/abstract work.
That being said, I'm grateful we have researchers interested in the theoretical stuff, otherwise practical guys like me wouldn't have anything to apply. So it's important to realise that while "minimise risk" might be good advice for the author, for another researcher "maximise real-world impact" might be sounder advice.
The funny thing about these type of thought exercise is whether you would be a better person?
I am who I am because of all the mistakes I have made, because of all the crisis I went though during my projects, because of the laughter and satisfaction in my life so far.
Unless you have like a beachfront university, no one should be spending more than ~ 20-21 hours a week at actual working (commute included). Slight caveat 1. once decent VR gets here, and you can make it to a reasonably nice VR beach from home, sort of like in that Valerian movie... but more MMO style. Slight caveat 2. if your work is really fun or (reasonably) progresses the state of humanity to where no one really needs to work / everyone is supermodels / environmental and social problems are all solved etc.
(I say this with out considering it that deeply, but it's probably right.)
As others have said in the US if you're teaching a class in a university or college you're generally called a professor. I definitely used that title for all my teachers throughout my college career.
Yes - at the time Matt graduated, this was still the norm in CS. Things are changing and it's becoming more common to do a post-doc, but it's still not unusual to go directly (in the US).
I was going to write that I disagree with you, but according to the Taulbee survey this is true:
The percentage of Ph.D. graduates who took North American
academic jobs rose in 2015-16 for the second straight year,
to 30.7 from 29.0 last year. However, the percentage of
graduates taking tenure-track positions in North American
doctoral-granting computing departments fell from to 10.0 in
2014-15 to 9.0 in 2015-16. The percentage taking positions in
North American non-Ph.D.-granting computing departments
fell from 2.3 percent to 1.6 percent, while the percentage
taking North American academic postdoctoral positions
jumped from 9.7 percent to 14.3 percent.[1]
In the US the important form of "promotion" is in the form of salary. Titles are nice and probably provide some motivation, but if you can become a chaired professor at a top university, you will bring in a salary several times that of a new hire.
To further clarify: The UK has the distinct titles "reader" and "lecturer" before "professor." In the US, if you have a PhD and you're teaching a college class, you're called a professor, even if you're an adjunct (not tenure track). The tenure track titles are "assistant professor", "associate professor" and then just "professor." Informally, we tend to use "professor" to refer to all of them, and it's common for people to say "Professor Doe" even if the person is an assistant professor.
I have never seen the title "lecturer" used in the US. Can you recall where you saw it? The title "instructor" is quite common when the person does not have a PhD.
Lecturer is used at the University of California for instructors with a PhD and no research responsibilities. Some of them even have a form of tenure, "security of employment."
Lecturers are part- or full-time teaching faculty (usually not tenure-track, but some schools like UC's have tenure-track Lecturer tracks) e.g., https://profiles.stanford.edu/48960
Interesting, it was not a title used in the schools I attended (Virginia Tech, William and Mary) or have I seen it used elsewhere. I just checked Harvard, and they use it, too, so it's not just a west coast thing.
Here professor is quite a distinguished title, and while it isn't legally protected, I've never seen it used for anyone except a very senior academic who has been formally appointed as a professor.
Looks like he's about 45, which is extremely young to be a professor, but still leaves a 10 year gap between phd and tenure. I think he just looks very young.
45 is definitely not extremely young to be a professor. Not in the US, where you start as assistant professor. I know plenty of PhD's who graduate and start in a university as assistant professors, age range starting at around 26. My advisor started at 28.
In the US tenure track university faculty typically have three ranks: Assistant Professor (upon hire and prior to tenure) Associate Professor (typically awarded at tenure) and Full Professor. All of these ranks are referred to as "professor", in fact, non-tenure track faculty as also frequently referred to as professor.
1. Working on too many projects at once.
I joke with my team that "we're never short on problems". It's tempting to hear about something interesting and want to be involved—or worse, feel like you should be involved. It's very hard to say no.
2. Taking on high-risk projects with factors out of my control.
If there's one thing I'm good at, it's setting expectations. But combined with #1, handling too many projects mean you never have time to properly think (and mitigate) the risks.
3. Taking on too many students.
You could flip this to "growing too fast," though that's not a problem I've personally had. There was a culture of mentorship at my previous company, which a pressure to take on mentees. Similar...
4. Wasted too much time courting companies for money.
Probably an analog into the startup scene, but again, not a problem I've had.
5. Way, way, way too much travel. / 6. All the boondoggles.
Forget travel, the core message here is opportunities. The author says, "I found it hard to avoid this trap because you think, well, saying no means you're not going to get asked to do the more important thing next time."
Oh man, do I agree. And I agree that most of the time it's a moot point. But there's something to be said for "visibility" and there's a 1-10% chance that the one "boondoggle" you sign up for ends up yielding an opportunity you wouldn't have had otherwise. This is my major personal and professional issue, and I haven't solved it yet, either.