Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why don't schools teach debugging, or, more fundamentally, fundamentals? (jakeseliger.com)
107 points by jseliger on Jan 17, 2024 | hide | past | favorite | 154 comments


In my first CS class at university the professor started with "We are not going to teach you programming. You can do it on your own time. This program is called computer science, and that's what it will cover."

It was a bit harsh at the time, but over the next semester or two plenty of students switched majors when they couldn't keep up with graph theory, combinatorics, linear algebra, advanced algorithms, computer architecture, compilers. Too many people are enrolling in CS programs these days because they want a quick ticket to a cushy tech job, and I'm happy that's not what reputed programs are catering to.

Funny enough there were a few classes on debugging, but it wasn't about the process of finding bugs in code but rather "here's the GNU debugger, crack it open and figure out what's going on inside". I also don't understand how one can even "teach" debugging like the author envisions. It's like saying why don't schools teach problem solving or why don't schools teach intelligence. If students are getting regular non-trivial programming assignments they have no choice but to develop the skill themselves.


> In my first CS class at university the professor started with "We are not going to teach you programming. You can do it on your own time. This program is called computer science, and that's what it will cover."

That was my school. Because of the education system in Slovenia, we even looked down on software engineering a little. Engineering was the vocational college. We were a university program thankyouverymuch.

In reality, our program took 7.5 years to complete on average because people got jobs in industry and forgot to graduate.

BUT we did have 2 programming classes in the first year. It was considered necessary to give everyone an even playing field so they could do the tutorials. Those were heavy on “You learned all this theory, now put it to work”. We implemented programs in ARM assembler, various graph algorithms, numerical computation methods, stuff like that.

Just writing lots of code also taught you debugging. Many classes evaluated homework and even exams by running your code through a test suite. Your grade was percentage of tests passed.

And then there was the compilers class. Holy shit do you learn a lot about debugging when you’re fixing your code and your compiler at the same time. That was fun.


You teach it like you teach any other skill: with guided practice. People get a lot better with debugging if you help them through some of their problems and then let them come to you when they’re stuck. The thing with debugging is you can struggle a lot and do it very inefficiently so I see plenty of students who power through these non-trivial assignments while taking far longer than they should because they never learned the fundamentals of how to validate their code. They have no idea what tools they have available to them, how to design checks that efficiently test their hypotheses of why things aren’t working, or designing code so that it works the first time as much as possible.


You and your professor rightly call out the distinction between computer science and programming. I wish more new grads understood that programming and software engineering are also not the same thing.

I've seen great young software engineers feel like they aren't succeeding because they aren't belting out hundreds of lines of code on every little task like some of their peers do. The reason the good ones aren't doing that is because they see a more practical way forward (i.e., find a better set of tradeoffs). For example, finding a good package instead of writing it. Or collaborating with stakeholders to simplify requirements. By taking more practical paths, they avoid the cost of developing and maintaining all that extra code. They are considerably more valuable because they are doing actual software engineering and not just programming. But so many new grads, especially the ones belting out so much code, think programming and software engineering are the same thing. They are not.


Programmers should also take the pragmatic choice of using a library vs coding it themselves...


There's still a mix of "cs courses aren't up-front about not being about programming" and "programming courses are not advertised enough". I totally agree with the approach and the split, but... often it looks like people are set up for failure. I've definitely seen that happen during my time at uni. There's a lot left to fix at the system level.


I wish my university had been up-front about that. I did, in fact, teach myself programming - and had been since high school, more or less - but I didn't realize how poorly prepared I was going to be for a career in software engineering after a computer science track.

I loved all the "academic" classes, though, and almost minored in math. (Turns out, while I loved them, I wasn't particularly good at them.)


> Too many people are enrolling in CS programs these days because they want a quick ticket to a cushy tech job

Or maybe, people are enrolling in CS programs these days because they want to learn how to program. Which is a perfectly appropriate skill to teach. The fact that schools fail to teach it is their defect.


If people were enrolling in physics programs because they wanted to learn to cut hair, would it be prudent for the program to shift focus to hairdressing? Or, perhaps, it is the students that would be better off recognizing that if hair cutting is what they want to learn, they would be better off in a hairdressing program? It is not like there aren't countless schools that do offer programming programs. In fact, I suspect you will find that there are more schools offering programming programs than there are schools offering CS programs.


> The fact that schools fail to teach it is their defect.

Of course it is not a defect to teach the subject matter of the degree.

Teaching programming is also a perfectly nice skill to teach, but it's not computer science.

You wouldn't enroll in mechanical engineering and complain they don't teach how to change oil in your car.

However, equivalently, if you do study mechanical engineering and aerodynamics, you could then go into a career in car mechanics where you can get much farther (e.g. Formula 1 car development) than if you studied only fixing cars at a trade school.

The problem with this field is that software engineering and computer science and trade-school level programming all get conflated when they are distinct but related fields. Maybe it will take many more decades of maturing before the field sorts these out properly, such as has already happened in more mature fields that have been around for centuries.


In Germany there are two distinct kinds of higher education institutions addressing that. Vocational universities that teach skills in a very practical manner, and generally more prestigious "plain" universities that spend more time on fundamentals (as in scientific foundations), and teach under assumption that alumni will contribute to academia themselves - even if this assumption turns out false in most cases. In addition to that there are also vocational programs in software engineering that don't count as higher education at all.


The same is true in America. It is just that there was that wildly successful marketing campaign a few decades ago that went something along the lines of "Go to university or you will be forever left to flip burgers", so most people won't even stop to consider vocational training from traditional vocational schools, and instead are always pushing for universities to become the vocational schools instead.


Being able to program in the context of computer science is like being able to build an airplane in the context of aeronautics.

If one is not interested in things that are foundational to the context they operate in, why are they there?


Do car mechanics and car hobbyists take a semester to study Carnot cycles, thermodynamics, and entropy?


Car mechanics aren't enrolling in mechanical engineering programs and then complaining that they aren't being taught how to fix a car.


Only if those subjects interest them. Otherwise, like the previous commenter asks, why would they bother?


Universities offer mechanical engineering degrees, not just physics degrees. In the same way, they need to offer software engineering degrees, not just CS degrees. Some do, but too few.


Must be a regional thing? It appears that all of the major universities in this country offer software engineering programs.

That said, someone wanting to learn how to program would be best served by a community college/vocational school.


As far as I can tell, most of the "major" universities offer one CS course (sometimes two) on software engineering but no degrees (e.g. Berkeley, Stanford, MIT, Princeton, UIUC, Cornell).

CMU, UT, and UW are the few prestigious schools that I could find that explicitly offer a Master's in Software Engineering (but no undergrad degrees, certainly).

Other than that, Georgia Tech offers an undergrad certificate (3 classes). Berkeley apparently has an online program for a Master of Molecular Science and Software Engineering, wherein you mostly take chemistry classes along with one data science course and one CS course.


Funny how schools say "We are not going to teach you programming"

yet then you have like 5-10 strongly programming related subjects which would benefit from 2 semesters of strong introduction to software engineering.

But you'd need competent software engineers to teach it, instead you get people who teach you SOLID and other buzzwords they barely understand


> It's like saying why don't schools teach problem solving

Don't they do that at the lower grades ? (around 7~10 years old ?)

Or in general I would see our math classes as lessons on how to approach complex problems ("prove that X"), hand what strategies are available to match what we know with what we don't know.

> why don't schools teach intelligence.

Define intelligence first.

Otherwise, I went to an university that thought both science and engineering, and indeed we had sessions sitting in front of an editor, looking at core dumps and asking the assistant professor for advice. Like any other skills, there's ways to teach debugging.


Vast majority of the CS grads won't even need to use most of what theyre taught, when they finally land a job. Facts are that most of the jobs available are the jobs that you call "cushy tech jobs". Where you can go by with having a 2-3 year long programming focused schooling.

Only quite niche type of tech jobs today require exactly what CS teaches.


> Vast majority of the CS grads won't even need to use most of what theyre taught, when they finally land a job.

Should they? A material science graduate isn't expected to use much of what they learned when they get a job as welder either. CS is for those interested in the academic pursuit of the sciences. It is not for vocational training.


Funny enough to the topic title, actual CS topics tend to become useful when debugging.


> It is not for vocational training.

And yet most of the companies require CS degree when hiring


You still need some way to keep out the poor and underprivileged who are not good for business. Overtly stating they are not welcome doesn't fly anymore, but rejecting anyone who isn't in a position to be able to casually engross themselves in the sciences for several years for no reason other than personal enjoyment is a decent proxy.


There are good books about debugging, simple but effective.

https://www.amazon.ca/Debugging-Indispensable-Software-Hardw...


Ex teacher here. This was a common complain from other university departments. We did not include debugging in the curricula, instead we taught debugging when a student got lost in their own code. Experience showed that was the moment they grok it the easiest. When they had their own code, their own baffling test case, and suddenly a new tool made it transparently easy to fix. They loved it immediately.

When we tried differently, the times we used to taught debugging in the whiteboard , it was a mess. Debugging requires carrying a lot of context line by line. Having 80 eyeballs watching the same debugging exercise, made the bug shallow and stole the aha experience from individual students.


This is very true. The whiteboard portions of the practical programming classes were among the most boring when I was going through school but actually trying to solve a problem myself was quite enjoyable and much better for learning. If I remember right we had a brief introduction to debuggers as a concept for better debugging but we were largely left to our own devices to figure out the features because everyone would need different things from it at different times.


That comment right here is spot on.

It is the same like driving a car - you cannot teach someone driving a car by having a lecture, you also cannot teach someone driving a car when all they need is train line and they are happy with it.

Most of the discussion about "people not fit for the job" is mostly about if a person has real interest in topic to grind through and find their way around the topic.

I find people who are not interested in computers who suck at using them, but a lot of times it is only that they are not interested while having to use it for some things. All people that put effort and are interested get by quite well.


The comment actually demonstrate desirable difficulty, not the fact that you can't teach someone. The way they taught debugging is flawed.


For me "demonstrate desirable difficulty" seems like "can't teach in a class setting" which in context of "why don't schools teach debugging" seems like "can't teach someone".

You most likely can teach someone in 1on1 setting and it would be easy to setup something to work together but that won't work at school. Even when we had laboratories at university there was not enough time to get everyone through laboratory exercise in smaller group than a lecture.


Universities are supposed to teach academic skills - ie, things that are believed to be true by the academy. "How to program" isn't something the academy understands, so they generally can't teach it [0]. By extension, they can't teach debugging either. People will need some sort of trade skill system to pick that stuff up.

I'm tempted by both sides of the argument, but I think the professors have a point about leaving students to sink or swim. At the end of the day a professor generally has a lot of teaching experience and most of the younger ones I've seen seem go through a phase where they think they can teach every student. Very few (none?) of the older ones. I think that is an important clue.

[0] Observe that teaching language semantics and syntax are different from learning how to program using that language. As far as I know among professors there is a strong consensus on what C is, and no consensus on how to use it to solve practical problems.

EDIT Dear me, this sounds awful. Poor man. https://jakeseliger.com/2023/07/22/i-am-dying-of-squamous-ce...


In the case of debugging I would say that the academic side would benefit from the teaching of the fundamentals of debugging: this is because debugging at its core is just science, the process of measuring and perturbing a system to understand something about how it works. You can teach someone gdb but if they don't get the fundamentals then they will struggle to actually debug anything


I think you are right when you frame debugging as a scientific process. It really is about observing and measuring while perturbing a system in some way to provoke it to provide you with observation.

But as roenxi points out, academia doesn't see programming as something they are supposed to teach. Academia sees programming as a vocational skill. Something to be learned "on the job" - or done by "other people" (whoever they might be). They see it as their job to, mostly, teach you the math that is needed to model behaviors of software and systems.

I think the closest I ever got to something that was practical was system design/modeling. And as to demonstrate how terrible academia is at keeping itself updated, the methods we learned were, without fail, known to be impractical, outdated and wasteful. Nobody who took that class would ever apply anything they had learned in that class later in life.

Computer science is a bit odd. If you become an architect (the kind that draws houses) it is understood that you are not going to be actually building houses. You will probably never even touch a hammer in any professional capacity. You will work with someone who understands the mechanical bits, building code etc and together you produce a package of instructions that someone else has to realize.

However, if you are a CS grad, it is fairly likely that you will be writing software. Which makes the whole thing a bit absurd. It is as if you expected architects (the house-designing kind) to build houses, but never even give them any training in how to nail two planks together.


This strikes me as so odd. My CS degree had tons of practical classes but it was all application of things you learned because, being quasi vocational, the best way to learn to program is by doing so the learning to program portion of the degree were the assignments to the classes. We had loads; your standard welcome to Java course, the second level where we had some of the early Google Nexus phones and experimented with building little apps [0], then a whole class on development methodologies which had a lab/group project portion where we were adding to a little digital health document system that was actually passed from class to class taking the best version(s) from each semester and adding to it on the next, and at the end a Senior capstone project. Each semester generally had at least one largely practical class.

> It is as if you expected architects (the house-designing kind) to build houses, but never even give them any training in how to nail two planks together.

Well architects don't build houses, they design them. The learning of computer programming is best to learn by doing after an introduction to the basics of some language. I have a hard time believing programs just fail to mention the existence of debuggers over the entirety of their program. There's a lot of the programs that expect you to figure out the precise tools you want to use for yourself.

[0] The core assignment of this actually kind of failed because it was flawed from the beginning but it was still quite useful. We were supposed to build a lock screen that "locked" the phone when we went above a configurable speed. The main issue was the GPS data available to the apps in that early version of Android was very spotty.


The academia teaches practical programming. The CS department usually doesn't, as it's not their job.

You can see the same situation with the mathematics department. They teach mathematics, not doing calculations. If another department has a need for more practical mathematical skills (physics is the usual example), they often teach their own classes.

Like many other biases in the academia, this is mostly a matter of self-selection. If you want a career in software, the industry is clearly the better option. If you want a career specifically in CS, there are not that many good alternatives to the academia. Which means that CS departments are populated by people with no particular interest in software.


> Which means that CS departments are populated by people with no particular interest in software.

This is bunk, from my experience at UC Berkeley (PhD) - the professors I knew/worked with loved to program. One explained to me that teaching+research+academic duties left no time for programming on campus, so he spent another 40 hours a week programming at home.


Your experience supports the statement. If the professors you knew had a particular interest in software, they would turn their work focus towards that, not focusing on teaching, research, and other academic duties that are at odds with software.

They prioritize the latter as those are in line with their particular interest. Software is just a sideline, just like I'm sure most of us here have some kind of other interest that we partake in at home.


They were programming language professors. They most definitely had an interest in both programming and academia.


Nobody is questioning their dual interest, but it is clear what their particular interest is (hint: not software).


I see. So anyone flipping burgers at McDonald's must have a passionate interest in and life goal to be flipping burgers. Otherwise, why would they spend so much time on it?

If I understand your argument correctly, you're saying that if someone in practice spends their time doing X and not Y, then they cannot actually be more interested in Y. I believe this flies in the face of human experience.

If you love software, there are a limited number of careers that involve writing software. If you do not have the privilege of a large trust fund, you're looking at industry or academia. Neither tends to involve just sitting in a room and coding all day. (Those situations that do probably won't satisfy a post-teenaged love of software.) Especially if you have a more academic love of software, academia is not a stupid option, even though you realize that your time is going to be filled with a lot of non-programming duties. Yes, you would probably spend more of your time programming if you got a corporate programming job, but loving software does not mean automatically loving any situation where you are typing code into an editor.


This doesn't fit with my experience. Most (90+%) of the developers I've worked with have a CS degree. Those who didn't mostly had math, physics or chemistry degrees. A small minority have no degree at all.


Maybe universities would benefit from a formal programing lab discipline, like they have for physics and chemistry. One where you document what you are going to do, go into the lab and do it, and document what you concluded from it.

And yeah, finding bugs on code that was already there when you came can be an exercise.


Or just require CS students to take at least one lab science course. I started in the engineering program and switched to CS, but by then had already taken lab-based chemistry and physics. Those courses taught me more about “debugging” than any dedicated engineering or computer science course I took (and I did take the software engineering courses that were offered). In fact, at a reception dinner during graduation week I was seated at the same table as one of the deans, who asked if I had any recommendations for improving the program. I said that they should require students to take physics and he was shocked - what would that do for students??? I explained my reasoning and he understood what I was getting at. But I don’t believe they ever acted on my suggestion.


> "How to program" isn't something the academy understands, so they generally can't teach it

This was my (limited) experience with (introductory) formal CS.

Which is sort of amazing to me as art schools seem in broad agreement that “how to draw” can be taught, albeit in many different ways.


There isn't a single fine arts program in the world that will teach you how to draw lines on a piece of paper. If that isn't something you have figured out how to do in the first 18 years of your life, you probably shouldn't be pursuing it at university.


> [Younger profs] think they can teach every student. Very few (none?) of the older ones.

I buy that argument, but I would ask: if they can only teach some of the students, then are they teaching the right ones? Bess's example in the article suggests the answer is "no".

It is easy to "teach" only the students who don't need to be taught, and pat yourself on the back for your success. The students who sink are a mixture of people who are going to sink no matter what, and people who are only sinking because the teacher isn't giving them what they need. The students who swim are a mixture of people who have the right background to take advantage of what is being taught, and people who already know enough to not need to be taught (either by knowing the material already, or by being far enough along to pick it up along the way when working on the next step).

It sounds like professors today only care about swimmers, not sinkers. They would produce more value by ignoring some of the swimmers and paying attention to some of the sinkers. (Not saying this is possible within the current system, just that it would be better.) If your goal is to fill up a bucket, you will fail if either the bucket has a hole in it, or if it's already full.


> Universities are supposed to teach academic skills

The passive voice does a lot of rhetorical heavy lifting in this sentence. Who supposes that?


The people who control the syllabus? The research staff? Adminsitrators?

Take your pick. Universities teaching academic skills is not some novel innovation, these are traditions that have been ongoing for centuries. If they were interested in practical skills, entire departments would be removed immediately.


I didn't read through the entire article, but I disagree with the overall premise.

I remember being in college courses where the entire class would come to a standstill because 1 or 2 individuals just didn't get it. I can't count how many hours were wasted not getting into new material because of this issue, I would have much preferred the professor ask them to self-study and/or take advantage of the professor's office hours.

This is a case of "who do you harm?". The ones who aren't getting it or the ones who are but are forced to wait?

If you can't self-study then perhaps you should fail out. I myself had hearing problems while in college, couldn't hear the professor talking while in class. I still graduated with a double major in CS and Math because I passed all the coursework despite not being able to hear the professor speak (and this did cause problems for me on occassion when professors would call on me).

Having said all of that, I do have some sympathy. I once (late 90's) asked a professor why they didn't teach us many of the skills I was picking up on my own outside of class. His response is that they only have so much time so they must be directed in what they teach and therefore they strive to give a base of knowledge so that you can learn on your own.

OTOH, I once walked into a math class I had been basically skipping for a while only to have the professor out and he gave a group assignment for everyone to work on. I ended up having to argue with _everyone_ in that class (this despite me barely attending) that they're wrong. At one point one of the students claimed what they were doing is how the professor taught them to do it, I told them there's no way the professor taught them to do it that way since that's wrong. I ended up being right.

At some point people have to be there to learn, not to be spoon-fed knowledge.


> I didn't read through the entire article, but I disagree with the overall premise.

Actually reading through the article would probably have helped, considering that the author lays out that [part of] the problem is with professors refusing to help and/or denigrating students for needing help during office hours.


I did read through the article, I didn't read through _all_ of the article.

There's a reason I said "overall premise". Perhaps "overarching premise" would have been better, but the point still stands.

At some point it's the students responsibility to learn, it's the professors responsibility to offer the opportunity, not to offer guarantees.

If a professor is not keeping office hours or denigrating students in any manner, that's something that needs to be taken up with the administration.


Self learning and self study is a skill poorly taught or not taught at all. There should be classes for that. This will reduces student's reliance on needing classes to learn materials while at the same time allowing universities to focus on teaching those who needs help the most.

That say, there should be (separate) remedial classes for students to acquire knowledge.


I think there's a fundamental philosophical disagreement between people who have experience in advanced science and math subjects and those who are part of society at large. Society at large operates under a philosophy that /anyone/ can learn /anything/ with enough support systems, resources, and instruction in place. Those who have experience in advanced topics realize that there is very real intelligence & personality barrier of entry. If you don't have the intelligence and personality for these topics, the only way you will succeed is via pure grit, because the strongest motivations are intrinsic, not extrinsic.

There is a reason there are so many people who claim to be programmers but can't do a FizzBuzz, because colleges manage to churn out thousands of graduates every year who don't have the intelligence and personality to be successful in a technical subject, but are doing it for the money only. Competence is not /only/ trained, it is training on top of talent (intelligence and personality). It requires an above average intelligence to be successful in advanced technical fields, and statistically speaking /most/ people do not have above average intelligence. Trying to solve this problem is likely incompatible with reality.


I am not going to discount intelligence as somehow unimportant. After all I basically skated through grade school without having study skills. This bite me in the ass when I actually do need to use study skills.

I believe most people are sufficiently intelligent to learn basically any skills and knowledge that they want. The question isn't if they can do it, it's why it's so hard for them to do it?

Self learning can be taught. So is emotional self regulation. Those goes hand in hand to finish any projects that you may desire.

Now why do college turns out graduates who can't program? It's probably that college isn't churning out programmers, but people who are taught computer science. That's why they can't do FizzBuzz. They lack skills in the fundamentals.


> Now why do college turns out graduates who can't program? It's probably that college isn't churning out programmers, but people who are taught computer science. That's why they can't do FizzBuzz. They lack skills in the fundamentals.

The answer is that people study for the test and passing enough tests is all you need to graduate.

There is a correlation between graduating and being able to program, but it's not a perfect correlation.

> I believe most people are sufficiently intelligent to learn basically any skills and knowledge that they want.

It's easy to see this is untrue. replace intelligence with physical ability and it quickly becomes obvious that not everyone is going to be a great boxer or a great basketball player.

If you take away the requirement for success, then sure, anyone can do anything. But lets not take that requirement away as that makes your statement unuseful.


> If you take away the requirement for success, then sure, anyone can do anything.

Part of the disconnect between our positions, and the position of society at large is that society at large sees receiving a college degree as success, or at least a strong sign of success. Where-as when people fluent in advanced technical topics describe someone being successful, they're referring to the ability to work in those topics fluently, e.g be their intellectual/professional peer. As your pointed out, there is a weak correlation between graduating college and actually being fluent in the topic, as it is commonplace for people to cram for a test and forget everything by the following week.

The core disconnect is that society at large interprets "anyone can get through a degree program in any topic with sufficient support systems and resources in place" as being the same thing as "anyone can be successful in any field or topic with sufficient support systems and resources in place". This conflation does a disservice to everyone, but perhaps most of all the people who work in those fields who get stuck on teams where some colleagues are a net drag on productivity and actively drain their mental and emotionally energy rather than contributing, because as a rule the folks actually doing the work aren't empowered to control who they work with other than choosing to quit.


It's the difference between HS football and college football (I was a college athlete).

The sheer speed difference between the two isn't something you can fully communicate to even a spectator, much less a complete layman, but it's there. professional football is going to take it up another notch.

I once had a HS player tell me the state champions could beat many of the college teams in the state and I laughed my ass off. I'm sure you could find a college team that would lose to them, but in general? The speed difference is too great, the HS team would flat out not be prepared for it.

And it's the same with intellectual pursuits. There's a world of difference between being the best chess player in your HS and being the best chess player in your state (and even more of a difference if you consider the US).

But you know, for those who are amazing at one thing they sacrificed being decent or good at almost everything else.

I can't find the video, but someone once asked Feynman why he started painting once he got older. His response is that he went very deep into physics and math and became great at it, but he did so by sacrificing almost everything else. Paint was his attempt to learn something outside of his wheelhouse.


I don't think this is the main contention. It's about what people are fundamentally capable of achieving absent EQ and other factors and taking into account only IQ.

I am simply more optimistic.


It's easy to see this is untrue. replace intelligence with physical ability and it quickly becomes obvious that not everyone is going to be a great boxer or a great basketball player.

The standard is fuzzy by what you mean a 'great boxer' or a 'great basketball player'. I'll simply replied that with a fuzzy answer 'at least highly skilled'.

I believe most people can potentially trained to run a marathon. By my standard, that's a high level feat. Now, most people aren't actually capable of running a marathon right now, but my point is that it can be achieved.

Now you might see that as wildly optimistic, but as I explained, there are factors that prevents people from learning these skills, which can seem pessimistic. We are clearly not teaching in the most optimal fashion with the most optimal mindsets.


If it were fuzzy we wouldn't call Mike Tyson one of the all time greats, nor would Mohammed Ali be considered one of the greatest boxers of all time.

What you mean by fuzzy is not perfectly predictable, which is true. But the results of certain individuals speak for themselves and very few people will ever be able to do what Michael Jordan has done in basketball.

Indeed, the chances of making it into professional sports is so small most people are told to always have a backup plan.

So if you want a definition for "great" here, it's that people will look back and consider them great after evaluating them _after_ the fact.

Not everyone has the _potential_ to do what Michael Jordan did, not everyone has the _potential_ to do what Albert Einstein, Isaac Newton, et al, did.

This is obvious at the extremes, therefore, it's obvious that it also happens at the lesser extremes with the difference being that the sample set of people able to achieve that level of success is greater than at the extremes.

paradoxically, understanding this isn't rocket science, but it does require you to admit that people aren't merely blank canvases (of exactly the same material and size) to be written upon.


I would define my standard as something achievable by mere mortals rather than 'great' if we want to say that most people are capable of learning skills/field of knowledge, say a PhD.

paradoxically, understanding this isn't rocket science, but it does require you to admit that people aren't merely blank canvases (of exactly the same material and size) to be written upon.

I wouldn't say they're blank canvases, but I would say that brains, however different they are, probably has the potential to reach a certain level of competency in any given field.

It's not like I am not acknowledging talent. On the contrary, I have experienced what 'high' talent look like in at least one field of endeavor due to how my brain is configured.


This is a long winded way of saying if you lower the bar for what you consider success, more people can qualify for it.

While technically true, the observation is not useful for one very obvious reason. If you lower the bar too much, the skill itself is no longer useful.

The question is, can everyone learn anything to a useful degree, and the answer is no.


I believe they can. Competency/highly skilled is useful enough.


people believe lots of things that are untrue.

In this case it's borne out of hubris in believing you yourself can do whatever you apply yourself to.


> Self learning and self study is a skill poorly taught or not taught at all.

Stands to reason. Most jurisdictions restrict teaching positions to only those who have proven that they haven't figured out how to self-learn. If a self-learner has some desire to teach self-learning, there is little opportunity for them to do so without jumping through an array of pointless hoops that quickly diminish the desire.

Which places on the onus on parents. Which is problematic as they may not understand self-learning themselves, and even those who do are apt to not consider that it is a skill to teach, seeing self-learning itself as something to self-learn.


I don't disagree, but how do you teach something that requires curiosity to the incurious?

I don't think it's as simple as teaching techniques, there has to be motivation there.


Curiosity is secondary to learning objectives such as being able to get a job, or simply getting a good grade. That would require restructuring our educational system and by extension our whole society. That's a tall order.


I don't think such escalation is necessary here.

the incurious don't learn at the same rate as the curious, we can acknowledge that without making wild claims about how the whole of society needs to be torn down and rebuilt.


Unfortunately, this doesn’t stop at school. Half of my work meetings feel like this. :(


Yep, as the article notes, the reason for this is simple: Universities don't employ teachers, they employ academics. Professors are pretty much universally researchers, not teachers.

Teachers know all these things, that if most people in the class don't know how to use commas, you teach them that, or if they don't know how to debug, you teach them that. Teachers don't tell students to quit a subject because they don't know all its fundamentals already, they are there to teach. But academics aren't there to teach, they are there to find new academics to work with. And certainly it's a quicker and easier path for that goal, to focus their time on students who already know everything.

But the article is spot that this is a bad thing and is a major hindrance to our society's ability to educate people. And it's also a hindrance to our ability to do research! To a person, all my professor friends hate teaching undergraduate courses, both because they know they are bad at it (they aren't teachers!) and because it is a huge distraction from their research (which is their actual profession!).

Most undergraduate courses, certainly for the first couple years, should be taught by people trained as educators, who also have advanced knowledge of the fundamentals of the subject they are teaching (which is generally not true of high school teachers, for instance). By the graduate level, most time should be spent on learning how to contribute to real academic research in the field, and professors are the right partner for that, for sure. The last couple years of undergrad should be a mix of these two, maybe by flipping the professor / teaching assistant relationship to instead have a primary teacher with a professor assistant.

Teaching material more advanced than the high school level is an important niche that we do an awful job of.


I think in part it's because the idea that programming is text and math-based is too ingrained in society.

For example, we talk about programming languages. But IMO there are also programming systems such as Smalltalk [1]. I've programmed 2 years professionally in it, currently looking for an engagement in a different language (a curiosity thing, also a resume thing).

I think Smalltalk has a lot to offer by switching the programmer's view of thinking about programming systems rather than programming languages.

Moreover, programming systems is also not where it is at. One downside that Pharo in particular has is that the community is small. A lot of plugins/libraries that are a given in other languages aren't there! For some, however, this is a strength because one gets to learn much better how to build stuff from the ground up and tinker on it by yourself. Given that there is still a lot of low hanging fruit it is easy to become a contributor.

But this part, whether a community is big or small means that I think it's smarter to think about programming ecosystems where a programming language or programming system is the central hub connecting the programming community together.

Why don't schools teach about programming communities? See my first sentence ;-)

[1] https://pharo.org - a modern Smalltalk


Programming is logic, language, and practical skill (like welding or riding a bike).

The logic is basic enough that you can get most of it in a single class. You can get a good bit of the language idea (since the syntaxes change) in class(es), but you can't be fluent without immersion. The practical skill (which is what people get paid for) can't be directly taught and requires significant personal effort practicing over a long period of time.

In my experience, most students fall on the practical hurdle.


Smalltalk is image-based though, which means the amount of things you can break is much bigger than in traditional systems, and you need to apply a real effort to keep things consistent. Which would be especially hard for beginner students.

Take a classical C or Python programing assignment for example: it all fits on a single screen or few screens, and that's all there is (most assignments usually do not include third-party packages or persistence mechanisms like files/databases). When you run it, the state resets from scratch every time, and the run is normally deterministic. If you revert your file to previous version, program will function as before. You can show this file to someone and they'll have the full picture of what you have done.

Compare it to Smalltalk where your program is spread throughout the system inside lots of tiny functions. You might have a good code which does not work because there are objects created by previous versions. Or maybe you changed a system-provided function during debugging and accidentally broke it in the process.

("notebook" environments like Jupyter notebook have the similar problems. But at least there, you can tell a student: "please restart the kernel and reproduce the problem". Nothing so simple exists in smalltalk)


> especially hard for beginner students

Perhaps you don't know that 50 years ago Smalltalk was being taught to middle school and high school students?

> Nothing so simple exists in smalltalk

Have you actually used Smalltalk?

    ----

`I will always remember a debug emergency call I received during one of our student classes. A girl was experimenting with numbers in Smalltalk-80, and suddenly her system froze when she tried to reframe a window. Although the UI was unresponsive, the "emergency evaluator" window was still operating so I was able to take a look at what had happened. Her window was trying to display itself, but was encountering an error because BitBlt, responsible for painting the border, had a width that was. . . wait a minute. . . a Fraction? We did not even have fractions in the student system, but she said, "Oh yes, I added a Fraction class".

Interestingly, BitBlt was written to be resilient, in that if it received a non-integer argument, it would call itself again after sending the message asInteger to that argument. However, her fractions did not have such a method, so the debugger had stopped at that point and was not even able to show the problem because of another such infraction (er, sorry). I asked whether she had a conversion to integer and she replied no, but that her fractions did have a makeFloat message. We talked about the problem, and I got her to suggest defining

    Fraction understands: 'asInteger' as: '↑ self makeFloat asInteger'
We were able to type this into the emergency evaluator, at which point the debugger miraculously displayed itself, the window she had tried to reframe reappeared, and everything in the system seemed to work again. Probably hundreds of methods in the system were now operating just fine with rectangles whose coordinates were instances of a student’s newly defined Fraction class. This illustrates the astounding ability of message-sending systems to absorb unanticipated constructions.`

Daniel Ingalls "The Evolution of Smalltalk" 85:63

Proc. ACM Program. Lang., Vol. 4, No. HOPL, Article 85. Publication date: June 2020

https://dl.acm.org/doi/pdf/10.1145/3386335


I don't think "smalltalk was taught 50 years ago" is a good argument. When I was a kid we used MSX machines with no storage.. your program would be just gone at the end of the lesson (or if you messed your POKE's). If you wanted to save, you copied code to your notebook. Were we able to get things done as middle-school students? Yes. Would I recommend this environment to anyone else today? No.

And yes, I get this, if you have on-call expert for student classes (preferably principal architect of the system), then you can use Smalltalk. But I've never seen "on-call experts" at any time during the university... the closest thing is sometimes there are TAs during labs, but those TAs are just slightly older students.

What would that girl do if Daniel were not around? Would she have to reset the environment and lose all of her work?

(For comparison, later on we got PCs with MS-DOS. I've crashed those machines so many times with all sorts of crazy or stupid code.. and I never needed anyone's assistance to recover - hit RESET, wait for machine to reboot, reopen your files..)


Have you actually used Smalltalk?

> Would she have to reset the environment and lose all of her work?

Why would she lose her work?

https://cuis-smalltalk.github.io/TheCuisBook/The-Change-Log....


> My O-chem professor … told me to quit when I didn’t just understand it immediately… He wanted to appear helpful, but then acted resentful when I asked questions, “wasting his time” …

Stories like that really leap out at me these days. For the past few months, I’ve been experimenting with learning through interaction with GPT-4. While some subjects work better than others and GPT-4’s lack of personality can make the lessons a bit dry, it never gets annoyed at my questions or belittles my ignorance. I suspect more and more people will find that they prefer learning from chatbots to being taught by humans.

(Later) As one example of how I have been learning from GPT-4, below is a conversation I have had with it over the past few days:

https://gally.net/temp/20241017exchangewithGPT-4.pdf


The core problem I think seems to be time. Everyone lower than C-level is packed full with work, no matter if in academia, government or the private sector.

So, when someone comes to me with an issue and shows they have done at least done the basic legwork, I'll be more likely to help them out than someone who just says "xyz doesn't work" but has done zero work on their own to troubleshoot it, because I don't have time to guide some junior on utterly basic stuff - and that is why so many job postings say "x years of experience" required: the existing staff is drowning and can't handle the enormous workload of training juniors.

Obviously that has negative mid- to long-term consequences as when no juniors get trained you'll eventually run out of seniors, but I have no idea how to get this fixed, especially not on a societal level.


What makes you think that the C-level is not packed with work?


I've been doing the same. Sometimes chatting with with bot gives me something I can confirm and the process feels like I'm learning more than I would had I been reading the textbook.

Sometimes the bot can't help, but by the time I come to that conclusion I'm now familiar enough to ask the professor in a way that doesn't feel like a waste of his time.

Your O-chem professor sounds like a jerk, but there probably is some merit, in general, to the idea that we can maximize the efficacy of teachers by thinking a bit harder about the questions first.


I read somewhere that "in the 40's and 50's they built computers for the computation, but it turned out the storage was almost more important.

Now we built AI for the cognition, but it will turn out that the patience is almost more important."


> but it will turn out that the patience is almost more important

You never used a 3D printer or other kind of slow robot, have you?

Computers are infinitely patient and relentless. That has huge implications.


ChatGPT will also teach you important skills of "trust but verify" and "how to detect straight-face lying"

I've experimented with asking GPT-4 technical questions, and the moment you have something moderately complex, you start occasionally getting outright incorrect information (library functions which do not exist, missing correct answers entirely, products of entirely wrong category, descriptions of completely wrong things...)


As a professor, I appreciated this article; the poignancy from the ending reminded me to reflect on how I choose to spend my time. I think the one nuance that’s missing is the concept of “prerequisites”.

Most courses at a university have them - other courses or concepts that are expected to be understood before the current material. While I completely agree that it’s almost always a good idea to review the prerequisites, there is a dark underbelly that goes unmentioned here.

I don’t know if this is just more prevalent at a smaller university, but I’ll occasionally encounter students in my (graduate level) machine learning class who are excited and passionate to learn the material, but who’ve chosen to forgo taking the two or three prerequisite courses. (Note that this is often undergrads who can’t fit the sequence into their schedule, rather than because of simple arrogance.) In these cases students often would like to spend a lot of my time outside of class learning the material they were supposed to know already. While that is a more efficient way for them to learn the material better, it takes time that I need to allocate to other things. (After all, it’s well known that 1-1 tutoring is the most successful form of teaching!) So usually at the beginning of class I’ll threaten and harangue to try to convince them not to do this.


My second day of classes, after an overview of the courses, I teach students:

- How to use a debugger

- How to set up a Github account

A good computer programmer needs to be able to evaluate/simulate the code they are looking at in their head, and the debugger (and other visual tools) are excellent for developing that ability. This is particularly the case for students that aren't "naturals" at CS (I was not a natural.)

https://grugbrain.dev/#grug-on-tools

> grug always recommend new programmer learn available debugger very deeply, features like conditional break points, expression evaluation, stack navigation, etc teach new grug more about computer than university class often!


There's a difference between general "debugging" and "using a debugger". In my experience using a debugger is generally abused. Students need to learn to modularize and modularize and modularize so that what they write can be unit tested with logic errors detected by the tests. Tossing in quick print statements, compiling and rerunning ends up encouraging better design. When someone can't debug without an interactive debugger generally that means the code has hit ball of spaghetti status.

This becomes especially helpful when writing heavily threaded code as interactive debuggers are almost entirely worthless when it comes to finding and fixing race conditions.

I find I use debuggers for post crash analysis, either unintentional during a program run, or intentionally forced by me. Specific tests designed to crash the system in these cases can be cleaned up and added to the general test suite.


Reality is that some schools do and some don’t. Like most things in life it isn’t black and white.

I took an intro to computing systems once and we spent all semester using GDB to debug homeworks. We even had some extra credit assignment that were simple CTF-like challenges, all within GDB.

Also I’m pretty certain if you take a binary explotation class or similar you will learn to use debuggers and even more advanced tools.


My school did--at least it taught debugging.

In his memoir of his secondary school years, <i>What's to Become of the Boy: Or, Something to do with Books</i>, Heinrich Boll talked about his older brother's method of tutoring him in Latin and mathematics: work back to sound knowledge and then build up from there. Boll writes that he used the technique in tutoring classmates also.


The answer is no.

What you propose is teaching to the lowest common denominator, which is precisely what happens in the public (grade) school system in America, and is also precisely why our high school students are falling behind their global contemporaries.

A bachelor's in computer engineering indicates a certain minimum bar of proficiency. Those who cannot meet the bar will not earn the degree. A typical semester has twelve weeks of instruction, three hours per week. Spending time on remedial algebra comes at the cost of not covering a relevant curriculum topic. If each course did this, engineers would graduate with major gaps in knowledge.

This is why catering to the lowest common denominator can only lower the bar, and would result in engineers that can't problem-solve their way out of a paper bag, let alone compete globally.


There is a difference between teaching to the lowest common denominator and teaching for mastery. The latter requires making students aware of where their gaps are, even if they must address them on their own for time scheduling reasons. You don't have to spend hours on covering remedial algebra, but you can spend a few seconds reminding students that fluency with algebra is a prereq for the course, and they can't expect to pass unless they address that promptly.


You could have separate remedial classes for these students if they don't already exists.


Yup, and many universities, recognizing the gaps in public education, already do. But they are extra courses that one has to take in addition to the degree requirements. They make it the student's problem, not theirs, which is indeed how it should be.

A university isn't a place that trains a person. That's what colleges and vocational schools are for. Rather, a university is a place where resources are made available so that a person can learn for themselves.


A university isn't a place that trains a person. That's what colleges and vocational schools are for. Rather, a university is a place where resources are made available so that a person can learn for themselves.

Hardly. Youtube already served that function, but it doesn't teach or train, which is the university's job. A student's responsibility should be to set them up their success, not to unload a university's burden onto them.

It would be ideal for students to be self supporting and self teaching to the point of not needing teachers, but the ideal is for universities to offer the greatest resources, teachers, and tutors to overcome any difficulties students may have.


What you propose is teaching to the lowest common denominator, which is precisely what happens in the public (grade) school system in America, and is also precisely why our high school students are falling behind their global contemporaries.

I don't see why schools shouldn't teach the 'lowest common denominator'. Perhaps they should be separated into their own classes?


> Perhaps they should be separated into their own classes?

This is called tracking in educational parlance and is largely detrimental to students and has all sorts of problems with exacerbating existing socio-economic issues the students come in with. Usually better to have additional remedial classes to try to catch those students up but by the time you're in college there's a certain level of self starter ness expected.


It's been said here before, but university CS programs tend to expect the most crystallized knowledge of incoming students of any program. I did not code in my free time in high school nor did I have formal CS education in high school, and I always felt that my university CS courses weren't designed for me. They were designed primarily for the student who self-taught during high school.

As to why this is, I think it's the path of least resistance for the teaching staff and it seems to work so why change it?


They expect the kind of incoming skill an arts program does. Music or what have you.

But they don’t filter applicants the same way, which is kinda weird.


That's interesting. As someone who does a lot of programming in my free time and does (hopelessly simplistic) compsci at school, I'm avoiding CS degrees precisely because they assume you know nothing and move slowly. My info for this is the course structure and talking to current students. I'm in the UK.

When and where was this? Perhaps they have changed over time.


This was in the early 2000s in the US. I wouldn't be surprised if the programs have become more accessible over the years. CS was still pretty niche when I was taking the classes.


This article is not really about debugging, but I would like to add something about debugging, and how the author relates it to programming:

I recall one time I helped a relative diagnose and fix a problem with the plumbing at his cottage. My relative was so elated afterward because he enjoyed the evidence gathering, proposing a hypothesis, going down one path of investigation, crossing that off, going down another path, trying experiments, etc. That was all a new experience for him. I didn't want to ruin his excitement by saying what I was thinking: that it was exactly the same debugging I have done all day at work nearly every day for years, and so I did not share his excitement, although I always enjoy a good debugging session.

This is all to say that I don't think debugging should be seen as exclusively a programming skill. It's a general skill that can be applied to certain types of problem solving. I probably learned it initially by playing with toys like train sets and wooden blocks. Later, I would use it when repairing things around the home. The schools I attended never attempted to teach debugging. My parents never knew how to debug. I think it is a useful skill for any person to know, but maybe I am biased. If we wanted to, we could teach children how to do it with physical mechanisms, and it would probably be intuitive, and maybe even fun.


Definitely a good question, and one I wondered a lot when I worked in the field. There's probably something satisfying to certain teachers watching a bunch of reasonably smart kids fail, especially the ones who used to skate through with intuitive understanding of everything until they get to that one class where they might have to apply themselves. There is also a type of student that traditionally tries all combinations of tricks they learned until something gets the green light and these students are probably annoying in office hours, where the TA or professor has to try to understand a very foreign way of approaching problems to tell the student what's wrong, just for the student to add something to their bag of tricks instead of learning the more general principle they are missing. Those students often fail weed-out classes. There may be some students who cannot learn to generalize properly, I do not know.

I like what is being asked by the OP. Why do we see it as a benefit to gatekeep and weed out rather than educate? If you get to first year CS or EE without a complete understanding of basic algebra, or first year creative writing without comma-use skills, you have been failed in your previous education. A remedial approach will hopefully get a bunch of students back on level. However, if you get through CS without debugging skills or creative writing without being able to edit and improve piece, then your university failed you. Even those educated with a focus on getting it right the first time will still produce products that need debugging or editing.


Great title; really exposes who didn't even bother reading and just goes straight out typing words

If you're wondering: the article (and the original article by Dan) is not talking about software debugging tools or techniques. Not at all. (Systematic) Debugging here is used in a loose meaning: the process of (re)reviewing information at hand and finding gaps and errors in your understanding — "to start at the symptom of a problem and trace backwards to find the source"


Yeah, I thought it was a very clear and though-provoking article about many academics' inability/unwillingness to teach, and then the comments were all just people talking about literal debuggers.


Some things can actually be taught... some can't.

To program something, you need two things:

- to learn the language (syntax, rules, specifics)

- to break up the real-world problem into smaller step-by-step blocks, that can then be written in a chosen programming language.

The first part (language) is simple and can be taught in schools. But some people are just unable to do enough mental abstract processing to do the second part.

I've intentionally chosen programming as a first example (because of where we are), but this is true for human-languages too.. rules of the language are simple, articulating and in a more extreme example, writing a book require more than that. Same for woodworking... saw, drill, chisel, router, sander... simple. Being able to mentally transform wood into something useful in your head is hard.

I have no problems with coding pretty much anything, given enough resources (time), but writing a short story (and other texts) in school was always a pain that involved a lot of counting words (back in my time, written by hand on a piece of paper) and "oh shit, I need two more paragraphs of something".


> But some people are just unable to do enough mental abstract processing to do the second part.

Even accepting that this is true, not teaching it because some people may not be able to learn it and instead leaving everybody to figure it out for themselves, some of whom might have figured it out with help, is really fucking shitty gatekeeping behaviour and you should be embarrassed for defending it.


Gatekeeping means limiting access to something. I never said that we should forbid kids from learning programming.

What i was saying, and am saying again is, that programming is a thing that will be done by a small percentage of students, and having a mandatory course in that subject is useless, because those who can't do programming will only "suffer", and those who can, won't learn much, because otherwise the majority would fail the class. Make it an elective, sure.. Mandatory? No.

A class about basic finances? That should be mandatory.. everyone will get a job, work, pay taxes, save for stuff, get a loan, have unexpected expenditures, etc. Everyone should know at last the basics. A general computing class (internet, emails, documents and basic editing, etc.), sure. Coding, no.


To add on to this, in college I knew a woman who could read code just fine but could not write it. It was really odd, but I paired with her (she was 2 years ahead of me) and basically had to write the entire thing myself.

But once I was done she understood it perfectly, to your point about abstract mental processing, she just couldn't come up with the algorithms on her own.


The opportunity to debug something is something everyone will encounter at some point. Not necessarily with respect to code, but the debugging process applies to all things.

Surely that is a more useful use of time than a dedicated basic finance class; a topic that is already thoroughly covered in the standard math curriculum?


> Gatekeeping means limiting access to something. I never said that we should forbid kids from learning programming.

Not giving people the training that is required to be a good programmer is gatekeeping.

Not having a mandatory programming course is gatekeeping. It prevents people who don't have exposure to programming outside of school from even contemplating that career, mostly people already disadvantaged.


It's a cursed problem.

As mentioned elsewhere, immersion in the actual problem domain is the only way you can deeply learn certain things. You can create synthetic problems for students to solve all day, but these won't move the needle on many of the fundamentals.

Most of my debugging expertise came out of working at Samsung's semiconductor plant as a systems engineer for a few years. Nothing else I experienced comes close in terms of rate of learning. By week 4 they have most of the novice assumptions beaten out of you. It is like a vocational program for engineers that does exactly what the author of this article is looking for. The skills I learned there I use literally every day - root cause analysis, asking why a bunch of times, suppressing my urge to jump on a simple answer, coordinating issues between multiple teams & vendors, etc.


Schools are only partially made to teach. They are also serving the interests of the people running them, and of course, they are filters.

What's more, many teachers actually don't produce anything with what they teach, and have a curiculum forced on them by people making it up on theorical basis.

The result is that half of the professional training I give have to cover those holes. Having to teach debuggers to pro devs is weird but a reality.

In fact, shameless plug, but this is why I have a articles dedicated to pdb, venv and pip on http://bitecode.dev.

Because the demand is very high.


They were also made to train a workforce for industrial production. Some part of the school curiculum is detrimental for todays tasks.


I remember a lesson from a literature course titled "How I Learnt to Swim". The author had read books and watched the experts and got tips from them but until he floated in the water he had not even begun to learn swimming. This is the issue with academia too. Too much of the stuff be it electronics, theory of computation, algorithms or software engineering is taught before most learners had even had a taste of programming. No wonder very learn much from school.


Counter point. My CS program had a class specifically for low level programming that required massive amounts of debugging C code using GDB. That same program also required students to take an OS class in C where the knowledge from the low level programming class came in very handy.

For higher level languages I don't think much of an introduction is warranted since it's just using tools provided to you by the IDE and should be left to the students to figure out.


I have found that many things are learnable, but they have to be taught in a way that you can understand. Most teachers only know one or two ways to explain something. Additionally, they just don't have the time to tailor materials to people. I get really skeptical when people say they can't learn things or have trouble with something. People start acting like you need to be a genius to understand how a comma works or know algebra. I would feel bad as a teacher if I could not explain to an 18 year old how a comma works. Its not that the student is bad, the teacher is bad.

The more time I have spent trying to teach things to people, I think its mostly a failure of the teacher to be able to explain things in a way that people can understand.

You listen to them explain what they are having trouble with, and it seems like they just had bad/incomplete information which makes the process 10x harder. Many people are only really going to learn things if you explain it in X terms, where X is something they really like and think about all the time.

Edit:

I see complaining about dumb/bad students like comedians complaining that the crowd is bad and does not like their jokes. Maybe you just need better jokes, because there are comedians that can get that crowd to laugh.


I'm not sure that all the examples in the article really demonstrate the same thing. Debugging is not the same thing as knowing the rules for how to use commas. Students who are behind on basic punctuation and grammar can be put in remedial classes -- and often are, that's what happened at my university. But debugging seems very different. The condition of "I wrote a program, but it doesn't work and I don't know why," and I don't know if teaching a set of debugging skills or techniques will help a student who can't even form a hypothesis of why their program is broken.

The bigger question is about the teaching of fundamentals generally. Of course schools should teach them, but it is fair for classes to have prerequisites and to expect students to have met them on arrival to a class. It is unfair to the students who have done the work to slow down the class for stragglers who haven't met the prerequisites. Those skills should be taught in a different class that the students take first.

As for the chem professor who didn't want to help the student during office hours, he sounds like a professor who doesn't really like to teach. This is an endemic problem of major research universities that is probably really at the heart of the main question of the article, but which won't be solved by structural changes to curricula, but rather by changes to the incentive structure of university faculty. Professors at major research universities (in the US, at least) aren't incented to teach well, nor are they selected for their teaching ability. (They are incented to get good student feedback evaluations, but that's hardly the same thing.)


The hard truth is: Some teacher's don't know the material (and surrounding skills) as well as they should. They are embarrassed to have students inadvertently point this out with questions and would rather have that embarrassing "problem" just "go away" instead of learn to be a better teacher.

I say this as the son of two teachers who enjoyed their craft. Boy did I hear a lot of stories growing up ...


From my perspective in IT tech support, I didn't take any programming/coding classes, but for our CompTIA A+ preparation, we did learn the essential steps of troubleshooting any system. In fact, I prepared a custom presentation to demonstrate such a troubleshooting session as my final project.

I thought that this troubleshooting process was indispensable to anyone at a Help Desk or in an IT Support role. Unfortunately, on the front lines you are usually not able to choose a course of action, but you must stick to a script.

That is why Tech Support gets such a bad reputation for asking people to turn it off and back on again, to clear their cookies and cache, and other stupid busywork, because the Support rep needs to recite the script and follow it slavishly, rather than following an actual troubleshooting process, which takes intelligence, a lot of discernment, and usually a lot of cooperation from the user.


A surprising number of responses here seem to be arguing that universities aren't intended to educate, they are intended to provide a series of tests or challenges to prove an education.

I agree with Jake Seliger, I think universities would be more useful if they were trying to optimize for education.


But then they wouldn't be able to make as much money.


I have noticed that data structures are foreign to many less experienced developers with a computer science education. That blows my mind, but then I am self taught.

For example consider the following:

    var1[var2[index][index1]][0]
It’s just a combination arrays and objects, but may as well be its own language. The practical problem then is that many developers cannot navigate larger structures that come over a network, cannot recursively walk a file system, cannot traverse a DOM, and cannot do so much more. To me, the person without the computer science education, it looks like their education was a waste of time and money. Understanding data structures is among the most important of all skills in programming.


My university taught all of this stuff in CS101, and it was reinforced throughout the curriculum.


My system’s programming teacher spent an entire week of a 12 week semester (3 classes) going over how to use gdb/vim for one of our projects.

Still my favorite class and professor from my time at school.


I’m not sure about his insistence on teaching comme rule in class. I don’t think knowing comma rule is fundamental. And anyone who wants to learn grammar can look it up. The information is freely available online.

But I agree that education is not about teching it’s about creating jobs for academic professionals. Money is the most fundamental topic in our society. But I don’t remember ever being in a classroom where how money works was explained.


> I’m not sure about his insistence on teaching comme rule in class. I don’t think knowing comma rule is fundamental. And anyone who wants to learn grammar can look it up. The information is freely available online.

It's explained in TFA: you teach them to take a moment to ensure everyone is properly exposed to the knowledge. Why do people continually act surprised when something that is supposed to have been taught simply isn't?


Teaching how to debug is craft. Teaching how debuggers work can be science (human computer interaction, abstraction, state machines...). Teaching craft is vocational training. Teaching science is university-level stuff.

When I hire developers fresh from univeristy, I hope they have a solid foundation in how things work. I want them to know the science. I'll train them on craft.


I don't know how it works in USA but here in Poland you earn like 4 times more as a developer compared to a computer science PhD at an university. The incentives aren't exactly healthy. And the programming culture is basically - learn it yourself, go to university so you have a paper proving you can do it.


University provides much more than paper.

What many don't seem to understand is that it is the student's responsibility to learn. The best lecturer in the world can't teach you anything if you don't pay attention, or if you don't have the prerequisite knowledge and understanding. I could attend a lecture on advanced neuroscience and I would be left with pretty much nothing because I have no idea about any of that stuff. In order to gain from it I would need a solid base of knowledge on which the lecturer could build.

So university students fall on a spectrum. Some are really interested, they started writing code on their own before they even enrolled. They challenge themselves and do their own projects for fun and learning. Others don't care, they do as little as they can get away with.

A trick I picked up that I found very helpful was to study before each lecture. They would publish a list of all the lectures and which chapter the lecture was about, so I would study the chapter before the lecture. This allowed me to much better understand the lecture, and I felt like it really helped me gain value from the lectures and learn the material.

My point is nobody can put knowledge in your head but yourself. My experience was that the people complaining most about our studies and lecturers were the people who didn't take responsibility for their own learning. If a lecturer sucked I just skipped the lectures and read the book instead, I didn't care. I don't need a lecturer. Good lectures are just a bonus, they're not supposed to be your main source of learning. You're supposed to study on your own time, even when nobody has given you a specific task and deadline. If you do that, university is a great way to guide your learning and keep you going - and get a paper proving you have put the time in.


> and get a paper proving you have put the time in.

Are you apt to forget that you put time into it?


If I'm hiring a stranger for my company it's nice to have an esteemed institution willing to officially vouch for them having some baseline of knowledge in the field.


While there are always exceptions, the esteemed institutions in question are generally not willing to vouch for some baseline vocational knowledge in the field. In fact, that's what the entire discussion is about, with debugging being a prime example of baseline knowledge they are often not willing to vouch for. Their esteem does not stem from that sort of offer.

No doubt you keep on top of who are the exceptions, and ensure they stay that way on a continual basis, however that is going to be way more work in the end. More power to you if that's what enthrals you, but that's not work I would consider "nice". That sounds like drudgery to me...

...and most everyone else it seems. Indeed, there was those couple of years there where this notion of yours made it into the mainstream, but it disappeared as quickly as it came.


I'm not saying universities are perfect. I'm just saying a university education is useful, or at least that it was to me and I think it will be to anyone who puts effort into it.

I have actually remarked to colleagues about this exact topic, I think it's strange that debugging was a topic I had to learn on my own and teach to struggling students when I was a TA.

The flip side of that is that debugging is easy. It is definitely a skill that you can develop over time but learning the basics and getting started debugging your code is something you can do in less than an hour.

This is a minor criticism, not a justification for declaring universities useless. I learned tons of other useful stuff. I learned multiple programming languages, math, DSA, databases, networking, all kinds of things I use on a frequent basis.

Whenever I hear people say they didn't learn anything useful in university i think that says a lot more about them than it does about the university.


> I'm just saying a university education is useful

Sure. I expect there is no activity you can do in life that isn't useful. But the specific claim was that universities officially vouch for a baseline knowledge for the sake of corporate hiring interest. But the larger discussion is about how universities by and large do not vouch for such things.

And why would they? That is decidedly not the business they are in. In fact, if you found yourself hiring someone out of an esteem institution who still lacks those baseline skills and you tried suing the institution for false representation I expect you would be laughed out of the courts as there is no such promise actually made.

> It is definitely a skill that you can develop over time but learning the basics and getting started debugging your code is something you can do in less than an hour.

Sure. There is no programming-related skill that you can't start with in less than an hour. The only thing that really separates a great developer and a beginner is practice, practice, practice. Same with everything in life, really. You can learn what you need know to play baseball in less than an hour, but it is still a long road to the major leagues.

However, an expectation of baseline knowledge already being present would expect that the hours were already put in. If we accept "it is easy to get started" as good enough, then who cares about the baseline knowledge?

> Whenever I hear people say they didn't learn anything useful in university i think that says a lot more about them than it does about the university.

From my vantage point it says most about neither, but about the one who has not accurately read between the lines. But anyway...


They vouch that a person has attended the university and passed the required exams with the listed results.


I learned debugging (with the VS IDE and .NET tools) on my first job. School never taught me anything about debugging. Certainly not how to use the tools, but more critically never taught me: how to use binary divide techniques to track down which change caused a bug, how to use deductive reasoning to narrow down what can't be the problem, and even how to carefully read error messages to pinpoint the issue.

Schools teach theory, and sometimes they teach some versions of practical job skills. But it's pretty hard for a for-profit educational institution to make good decisions regarding education when students are the product and for-profit businesses are the actual customers. You end up maximizing recruiters' ideas of what makes a good worker and makes them marketable products, rather than what other working people have experienced as being important job skills. People say software devs need stronger unions but what we really need is to bring back guilds.


The sentiment from most of the replies, that others have to learn the way you did and that there’s not an accelerated way to learn a topic through explicit practice is deeply exhausting.

Will people learn on their own? Yes. Is there a better or faster way possible? Hopefully.


I think this is probably dependent on the college/university in question and is not necessarily universal. The author is right to think it's a problem, and I think many of the examples show weaknesses in those professors or departments. Debugging strikes me as a much more complicated skill and process than using commas or many other writing skills. Using commas is largely memorizing and internalizing the rules. They are much simpler than debugging, and fixing comma usage in prose is a one person job, it does not necessarily require multiple people brainstorming like debugging can.

I went to Rensselaer, our CS department was more closely aligned with the Math department, unlike some schools where CS is more closely aligned with the engineering school and the Computer Systems or EE programs. So we did have a very high load of required theory classes where practice like debugging was not relevant.

But we had a parallel track of required classes which were much more hands on along with a track of elective courses, and there was ample opportunity in those classes for learning debugging. We were often in a lab sitting at the computer working on assignments with the professor + grad students circulating helping with problems, and this is where hands on debugging was taught.

I suppose I am not a good example though either. I'm basically a second generation software engineer. I got my first lessons in debugging with my father when I was 10 years old and I had made errors typing BASIC program listings in from books or magazines I'd grabbed at the library. So I had a solid 8 years of struggling with debugging before I ever got to college. My debugging skills were tested because in a lot of cases I didn't yet understand important programming concepts needed to solve a problem and I was then trying to solve the problem with a limited set of tools. (E.x. not understanding dynamic memory until I was around 18, so I did lots of things with only static allocation of memory, making some problems much harder.) In any case it became a pleasant and unpleasant skill. I can often debug problems so quickly that managers figure out I might be better to look at a problem than the person who wrote the module that is creating an emergency. I get credit, but I always feel like I'm being punished.

I can agree though, this is not something to be taught in a big lecture hall with 100+ students in the class as that certainly would derail the class. It's for smaller breakout sessions where the teacher:student ratio is much smaller. If breakout sessions aren't a thing maybe the program isn't that great.


"Coding in the Debugger" KENT BECK 2007

https://tidyfirst.substack.com/p/coding-in-the-debugger


Learning unit testing as one of the first things to start with, and then having that always be a part of every future programming class would be something I'd choose.


Putting the "why" aside, debugging is an important skill that is not taught in school.

What kind of startup could fill that need and become a sustainable business?


Depends on the quality of the university, I surely had it as part of my software engineering degree, not that deep, but enough.


US public schools are still teaching that the digit 0 comes after 9.

Why? Because that's what the teachers were taught...


Debugging is a computer science fundamental in the same way that grinding lenses is an astronomy fundamental: it's really not.

I did learn debugging in school. I came to learn Computer Science and a left with a head full of Software Engineering instead. I wonder if I find one of these schools which doesn't teach debugging if they'll actually teach me some Computer Science.


Most students at universities are not there to study computer science, they are there to study software engineering. A relatively small number of people are there to study academic computer science, and that's great too, it's just not the predominant interest. It's also good for the software engineering focused folks to learn some computer science and vice versa, but it's weird to focus on preparing people for a career in academic computer science research when their goal is to work at Facebook or whatever.

This is not new. Mechanical engineers learn a bunch of physics, but their courses prepare them for a career as a mechanical engineer, not a career doing academic research into physics.


Yes but those students are enrolled in a program called "mechanical engineering" so it makes sense that it would be the focus and not physics.

I'm not sure how to encourage this change, but if we want educators to take software engineering seriously then I think we ought to disentangle it from computer science.


Yep, this is a very fair point. My opinion is really, "we're doing this all wrong", but I also don't have any idea how to redirect the ship.

My university had the CS program under the engineering department, and I think it served me very well as someone planning on a career building software, but I doubt it served the people hoping to do actual computer science research nearly as well. And I doubt they would have known that when they signed up for the program. It was called "computer science", after all, so to your point, it seems reasonable to assume that's what it would prepare one for.


Why don't schools teach so many things? Why doesn't high school teach you how to balance your budget? spawning many conspiracies.

But the answer is, they don't know.

Teachers usually dont control what they are teaching. Go up the chain and the person who did pick what they are teaching is balancing so many things and trying to tighten everything as much as possible. That some things dont make it into the curriculum.


Nitpick: Epstein's book is called "Range: _Why_ Generalists Triumph in a Specialized World", not "Range: How Generalists Triumph in a Specialized World"

[0]: https://a.co/d/g0y80gW


>Epstein's book is called "Range: _Why_ Generalists Triumph in a Specialized World", not "Range: How

It looks like the blog author remembered the older title that had the word "How" but linked the url for the newer title that changed it to "Why". Previous title with "_How_" : https://www.amazon.com/Range-Generalists-Triumph-Specialized...


You have to read Dan Luu's post to understand the meaning of "debugging" - https://danluu.com/teach-debugging/

Everyone learns software debugging in high school (If you do programming) and university, I'm not sure why people are agreeing with the premise with this mistaken definition.

I don't agree with Dan Luu, he seems to wave his hands and say it should be taught, but that's exactly what happened to him, that is traditional university, I'm not exactly sure how you'd learn it otherwise.

His comment re systematic debugging - "It takes, at most, half an hour to teach the absolute basics" is somewhat embarrassing.

Then he moves onto "fundamental skills" which is a different topic.


I really think you misunderstood what he wrote

> ...he seems to wave his hands and say it should be taught, but that's exactly what happened to him, that is traditional university

That's not true; he did experience the traditional university, but that's really not what he said how it should be taught

> I'm not exactly sure how you'd learn it otherwise.

By teaching differently (what he calls systematic debugging): figure out why students fail to grasp a concept, and fill in whatever gap of knowledge is causing it. Teach how to self-perform this process. Instead of throwing in the towel and say they won't hack their way through the rest of the course i.e. the traditional university way.


i guess they want to teach you generic skills you can use everywhere. then you select a few langages, write programs with those and learn the debugging tools that come with each (considering those debuggers as tools specific to each langage - implementation details) ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: