> "We point out that programming teaching is useless for those who are bound to fail and pointless for those who are certain to succeed."
Maybe colleges would stop forcing people to take intro programming courses if they believed that and just skip to the more interesting language design courses.
There is a quote like this in the Feynman Lectures. It's nice for faculty to believe: takes the pressure off of us to teach. You're either born with it, or go away.
In physics, now, we know better. We can teach it. I'm sure they'll figure it out in programming, too.
When I was a grad student, we had group finals for calculus (i.e., standards uniform across all professors/TAs). We did not observe any statistically significant difference in grades between professors or (professor, TA) 2-tuples.
We were planning to use grades as a better way to evaluate teaching quality. When it didn't work, the professors (1) decided to go back to student evaluations rather than admit their job didn't matter.
(1) The people in charge of undergrad teaching were faculty not actively doing research, but who did have an interest in teaching well.
From experience it appears that there are three major semantic hurdles which trip up novice
imperative programmers. In order they are:
- assignment and sequence
- recursion / iteration
- concurrency
I don't know about the rest of the paper, but I can anecdotally but strongly confirm these observations from trying to help a bunch of people through programming courses at university - luckily "only" maths & physics students, not future programmers. A lot of people were just completely baffled by assignment and iteration. (I don't think they got as far as recursion, and concurrency would have broken their brains; presumably not for lack of intelligence, these were people who were doing well in maths & physics)
I wonder if programming courses would be better if they were sufficiently built around those hurdles, taking them one at a time? Courses I've attended (had to attend) or observed always struck me as wrong, as they confused people about syntax and semantics, and rushed assignment operations. Presumably because "it's only an equals sign". If you don't understand assignment, a 'for' loop must make no sense at all.
I don't know, maybe courses elsewhere are better. Maybe I'm not a good judge of course quality, as I attended all of them when I was already a competent programmer. (damn you, compulsory modules!)
Hm, I still remember how it was when I tried to understand iteration and recursion.
Of course I understood the concept of doing something x times with increasing counter, but I couldn't see the clear relation between what I wrote and what it did. It seemed a bit magic and I wanted to "truly understand" it before mapping the concept to syntax. It seemed too arbitrary to me to rely in my thinking on such magic.
My strategy was to unloop every loop I saw. It was ridiculous, I didn't have enough resources to keep even smaller loops in mind and I quickly forgot them. After few days I just gave up and committed it to my memory. Only then I felt that I truly understand.
I think I tried programming skipping this step but I just got irritated and really unsure if what I'm doing makes any sense. I remember it being the same with recursion, except a bit more mind-boggling. Now every time I try to understand something similar I try to reduce it to simplest case and just commit it.
I suppose other people may be unaware of what is actually keeping them from understanding and never get that far.
Yep, I think the step of writing everything out without loops, coming to the realisation that you're duplicating a ton of code, and figuring out loops, is a crucial one. I only vaguely remember taking it when I was about 10, but I'm pretty sure I did exactly the same thing as you.
I had a similar transition of just writing all my code in one big block for a while, versus factoring stuff out into functions. And then putting stuff into data structures rather than primitive variables and arrays. Then, fully understanding the stack. And pointers.
Oddly enough I never had any trouble with recursion, although I have to say I probably encountered it somewhat later than iteration, at which point I was already very comfortable with the basics. (I only had access to very basic programming books and tools early on - internet? forget it) I guess I had the freedom of having no direction whatsoever and learning this stuff really slowly, something you don't have when taking a course at university. The optimal path is somewhere in between, for sure. I've done a fair bit of un-learning.
I wonder if this is harder today - the temptation of copy & paste coding is there. I've never been especially fast at typing, so typing more than a couple of lines from a book was probably harder than figuring out how the code in the book worked and how to adapt it without much trial and error. Cue comment from someone who learned to program with punch cards. (or twiddling switches...in the freezing cold...at the top of a mountain)
Best part: "There is a test for programming aptitude, or at least for success in a first programming course. We have speculated on the reasons for its success, but in truth we don’t understand how it works anymore than you do."
I've always had the thought that first time students should just be given a REPL and follow along line by line with the instructor. Assignment would be so obvious if one could just type the variable and have it spit out the value. Teach good debugging first and kids will be able to teach themselves the rest is what I say.
I think that your causation direction is reversed. Good debugging requires solid understanding of the syntax and behavior first. If you don't have it, then you're just flailing at the tool. I see people do this all the time: using debuggers to try to get information they could get trivially by reading the code, or randomly changing things in the code and retesting instead of carefully watching something in a debugger.
This was how most of my professors taught debugging, actually. Print statements. Reading the code is great but after a while it really does become hard to see where things go awry--especially in your own code. I think teaching students the basic concepts of debugging, break points, stepping, etc. will 1) clearly illustrate how bad print statement approaches are and 2) become annoying enough that it will make them better planners and code readers.
... the moderately successful perhaps are the software engineers, those who can program but can’t imagine that they will ever enjoy it, and are content to submit to management discipline and to be drowned in UML (ugh!).
At least I know I'm not alone in thinking that best programmers are those who simply love what they are doing. However sentimental this may sound, there's nothing else that distinguishes outstanding engineers from good ones; there's nothing else that can predict the most important career dilemma in every programmer's life: to be a coder forever or to abandon coding for management and... UML.
And thanks for the post, enjoyed reading it very much.
Some people aren't cut out for the particular style of thought needed to write algorithms in a programming language, yes. Or at least some people require disproportionately more instruction to succeed and thus are likely to always fail in a college intro programming course.
Agreed. Some people just aren't as effective at abstract symbol manipulation. I've found that people who can pick up programming languages can pick up other human languages and vice versa.
This doesn't mean that they can't learn programming. It means that they would be much better off focusing on their strengths instead. There's a difference between "can't do it" and "fails course" as you rightly suggest.
Actually, these test results were not able to be reproduced even by the author(s). So while it was predictive of the set of people the paper was written for, unfortunately it isn't any use outside of that.
I remember how I learned to program. My parents had a Timex Sinclair with no tape drive and 4K of RAM, so if you wanted to use a program, you had to type it in, in BASIC, from scratch. The computer came with a book of programs. I just looked at the programs, figured out what they did, and started writing my own modifications. I think I was around 5 years old.
I guess I must have just been "born rational", if you're going to equate programming ability with rationality.
But what makes you think that the "programming gear" I had - the quality of a born hacker, by which the first time we look at any piece of code, anywhere, any time in our life, at any age, we instantly know how to program - is equivalent to "rationality"? Or even "intelligence"?
There are physicists who cannot learn to program, apparently. This shocks me. But if I can learn to program at age five without instruction, and a physicist cannot learn to program with instruction, that makes it pretty darn plausible to me that yes, there is a "programming gear".
PS: How the hell can you be a physicist and not be able to write computer programs? WTF, human brain?
Its easier to learn programming when you're a kid. Its easier to learn natural languages when you're a kid. How long would it take an adult to learn a second natural language to conversational fluency? Some people can do it in six months or less. Some will never achieve it.
Maybe someone who fails this test won't pass a class their first time through, but I find it hard to believe that any test can determine that someone, given time and practice, cannot be a programmer.
Maybe colleges would stop forcing people to take intro programming courses if they believed that and just skip to the more interesting language design courses.
Oh, and the pdf link: http://www.cs.mdx.ac.uk/research/PhDArea/saeed/paper1.pdf