As a computer science graduate student, I am always surprised by how rarely my peers seem to know or care about the history of our field. I doubt many of them would write papers about computer science history even if the incentives were better.
I think it is somehow related to the power of computer science to change the human condition. Everyone is thinking about the future. Mathematicians also crave novelty, but I don't think they feel "my work could change the world" in the same way as CS researchers.
Learning about CS history would make us better researchers, and thus more likely to change the world, but that line of motivation is not direct enough to excite people. There is still so much low-hanging fruit that can be plucked with only shallow knowledge of the field.
A favourite quote, that appears on my Github profile:
> Computing is pop culture. [...] Pop culture holds a disdain for history. Pop culture is all about identity and feeling like you're participating, It has nothing to do with cooperation, the past or the future—it's living in the present. I think the same is true of most people who write code for money. They have no idea where [their culture came from]. - Alan Kay
I particularly like how he had the insight of making the statement true only for "people who write code for money." Growing up in various computing "scenes" (open source, hacking, demoscene) in the 90's and 2000's, nobody did anything for money. And it seemed everybody had the utmost respect for the history of the scene. Demosceners were all about "old school" stuff and the old Amiga days. Hacking ezines and forums retold the tales of Captain Crunch and other legendary hackers from yore. And the open source community was still full of completely antiquated practices that some would have question but nobody would have dared to disrespect.
Only when I started programming as a job did I encounter people who, strangely, had no interest for such things. There was plenty of enthusiasm for new languages, but little for their genealogy. It really seemed odd to me and I think the lack of love for the "craft" and its history is ultimately what drove me out of industry and to academia.
To make new pop culture cooperation is necessary otherwise a culture stays niche. Diffusion of new culture works exactly as diffusion of technology from innovators to late adopters. The paradox with culture is that when an innovation is adopted in the mainstream late adopters do not perceive it as innovation because it is now the normal thing to do. Innovators do because brands validate new behavior, values and rituals that are contrarian to the status quo.
Nike & BLM, Tesla & global warming, Apple and computing without a 1000 page manual.
Nike validated Kaepernick at a moment for the first time that more kids of color are born than white.
The problem with this blanket rejection of "pop culture" as something lesser is that its almost always deployed by "young fogey" conservatives, harkening back to so golden imperial age.
And of course these sort of people don't actually have any in depth historical knowledge.
Compared to those employed in an industry, few write about the history of trains, or wifi, or road building, and on and on. It's normal for only a few academics to be interested.
There are only a couple of fields that are different - media (movies/broadcast), military, medicine. Probably because of the built-in human drama that's easily accessible?
As others have mentioned, Computer Science has a little human drama but few have sacrificed their lives or appeared heroic. It's a pretty dry field to document - more similar to road building.
The GP is talking about (future) researchers in particular. Still, there’s a tremendous historical myopia among practicing software engineers as well, leading us as a field to reinvent old ideas in new clothes every ten years or so.
So what? Sometimes ideas have to be reinvented, maybe this time they succeed. There could be two reasons why old ideas (talking about Software here) didnt succeed, one is Hardware wasnt capeable, the other is 'they didnt cross the chasm' - both might be different now (third option, the idea is bad, then one shouldnt do it again of course).
You're still assuming a linear model of history. My point was that history is cyclical, the same trends come and go. Static vs dynamic typing. Thin client vs thick client. Local vs distributed. Key-value vs relational. Monolithic vs micro*. And so on.
Yes, sometimes these have to do with changing requirements or hardware capabilities, but more often they're just about a new generation wanting shiny new things rather than boring old things. Except that the shiny new things were the boring old things of the previous iteration.
Many ideas of the yesteryear were not bad or even infeasible, indeed they were successfully put into real-world use. Until the tide changed and they became unfashionable for whatever reason. And then they became fashionable again, but without a view of the history there's little synthesis of ideas, little learning from past experiences beyond the current cycle.
Researchers might have more use for history... but day to day, in the field, programmers? not so much. I would think it's the difference between math researchers and accountants... You don't need to know history to balance a check book.
The difference is that the way to balance a check book doesn't change every five years, or these days more like every year. Maybe if programmers knew a bit more about history, they wouldn't need to reinvent it so often and could be more like accountants.
But programmers don't want to be accountants, they want to use new and shiny things even when it's not the pragmatic choice, even when it would be better engineering to understand things in a broader context, to understand the history behind these "new" (actually old) things.
Accounting has been around for hundreds of years. The reason it doesn't change? Because it's not a "new" field.
Same for stuff like construction and cooking... stuff that's been around a very long time has become rather stable and of course it isn't going to be "reinvented every five years".
> when it would be better engineering
again... I don't need to understand history to understand Big O and which algorithm to use in the right circumstances or when procedural programming is proffered to functional.
At no point in my career have I ever NEEDED history to decide which library to use, which sorting algorithm is suitable and how to create functional apis or windows services.
History is nice to know... but still 100% not needed at any stage of the process.
I'm not sure where the disconnect is because I like history... I like learning about American history, world history, programming history... but none of that has ever been relevant in any of my jobs when it comes to day to day decisions and projects.
"don't reinvent the wheel" is a separate topic and while I can agree that many things get reinvented - computers are still a very young industry compared to stuff like accounting and architecture.
Programmers in the field have little use for computer science in general. You don't need to know anything about cyclomatic complexity when assembling a shiny javascript widget.
Until you do, of course. At which point it's clear you should've known about it years ago.
cyclomatic complexity and computer science isn't history.
Not sure where the disconnect is because at no point have I said you shouldn't learn how to do your job correctly - and understanding Big O, SOLID, YANGI, etc are all important to know... but have nothing to do with history.
So again... history isn't needed to be a good engineer. And being a good engineer and KISS programming is possible without understanding 100 years of history.
> Step 1. Show 10+ years of experience
> Step 2. Never talk about history because I don't need history to solve business problems with programming
> Step 3. Make 6 figures.
An anecdotal example made up about talking at some conference with big words won't provide business value and isn't useful in day to day problem solving.
So again... 99% of programmers won't ever use the history of DARPA and the birth of the internet to land a job.
The optimization bottleneck you're fighting against in the game might have been solved 50 years ago, if you don't know history, you don't have a complete toolkit of all the good tools.
There are good reasons to learn assembler, basic, forth, lisp, smalltalk, pascal, c, c++, python, java, etc. even if you don't use them on a regular basis.
I don't need history to know algorithms... so "that was solved 50 years ago" doesn't teach me how to build a balanced node tree or actually matter if I simply know to use sort library method A instead of B.
> good reasons to learn
Of course... but none of that needs HISTORY to use effectively.
Knowing functional programing, procedural, async, etc, etc, etc... I can know all of that without needing to know history.
History is not just about time — what happened when.
An equally important part is the ordering and the reasons something happened.
You can know how to write simple programs in all paradigms, but fail to understand what to use when.
You might have memorized a long list of algorithms but not understand the trade-offs involved in choosing one.
Of course history is not the only way to understand this, but it is definitely one good method.
Well like I said... I enjoy history. but. I've never used history to solve business problems.
They are related and good to know... but I don't need to know history to flip burgers, run a business or plan my next sprint.
I'm not arguing history is unimportant or good to know or even possibly useful just to have ideas of what to do - or what not to do... I'm just arguing it's importance is over-emphasized.
Just feels like you’re arguing that only short term memory matters. Have you never encountered a problem that became easy once you talked to someone with more background on the problem than you?
Why would you want to handicap yourself or your field by making it harder to “talk” to those a little bit further before you?
Your building is resting on their foundations whether you acknowledge it or not. They may have had a solution for the problem you’re facing.
> I've never used history to solve business problems.
As Knuth was trying to point out, history should be more than tables and time lines.
It should provide details of how important advances where made and how solutions to big problems where found.
And the reason being, since those historical details aren't being recorded, it's more than likely you are in fact solving business problems today using the same techniques discovered in the past, you just don't know it.
What I see is not over-emphasizing the importance of history but de-emphasizing it as a justification for present day sufficiency.
If you don't like the term, use another one. But without history you're bound to cargo cult computer science where what you have to do next is plan your sprint. Why do you need a sprint anyway? You might never know if you need a sprint any more if you don't know what problem was it supposed to fix.
You don't use history to solve the problems. You use history to find when people were fighting similar problems. Then, you link that to when other people solved still similar problems. And you look for what changed between, to see if you can leverage that.
At the junior programmer levels yes that can be true. At the principal engineer levels it’s not true. At that level you have to invent new ways of doing things when existing ways don’t cut it. Looking into the past is super charging.
none of my statements goes against learning history.
I probably don't do as much as I could/should... but I am a successful programmer who has never needed "history" to understand basic programming principles.
I seem to consider history and good engineering to be different topics while everyone else seems to think you can't understand Big O, YAGNI, SOLID, the difference between Functional Programming and Procedural programing, etc, etc, etc without understanding the history of how assembly turned into c...
Because History is interesting... but not needed to be a good engineer and a good programmer.
The disdain for history as an expression of you not knowing how to code wordpress is strange considering that, as you allude to, you don't know the latter.
You may want to think whether learning wordpress is the same with learning history of html at decade level.
Since you're publicly exposing your thoughts, let me tell you that what you are doing is expressing your ignorance and being proud of it. You may want to drop the second part.
I've not shown any disdain for history or an inability to code wordpress... in fact, I've said the opposite: I like history and simply don't need it to code.
I don't need to know history to code html, php, javascript, etc. I've a very successful programmer without using the "history' of programming in any shape or form.
"expressing your ignorance" or... I'm simply expressing that I'm a successful programmer who's never used "history" in a decade+ of programming...
You may want to reevaluate "ignorance" as you make claims that you can't support (IE: "disdain" for history and "can't code wordpress" - again, neither statements I've made). Reread my comments.
recap of them: I like history. I don't need history to code wordpress or windows services.
I was not entirely fair to you when trying to make my point. In fact I could see this exact argument of yours coming up and ignored it completely.
What I want to say is that there are entire domains of competence that are irrelevant to a large degree to doing the day's job. Knowing history of computing will not necessary help in making a better program in a way that can be noticed.
But history of computing is closely related to computing. Unlike for example history of knitting. Although neither will make you a better programmer now, the first has much better chance at that than the second (although we don't exactly know that).
Thus, when someone downplays the importance of history of programming to improving a programmer's mastery, I see it as touting ignorance at not seeing the connections between the two: self sufficiency, arrogance of ego being trapped by the light of today's fads, pop culture that doesn't care about the past or the future.
I mean, at a conceptual level, knowing history is important in the "ignorance of history will lead to repeated history" line of thinking - and I don't disagree. ... and knowing why decisions were made can help determine which tools to use (IE: Why use static typing or duck typing and when to use the other... or when to use procedural programming, async, functional, etc...)
Maybe it's a combination of you "projecting" and me not being clear. And trying to discuss what could be deep conversations in a little more than a twitter tag of 140 chars.
I'm personally a .Net Developer and while I do stick with the latest versions (IE: .Net Core), I also have enough experience to know the past (IE: ADO vs Entity Frameworks). I was trained in school on a mainframe (IBM DB2 with RPG and SQL). I'm not trying to stick with the latest "hotness" as most of MY work is actually done via Windows Services, API interactions and moving files around - definitely not the latest fad. With that said, I am working on using good tools to get my job done faster/better - IE: CI/CD pipelines to automate builds, testing, deployment, etc. Tools that didn't exist 5 years ago could be the latest fad but I don't think that's what you are suggesting.
I'm more worried about learning different things (IE: Functional, procedural, async, parallel, etc) than I am worried about "history" of those. When I pick up a programming book, I'm less worried about the "Microsoft created version 1 in 2000 and version 2 in 2005 and..." and more concerned about do's, do not's, best practices, etc.
Maybe it's my personality and the way I "deemphasize" history... it's not that I think it's unimportant... I just think that it's more important to focus on other things. Learning some history along the way is good and fun but it's never been my focus and I've never used what I consider "history" in an interview or a job on a day to day. Maybe that's rubbing people the wrong way lol
I think it is somehow related to the power of computer science to change the human condition. Everyone is thinking about the future. Mathematicians also crave novelty, but I don't think they feel "my work could change the world" in the same way as CS researchers.
Learning about CS history would make us better researchers, and thus more likely to change the world, but that line of motivation is not direct enough to excite people. There is still so much low-hanging fruit that can be plucked with only shallow knowledge of the field.