> The computer was suddenly revealed as palimpsest. The machine that is everywhere hailed as the very incarnation of the new had revealed itself to be not so new after all, but a series of skins, layer on layer, winding around the messy, evolving idea of the computing machine. Under Windows was DOS; under DOS, BASIC; and under them both the date of its origins recorded like a birth memory. Here was the very opposite of the authoritative, all-knowing system with its pretty screenful of icons. Here was the antidote to Microsoft's many protections. The mere impulse toward Linux had led me into an act of desktop archaeology. And down under all those piles of stuff, the secret was written: We build our computers the way we build our cities -- over time, without a plan, on top of ruins.
I repeat the last sentence to my students all the time ("We build our computers the way we build our cities -- over time, without a plan, on top of ruins.")
There's no way to understand why our computers work the way they do without understanding the human, social, and economic factors involved in their production. And foregrounding the human element often makes it easier to explain what's going on and why.
I've always been curious if our society were to collapse would the artifacts left behind indicate to a future society how to reconstruct this technology? Is there any object permanence in our creations?
I imagine a wanderer silently plumbing their way through the streets of Manhattan on their makeshift catamaran. They're mostly in search of useful resources but they can't help but be intrigued by these little, grey squares with faded little pictures on them. What are they? What were they used for?
Perhaps they know something about electronics. They have opened a few of these mysterious squares. There is that tell-tale green circuit board inside. But how does one turn it on and use it? What does it do?
The people of medieval Britain could see the ruins of Roman architecture dotting their landscape. They hadn't seen those people in a long while and had no means to repair the aqueducts or high ways. Yet they built on them and around them none the less. A building is a building and a wall is a wall.
But computers? And the hardware we use to build these artifacts of the mind? Vastly more complicated and difficult to reproduce from first principles.
> I've always been curious if our society were to collapse would the artifacts left behind indicate to a future society how to reconstruct this technology?
Fundamentally, if there is a collapse, future societies have a VERY tough row to hoe ... we've essentially dug up and drilled ALL of the easily available hydrocarbons and moved the bulk of our science and technology to storage that requires that level of science and technology to access. And encrypted most of it. They'll be trying to climb the technology ladder without coal, oil or natural gas and without any technology documentation from about 2000 onwards.
On the other hand, they get some high-concentrated storages of refined materials, and we will leave lots of those rune-engraved panels that emit an unnatural power when exposed to light.
Anyway, unless we are talking about some really unprecedentedly disaster, we'll leave plenty of paper books to help them understand those things and bootstrap their society.
Has any book-burning movement in history actually succeeded in getting ride of the books?
Because I don't know of any, they manage to destroy a text here or there, but there were always some compatible texts that survived. AFAIK, we really didn't lose useful knowledge to them.
(And let's not forget that today we made life much harder for the book burners.)
What really does destroy knowledge is it becoming useless for a long period of time. As in losing all of CS because nobody can make a computer yet. But we have an instinct to preserve knowledge, even when it's absolutely useless. This has already saved a lot of technology, and it's hard to imagine how some disaster would stop it.
Personally, I have a hard time imagining something that would throw us at the stone (or bronze, or iron) age, but still fail to destroy all the life on Earth.
Most of the early christian texts were suppressed and destroyed. In the 1940s some ancient texts that had been buried for nearly 2000 years were found in Egypt and not realizing their value, a family member used many of them for kindling. Before the discovery of the Nag Hammadi texts, the existence of these materials was known from references in orthodox texts but they had been completely lost. Of course we have no idea of what was contained in the texts that were burned and we have no idea what else is out there so we don't even know what has been lost. Similarly, parmenides is considered by many philosophers to be the "father of logic" which he brought forth in a long poem. Unfortunately, we only have about 160 of the estimated 800 verses in the poem, so again we are lacking basic knowledge of the origins of our civilization that was written down at one time but has been lost.
Digital information seems especially likely to me to be lost over time because the recording nearly always requires an instrument to extract and these instruments do not have nearly the durability of books.
The near loss of texts in the Nag Hammadi library and the fragmentary transmission of Parmenides (or Sappho, Thales, Manetho, and countless others, for that matter) is not about book burning. That's playing into ideologically-motivated narratives about knowledge vs. religion and ignores the reality of textual transmission prior to movable type printing making the production of books relatively inexpensive. We've lost most of the texts from the ancient world not because they've been suppressed or deliberately destroyed, but because tended to be written on organic materials that do not stand the test of time. In a few parts of the world like Egypt, the hot, dry climate preserves some of these ancient manuscripts to varying degrees. But for the most part, ancient texts survive because there was interest in those texts sufficient for scribes to expend considerable effort in making new copies by hand as older copies decayed or wore out. In the ancient world, texts died not from suppression, but from decay combined with lack of interest or neglect.
If you want to talk about early Christian texts, the Didache is instructive. It's orthodox and never was suppressed. It's earlier than much—if not all—of the canonical Greek scriptures. However, the canonical scriptures overshadowed it and it became obscure, to the point where the only known complete copy today is a single 11th century manuscript that was found behind a bookshelf in a monastery in the 19th century.
> The near loss of texts in the Nag Hammadi library and the fragmentary transmission of Parmenides (or Sappho, Thales, Manetho, and countless others, for that matter) is not about book burning.
I can't speak to the reasons for the loss of Parmenides' poem, but scholars seem to think that the Nag Hammadi was indeed buried to prevent their destruction after Athanasius condemned non-canonical sources https://en.wikipedia.org/wiki/Nag_Hammadi_library.
Regardless, my point is that at least a written text has the chance of being discovered and understood in the future. The arctic code vault is actually interesting in this regard, but decoding it to find the jewels would likely be extremely challenging. There also is a lot of important technical information that isn't on github.
Elaine Pagels straight up lied about the contents of Athanasius Letter 39, turning an admonition to only use certain canonical and apocryphal texts for teaching and a warning against similarly named heretical texts (like most of those found at Nag Hammadi) into a command to destroy everything except the canon and the apocrypha. Since she was one of the early scholars to work with the Nag Hammadi library, her hiding-to-avoid-destruction theory remained influential for a long time, despite it being based on a lie. More recent scholarship tends to theorize that these texts were a burial deposit—a practice that began long before Christianity in Egypt.
The point, though, is these texts were doomed when there ceased to be a scribal community that cared about them being copied. If you suppressed the heretics in antiquity, you suppressed the transmission of heretical texts except to the extent those texts were quoted in refutations that scribal communities did care about preserving. Actual destruction of manuscripts was unnecessary at that point, although it still sometimes happened like with the works of Arius and Nestorius. At least with Constantine's condemnation of Arius, it seemed to be more of a damnatio memoriae than an a practical act in furtherance of suppressing heresy.
> Most of the early Christian texts were suppressed and destroyed. In the 1940s some ancient texts that had been buried for nearly 2000 years were found in Egypt and not realizing their value, a family member used many of them for kindling.
But they didn't do that because they were trying to destroy heretical thought. They did it because they didn't realize what they had, they were poor, and needed kindling to stay warm or cook food. That's an important difference. Many ancient texts have been lost this way not out of malice, but just because that's what happens throughout the ages.
> Has any book-burning movement in history actually succeeded in getting ride of the books?
> Because I don't know of any
Of course you haven't, by definition. We find out about the things that survive destruction, not the ones that don't.
edit:
As for the instinct to preserve knowledge, that's not all we have. We also have a profit-maximizing infection that spread to every organ, so to speak. Books printed today don't last very long. Heck, not even newspapers and tech companies give a crap about link rot! Archive.org is a team of volunteers. Wikipedia is very hit or miss, and the ego of some seems to override whatever "instinct" we might have real quick.
So, frankly, I'm not seeing it. Even computing is rotting from under our hands. The Amiga came with schematics of the machine as part of the manual, while nowadays repair shops have to hunt for info. If companies were allowed, they'd each have their own incompatible charger cable, you bet. And let's not even talk about about 100MB applications that do 100KB things, because it's easier for the developer to never care about anything close to the machine and just use the bloated tool chain they already know. Recently I asked for a good (as in, other than "Microsoft wants you to") reason to use Win 11, and just got downvoted. Ever since Win 11 was announced I haven't gotten one answer to that question, ever. There's people who turn their nose up at it, and people who use it who don't want to talk about why. I could go on.
> No greater mistake can be made than to imagine that what has been written latest is always the more correct; that what is written later on is an improvement on what was written previously; and that every change means progress. Men who think and have correct judgment, and people who treat their subject earnestly, are all exceptions only. Vermin is the rule everywhere in the world: it is always at hand and busily engaged in trying to improve in its own way upon the mature deliberations of the thinkers. [..] An old and excellent book is frequently shelved for new and bad ones; which, written for the sake of money, wear a pretentious air and are much eulogised by the authors’ friends. In science, a man who wishes to distinguish himself brings something new to market; this frequently consists in his denouncing some principle that has been previously held as correct, so that he may establish a wrong one of his own. Sometimes his attempt is successful for a short time, when a return is made to the old and correct doctrine. These innovators are serious about nothing else in the world than their own priceless person, and it is this that they wish to make its mark.
Book-burning specifically, no, but the Nazis destroyed a lot of research from a German institute that studied gender issues. Knowledge of the place and some of its research still exists, but a lot of the research was simply gone once the Nazis rose to power. That seems like way less of a liability these days with the internet, but even that is prone to government meddling, and when the people that agree with book burnings run the government, well, there's a very famous prescient book on the topic.
the maya codices and the khipu were lost in very nearly their entirety, and the more abundant maya hieroglyphs carved on stone have been only partly deciphered. the khipu are entirely undeciphered except for arithmetic
of carthaginian literature not one book remains
the khwarazmian empire is known from the accounts of its neighbors and destroyers
qin shi huang ordered the burning of the history books for every kingdom he conquered; consequently what we know of most of them today is little more than legend, except for what the histories of qin tell us
in countless smaller-scale examples we don't even know the names of the nations that perished along with their language and books
It was by an incredible amount of luck that Lucretius' On the Nature of Things is still published today. If it wasn't for a group of Florentine hipsters the last copy might have rotted on a shelf in a neglected monastery library. Who knows how the Renaissance might have progressed without it and whether the Enlightenment would have been different.
yes but this wasn't the result of intentional book-burning directed at suppressing epicureanism; though the christians were not really a fan, lucretius wrote too early to directly attack christianity, so it was never banned by the church
What a coincidence, I had that discussion with my friends yesterday.
We came to conclusions that the scenarios of rebuilding society back up are based on the type of collapse. The greater drop (destruction events) and greater time gap before rebuilding can start anew the harder it would get.
A dark age collapse - worst case scenario - where people slowly loose knowledge over generations - due to lack of institutions to protect the knowledge, sounds like a game over scenario. Small clusters of agrarian people would not have resources to support an engineering department tasked with preserving knowledge that is useless for time being.
Slowly the tech that could have been re-used or restarted would deteriorate beyond usability, and then beyond repeatability.
Its a scary scenario, probably good basis for a book series... there is probably plenty of novels written with that scenario already ;)
I think slow collapse would be the best case. It would give time for technologies like blacksmithing to flourish. It would also mean that advanced technology could be used if couldn’t be produced. It would also mean civilization that could have reason to save books. Finally, there is an advantage to knowledge, technology, and science. We forget that pre-industrial technology can be produced on small scale, and that musket and steel armor is a huge advantage.
In a hard collapse, all of that would be lost trying to survive. Books would be burned for heat, electronics forgotten, and blacksmiths killed accidentally.
A Canticle for Leibowitz comes to mind, though while the preservation of knowledge after collapse is a central premise it is not what the novel is about, exactly. Worth reading in any case.
> we've essentially dug up and drilled ALL of the easily available hydrocarbons
And basically every accessible ore deposits as well… You can't reach bronze age if there's no available copper nor tin, you just end up stuck in the stone age forever.
And yet there seems to be shockingly little effort towards making sure that doesn't happen. Why isn't there an underground bombproof chip fab somewhere?
It's like as if people don't really care about technological society, they just do what they always do, expand and try to do stuff that is hard and impressive, and if the computers went away... they'd be just as happy as a blacksmith or something...
A modern chip fab needs a supply chain the size of humanity. And you can't bury a reserve humanity in case the regular one goes bust. A buried chip fab would be scrapped for steel.
23 years or so of lost knowledge can be regained in the grand scheme of things. But yes, the lack of easy access to coal and oil would be a problem. Maybe the next civilization will be more careful.
That's why I think we should all be trying to help keep society going. I try to do my bit. I just wish the preppers, survivalists, deniers would get involved in their community and try to save it too.
I guess I'm kind of thinking of it from the point of view that of all the technologies that we are inventing to allow civilization to survive without coal and oil are the most likely to be lost.
Unless it's a complete wipeout, there will be plenty of bicycles, solar panels and IKEA knives left - that's a significant leg up in rebuilding civilisation.
Before coal there was wood, before oil there was whaling - when the global population is under 1bln, energy isn't much of an issue.
The wars over it though - that would be the tough part.
I would really like to read a novel about future archeologists who rediscover brute force password lists and rainbow tables as they investigate a ruin. Oh, maybe they could be specifically looking for LLM models, LLMs being no longer possible to train due to energy costs.
LLMs and even weapons and nuclear bombs are harmless by themselves. It’s about who ends up in control of something that has the potential to do more harm than good then I could imagine it’s downhill from there.
Ever read ringworld? Society without access to minerals and fuel resources. Untold technology that the natives have zero capacity to understand or harness. That thought was very engaging for me. Until all the rishing. Then it spiraled.
There is still a lot of surface coal deposits left. Big open pit mines in Wyoming and Germany. It isn’t the best coal, and is far from areas with ores. We will never mine it since would cause catastrophic warming.
This is why I think that we should resolve the climate change crisis by reducing excess atmospheric carbon by seeding the oceans with iron. Oceanic iron seeding results in plankton blooms that consume atmospheric carbon. When the plankton dies, it accumulates on the seafloor. Over geologic time scales, the accumulated plankton will turn into hydrocarbons, renewing the supply of cheap fuels for a post-human civilization millions of years in the future.
We can't even switch to renewables because that would make some capital owners slightly less money and you're thinking of civilisations millions of years in the future...
Where I live, it's because the taxpayers would have to pay too much and would be exposed to a lot of outside risk (it's usually not that sunny here, and we don't have space for wind turbines nor where to landfill the replaced blades).
Here it's the capital owners pushing renewables because it's one of the easiest, cheapest, least objectionable (in case of solar) and least risky energy projects you can build on almost any useless piece of land (everywhere is near the grid in this country), but they just can't fulfill the demands of the state-owned energy company with their solar arrays and wind turbines. Hydro is out of the question, the water-environmentalists hate that - for good reasons I have to say.
There is a nearby country that "did" (legislated) the switch... In result, they are the most polluting ones in every metric (per capita, per kWh, per km^2) while having the most expensive electricity on the continent; all metrics are getting worse every year there. Not a good look for the switch, certainly doesn't make most voters in my country motivated to even try - and you'd need to convince half the country. Right now the support is around 5-10%.
In conclusion, given that both our worlds coexist at the same time - I think it's not that simple to switch and wouldn't be looking for the reason in either capitalists or states. Perhaps the technology is just not ready for a full-scale switch yet.
If modern society collapses, starting back over with the same approach isn't the best plan. Cheap hydrocarbons are sort of like VC funding. They have allowed society to advance extremely rapidly, in a totally unsustainable way. We have yet to find out if we can stick the landing on transitioning to a sustainable lifestyle. It's possible that humans just aren't wired in a way to responsibly manage the risks introduced by sufficiently advanced technology, at least not at the current rate of change.
In a post-apocalyptic world, survivors should really be rethinking the approach that brought that about. Avoiding the cheap hydrocarbon consumption phase might result in a much longer lasting civilazation next time around.
In addition to coal that I mentioned above, it should be possible to have industrial society with renewables. It wouldn’t be as high energy as ours or as industrialized. But if they knew the science and some technology they could skip the big iron phase.
Early power could come from burning wood and windmills. That doesn’t have enough power for smelting iron, but there is more than lots of iron, Electricity could come from concentrated solar, wind, hydro, or geothermal. Nuclear would be the big jump.
Your comment makes me think you may enjoy the science fiction book 'A Canticle for Leibowitz' by Walter M. Miller (perhaps you already have :-})
It's set at a monastery in the desert post nuclear apocalypse, where scribes copy wiring diagrams and store artifacts like partially destroyed circuit boards without understanding what they are / how they work. And it builds from there.
It also made me think of 'By the Waters of Babylon' by Stephen Vincent Benét. Wikipedia tells me it was also (indeed, originally) published as 'The Place of the Gods'.
To be fair, I've thought about this sometimes too. Like, I contribute to technology myself, but in a very minimal way. And most people don't at all, they consume. But there are a golden few in the world who are doing _everything_.
For example how many people in the world actually know how to build an EUV machine like ASML's one? How many people understand the absolute forefront of integrated circuit design? Maybe like, 1-2 dozen people at most?
It's a pretty crazy thought. And I bet none of those guys are making as much bank as the suited executives above them.
Exactly. A few people have attempted fabricating simple 8-bit CPUs in their garage but the yields are low without a clean room and it still requires an incredible amount of technology to get started. A good amount of how to set up a garage fabrication lab is probably available through bits of print media, if we're lucky... but recreating a garage from scratch by finding all of that information would be quite the feat.
The answer to this lies in our history. Given that we know so little about past civilizations, especially older than 6000 years ago, and new discoveries sometimes startlingly reveal how advanced they were, the answer would be no.
Right, I imagine they wouldn't be seeking out the little grey squares. Humans are curious however and the Wonderer is no different. They might only collect them as they come across them and ponder their meaning and reason for existence in those rare hours they have a little leisure time.
That may have been a reference to that fact that many computers of the early 80’s had a basic rom that would be booted into if no operating system was found.
Not in a technical sense, but in a historical/cultural sense, sure. At the time PC/MS-DOS was propagating out in the public, many of us were also working with machines where the "OS" was a BASIC prompt. And the model and syntax of command interaction wasn't so dissimilar.
But yeah, it's a bit of a hand waving generalization.
Some early PCs would boot to a basic environment if it couldn’t boot to any other OS. Not sure if that extended to machines that could run Win3.1, but IBM 8088 PC-XTs (and similar, but that’s what I had) definitely did that, though normally one would boot them to DOS.
A history lesson for some of us (including me, tbh, because I was only vaguely aware of any of this and wasn't born yet at the time):
> The emergence of microcomputers in the mid-1970s led to the development of multiple BASIC dialects, including Microsoft BASIC in 1975. Due to the tiny main memory available on these machines, often 4 KB, a variety of Tiny BASIC dialects were also created. BASIC was available for almost any system of the era, and became the de facto programming language for home computer systems that emerged in the late 1970s. These PCs almost always had a BASIC interpreter installed by default, often in the machine's firmware or sometimes on a ROM cartridge.
> BASIC was one of the few languages that was both high-level enough to be usable by those without training and small enough to fit into the microcomputers of the day, making it the de facto standard programming language on early microcomputers.
> The first microcomputer version of BASIC was co-written by Bill Gates, Paul Allen and Monte Davidoff for their newly formed company, Micro-Soft.[21] This was released by MITS in punch tape format for the Altair 8800 shortly after the machine itself,[22] immediately cementing BASIC as the primary language of early microcomputers. Members of the Homebrew Computer Club began circulating copies of the program, causing Gates to write his Open Letter to Hobbyists, complaining about this early example of software piracy.
> Partially in response to Gates's letter, and partially to make an even smaller BASIC that would run usefully on 4 KB machines,[e] Bob Albrecht urged Dennis Allison to write their own variation of the language. How to design and implement a stripped-down version of an interpreter for the BASIC language was covered in articles by Allison in the first three quarterly issues of the People's Computer Company newsletter published in 1975 and implementations with source code published in Dr. Dobb's Journal of Tiny BASIC Calisthenics & Orthodontia: Running Light Without Overbyte. This led to a wide variety of Tiny BASICs with added features or other improvements, with versions from Tom Pittman and Li-Chen Wang becoming particularly well known.[23]
> Micro-Soft, by this time Microsoft, ported their interpreter for the MOS 6502, which quickly become one of the most popular microprocessors of the 8-bit era. When new microcomputers began to appear, notably the "1977 trinity" of the TRS-80, Commodore PET and Apple II, they either included a version of the MS code, or quickly introduced new models with it. Ohio Scientific's personal computers also joined ...
> When IBM was designing the IBM PC, they followed the paradigm of existing home computers in having a built-in BASIC interpreter. They sourced this from Microsoft – IBM Cassette BASIC – but Microsoft also produced several other versions of BASIC for MS-DOS/PC DOS including IBM Disk BASIC (BASIC D), IBM BASICA (BASIC A), GW-BASIC (a BASICA-compatible version that did not need IBM's ROM)[28] and QBasic, all typically bundled with the machine. In addition they produced the Microsoft BASIC Compiler aimed at professional programmers. Turbo Pascal-publisher Borland published Turbo Basic 1.0 in 1985 (successor versions are still being marketed under the name PowerBASIC).
Some of the code in the libre software I use every day is 30+ years old. And if you're on a BSD you have code in your kernel copyright 1979.
The modern software environment works, I think, just from the sheer quantity of useful building and patch materials. To run with the building metaphor, it's like having specialized supplies on-hand so you can throw up a passable shed in a weekend with just duct tape and a screwdriver.
I recently read Ellen Ullman's memoir about her life as a software developer in the dot-com bubble era: "Close to the machine: Technophilia and its discontents".
The book was written before I was born, but I can still closely relate to most of the cultural points made. She does a great job defining the anxieties and frictions you experience working in the duality of the very formal computer systems and the subjective, messy working contexts, filled with deadlines, bureaucracy, "rockstars"...
Her takes on the internet are also super relevant today. A favorite extract of mine:
"When I watch the users try the Internet, it slowly becomes clear to me that the Net represents the ultimate dumbing-down of the computer. The users seem to believe that they are connected to some vast treasure trove — all the knowledge of our times, an endless digitized compendium, some electronic library of Alexandria — if only they could figure out how to use it. But they just sit and click, and look disconcertedly at the junk that comes back at them".
"The users seem to believe that they are connected to some vast treasure trove — all the knowledge of our times, an endless digitized compendium, some electronic library of Alexandria..."
Personally, I feel that this is one thing that worked out about as well as could reasonably be expected (it would be unreasonable to expect that the benefits of this could be reaped without effort...)
There is a treasure trove, it's just mixed in with a lot of trash. Like panning for gold, you need to know where the good lodes are and recognise it when you see it.
> Any treasure you find will be found through experiences in the real world.
That's romantic, but why would it be true, unless you just define treasure as something that you find in real life? Specific knowledge and skills are not treasure?
I'm curious if you lived before the days of having so much knowledge instantly accessible, practically for free.
I can say that I've got a much more open mind via conversions I've had with strangers on various platforms over the years. My feminism is much more informed and mature than it would be if I hadn't had access to the differing opinions I've read and discussed over the years for example.
I also wouldn't be able to do my job without the internet. When I was just starting out, I learnt perl out of a big thick book (O'Reilly maybe?) and I can say that searching the internet is easier!
I'm not saying it's perfect, but you must get some value out of it, or you wouldn't be here in the first place...
That quote also stood out to me, but probably in a different way. It's missing a crucial piece of information: What is the correct way to use the internet? And computers, for that matter?
A generous interpretation is that users are expected to take the knowledge and do greater things with it, instead of sitting and clicking, but that obviously doesn't make sense after a few seconds' thought. I'm stumped.
Her quote reminds me of Eternal September[1], when AOL started allowing their users to interact with Usenet. The people who were already there were not happy with this influx of the unwashed masses coming in and breaking stuff and ignoring good manners and the protocols that had been established.
In the early days of the internet, there was definitely a different crowd because the barrier to entry was pretty high and required a lot of dedication and problem-solving abilities. As the bar of entry came down, along with it came all of the things that come with football stadiums, shopping malls, and time-share condos.
A more recent, similar event was when Digg shut down and all the users from there flooded onto Reddit. That was the beginning of the end of the golden days of Reddit, imo.
Unfortunate as it may be, we would be remiss to accept Digg's 2010 exodus as recent, especially in internet years. At the same time, I'd be interested to compare and see what is currently purported to be the 'golden age' and what it ends up being.
I suppose it's semantics and concerns ones perception of time, but there's also general the pattern of collective memory decay, attention shifts and link rot, onsetting shortly depending on the cultural event or phenomenon of interest.
Once something is past this 'recency' window, we may want to start looking into additional examples more reflective of current times. That is not to say the Digg exodus is an insignificant event, but that there have been a fair few exodi since--StackExchange, Twitter's infosec sphere, streamers between YouTube/Twitch, Snap/Tiktok, failed SVOD services.
Use the internet for the tools it provides. Maps, phone book, connect to news media etc.
The internet is useful when you need some specific knowledge. It is useless when you need nothing and are just browsing. Sitting there watching TikTok videos or reading hackernews is usually going to turn out to be a waste of time.
A passage early on that resonated with me is her report of a talk given by Whitfield Diffie:
> We were slaves to the mainframe! he said. Dumb terminals! That's all we had. We were powerless under the big machine's unyielding central control. Then we escaped to the personal computer, autonomous, powerful. Then networks. The PC was soon rendered to be nothing but a "thin client," just a browser with very little software residing on our personal machines, the code being on network servers, which are under the control of administrators. Now to the web, nothing but a thin, thin browser for us. All the intelligence out there, on the net, our machines having become dumb terminals again.
"When I watch the users try the Internet, it slowly becomes clear to me that the Net represents the ultimate dumbing-down of the computer."
I think this is an asinine quote, to be honest. It comes across as elitist: "these idiots don't use computers how they should be using them, according to -me-".
The internet, the ability to communicate and access data instantly across the globe, has been one of humanity's greatest achievements to date. But because some people look at junk or don't use it efficiently, it's "the ultimate dumbing-down of the computer"? Really? People like this complain about "the unwashed masses", but fail to recognize that the internet would not nearly have been as useful if it were limited to an insular group of similar people until now.
This line of criticism has always had elitism woven into it. The real internet was the internet of those that build their own modems/wrote their own modem code/added specific NNTP headers to their news messages/had their own static IPv4 IP/understood what I understand. Mix that with nostalgia over a time when "have you tried restarting it" was the most common piece of advice and you get this line of critique.
It's an alluring tale and appeals to the technologist in all of us old enough to feel nostalgic about a lost time, especially as the internet and then the web has become used by a broader swath of the population. I have a successful uncle who tells us stories about how calculators and CAD dumbed down engineering and only his ilk of paper calculations and slide rule approximation can truly engineer.
> nostalgia over a time when "have you tried restarting it" was the most common piece of advice
I think I missed the memo... that's surely still the single most common go-to when something stops working mysteriously? Or are you suggesting it's no longer "advice" because everyone already knows that by now?
I think restarts of home computers are rarely needed even in Windows. On the server we have gone down the cattle not pets route so that restarting is abstracted away. (When was the bare metal restarted on that CDN/VM/ etc.)
Had to do it just yesterday when Bluetooth just stopped working, and yes I tried everything else I could think of first. And I do software restarts (iisreset etc) all the time.
Mind you I also tried restarting my phone(s) a couple of times recently when it started exhibiting strange behaviour. In neither case did it work - one phone I gave up on entirely (no internet when on cellular data, but it had other issues and had been planning to retire it). The other I had to dig around to find some obscure option ("reading mode") that had been activated somehow.
Can you or someone else explain what they liked about Microserfs? I read it after hearing a recommendation for it somewhere online, but it didn’t really connect for me.
I’m young enough that I don’t have any personal knowledge of the time period to compare it with, so maybe I’m missing a nostalgia angle.
I don’t doubt that it is a great book, it just didn’t grab me for whatever reason.
I typed in my first BASIC program in ‘89. I dropped out of high school in the late 90’s to program uh… high quality adult entertainment websites. I had a small e-commerce site for a while selling weird stuff. I eventually finished school and tried to move on but I got back into programming for a living to this day.
Twenty some odd years of doing it professionally. For fun. For curiosity. And looking at trying to keep at it for twenty more. Life’s a trip.
I created my first program on punch cards in '79. I can't even remember what the programming language was although I suspect FORTRAN. I wish I'd kept a listing.
Tangentially, my valued copy of The Lord of the Rings that I loved as a child has the date 'Christmas 1973' inside the front cover. I'm going to read it again this Christmas, 50 years later. I've been avoiding anything Tolkien-related since about 2017, when I last re-read it (and noticed the upcoming anniversary), to come to it as fresh as possible, accepting that I know the story backwards.
That was my first thought too - but my second was "there are people who were BORN after the internet who are now adult members of society and we're still managing software projects the exact same braindead broken way we were doing it back then".
My oldest niece (who is my little sister's daughter) graduated college two months ago and wasn't born until after 9/11. I think that was the most recent thing to make me feel old. I'm class of '99 and currently rewatching Buffy the Vampire Slayer, depicting kids who were also class of '99, and I now identity with Giles the school librarian, and in fact I'm exactly the same age as Anthony Head was when he took the role. Charisma Carpenter and Nicholas Brendon have now both been over 50 for years and Alyson Hannigan will be 50 in 7 months.
I do enjoy, Whedon revelations notwithstanding, that BTVS is in some ways evergreen.
I do not enjoy that one of the defining moments of my adult life -- 9/11, which happened when I was 31 -- is now long enough ago that there are adults who do not or literally COULD not remember it. Time marches on.
Isn't it even worse now because we need JIRA tickets that all start with, "as a user," before continuing on to describe something like obtrusive ad placement that a user absolutely does not want?
I think I’m general projects are managed worse than they were 25 years ago. As the years go by and Agile gets further entrenched there is more time lost to ceremonies and rituals and less time spent solving real problems that provide value.
Other memoirs by people who worked in computing and became disillusioned or dissatisfied with it:
Joseph Weizenbaum, Computer Power and Human Reason (1976)
Cliff Stoll, Silicon Snake Oil (1995)
Neither is all memoir like Ullman's book -- both are mostly exposition and argument. But both have some passages of memoir, and in both the author's personality shows through clearly throughout.
I haven't read Ullman's memoir, but I'll just put down "Soul of a New Machine" by Tracy Kidder as a peek into hardware history you may find interesting.
> How much of this is impostor syndrome and how much reality? How different are other fields really? I also feel any new programming task as something foreign, for which I am not ready, basically undoable from my position. But at the same time consider myself able to get into anything and everything, and of fixing and building whatever that can be done. I cannot figure out any straight explanation for such senseless duality, nor feel I adopt the same position for anything else in life.
I believe this is a side-effect of the "creative" aspect of computer programming. Creative jobs tend to have impostor syndrome, because there is no piece of paper that you receive that says "this person is 100% certified to be a song writer" or whatever that creative job is. Since software engineering also has no paper certificate, apprenticeship, mastership, etc, there is no proof that you know what you are doing, and it's all a little too loosey-goosey "figure it out for that one job". There's no certainty that you're doing it right.
A Computer Science degree is about as much evidence of you knowing what you're doing as a warranty on a hammer and chisel is evidence that you know how to cut wood joints. Software engineering is a trade, completely distinct from Computer Science. That's why so many people in the industry don't need a degree. You learn the trade on the job.
It's just bizarre that we don't have apprenticeships or trade organizations to ratify someone as a Real Programmer(TM). I can hire someone with 3 years experience working as a programmer and they'll turn out to be almost incompetent. That shouldn't be possible, especially for a job that pays $140,000. But I guess it happens with construction contractors too, so maybe it's not surprising?
I did a two year technical program, it was called "Information Systems and Data Processing"; this was maybe two or three years before this book was published ('94-'96 maybe?). It wasn't an apprentice program but all of the people teaching the programming classes had worked with the language professionally. It made a big impression on me at the time, I think it prepared me (somewhat) for the big empty spaces that you don't really notice until you start working on a project.
It's clearly no replacement for a computer science degree, but I think there's a real benefit to spending time with people who have done the work professionally.
I think the creative and craft element is some of this (maybe most), but it's also often the case that software engineers are producing code for business domains that they are also not experts in. Most code is for business, and most programmers are not trained or experts in those business domains.
It isn't just the lack of a piece of paper, it's the fact every project is new to some extent (to exactly the extent we can't just reuse existing code) and present somewhat new challenges. Also, some difficulties scale very nonlinearly, with single changes turning the challenge from needing a few extra weeks to solve to being effectively insoluble in any reasonable timeframe, if ever. ("Check this program to see if our code standards are being followed" to "Check to see if this program will halt" can sound equivalent to a manager, but woe betide the person who tries on the second one.)
Has anyone ever written a piece of code that's really so difficult for a human to read and figure out whether it halts? Obviously it's not hard to imagine theoretical examples of a program so large nobody could figure it out within a human lifetime, but I'm not entirely convinced for real programs written by humans that checking for haltability is likely to be harder than checking for following coding standards (unless your coding standards consist purely of such narrowly and explicitly defined rules that it's 100% automatable).
Just because a single algorithm that works for all possible programs can't exist doesn't mean that it's necessarily difficult for any (or even most) actual programs humans might write.
That may depend a little on the CS degree you do/have done!
Showing my age, but a CS degree at Edinburgh University in the 1980's set me up for a good career... No idea what is currently taught.
That said, recently terminated a contract early as the person was taking days to do simple tasks, whereas another person was showing gumption and being productive in less timeframe, and seemingly with less of a programming background.
Oh and I learnt a lot from initial working for a software house as my first job (Logica as it happens). But I did have a base to build on.
Some of the lecturers in CS dept in my era had degrees in psychology or classics...
I assume tech moves on so fast that you have to rewrite the course every year. But you do have some certifications which go some of the way, for example the sun certifications for Java.. At least you know the holder will have an idea how it fits together under the hood...
Tech moves but often in circles and not in a cumulative way. What I mean is that if you know the basic enduring time-tested principles of algorithms, data structures, tradeoffs, principles like caching, compilers, a bedrock stable programming language or two, you know the concepts of Unix as of 20 years ago, then you are well equipped to be dropped into 2023 or 2026 and get yourself into the context over a few weeks and learn the specifics on the job.
It's not like the framework churn is so important to remember. You can skip a lot of what happened between 10 and 5 years ago. So two things matter: the long term basics and the very recent specific permutations of it.
You definitely can find crappy construction contractors out there. People simply have a greater difficulty to judge their quality of work because who knows about construction as long as the house is not falling?
Back when syntax was a challenge for me and documentation more scarce generally, and, perhaps even more than I’d like to admit to this day, I find myself mentally thinking of programming as ‘begging a computer to do things.’
Then it goes from begging a computer to do something to cursing the designers of the latest popular framework for doing The Thing in a Slightly Better But Radically Different Way. Oh great, we get to learn how to do this basic Thing for the eighth time, hopefully this time is sticks (it won't).
It's even how we as biological systems work, there's plenty of evidence of old stuff and iterations still within every body and within every cell. This even was an explicit programming paradigm in the 60s and 70s, "extensible programming", for example driven by a lot of the designers of languages such as Smalltalk, aiming to enable systems that can grow not unlike living systems.
And it also defined the moral imperative for software to be malleable, for it to be soft and reshapeable. Needs change, and it's not just a brand new thing we slap on top, but something later to & extending the past. Whether we can do that, whether users have software that can be adapted to their contemporary needs, is a core liberty. https://malleable.systems
The personal computers start as toys and tools of the hackers. I couldn't find any other way to create it. It was definitely without much grand planning.
>The project begins in the programmer’s mind with the beauty of a crystal. The knowledge I am to represent in code seems lovely in its structuredness. For a time, the world is a calm, mathematical place. […] Yes, I understand. Yes, it can be done. Yes, how straightforward. Oh yes. I see. […] Then something happens. The irregularities of human thinking start to emerge.
It's ok, so-called "software development methodologies" solve this problem by making sure that the project begins in the programmer's mind in just as chaotic and irregular a state as it will eventually end up once the code has been written.
You and the person responding to you are both right. If you don't know Ellen Ullman or Close to the Machine: Technophilia and its discontents, it's kind of sideways.
He does mention Ellen and says
> The book is a memoir about her life
but never explicitly connects the title of the article to the title of the book or explains this is a review of that specific book (though that section is titled "Review.")
So I understand your confusion. I put "Ellen Ullman" in google and that quickly got me to Goodreads.com and seeing a list of her books, including this title. A little mystery left for us to solve. But the author could've also chosen to do a better job of explaining what they were doing here!
Right. I cross-posted without thinking about this, thanks for pointing it out. As you have figured out already the linked post is a short collection of notes on Ellen's book: https://www.goodreads.com/en/book/show/486625
The title of the blog post (which is different from the title of the HN submission) is the book title (I found out after googling the reviewed author).
I read the article, and I see nowhere that the author states that the title of their review is the title of the book. However, whatever was being reviewed seemed interesting enough to ask here. So, except for your arrogant way of telling me so, I am thankful that I now know, so that I can look up the book.
FWIW - I found "The Bug", by Ellen Ullman, to be a really engrossing piece of fiction to read. I wasn't aware she'd authored other works. I'll have to check them out!
The wikipedia for Ellen doesn't mention Jeffrey Ullman by name. Is there a reason which people prefer not to refer to, which explains why the coyness, or was it some other computer science Ullman which adopted Ellen? It does state she was raised in a computer science family.
People deserve to stand on their own two legs, and if its "not in the shadow, shines by herself" I can get behind that. Or indeed "no, it's just a coincidence, the adoptive parent isn't Jeffrey and his partner"
(I too was raised in a computer science family, 3 generations now. In my own case, regression tended to the mean but I have high hopes of the next one.)
I read this book years ago. One part that I still think about from time to time is how the author was approached for a job that entailed maintaining an ancient mainframe. Despite the pay being so high, the mainframe and its model were dying. Both her and the employer knew this. The author turned the job down just because it seemed like it would have been an unfulfilling and depressing endeavor, despite the high pay. It reminds me that sometimes, a line must be drawn between pay and self-fulfillment
I haven't read the book, but the quotes do really read as timeless.
I was in my early teens during dot-com era, but without a PC yet and still an "internet virgin".
One question that really tickles my curiosity is how much the spoils of the dot-com era shaped the direction of the free software in the next decade?
When looking at (the aged) projects under Apache umbrella, it seems a lot of them kicked off around that time. Is that because of the dot-com money or because Java graduated to a different stage around that time?
I encountered Close to the Machine by chance in my university's library, many years ago. I flipped through the first few pages and was quickly hooked. Definitely a great read.
"The dumbing-down of programming"
> The computer was suddenly revealed as palimpsest. The machine that is everywhere hailed as the very incarnation of the new had revealed itself to be not so new after all, but a series of skins, layer on layer, winding around the messy, evolving idea of the computing machine. Under Windows was DOS; under DOS, BASIC; and under them both the date of its origins recorded like a birth memory. Here was the very opposite of the authoritative, all-knowing system with its pretty screenful of icons. Here was the antidote to Microsoft's many protections. The mere impulse toward Linux had led me into an act of desktop archaeology. And down under all those piles of stuff, the secret was written: We build our computers the way we build our cities -- over time, without a plan, on top of ruins.
I repeat the last sentence to my students all the time ("We build our computers the way we build our cities -- over time, without a plan, on top of ruins.")
There's no way to understand why our computers work the way they do without understanding the human, social, and economic factors involved in their production. And foregrounding the human element often makes it easier to explain what's going on and why.