Hacker News new | past | comments | ask | show | jobs | submit login
Programming Is Forgetting: Toward a New Hacker Ethic (opentranscripts.org)
224 points by Espressosaurus on Dec 18, 2016 | hide | past | favorite | 126 comments



The audio "loss" example sounds plausible in passing (and the diagram looks plausible) but is actually incorrect. The frequency and timing dimensions of analog audio below the Nyquist frequency is preserved perfectly by digital quantization, which in practice for CD/DVD usecase are the full spectrum of the human ear. This counterintuitive result is explored in some detail in [1].

It is true that the amplitude dimension (only) is quantized to (typically, 16 or 24) bits, which you could detect with a very good oscilloscope. However 24 bits is way smaller than any human ear can discern. Visually, it is like looking at two stacks of dollar bills that are 6,000 feet high and trying to discern which one has one extra bill.

I suppose that is technically "lossy", but the only thing we are "forgetting" is something no human could perceive, or remember.

[1] https://people.xiph.org/~xiphmont/demo/neil-young.html


It's still lossy! Maybe it isn't lossy to the human ear, but what is the data going to be used for? There are assumptions you're building in here. What if I'm now interested in using the data for a bat ear model? All of a sudden, that information is now of limited use.

She's saying "be aware of your assumptions." Her entire screed is a call to take a step back, and recognize that programming is reifying assumptions and systems of control. Be aware of what those happen to be. Make those choices conscious, rather than unconscious.

When I'm creating a UI, am I assuming the viewer has 20/20 corrected vision on a 1920x1080 monitor with the full set of rods and cones? Am I considering folks that are colorblind, might need to have different zoom levels, or might use screen readers?

When I'm creating a tool for data analysis, am I making it for other programmers? Or can I maybe widen its usage to the business analysis side, thereby making the tool more useful to more people. When I change a tool, is it breaking someone else's workflow?


> and the diagram looks plausible

I've seen this and similar diagrams used over and over again by a lot of people who should now better. This kind of diagram can be interpreted two ways

(a) the diagram maker has no idea how sampling works (b) the diagram maker made it to illustrate a pathological case to make quantization noise obvious, that is, the diagram shows fs=infty with a four level / two bit ADC with no dithering

In any case, it has nothing to do with how digitizing audio works.

> It is true that the amplitude dimension (only) is quantized to (typically, 16 or 24) bits, which you could detect with a very good oscilloscope.

Nope, normal scopes work with 8 bits ADCs and manage 10 to 12 bits in ERES and similar modes (... they don't really get to 10 or 12 bit SFDR though ...). Some special scopes have 16 bit ADCs, but you'd still be hard pressed to detect the difference.

Also, while again a "counterintuitive result" in a certain sense, the dynamic range of a 16 bit audio signal is greater than 96 dB, that is you can encode and actually discern noises below -96 dbFS. This is because the 96 dB figure assumes a white dithering signal, which is not used.

Of course, all that doesn't really matter with pop music. Who needs >100 dB SNR and DNR if the piece you are encoding only has 10 dB DNR anyway?!


> The audio "loss" example sounds plausible in passing

Not really. Sure, your debunking is sound, but we really don't need to appeal to Nyquist for this one. Just consider the alternatives; analog media? It begins rotting the moment it's recorded. Some abstract representation? Fine, until you forget how to interpret it. Digitizing is the most robust means we've yet invented to forestall "forgetting;" a technique that enables precise and efficient replication of audio on myriad forms of media now and in the future.

The Unicode case is also naive. Every language suffers change as new speakers/writers and new representations appear; that isn't a feature specific to programming or computing at all. On the other hand, thousands of symbols from hundreds of obscure languages are being permanently preserved for posterity in Unicode; how is this "forgetting?"


The argument is not that "forgetting" is fundamentally bad - it's that the choice of what to leave out can be meaningful. Han unification removes some amount of distinction between things that are meaningfully different and instead relies on additional metadata to reconstruct that. What are the wider social consequences of that? I don't know, but it's not clear that those involved in making the decision do either.

The fundamental point here is that hacker culture has often made decisions without considering the effect they have on non-hackers (or even hackers of different backgrounds), and as a result those decisions may result in abstractions that "forget" meaningful data. Uncompressed digitisation of audio is a case where it's unlikely that the difference is important in any way, but there are plenty of examples given where it is. The suggestion that having more information can help us make better decisions shouldn't be controversial.


By the way, people seem to think Han unification was forced on CJK users by evil white people from Unicode (there was an article like this in modelviewculture once), but it was contributed by the relevant Asian governments.

And of course China already made much larger changes in real life by creating Simplified Chinese.


> Just consider the alternatives; analog media?

Now you're offtopic altogether, talking about different possible types of potential loss....

> Every language suffers change as new speakers/writers and new representations appear

That was the point (to whit, the whole discussion about transcription was illustrative).


The idea of Han unification doesn't seem right to me. Cyrillic alphabet for example got its own set of codes, even though some letters resemble Latin letters. But, even though those letters look the same, they sometimes sound differently and mean different things.

I suppose somebody came up with this idea back then when they were trying to fit everything into 16 bits.


I think the speaker knows that.

The point is that something is still lost. Maybe this is trivial, maybe not; in making that decision you are making assumptions about why recordings are made and how they will be used. Those assumptions may be correct, but they remain assumptions.


The quantization of amplitude isn't quite that straight forward though.

Usually some form of dithering is involved to transform any quantization errors into a smooth noise.

See http://www.audiocheck.net/audiotests_dithering.php

And I'm pretty sure a DAC isn't supposed to output jagged waveform, no matter how high the resolution.


Find me an example of a bandlimited finite time signal in the real world, please. _Technically_ the example is correct (although the diagram is not)

Any time you digitise a signal, what information you are willing to lose is the first question you ask. The answer is different depending on the signal and what you need it for. That's what is being brought to attention. Not the fact that all the information in the signal anyone would care to represent is representable.


I really enjoyed this essay, even though I think I disagree (at least partially) with its primary conclusion. I especially liked the discussion of the map/territory problem and its application to programming.

I think I even agree partially that the most literal version of the hacker ethic is a bit flawed. But I think there is still a lot of value there, even if Levy's book didn't necessarily show it in the best light, or deal equitably with all of his characters.

Here i'd say: the map is not the territory. Levy's book is not the hacker ethic, even though he may have made an attempt to map it. The examples he cites may or may not meet the criteria that he himself specified in all cases, but that says more about his choice of example than it does about the ideas themselves.

I think there's a way of understanding these principles that comports well with a, shall we say, more mature view of the world than perhaps some of us had growing up on them:

> Access to computers should be unlimited and total.

I think this one stands on its own. Access should be unlimited and total. For everyone. As the author rightly pointed out, the 'hackers' denied access to Margaret Hamilton. That makes this event a bad example of the hacker ethic, not a good one. But I think the principle fits rather well with the philosophy of the author, provided the definition of 'access' is probably expanded and contextualized.

> All information should be free.

To me, this is more ideal than implementation. All information ought to be free. In an ideal world. Where possible that information should be free, and we should be working towards a world where it is moreso. However, information should never be more free than the maturity and tolerance of the society its embedded within permits at any given time. As free as possible, but no freer, to put it on a bumper sticker.

> Mistrust authority—promote decentralization.

I think this is an unequivocal good not limited to the hacker ethic. Authority should be mistrusted. Decentralization should be promoted where we can avoid compromising the other ideals (e.g. access).


Even your modified interpretation has dangerous side effects. We can probably modify the access to computers line to only over computing hardware you own, and that probably works, but the other two lines you comment on are often in opposition.

If all information is free, then it's also free to access to authority. Authority can do a lot of things with information that few hackers would like: Read on what full transparency leads to in a place like Bridgewater. Once it's clear that we don't want authority to have free access to our information, it's clear that the 'all information should be free' line is not an idea, it's just hypocritical. What it really means is that hackers should have access to all the information they want, but they should be able to hide all the information they need from authority: Enough of that, and now you have a new authority, which happens to be the people that have access to the information.

All information should be free is a phrase that gives permission to developers to creates databases that are better at predicting election results than pollsters. It's what allows people to tell if someone is pregnant, has an undisclosed mental illness, can list everyone's favorite kinks, where has their cellphone been since it was purchased, and everything they've bought on the internet the last 10 years. When it comes to information quantity has a quality all to itself, and I don't think that hackers in the 70s had really internalized this.

This breakdown on the hacker ethos builds two camps: One that finds privacy more important, and doesn't really want information to be free, and those that put the freedom of information first, and are happy to hand tremendous guns to authorities, whether they are governments or silicon valley corporations.

And this is why the hacker ethics breaks, and we should be careful about the future. There's a lot of uses to all that centralized, free information: A lot of things become very convenient, computers systems are capable of pretty much reading our mind, and fraud detection algorithms become so very good that many things become cheaper. Unfortunately the same tools can be used for far less friendly uses.

This is a question we'll all be wrestling with for years, as choosing privacy would make most tech giants collapse, while choosing information freedom is just waiting for the right kind of authority to show us that East Germany's Stasi was just a small scale exercise.


> Even your modified interpretation has dangerous side effects. We can probably modify the access to computers line to only over computing hardware you own, and that probably works, but the other two lines you comment on are often in opposition.

I wouldn't even make that modification. Illegally accessing computer hardware in pursuit of a higher or better purpose can be good, provided that purpose avoids harm to others.

Of course, people playing God with their own interpretations of when that standard is met can lead to disastrous results. But sometimes it's ok to break something or break into something to learn about it. It's a thin line, but it is there.

A Chinese hacker poking holes in the great firewall, for instance. I'd say that's a good thing, even though they do not own the hardware.

> f all information is free, then it's also free to access to authority. Authority can do a lot of things with information that few hackers would like: Read on what full transparency leads to in a place like Bridgewater. Once it's clear that we don't want authority to have free access to our information, it's clear that the 'all information should be free' line is not an idea, it's just hypocritical. What it really means is that hackers should have access to all the information they want, but they should be able to hide all the information they need from authority: Enough of that, and now you have a new authority, which happens to be the people that have access to the information.

Ya, I couldn't agree more. That's why I added the proviso that information wants to be free only in the ideal case. It is aspirational, not pragmatic. We want to live in a world where all information can be free, and we should all work towards that. That does not, however, mean going around 'liberating' information arbitrarily. And it also does not mean that, in even our most progressive societies, that we are ready for all information to be free.

> This breakdown on the hacker ethos builds two camps: One that finds privacy more important, and doesn't really want information to be free, and those that put the freedom of information first, and are happy to hand tremendous guns to authorities, whether they are governments or silicon valley corporations.

Ya, you have no argument from me there. I do not think all information should be free in the actual world that we live in. But I do think it's an interesting way of modeling an ideal world. Underneath all forms of necessary secrecy lie blights of some kind. Weapons are secret for war, sexual preferences can be secrets due to shame or even criminalization, etc. Sometimes that secrecy is justified by the faults of the world, and sometimes it is used to further the interests of an individual or group. In the latter case we should strive to reveal it, and in the former to address the justification for that secrecy.


information should never be more free than the maturity and tolerance of the society its embedded within permits at any given time.

I'm not sure what you're talking about here. It sounds like you're justifying censorship, and I'm not sure why.


Some forms of secrecy are good, in certain contexts. A homosexual hiding his or her sexuality in Saudi Arabia, for instance. That is the sort of openness commensurate with the maturity of your society I was referring to.

Ideally, no one should have to hide anything about themselves. But we unfortunately aren't quite there yet.


Ideally, people would be able to to what they want without having to hide or proclaim it.

My sincerest desire has always been to just be left the hell alone.


> Some forms of secrecy are good, in certain contexts

I disagree. Because my ethic says that secrecy always benefits those of ill-intent. I must not make exceptions for those of goodwill, because exceptions disprove the rule. I am not wary of those of other intent.

An ethic doesn't need to be practical. Subscription is not required for it to be virtuous.


> because exceptions disprove the rule.

You may be aware (since you turned it around), but the phrase "the exception proves the rule" is meant to provoke thought about the rule in question, not announce it as broken. This is so because the meaning of "prove" in that phrase is "test", as noted in https://www.merriam-webster.com/dictionary/prove at 2.a.


No offense, but your deflection into another topic of debate (that you seem to care about) has no bearing on my ethics. You completely ignored the point, which was the issue of ethics vs practicality that you seem to misunderstand.


Sorry perhaps I wasn't clear. Maybe I should have said that some forms of secrecy are necessary or justified in certain contexts.

Surely you don't believe that homosexuals living in countries where they might be put to death ought to be outed against their will?


> Surely you don't believe

An ethic doesn't need to be practical (just to stay on the point I was trying to make). An ethic is a goal for some value of "preferred individual behavior", not a life choice. Subscription to an ethic is not required for the virtue of the ethic to exist, e.g. If I believe that secrecy is bad, I may still practice it for pragmatic reasons, but this does not change the ethic as an ideal model.


You should maybe reread my original post, as that is exactly what I said :p.


Indeed.


While it was a interesting read and definitely had several good points, I don't agree this is a "new" hacker ethic.

It bears no semblance to the original, and is of a fundamentally different nature.

The original is a short and simple declarative list of principles towards a goal.

This is a looooong list of ambiguous wishy washy questions whose intent is unclear and with little relation to the original.

Trying to brand this as a "new" hacker ethos is just abusing the recognition of the original to push your own (completely different) ideas and agenda.

I'm not going to call it entirely disingenuous, but it does come off with a bad smell of politics.

Count me out.


  Trying to brand this as a "new" hacker ethos is just abusing the recognition of the original to push your own (completely different) ideas and agenda.
  I'm not going to call it entirely disingenuous, but it does come off with a bad smell of politics.
You didn't say this (and I'm not asserting that you believe it), but it's easy to read this statement and have the feeling that the original "hacker ethos" was devoid of politics or trying to push an agenda. But it was totally politics and it was totally trying to push an agenda.

One may prefer the original to this new ethos, but anyone claiming to speak for a large group of people is engaging in politics. To borrow an idea from the essay, you are forgetting the diversity of that group's opinions when you claim it can be boiled down into short pithy statements.


It's a speaker talking about the ethos they'd like to see the hacker community adopt. If you want to call that "pushing an agenda" or "disingenuous" then it would help if you'd clarify what you thought the speaker said they were going to do. They were up front about the fact that they were discussing the values they think are important to adopt – that's exactly what the talk is about!

The talk is a direct response to the "old" hacker ethos. Each point addresses the same topic as the original, so there's a pretty strong relation. It's hard to "abuse the recognition of the original" when you're making a point by point response.

And, those values and approaches the speaker describes in that response are different from the ones in the original. So in comparison to that old version, this is new.

In short, it's completely unclear to me where your criticisms are coming from.


A new version of something existing should show clearly where it came from it. It should represent an iteration.

The original represented a celebration of technology and it's possibilities in competent and eager hands, when unconstrained. The new version is all about compromise and politics. It's political first, and only incidentally technological.

The original was short and concrete. This spans 20 pages and makes zero concrete principles.

The author misrepresents the simplicity of the original Hacker ethos as an absolutism, then moves on to show how absolutism has flaws. Shocking, eh?

I could go on, but I think my stance should be fairly obvious by now.


The original was a book! Yes, there was a nice concise list in that book, but this article is a response to more than just that list.

You seem to be thinking that "hacker ethos" means "Hacker Ethos™ by Levy" when the speaker is using it as... the word "ethos". There's no need for a response to be an iteration when there's just a fundamental disagreement about what ethics we should pursue.


> It's political first, and only incidentally technological.

"Information should be free" and "mistrust authority" are both obviously political statements, and most of the others are at least somewhat political in nature. That you don't see them as political is a reflection of you, not of some timeless pure technological nature inherent in them.


The original is a short and simple declarative list of principles towards a goal.

What was that goal? Could it be ascertained from the "short and simple list" alone, or is it only evident in conjunction with a collection of cultural assumptions?


This is a great talk, and I loved the discussion and criticism of the roots of hacker culture, and also what the author describes as "logical positivism." You can look at something like Bitcoin as a textbook example. The thinking was that by making a simplified model of money, money can be replaced, and the government (authority) circumvented. In reality, money is not just numbers in a ledger, and when you take away one authority you institute others.

That said, mistrusting authorities such as governments is a valid and valuable position, as is standing up for the right to tinker with hardware on principle. Some of the bullet points that are bulldozed in this talk seem to have a philosophical kernel that is sound, if controversial or subversive. It's pretty interesting to me that computer tinkerers in the 70s, before personal computers, couldn't own a computer or take one apart. Is that the lack of "access" the ethic decries? Properly contextualized, is this point just about the inherent value of freedom to tinker? Or is "access should be total" a juvenile rationalization for breaking into computer systems, and "information should be free" an excuse to steal or expose secrets? (And is this ambiguity real or an artifact of Levy's formulation?)

I would separate "philosophy" from "culture." If in a particular subculture like hacker culture, "all that matters is how good you are," this is equalizing in a way -- for example, a man and woman with equal skills will have equal status -- but it can also lead to phenomena such as arrogant rockstars and frequent pissing contests. And what about the people who aren't so good, how are they supposed to feel? So this "ethic" can generate a traditionally male sort of culture that is off-putting or toxic to many people, male and female. The fact is that creating a diverse and inclusive community is its own value and doesn't follow from anything about hacking.


> You can look at something like Bitcoin as a textbook example. The thinking was that by making a simplified model of money, money can be replaced, and the government (authority) circumvented. In reality, money is not just numbers in a ledger, and when you take away one authority you institute others.

Do we believe that the author of bitcoin has any ability to control Bitcoin other than releasing it or any way to compel adoption of bitcoin beyond advocating for it the way any of us could advocate for it?

If they are lacking those abilities they have just provided an alternative which dilutes the authority of traditional money without adding any authority themselves.

I may not understand where you are coming from, if so I apologize.


The rules and structure of bitcoin enshrine the values and ideals of a certain culture, and enforce their traditions with ironclad rules. It has just as much "authority" as any government mandated currency, though it might not seem that way if you feel more favorably about the culture surrounding bitcoin vs the standing political structure.


Bitcoin the community and Bitcoin the software have rules and authority. "Just as much" is an empirical claim that I disagree with. I don't think Coinbase or Gavin Andreesen have the same kind of authority as the Venezuelan president. It's a set of narrower, diffuse authorities, rather than a central authority.

But even that is just one branch of Bitcoin. What makes Bitcoin truly anti-authoritarian is that even if some body of power controls the dominant branch, a group of people can simply fork it, and run their own currency completely without the blessing of Gavin or Coinbase or anyone.

That doesn't give them all the benefits of holding mainline Bitcoin, but it gives them some benefits. But anarchism never said everyone can have all the power, it says everyone should have the power to do what they want amongst themselves. It's freedom from interference, not guaranteed access.

Ethereum Classic is a fantastic example of this. As is the proliferation of altcoins, many of which are just Bitcoin forks, and in that sense are Bitcoin.


I think Levy's take on hacker ethic was a bit phony to begin with, typical of journalists who try to describe the spirit of something without practicing it. All information should be free, really? Like your root password? I much prefer the version in the Jargon File:

"A person who enjoys exploring the details of programmable systems and how to stretch their capabilities, as opposed to most users, who prefer to learn only the minimum necessary."

Basically a hacker is someone who enjoys hacking for its own sake, not as part of some moral crusade. The OP is proposing another moral crusade which is just as irrelevant. Whether you like it or not, most discoveries will be made by those who enjoy discovery, not by those who view it as a means to reach utopia.


> All information should be free, really? Like your root password?

A password isn't information, especially in this context, it's data. How the login or encryption system works is information. Of all the misconceptions about hackers this is one of the weirdest to criticize. From the introduction of computers until the mass adaptation of the Internet, access to information was the largest issue in hacker culture. From BBSes to hacking groups and computer clubs, it was all about access.

> Basically a hacker is someone who enjoys hacking for its own sake

Not only isn't that what your quote says, it's essentially just the definition of an enthusiast. Hackers are, even in the most watered down definitions, something different. Doing something for fun is part of hacker culture, but it's not the purpose of it. It's things like curiosity, exploration, exchange of ideas etc.

> Whether you like it or not, most discoveries will be made by those who enjoy discovery, not as part of some moral crusade.

Rarely. Discoveries (at least "common" ones) tend to be made by people who see a future other people don't. One could argue that people like Elon Musk is very much on a moral crusade and that his motivations are utopic.

I don't really care who calls themselves or things "hacker" these days. But if you don't recognize that hacker culture, or the future of technology in general, is something more you're missing out.


While I get your point, I don't think data is clearly not information, it's just that the whole point of passwords is to be secret.


Information and data are entirely about the intent of them. The same number can mean a movie file or a math equation in different interpretations, yet still have the same physical representation.

The difference between passwords, DRM, and proprietary software is then intent. They all serve the same function - secrecy - but for different purposes. The secret password is meant to provide security to the holder in regards to the information they hold. Proprietary software and DRM are meant to provide secrecy to the rights holder in regards to software other people hold.


>All information should be free, really? Like your root password?

No, that sounds right. Didn't you hear about RMS's famous anti-password crusade when passwords were added to ITS?

Also, ITS had no root, or even permissions: you could log in as any user and do anything. Except browse the directory where the ZORK sources were kept, after Blank et al. hacked in permissions for that directory (and even then, some DEC engineers patched the system to subvert it and grabbed a copy of the source, translating it to FORTRAN, and giving us DUNGEON. The Zork developers figured that if they went to that much effort, they deserved the source.


Fair enough. Though even RMS argues today that real names and personal data of online users shouldn't be accessible to everyone.


I imagine that's a bit different. The ability to bend the system to your will, make the choices you need to make, and have it behave as you desire, regardless of what the people who were at the keyboard before you think, is empowering, and conducive to hacking. Having all personal data available to anyone in a mass database... not so much.


The OP is proposing another moral crusade which is just as irrelevant.

"Have some thought about the people your decisions effect" is a moral crusade?


I think it is, considering that hackers traditionally tended to value hacking above certain other mundane, authority-governed pursuits, as noted by the OP, and now she wants them to care.


It does seem that the author thinks there's a moral component to the subject. But you may want to dig into the word "crusade" a bit more. It's over the top here.


Richard Stallman actually had that viewpoint toward passwords:

https://en.wikipedia.org/wiki/Richard_Stallman

Stallman found a way to decrypt the passwords and sent users messages containing their decoded password, with a suggestion to change it to the empty string (that is, no password) instead, to re-enable anonymous access to the systems. Around 20% of the users followed his advice at the time, although passwords ultimately prevailed. Stallman boasted of the success of his campaign for many years afterward.[17]


I seem to recall that his account on the FSF ftp is still blank.


> most discoveries will be made by those who enjoy discovery, not by those who view it as a means to reach utopia.

This contradicts the common adage "necessity is the mother of invention". I am building software every day to try to solve problems in my neighborhood... you seem to be saying people like me rarely invent anything, but coders who just do projects on the weekend for fun do. Why?


> Whether you like it or not, most discoveries will be made by those who enjoy discovery, not by those who view it as a means to reach utopia.

I like this. A lot!


Somewhat related: all things that have already been discovered will be subverted to reach utopia because those who enjoy discovery don't enjoy them anymore while utopia is a Big Thing™ which will be delivered Any Time Soon Now™, this time for sure, so it needs all resources it can use.

And the OP likes this a lot :)


This is entryism at its finest. It feels like an attempt to subvert an already existing culture to another ideology.


In order to be entryism, the speaker would have to be an outsider, right? But that's not the case here. It's a member of the community speaking up about the values they care about.

Presumably, that care is genuine, right? Or do you have some reason to suspect the speaker is being disingenuous?


> In order to be entryism, the speaker would have to be an outsider, right? It's a member of the community speaking up about the values they care about.

Hmm, what are the author's membership credentials besides being in awe of the great hackers of bygone era? I mean, this is the kind of impression I got from reading TFA.


Well for one thing, as they say in TFA, they're a programmer and programming teacher.

What are any of our membership credentials aside from being people who program?


> What are any of our membership credentials aside from being people who program?

Haha, nice provocation. I'll try though: some people will say it's pwning boxes, others will say it's writing open source software (others still - free software ;)). Many will agree it's about finding novel and unintended uses of technology, for good or for bad. TFA itself mentions distrust of authority, meritocracy, seeking beauty in technology being quoted as part of the Hacker Ethic™.

And I think most people who call themselves hackers would agree it's not about having a boring job as a Java developer churning out repetitive proprietary code for some bank 20 months past deadline, so your definition is probably out ;)

Also, "hardware hacking" seems to be a thing and it need not involve programming. Some will accept social engineering too - maybe the closest to what the author seems to be doing ;)


From the transcription:

>I’m not claiming to be a hacker or to speak on behalf of hackers.


You're suggesting a different use of "hacker" than the speaker was there, which will be apparent if you look at the whole quote:

> It’s a privilege to be able to be called a hacker, and it’s reserved for the highest few. And to be honest, I personally could take or leave the term. I’m not claiming to be a hacker or to speak on behalf of hackers. But what I want to do is I want to foster a technology culture in which a high value is placed on understanding and being explicit about your biases about what you’re leaving out, so that computers are used to bring out the richness of the world instead of forcibly overwriting it.

That is, the speaker isn't claiming to be a member of the "highest few". But as a member of the community to which "hacker" is ever applied – programmers – it's legitimate to address what qualifies someone for the honor.


You seem to be confusing "hacker" with "10x developer".


No, the speaker seems to be confusing those things. But if that's the way the speaker is using terminology, it doesn't make sense to base our criticisms on an entirely different linguistic platform.


Entryism is the practice of infiltrating an organisation/group to gain trust then using a variety of tactics to try and subvert the politics/ideological premise of the group. Also misdirecting resources to support the subversive ideology and efforts. And ultimately trying to destroy the group and capture the membership into the subversive organisation that was behind the entryism in the first place.

What I gather from the piece and the reactions here point to a fairly clear example of entryism.


What evidence do you have of infiltration beyond the fact that you don't like what the speaker said?

It's a pretty extraordinary charge that someone would become a computer programmer and become a teacher of programming purely in order to give a conference talk that seems to suggest, at root, that we consider how our programs treat people. What a revolutionary!


The author is trying to replace one system of thought with another, that's a fact. I think it's baseless to assume if the intention of the author was to originally pursue this motive or to suggest improvements after being apart of the community for a period of time.


It's hard to reconcile lines like "was born in 1981, and as a young computer enthusiast I quickly became aware of my unfortunate place in the history of computing. I thought I was born in the wrong time. I would think to myself I was born long after the glory days of hacking and the computer revolution." with entryism.


[flagged]


I was born in the 70s and when I was a young computer enthusiast my reading experience is similar: there was a glory era of hacking and experimentation and lisp machines and fun but now it was over and we had to live in this era of professionalism and msdos and you couldn't phreak a phone anymore.

Then I got to college and discovered the internet just as the web was getting started and I started to realize that even though that old era was gone forever, here's a thing that's new and exciting and at least as world-changing.

I still see that 'wow those old timey hackers were really something, and we'll never have the like again' mindset floating around the internet, even here on HN. (For example, people talking about how great lisp machines were or how great smalltalk environments were and how it's such a tragedy that we'll never see their like again).


There is a similar dynamic with first wave Linux people who love the decentralized community oriented 1990's and 2000's that have been reshaped by the 2010's which have been highly centralizing and corporate driven with systemd, containerization etc.

Are the 2010's going to be looked back at in 30 yeas as a horrible decade that destroyed the ambitions of thousands of 12 year old female SysVinit scripting wizkids? Its possible but it seems highly pedantic and silly to think so.

Projecting our current arguments onto history usually ends up as colorful revisionism that is historically inaccurate and politicized. I think people are quite rightly skeptical of such attempts. The tactic is used endlessly in politics, which is certainly not the bastion of accurate factual argument and reasoned discourse.


I'd wager that management denying open access to hackers was more detrimental to Ms. Hamilton's exercises than the activities of said hackers. A good hacker would do root cause analysis instead of picking the most convenient victim to further some personal odyssey.


I think Parrish misunderstands a key component of the hacker ethic, which is that the "richness of the world" is overrated and the structured logic of computer systems is something we should strive for.


"overrated" is kind of a garbage descriptor. All I have to do to make something overrated is rate it highly.

I suspect you have more concrete descriptors for what you are complaining about, I would be more interested in reading those.


Okay.


I found it interesting that her big counter-example to Levy's hacker ethic concerned the modification of hardware. And shared hardware at that. There is a difference of kind rather than degree when it comes to modifying software because software never needs to be shared. That is, you can copy data, modify it to your heart's content, and never affect the original owner of the data.

This does bring up a bunch of interesting points about shared software and data, like who owns it. But one can imagine a world where all software is open, which means anyone is trivially able to hang a shingle and start a new node, but the only really "owned" stuff is the data users have put into it.


Programming is forgetting because forgetting is abstraction. On the other hand, abstraction is not always about forgetting, and programming is not always about the real world[1].

But that's always been part of the hacker ethic I've seen, to the extent it exists at all.

Really, I agreed with a lot of the criticism in the article. There are points definitely worth consideration and reflection. But it also had a couple of peculiarities I've seen again and again in discussions on such topics (whether they're criticisms of programming or some other discipline or science as a whole). It should be considered carefully—dare I say critically?

The first problem is that it, by necessity, doesn't treat "the hacker ethic" with the same nuance as its own ideas. That mostly fine—a natural consequence of constrained time and space and effort—but it's a bit concerning if a core part of the argument is that the hacker ethic is insufficiently nuanced. You could say it's a response solely to Levy's book, but its conclusions try to apply to hacker culture as a whole. It's a bit like criticizing science (or scientism) based solely on Neil deGrasse Tyson's hilariously simplistic "Rationalia"[2]. The discussion is still valuable, but not nearly as broad as it tries to look.

The other problem is even more classic: the criticism is relevant and natural, but the alternative solution pulled in is arbitrary and does not flow naturally from the criticism. Her suggested alternative questions pull in a lot of baggage unrelated to the shortcomings of the "hacker ethic" that just reflects her particular philosophical abstractions for understanding social systems, power and morality—abstractions that are by no means universal. For example, the focus on "labor" immediately stood out: why single out labor of all things? There is an answer to that question of course, but that's not the point: the point is that the answer to that question does not naturally emerge from the perfectly reasonable criticism in the rest of the talk. The same about most of the other things she wishes to change—or not to change—in the hacker philosophy.

As an illustration, it's not hard to imagine somebody with totally different views agreeing fully with the problems outlined in the body of the talk, but coming up with a totally different replacement for the "hacker ethic". Think about a hardcore utilitarian, or somebody with a strongly individualistic bent or a full-on anarchist or whatever else. Or perhaps just imagine the smallest change needed to address the problems of Levy's hacker ethic while keeping the "hacker spirit" however you see it. All those could come up with exactly the same problems in the fist part of the talk, but end up with totally different conclusions.

My point, ultimately, is not to address those conclusions in particular. Rather, I just want to point out the disconnect: it's perfectly possible to agree with the setup without admitting anything about the proposed alternatives. Just like it's possible to admit to shortcomings in science without going full-on postmodern about it :).

footnotes

[1]: I say this coming from a particular perspective where the real world is, if anything, an enemy—perhaps you could call it programming language idealism :).

[2]: It's an idealized government based solely on reason and empiricism which is hilariously unrealistic: https://www.facebook.com/notes/neil-degrasse-tyson/reflectio...


The whole article seems to have only the argument of, "Hackers as a whole should think the way I want them to."

This is the new authority to be questioned.


I'm not seeing any particular claim to authority here? In a critical essay, it's customary to argue in favor of your own beliefs.


I feel like nothing about the conclusion follows from the problems. If I say that LA has a public transit issue and my solution is radical libertarianism I should guide my audience between those two points.


I would suggest that computers don't necessarily make your life better, but following value does, and using computers in certain ways necessary follows added value.


I like the article. Replace "forgetting" by "abstracting" and it becomes even more reasonable.


If you buy into personality types, the hacker philosophy in part comes from the fact many "hackers" are of ISTP type or similar. These personality types learn by taking things apart and are far more likely to simply ignore rules if they can see a way to solve a problem. As tech expands and becomes comprised of a broader range of people with different takes on how things should be done, then there is room for conflict and it's valid to question how we will overcome this. However, replacing statements with questions feels like a philosophy class. If the author has an effective counter argument then they should be able to define it clearly and succinctly.


Is "you should be asking these questions instead of making those assumptions" such an unclear argument?

The questions are the whole point – that's what the speaker wants us to keep in mind as we make choices as professionals.


Start bracing youselves now for the Eric Raymond counterblast.


I'm not so sure, but if he decides to write one, I'm looking forward to it.

Eric usually makes for an interesting read.


Eric can be good, especially when his writing is purely technical. However, his politics are off-the-wall insane, so his response to this might be similarly so.


>> "if somebody calls you a hacker that’s kind of like a compliment"

Being called a hacker has never felt like a compliment to me.


You frequent a site called Hacker News so at least you don't see hacker as a terrible insult.


Actually, I really dislike the site's name and I would never infer that because someone visits a site that some label applies to them.


This site is just misnamed, is all. "Startup news" would be more apt.


...What community do you frequent?


Stewart Nelson decided to rewire MIT’s PDP-1 as a prank. Later, Margaret Hamilton tried to use the DEC-supplied DECAL assembler on the machine and it crashed repeatedly. The transcript points out that Hamilton is generally credited with doing brilliant work for NASA on the Apollo project. Did Nelson go on to do anything notable? Just curious.


If the length of one's Wikipedia page is any indication, Hamilton kicked a bigger dent in the Universe than Nelson.

  https://en.wikipedia.org/wiki/Stewart_Nelson_(hacker)
  https://en.wikipedia.org/wiki/Margaret_Hamilton_(scientist)


"And it’s a privilege…if somebody calls you a hacker that’s kind of like a compliment. It’s a privilege to be able to be called a hacker, and it’s reserved for the highest few. And to be honest, I personally could take or leave the term."

This feels so passive aggressive and subtly mocking.

This entire article is ridiculous pedantry to me. By default everything a HUMAN DOES is a simplification/abstraction of an unknowable mysterious reality. A better catchphrase is: "EXISTENCE IS FORGETTING"....Human language doesn't communicate our complete feelings, ou4 system of time doesn't capture the fullness of the 4th dimension, our structures are a result of our crude ability to shape the infiniteness of matter...and on and on...

Then the author claims: "Programs aren’t models of the world constructed from scratch but takes on the world, carefully carved out of reality"

Insane! Programs are tools. Is a hammer a 'carefully carved out version of reality?

Our societies perceptions give it whatever 'take on reality' it embodies and that is constantly shifting! Our views on Myspace changed pretty fast...the program actually failed because it didn't adapt to societies version of reality.

And the authors alternative to the hacker ethic is pretty unusable. It's a pedantic restatement of the golden rule....


> Is a hammer a 'carefully carved out version of reality?

Yes. The construction of a hammer takes a particular view about what is to be done with it. When all you have is a hammer, everything looks like a nail.


> This entire article is ridiculous pedantry to me

Then you missed the point. Kids grow up reading "real hacker culture" and these ethics are one of the first things I assimilated when incubating. They are not ethics at all, but justifications and she broke it down pat.

> Programs are tools. Is a hammer a 'carefully carved out version of reality?

There's a similar thread on /. about third party libraries. Programs are universally built on APIs (software or hardware). Yes, they are carved out of their originating context and they break down (as per the PDP-1 story) over time.


Any time I read content written with the voice and philosophy of the author of this article, I am reminded of the excellent paper "Fuck Nuance".

https://kieranhealy.org/files/papers/fuck-nuance.pdf

Just like there is an appeal to oversimplifying and over-abstracting, there is a similar appeal to over-complicating and "over-nuancing". By trying to account for everything, you accomplish nothing, understand nothing, and gain no insights.

The hacker ethos, like any simplified abstraction, does not capture the entire story, but it also doesn't oversimplfy. It provides a useful model to work with and to think about, and trying to make it a lot more nuanced dilutes the points it is making and ruins its usefulness.


This talk is a really poor abstraction of fundamental particles. (Even that could be a lot of abstraction and assumptions)


The article does a wonderful job in hiding its actual point, probably because the point is very inane - the semi-serious playful "hacker ethic" must go and be replaced by a moral imperative in the form of a list of questions.


Can someone do an executive summary or tl,dr please,

I honestly tried to read it but found it impenetrable.


Аuthor points out that part of the hacker ethics always was unethical (to many) "the end justifies the means". And she warns the wannabes to think twice before accepting that.

Also she reminds that there are always live world and live people behind any computer or mathematical model, and all models are incomplete by definition.


It's amusing to see the loss of information going on in these summaries itself. My take away from the article was we as programmers (hackers?) need to go beyond the tl;dr to really understand what it is we are doing and be conscious of the fact that what we do affects things beyond the those that we presume our work will affect.


The hacker ethic was born of tacit misogyny and not subscribed to, by the original hackers (i.e. pre-80s).

Most of development is about lossy processes.

All of the original hacker ethics should be changed; revised. She posits an example and attempts to break down the strategy used.


Explain to me how playful cleverness, open access to computers and being measured by skill over appearance is misogynist.

Unless of course you think women aren't clever or skillful...


What are you talking about? Did you even read the OP? I was SUMMARIZING the talk (as per the request).

> The mention of Margaret Hamilton in this passage is one of maybe three instances in the entire book in which a woman appears who is not introduced as the relative or romantic interest of a man. And even in this rare instance, Levy’s framing trivializes Hamilton, her work, and her right to the facility. She was “a beginner programmer” who was “officially sanctioned” but also just “showed up.” And she “complained” about it, leading to repercussions and reprimands. Levy is all but calling her a nag.

I don't know what other way to characterize that as a focused topic (tacit misogyny). It was not further discussed. I think you (along with a bunch of other people debating some nonsense about feminism) are having some weird kneejerk because you choose to interpret that part of the talk differently. The meaning was clear to me.


Sorry, your post was quite a ways down from the parent so I didn't realize you were summarizing. My fault.


I also apologize for my tone, it was unwarranted.


The author is a poet/twine author/programmer at Columbia?

She's upset that some guys appeared immature back in the day towards a female programmer, and attempts to deconstruct the hacker manifesto and suggest a replacement that can be summarized as: "Be aware of your actions"....

Along the way she posits that because technology can't capture the analog infiniteness of our infinite universe due to it's digital nature that "Technology is forgetting". She uses examples like: loss in converting audio to digital, and compression loss of uneeded information to support this theory....


I don't think I get deconstruction at a reasonable level. Every time it's brought I feel like I can't follow the argument.

Thank you for your summary.


That is an admirable job, thank you. I may or may not have gained some new insight. But I feel unqualified to decide one way or the other. But I did enjoy the pictures.


The phrase "Toward a new..." is very nearly a sure sign of relativism at work. I don't know why so many quacks pick it. I guess they think they're revolutionizing the world.

This article pushes for an ethics and feelings based redistribution of programming effort. That's bad. Because resources remain finite, we won't lose less data, we can only shuffle around what we lose. What we need to make this decision, is not feelings or vague ethical codes. We need hard data, especially in an age of big data and machine learning where the detailed pieces involved in making a deep statistical analysis may well be out of the realm of human comprehension.

For example, this article targets the evil gender binary. If only we would think more carefully about our choices. Meanwhile, clinics are scratching their heads trying to figure out how to map rates of ovarian cancer to "demisexuals", or how a local rise in "non-binary asexuals" might affect rates of testicular cancer, and what should be done about funding for screenings.

We lose data because we come from an era where space was limited, and cycles were counted. Sure we made some dumb mistakes like 2 digit years, but 2 digit years still work in a broad array of cases so we still use them all over the place today. They are sometimes good to use, sometimes bad, sometimes they're superfluous and unused anyway.

By all means rethink what fields should be compressed or expanded based upon business needs. But do it based on data and business use cases, not on feelings and someone's personal bias- even if that person promises their ethics are better than the old ones.


> Meanwhile, clinics are scratching their heads trying to figure out how to map rates of ovarian cancer to "demisexuals", or how a local rise in "non-binary asexuals" might affect rates of testicular cancer, and what should be done about funding for screenings.

This is a great case in point! Simply classing people as male or female already loses data that's therapeutically relevant. The important thing in determining whether someone's at risk of ovarian or testicular cancer isn't whether they're male or female - it's whether they have ovaries or testes. Some people have both, some people have neither and some people do not have the set you'd expect from their gender. Assuming that "Male" or "Female" accurately describes this is exactly the kind of "forgetting" discussed in the article, and people lose out on appropriate healthcare as a result.


> Some people have both

I don't think this particular combination is possible, for the record.


I think the essay was more about double checking the assumptions and particular world views we bring with us when we write code. Even the most mundane software can distinctly embody specific views that some might take for granted, not acknowledging that others have genuine reasons for viewing that specific dynamic differently.

We need to be willing to explore the assumptions, known or unknown, that we carry with us when we create software.

Software doesn't exist in a vacuum. It is part of a rich, ever changing context of culture, social systems, and all the messiness of being human.

Each human action is the expression of a complex set of viewpoints, and technology is not perfectly neutral. I think, occasionally, some (not all) hackerish types are less interested in exploring this side of things and think they can avoid it by sticking with software. The essay was trying to remind people that you can't truly avoid expressing viewpoints with your actions, even in software.


> We need to be willing to explore the assumptions, known or unknown, that we carry with us when we create software.

Hear, hear.

I have two other longstanding favorite examinations-of-assumptions pieces: "Falsehoods Programmers Believe About Names"[0], and "Five Geek Social Fallacies"[1]

[0] https://www.kalzumeus.com/2010/06/17/falsehoods-programmers-...

[1] http://www.plausiblydeniable.com/opinion/gsf.html


Oh boy. I can taste the flames already.

Others have aptly tackled the "computing is forgetting" part of the article, so I want to discuss the Midnight Computer Rewiring Society section.

The argument is that they denied somone access to a computer by re-wiring it. But that's not true, anymore than changing the ITS login screen to say "5 losers online" was denying anyone access to that routine. The functioning of the computer had been altered, but Hamilton could have simply popped the front panel and re-wired it back, or translated her program to use the other assembler.

She wasn't denied access, the hackers just broke something.

There is also the claim that Hamilton's access was considered umimportant without even looking at the code, with the implication that it was because she was a woman.

As stated above, Hamilton's access was ininhibited. Her program wouldn't run, but not because the hackers didn't consider her important: they had never considered that someone would use DECAL. It wasn't Hamilton's code they disregarded, it was DECAL itself. In this case, wrongly.

Frankly, I find the continuing undertones of sexism accusations frustrating: firstly, it was the 50s, and secondly, none of them seem to be in any way accurate. And if you're going to claim sexism, at least have the guts to say it straight up.

Hacker ethic and culture may be flawed, but it's the only set of computer ethics and culture that I'd want to be a part of.


> Oh boy. I can taste the flames already.

Please don't. You've been posting a lot to HN, so it's important that your posts not degrade the culture here.

Civil and substantive comments, please, or no comments.


Okay dang. I was only trying to comment on how contentious the article was. But if it could be taken otherwise...


>I find the continuing undertones of sexism accusations frustrating: firstly, it was the 50s

It is possible to regard the historical period (and by implication, the social context) of the events as an excuse of the pervasive sexism (but given the vaunted propensity to distrust authority and disregard social conventions, it is quite weak as excuses go), but it certainly does not make the events any less sexist.


Fair enough. It doesn't. However, it does change our expectations of how sexist the events would be.

But this doesn't nullify my second point, which is that the article here doesn't actually demonstrate any sexism at all.


Because only women were ever prevented from working by Hackers?

Much as an intellectual achievement Hamilton's code might have been, it wasn't really that relevant to many people (compared to, say, GNU Software or Linux which benefits billions of people). Presumably it therefore simply wasn't the topic of the book Hackers.


I think the bigger problem is her understanding of the "Hands-On Imperative". Yes, it assumes some form of reductionism. That philosophy happens to work really well when you are debugging computers, and indeed in most of life.

But there is not an assumption that hackers are "perfect", or that they can make perfect systems, or that they can per­fect­ly mod­el the world. Rather, it is exactly what it says on the tin: an imperative. In particular, it is the notion that the people experiencing the problem should be the most motivated to fix it.

The other part of hacking is just the Agile manifesto; if the rules don't let you fix the problem, then fix (or break) the rules. She seems to be misinterpreting this when she discusses "the mis­trust of author­i­ty"; authority there refers to bureaucracy, composed of arbitrary rules, procedures, and restrictions. It's not a blank check to ignore people (individuals) with concerns.

But her final "hacker questions" are mostly of the form "who does X?" so in the end she (mostly) succeeds in repeating the "Individuals and interactions over processes and tools" theme. Still a lot of room for improvement though. If programmers spent all their time figuring out "what the data left out", they'd never get a working product.


My impression was that the implications of sexism were directed at Levy, for glorifying the hackers who made Hamilton's work more difficult while trivializing her work to some degree, not so much at the hackers themselves, who indeed just broke something because they had an incomplete understanding of the system they were working on.


I fail to see how that originates in sexism either...


[flagged]


Please stop violating the guidelines by posting uncivilly like this. It's absolutely possible to say what you have to say without being personally abrasive and insulting, and it's required if you want to keep commenting here.

https://news.ycombinator.com/newsguidelines.html


When a non-hacker sounds off about how my culture needs to change, my inner Naggum cannot be contained.


[flagged]


It's not clear to me exactly what you're trying to say, but it doesn't seem to be civil and substantive like the guidelines ask, and its effect on the thread is just what we don't want. Please don't post like this.


[flagged]


I agree. But to play Devil's Advocate for a second, the article does talk a lot about sexism. (Although it seems hesitant to discuss it outright).


As well to say that fascism caused communism.


[flagged]


I too find myself disposed to think poorly of 'these sorts of articles' about gender and women in tech, etc..I think most of what is said about it is pretty silly on all sides. But this article was actually pretty good - it's worth a read, IMHO.


I wonder what you would consider a non-toxic type of femininity?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: