Hacker News new | past | comments | ask | show | jobs | submit login
The Black Triangle (2004) (rampantgames.com)
29 points by emontero1 on July 11, 2009 | hide | past | favorite | 10 comments



It makes a nice story, but I think a black triangle moment is something programmers should strive to avoid. Progress should be incremental and visible as much as possible and the system as a whole should always be in a working and testable state.

Working on something for weeks before getting it into a testable state is a surefire recipe for an over-designed system with superfluous features and long debugging sessions.


I think sometimes "black triangles" are necessary, but you can advance them in a way that is visible to the programmer/technical people. Most programmers don't write the whole thing then see if it works, but that doesn't mean you can always have additional deliverables (at least from the point of view of the client/stakeholder) at every iteration.

That said, I wonder if he'd demonstrated a need for a custom TCP replacement before he wrote one. That feels a little excessive, but I was born into high bandwidth network coding so I don't remember the bad ol' days as well.


Every week you can't show visible progress to a customer is a week your code drifts farther from their actual desires. The feedback loop is essential to writing good software. It may be difficult to achieve at times but it should always be our goal.


You know, I'm very happy for you that your work lets you show real results to customers every week of development and get meaningful feedback. That's awesome.

But before lecturing the rest of us, you might want to consider that not everyone's work works that way. In my own work, it is not at all unusual to go weeks without having anything that could be usefully shown to a customer; and by the same token, the customer's desires are usually simple and unchanging.

Let me give you a concrete example: importing the geometry and topology from a Parasolid XT file. Now, the good news is that the file format is publicly available. The bad news is that it more or less amounts to a direct serialization of their internal data structures, in a format where there is absolutely no room for error: a single misread byte will make the rest of the file unreadable garbage. And of course, every version of Parasolids (there are 75 or so now) used a slightly different format, and a number of crucial details were left vague or thoughtfully undocumented.

As you might suspect from that, it took well over a month to get the basic file parser done. What was I going to show the customers in that? Me: "Well, here's the dump of this file as I parse it now. As you can see, it gets completely screwed up about three-quarters of the way through." Customer: "That's great progress! But we've been talking it over, and we think your debugging dump should be colored chartreuse."

That would be the norm for my work. On shorter things (adding a simple new feature or fixing a simple bug), it usually takes less than a week, and we get as much feedback from the customer as possible. On longer things, usually the customer's only feedback is "We want to read these files" and there is little meaningful I can show them in the middle of the project.


Slight digression… I think this conversation is pretty much the rational, reasonable version of most of programming.reddit:

* "this is how I work"

* "fool! You should be doing TDD/AOP/Agile/XP/NIH/WTF/BBQ!"

* "actually, that's not possible for reasons X, Y, Z."

The problem is that the respondent always assumes that the writer of the original post is in the same environment, subject to the same conditions. There's a name for this fallacy, but I don't remember it.

A similar problem is when posters assume that their situation is the typical one: I see this all the time as people post stories like "how to do X", omitting the crucial phrase "... in Ruby", or "in the web development world", or "on Windows". Knocking them down with "I think you'll find telecoms is a little different" seems to be a habit of mine.


In your example there are ways to do development incrementally. You could generate some files that only use a tiny subset of the features of Parasolid, in only one version. Once you have that working you can add feature and version support incrementally guided by your customer's priorities, which they might not even be aware of themselves but can easily be found by testing your code on a sampling of their most important files. Your incremental progress is loading of files with progressively more features, closer to your customer's files. It might turn out that you can avoid implementing every version and feature of the format, and it's much easier, faster, and more accurate to figure this out by testing working code than trying to write a spec ahead of time.


I should point out that I drastically simplified what was involved, attempting to give an idea of the scope rather than the nitty gritty details. In fact they helpfully provide a set of schema files defining exactly the class layout for each class read in each version. Much of the actual work involved figuring out how to interpret those files into useful code. And of course we did not try to support parsing every possible class from every possible version before we tried importing simpler files from the versions we already understood. Believe me, if anything I am too impatient to see real results rather than too patient.

My fundamental point was that it both took a long time to develop the parsing stage and there was no opportunity for useful customer feedback to be had during that development. In order to make a marginally usable importer, the code had to be able to parse nearly every type. The customer could not tell me what to skip. There was just no way for them to provide useful feedback at this stage other than "these are the sorts of files we need to be able to read."

As a programmer, I aggressively attempt to not implement features my customers do not need. It is the only plausible way to write this sort of code as a one-man shop. But as feedback, it usually works the other way around -- I implement the minimum needed to make the files they send me work for them. So it's not me saying "Do you need this?", it's them saying, "Hey, we need this."


The 'black triangle' moment doesn't need to take a team or weeks. I started on a one man project last monday and had my 'black triangle' moment wednesday after lunch, after which I knew that it could be done the way I envisioned it, that it worked as expected and that I will be done next wednesday.


What counts as visible progress? In my experience, simply piling up code for every new feature leads to giant monolithic messes with little structure and organization. Since each feature always seems small and easy to do, it becomes tempting to just hack it into what you have without thinking about properly designing it.

Buildings are constructed foundation first, followed by the scaffold. Trying to construct a building room-by-room would be a disaster.


Turns out software isn't like buildings. Buildings can't be refactored during construction, nor do their requirements change halfway through.

Developing software incrementally requires diligence in refactoring to keep the code clean, but development goes faster despite the "extra" refactoring time and the end product is better for it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: