Hacker News new | past | comments | ask | show | jobs | submit login

Ah, the cost of premature optimization: a couple thousand bucks, and a couple days of monkeying around. Better to first figure out where the bottleneck is (and maybe listen to your developer, since he thinks parallelization will help.)



Listen to the last Stack Overflow podcast. Every single month that they ship FogBugz earlier is an extra $200k in revenue share for the developers.

The developer may know how to solve the problem in code. But, Joel is the CEO. He has a better idea of how this affects the Balance Sheet and Income Statement. It's kinda, sorta his job.


Except that (a) he spent a couple days of his time being wrong about it, and (b) he then wrote a blog entry about how he was wrong about it.

Imagine if instead he'd said to the guy "okay, spend three hours profiling the build process and get me some suggestions with time estimates", and they'd found some likely prospects, and three days from now he gets to post about how they'll be shipping the next release a month earlier because of the improvements they made to the build process.


No matter how much you speed up a 30 second build, you're not going to save a month on the release process. Be realistic, now.

SSDs provide so many other performance benefits--even just launching apps--that they're going to make our developers a lot happier anyway. And my time is far less valuable than a developer who is in the critical path to shipping.


A 15 second build over a 30 second build saves way more time than 15 seconds a build. It means that I'm more likely to hit compile after a smaller change before moving on to the next thing. It means I can iterate more rapidly on my approach to fixing a bug, keeping the issue hotter in my mind. It means I have half the time to get distracted by something shiny. Any time you can make the build perceptibly faster, you win big.


For the Arora project one of my default git hooks it to build the project, tools, manualtests, matching autotests (and run them for regressions) before each commit. This slowly increased to the point where it was taking minute or so (over the course of just a few weeks). Taking a few hours I cleaned it up and had it back down to just a few seconds at most and on average less then a second. I did it the more correct way and did some quick profiling to see where the time was spent and then fixing that (mostly that object files that could be were not being re-used across different project when building). Making sure that I never break the build on any commit really pays off. There have been less then a dozen build breakages in the entire Arora commit history due to this git hook and those I believe were either build breakages introduced on OS X where a.cpp == A.cpp (breaking it on win/linux) or when we broke the build against older versions of Qt so it build against 4.5, but not 4.4. When you have a quick build time doing things like build hooks are very possible and useful.


Playing the devil's advocate, I would say that you can also look at it the other way and say that a developer might not build as often but instead make sure his/her code is right before building, thus being more careful about what s/he's writing. (just for the sake of argument ;))


Right, because I was advocating typing line noise until it makes it past the compiler.

I can't count the number of times I used to make a tiny change that I didn't think was worth running the build for, only to have it be the first thing to pop up as wrong next time I compile (now I always run a build, because we made it super fast).

And iterating over a bug, that can involve a lot of little changes, it can involve writing a ton of unit tests trying to duplicate the problem, it can involve subtle interactions that all seem right until you figure out what the issue is. Yes, you have to think, but sometimes you just need to churn through it, too, and frankly it's ridiculous to suggest that having a faster build process wouldn't help this.

And I, personally, get distracted pretty easily. This comment courtesy the 5 minute test suite I'm plowing through in the background right now.


...Now let's move them into separate offices with walls and doors. Now when Mutt can't remember the name of that function, he could look it up, which still takes 30 seconds, or he could ask Jeff, which now takes 45 seconds and involves standing up (not an easy task given the average physical fitness of programmers!). So he looks it up. So now Mutt loses 30 seconds of productivity, but we save 15 minutes for Jeff. Ahhh!... It is from one of your writtings...


Listen to the Stack Overflow podcast. The developer estimated that it would take "a few weeks". A day and a half of Joel's time sees like a fair trade to do this experiment.

Listen to the podcast. The conversation starts at about 16:00 mark if that helps. I am seeing a ton of comments that don't understand the context of the article.

http://blog.stackoverflow.com/2009/03/podcast-47/


But the developer wasn't going to take this brain-dead approach; he was going to do something that would actually help. Joel spent time and got nothing; the developer wanted to spend time doing something that would probably speed up the build.


Except that he didn't "get nothing" out of the exercise. As he expected it would, the new drive did provide significant speedups for a variety of tasks and will undoubtedly improve his productivity enough to have been worth the time and money he spent on it. As it happens, it didn't improve the build speed specifically, but so what?


Yah, but I spent 10 minutes and read the ars article on the drives and got what took him several days to learn.


He got: snappier computers for everyone and a story to write about (key to his marketing).


Bingo! Beat me to it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: