Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good article. It reminds me of Sandi Metz's treatment of well-designed code as a business proposition: the goal is to save money, because a good design makes changes cheaper.

It makes sense to consider the cost of having a given bug vs the cost of fixing it. Of course, such estimates will almost always be hand-wavey.

I would also say that when in doubt, fix it. A bug is, by definition, the software not doing what it's expected to do; I think it's better to make fewer promises and keep them. A user who encounters a bug loses trust in the software, and there's a tipping point where they abandon it. You might not know where that is.

You also might not realize what a bad day it could give someone, even if they're only one person. Eg, if you're an email platform and you have a bug that drops one email in a million, that might seem OK. But if missing that email gets someone evicted...



    It makes sense to consider the cost of having a given bug vs the cost of 
    fixing it. Of course, such estimates will almost always be hand-wavey.
I agree with that approach in theory, but in practice it turns out that that it's a lot easier to estimate the costs of fixing the bug than it is to estimate the cost of having the bug. As a result, because of our biases, in any ambiguous case, our bias will be for keeping the bug, since the cost of having the bug is the impact of the bug multiplied by the probability of someone hitting it, and it's always easy to lowball those probabilities. "Oh, no one will notice that," or "Yeah, but that's a really obscure case." And then you find out that all it takes is one obscure case for your trading application to lose hundreds of millions of dollars a day. Or for hackers to breach your systems and make off with millions of credit card numbers. Or for malware to turn your IoT devices into a botnet.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: