Hacker News new | past | comments | ask | show | jobs | submit login

The rule of thumb is: Describe the problem, ask for clarification, offer a default solution.

Most often, people just won't be interested in the problem, and will ignore you, hoping you and the problem go away, at which point your default solution wins and is documented. But every now and then, it'll set alarm bells ringing up and down the chain of command, and THAT is why you bring it up.

Of course it's also important to get a feel for what kinds of issues should be brought up, and what issues should just be quietly solved. This skill is half-technical, half-political, and comes with experience.




> and what issues should just be quietly solved

As a pedant, it sometimes makes me uncomfortable to make these decisions "quietly".

While I wholeheartedly agree that this is an important skill for all developers, it begs the question: Why should the dev be responsible for deciding when an issue needs to be brought up? Why should the dev be making these quiet decisions, is this not evidence of incomplete requirements?

There should be protocol here instead of relying on the dev's subjective sense and opinion.

Generally speaking, I feel like this goes outside the bounds of the developer's responsibility. Not every dev has honed this skill; it could be dangerous. I would choose to err on the side of caution and apply your rule of thumb above in almost all situations.


Yes, it's not ideal, it could be dangerous, someone could launch nuclear missiles by mistake.

But the reality is that we live in an imperfect world, where imperfect things can and do happen all the time, and we have to deal with them. You can't have a rule for everything; when you do, nothing can get done because the rule makers can't anticipate everything, and often get the things they DID think about wrong (rule making is remarkably similar to program design, with the same drawbacks and limitations).

So our imperfect world demands that we exercise our own judgment in deciding what to do. If you don't trust your developer's judgment, you shouldn't put them in charge of things that can cause a lot of damage. The alternative is a rule for everything, which is guaranteed to collapse under its own bureaucratic weight.


> The alternative is a rule for everything, which is guaranteed to collapse under its own bureaucratic weight.

That's an important point that's too often missed. There's a cost to creating rules, maintaining them in a list of rules somewhere, and having people actually read the whole list and apply those rules to real situations, handling appeals processes when you discover something that the rules handle badly, etc. You need some rules, but the can't ultimately fix the need to trust people.


> Why should the dev be making these quiet decisions, is this not evidence of incomplete requirements?

If the requirements were a complete specification of what the program must do in every circumstance, you could directly compile them to create a working program. Requirements documents are not intended to specify everything—only enough that a competent person would get the important things right. Filling in the trivial details with reasonable behaviour is basically the programmer's job.


> is this not evidence of incomplete requirements?

Yes! But there will never be a complete set of requirements — indeed if you think the requirements are complete, you spent too much time on them and you aren’t looking at the problem carefully enough.

There’s a balance between calling for more complete requirements and being able to work with less complete requirements. The more you can correctly choose to do the latter, the more you “hone this skill”, the more effective you can often be.

(As a bonus, when you do call for more complete requirements, in my experience people will be more open to doing that work. They know you wouldn’t ask if you didn’t “really need it”.)


There are always unforeseen requirements or some requirements are very loose because there is really no way to know something will work until is tried.

If there were any protocol that requires discussion about every little thing you would never ship nor be productive.

So, companies relies on programmer's judgment on what to do when there is no clear resolution or something is not perfectly spelled out.


Sure, in a perfect world, programmers wouldn't have to make decisions outside their sandbox. But humans get hired to be more than robots, and the same goes for everyone, in every job, ever. That's why you hear cliches about "above and beyond", "extra mile", etc. etc. so much that you probably don't hear them anymore.

And if you are good at operating outside your sandbox when you need to, you should get a gold star (and think about if you want to look for more responsibility). If you're not, your manager should try to baby-proof that corner.

This is also no different than every other job, ever.


I think I'd rather have high-level specifications than ones with all of the detail in it.


It is indeed indicative of incomplete requirements. Unfortunately, the real world consists of projects with incomplete requirements.

You are going to need this skill as a developer since you’ll end up using it every day.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: