Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This stems from an unwillingness to make it a job requirement.

There are several things you as a software engineer are expected to do as a part of your job: write code, write tests, participate in code reviews, ensure successful deployment, work effectively with various groups, etc.

It's really simple: state the job requirements up front in the position description and during the hiring process. Make testing part of the code review process, and use it as an opportunity to educate about what makes a good test. Make it part of the performance review and tie raises to it (and, if it goes on long enough, continued employment).

Need to write tests for existing untested areas of the code? Have the team create a developer rotation so they dedicate someone to it for part of each sprint.



I couldn't agree more. I've worked at places where some of the engineers were conscientious about writing tests and having excellent test coverage. Guess what? Our services were still unreliable because the engineers who didn't write tests brought poor quality into the codebase, so we had constant problems.

Even a few engineers on the team who don't write tests can make the product as unreliable, from the customer's point of view, as it would be if none of the engineers wrote tests.

At my current company, test coverage is taken seriously as a job requirement, and it is considered during performance reviews. Consequently, the test coverage is pretty darn good.


Per your previous now on reliability, does reliability at the current company match the test coverage?


I'm in 100% agreement with you up until the point of tying your test coverage and writing of tests to your employment. In my eyes that promotes a culture of writing bogus tests that provide no value other than to make more green check marks. You should be encouraged to write tests by your colleagues and be in a culture that sees the benefits, rather than forcing people to do it.

I'm also unsure if sitting one developer down in a corner for a segment of each sprint and dedicating them exclusively to testing legacy code with no purpose is valuable. You should be testing legacy code as you come across it and making sure you harness it properly and make your modifications and continue to the next stop. If you are spending time doing something that doesn't complete a bug or a feature, you're spending valuable time on testing something that may completely removed in the future.


If a PR has bogus tests that provide no value other than to make more green check marks, how do they pass code reviews? That indicates that your code review process is kinda broken--tests should support the code review process by indicating what edge cases the writer of a PR has thought of and then prompting the reviewer to ask what hasn't been thought of.


Bogus tests have to be caught in code review. When I talk about educating the team that's what I mean.

I've only ever had to do a test rotation once or twice, and it was like pulling the rip cord on a lawnmower. Requires effort at first and then it becomes self-sustaining over time. It establishes or affirms a culture of testing. The rotation doesn't even need to last long.

You should know which portions of the code are here to stay and which are nearing their end of life. Naturally, you want to spend your time where it will have maximum payoff.


If you are a company trying to integrate the idea of Unit Testing into a company and it is a new concept, I guess this practice could be acceptable. I think context really matters. I've put a lot of thought into this throughout the day, and I'm completely torn. On one hand I see the benefits of trying compensation to it, but I also just see it creating more problems than solutions. Especially if later on it becomes a cultural standard in the office, how on Earth are you going to remove that benefit (because you no longer need to encourage it) without pissing people off?

Also for the latter point I guess that also depends on context. If you work for a consulting company you may not have the full knowledge of what the code base is, or even have direction to be touching some things. If you are developing software for your own company, I do agree you need to figure these things out, and maybe having a developer dedicated to it each sprint isn't a bad idea. I overstepped my bounds on that comment, as I have never worked for a company that makes its own software it sells, I've only ever done consulting and I sometimes forget about alternative perspectives, so sorry about that.


No worries. Note that I am not saying you get rewarded for doing the bare minimum (writing tests). You get rewarded for going above and beyond. You are not performing the minimum requirements of the job if you do not include tests.

Of course you combine this with managerial support and coaching around task planning and messaging to other groups.

I've been a consultant, too, and I agree that it can sometimes (for some clients) be difficult to make the case for testing in that environment.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: