Hacker News new | past | comments | ask | show | jobs | submit login

Wholeheartedly true. Vague requirements are definitely the bane of my existence, just this last week I had a massive story I was working on which I implemented only to be told that I misunderstood the story at hand. After pointing out the confusion with the task, product admitted they got the requirements wrong.

Don't get me wrong, I like the story format of detailing requirements, in most cases it definitely helps (although I know many developers who dislike the story format), but it is not without its issues. The weak leak is always the product owner writing the wrong stories within a task.

In my opinion the biggest contributor to tech debt is NOT lack of stories in tasks or poorly written ones leading to vague requirements (although it doesn't help), it is unrealistic timelines and failing to factor in testing. When people are pushed to achieve a long list of things in a short amount of time, they take shortcuts to get them done which appeases the product owners but leaves some landmines in the codebase waiting to be stepped on.

The biggest issue with Agile in my opinion is lack of time devoted to testing. It works on a cycle of "You estimated we can get this much done within this much time" I have never been in a planning meeting that even mentioned testing as part of the normal development cycle, it is always an afterthought when the other big ticket items are done and time is running out. Time for writing tests and testing should be instantly baked into the Agile process (maybe it is and my experience has been unfortunate), this will significantly reduce tech debt.

I think the issue with testing and technical debt in my experience has been the fact the places I have worked in like to call themselves Agile, they have the daily standups that always descend into debates and meetings, they have the planning poker events and we do sprints, however they don't fully adhere to the rules of Agile. When you start slipping in extra tasks that were not estimated on or were just added and slip them into the current sprint, that is not Agile. When you start changing deadlines and priorities midway through a sprint, that is not Agile. I think this is a problem that is more prevalent than people realise. There are misconceptions as to what proper Agile is.




I used to work at a company that addressed testing with a three-pronged strategy:

1) when estimating, dev tests are part of the strategy (and, ideally, stories are written with enough detail to make testing strategies clear). Sometimes we review the ticket with QA to ensure that we both understand what's being asked for and what needs to be taken into account. Most tests at this point would include unit tests and functional tests.

2) Once a task is done, the story is reviewed with someone from QA to ensure it works. They suggest a couple of things to try that may require us to make improvements. The goal is to catch 80% of the issues at this point with 20% of the effort, and the pair-testing does a great job of flushing out issues. Here the focus is on functional tests and exploratory tests.

3) The QA team runs their own sprints testing the dev team's previous sprint's work. This is mostly performance and integration tests, but sometimes includes functional testing.

I thought the process was good because we were able to measure our software quality and address it quickly. That said, it feels somewhat cumbersome. This isn't an easy problem to tackle.


All of the projects I've done over the last few years have been "agile" - the company I work for actually offers agile training.

Officially, all projects are written using TDD, and all estimates for all tasks should include time to write Acceptance and Unit following the suggestions in "Growing object oriented software guided by tests", along with any integration tests with external software that may be required.

Even unofficially, I've never seen anyone consider a story "complete" until there's sufficient automated tests for it, and there's been at least some manual testing.

Another thing we strongly encourage (and do) as part of Agile is regular retrospective meetings on our software development process. If our testing strategy (or lack thereof) was causing Fear during refactoring, prod bugs, or difficulty during maintenance, this would be bought up during such a meeting and addressed.

I'm not saying you're not "doing Agile", but your experience very strongly does not match mine.


Your experience is definitely unfortunate - stories/features being fully tested before being considered done, and therefore potentially shippable, is a key part of a successful Agile process. Without it, you are missing out on a lot of the possible benefits of the process, and I have seen first-hand the difference having QA engineers, even not particularly strong ones, integrated into the sprint can make to product quality.

A good first step to take would be to get your team together and work to define a Definition of Done, which should include both unit and integration/E2E testing (manual and/or automated) being complete - any stories not meeting these criteria cannot be considered as "done" and you can't count the points for them in the sprint.

Of course, initially this will probably cause lots of failed sprints and will decrease your velocity, but you have to see this as a positive, in that this raises visibility of the problems, and once you can identify the specific issues you're having and take the steps you need to resolve them (whether that's a lack of QA resource, or poor testing culture among developers, or whatever), you'll know when you say something is "done", it actually is done, rather than hiding a load of extra testing work that isn't complete.

On the requirements front, again defining a Definition of Ready can help - these are the criteria stories need to meet before you will estimate or start working on them. This should include things like requirements being clear, designs/UX being complete (if appropriate) and the story being testable and of a suitably small size for you to estimate with a reasonable degree of confidence.

Once you start pushing back on estimating stories because they don't meet these requirements and educating your product owners what they can do to improve the situation (for example, breaking stories down into smaller chunks, or defining requirements more clearly), you'll hopefully find the situation improves.

Of course, none of this is a substitute for conversations and working with the product team to help them understand what does and doesn't work for you, but I've personally helped make big differences to a team's quality and productivity by taking small steps such as these, in addition to educating the team and business on what "agile" is (without all the BS that some people will try and sell you!) and why this is important to us as developers and therefore to the wider organisation.


Same here with vague requirements. Interesting enough, between requirements that are a couple of sentences that I pull out from a conversation or a 30 page requirements document, the vagueness is often a problem in both. In the former case, there just isn't enough information and it's clear that the feature just hasn't been thought out. In the latter case, the information is too detailed and there isn't enough room for interpretation of small, mainly insignificant (from a product perspective) details, on the developer's part. Either way, it ends up with a lot of wasted effort and a lot more time being spent on a project than necessary. Almost always, this is avoidable if the spec writers put in more effort in tackling the problem, understanding the system, and understanding what is and isn't possible.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: