Always, always, always - when I'm faced with a new project using a new technology, I have two choices:
1) spend an indeterminate amount of time reading documentation until I actually understand what I'm doing, and have nothing to show for the time I spent gaining a deep (or at least decent) understanding.
2) Just jump in and start coding, bugging people to look at error messages for me, google when that doesn't work, skim the documentation when that doesn't work, cut and paste examples when that doesn't work, but have something to "show" at the end of the day which is far more complex, slow and unpredictable than a well-thought-out solution would be.
Every year of my 30 year software development career has been spent under software management fads that insist on the second approach because it provides an illusion of productivity and predictability. I actually had some hope when XP came out in the late 90's that the tide was turning here, but XP became Agile, which was "meet the new boss, same as the old boss".
the way to go is do 2, read what you wrote, then throw it out and do 1. now you know better what docs are worth reading, and you'll digest them faster with context gained from your throwaway.
it will be only a little bit slower than 2 on its own and it will be written better because it's not just accumulated debris from your learning
Just as long as you avoid the trick where you complete the POC, and then your manager moves you onto something else and reassigns a junior to productionize the POC.
At least until the deadline passes and he still doesn't have it working, then you have to do the something else you moved on to _and_ productionize the POC that was already overdue last week.
I 100% agree with this. Have always called it `throw-away development` where you build a prototype just to understanding what it is that you are actually solving/building. Then, once you understand, implement the _actual_ thing you need to, taking the learnings from the throw-away. It takes a _tad_ longer than if you had magically known what to do from the get-go but the learning is invaluable.
The problem when you do 2 'just enough' for it to (barely) work - if you show it to your manager, they'll Ships It and you'll get put on a new project to hack together.
I've come to understand that the most important part of any software project is the Critical Twenty, the first 20% of the project's timeline where developers' need to do things like gain a deep understanding from reading documentation and lay down the infrastructure and architecture and foundation necessary to support features above.
On a 10 month project, this translates to reading docs and putting down foundation stuff, and delivering no features, for the first two months.
Attempts to abrogate The Critical Twenty are what causes the majority of problems later on in most software projects.
I have regular pain with the team because they don't know anything beside option 2. No middleground to dig and plan at least to some amount. It's a strange swamp because they justify that because that's what agile is (according to them) but they feel angry every month, and then do less and less work because they think it's too hard and requires a higher TC.
I'm the lowest pawn on the ladder. But we have time to adjust our ways, so I asked them, but they just refuse. It's all block and nap then rush (with some "can you skip lunch maybe?").
Method 1 is definitely a culture that needs to be encouraged and nourished (and sometimes people need to be trained). And it's fragile: If there's even a hint of "What have you coded in the last week/month" from anywhere up the management chain--poof! It's gone and everyone is motivated to move to Method 2.
Thanks. I'm very much interested in the psychology of social structure / workgroups. These are indeed very brittle and it doesn't take much to make a race to the bottom.
I'm always thinking about relay-like and continuous improvement (team processes and mastery, not CI/CD) between team members to promote curiosity, high drive, creativity and friendly competition.
The problem with #1 is that it's only useful if the system is conceptually simple... and the article in question is spot on.
If there is nothing deep to understand, the only thing you can gain from the docs is encyclopedic knowledge of all the toggles and switches. So, there's no reason to ever study them instead of just start using.
"The management question, therefore, is not whether to build a pilot system and throw it away. You will do that. Hence plan to throw one away; you will, anyhow."
Fred Brooks.
Edit: Now I'm sad, I hadn't noticed that he died last year.
> 1) spend an indeterminate amount of time reading documentation until I actually understand what I'm doing, and have nothing to show for the time I spent gaining a deep (or at least decent) understanding.
Tough to explain to management or colleagues sometimes. (On the other hand everyone seems to "know" it's the right approach because even today just looking up something on SO is not seen as the highest standard)
I usually end up doing it anyway and taking my "lumps" for the week that I spent learning but not producing. Get frustrating when the #2 crowd starts demanding that I drop everything and help them since I'm now the "expert", though.
I've found that even if you start with #1, you end up doing #2 anyways, whether it's because the documentation wasn't complete or you didn't realize what you were trying to build until you started building it. I always start with #2, since in my mind, it often encompasses #1. I've also found #1 leads to analysis paralysis.
1) spend an indeterminate amount of time reading documentation until I actually understand what I'm doing, and have nothing to show for the time I spent gaining a deep (or at least decent) understanding.
2) Just jump in and start coding, bugging people to look at error messages for me, google when that doesn't work, skim the documentation when that doesn't work, cut and paste examples when that doesn't work, but have something to "show" at the end of the day which is far more complex, slow and unpredictable than a well-thought-out solution would be.
Every year of my 30 year software development career has been spent under software management fads that insist on the second approach because it provides an illusion of productivity and predictability. I actually had some hope when XP came out in the late 90's that the tide was turning here, but XP became Agile, which was "meet the new boss, same as the old boss".