Four rules that should be tattooed on the back of every technical writer’s hands so they can’t accidentally forget it.
1. Poor grammar and bad writing are often a sign of poor comprehension.
2. Good documentation takes time.
3. Deep expertise is not automatically a prerequisite for good documentation.
4. Don’t let working cultures that put too great a premium on knowing everything dominate - i.e. being 'in the know' should be a tool for helping others up rather than beating them down.
Even in a non-native speaker they are still signs of poor comprehension. If you are going to have someone write your documentation, they had better know the language. I have not bought from some companies a second time because their documentation was so bad.
And, truthfully, many non-native writers are as good as many natives. Non-native speakers can be more readily identified (if for some reason you want to) through unusual word choices or stilted (overly formal) grammar.
There are other more reliable signs of the same sort of sloppiness that are independent of native language, which if I were analyzing a bit of documentation I would use. Using inconsistent terminology is probably the biggest one. I struggle with this myself.
It doesn't matter what a "widget" is within your system, but it ought to be one thing and stay that one thing and everything that isn't that thing shouldn't be a "widget". (I'm not even trying to choose a generic word to make a point; I'm literally between two sessions where I'm working on the documentation for my local "widget" concept, which isn't quite like any other widget in the world, most likely.)
"Deep expertise is not automatically a prerequisite for good documentation."
A beginner's mind is a big plus.
When documenting my stuff, or writng tech articles, I have to try hard to recall what things tripped me up or what things feel obvious now but were were hardly so when I started.
That can backfire if there's a big gap between the tutorial and the reference documentation. I had a huge and extremely unpleasant initial frustration with Boost Python. The tutorial had me download the Boost source code, build it, and work in the Boost source tree. It was slick and easy. A very exciting start! After I did the tutorial I blew away the Boost source tree, installed the libboost-python package, and got utterly stuck. The reference documentation provided no guidance. There was no libboost-python-dev version of the package at that time, so I thought installing libboost-python would give me everything I needed to develop. Oh well, I eventually figured it out and rebuilt the Boost source.
But to build code that wasn't inside the Boost build tree, I had to read the documentation for the Jam build language so I could modify the build files! I had to learn the damn build language so I could fix the build files so I could get Hello Freakin' World working in a top-level directory. After that I never had to touch the build files in any meaningful way again. A page of documentation could have saved me a whole lot of work.
Partly I blame the tutorial for raising my expectations of the documentation. I've come to accept that software often requires fiddling, but the Boost Python tutorial was so slick and easy that I figured the answer HAD to be in the documentation, right in front of me. The whole time I was working on figuring out the build files, about 10% of my brain was working on the problem and 90% of my brain was screaming, "You idiot! This is a well-documented project! You don't have to think this hard to get Hello World working! Stop fiddling and go back to the documentation and find what you missed!" And I did waste a whole hell of a lot of time re-reading the documentation looking for what I missed.
This means you need to playtest the tutorial under all sorts of different circumstances, making sure that it always works (even on Windows)
Just a little offhand comment. MSFT is totally screwed when this is the average developer's mindset. We were always so worried about 'Linux being ready for the desktop' that nobody noticed that Mac & Linux have owned the developer desktop for years, long enough for the author's comment to be not even noteworthy.
I usually have to struggle to make things work on Windows.
I have a friend -- an electronics hacker -- who insists on using Windows for everything. He recently wanted to play with a couple of web frameworks so he could pick one for a website he's building for an early stage startup. Getting him set up with Symfony and Rails was crazy. Now I fear the day he might want me to set up Django or Pylons :(
I once tried to set up a decent development environment for myself on Windows 7. Needless to say, I failed. Windows is excellent for .NET development (VS has no competition), but for anything else you're better off with a decent UNIX.
> I usually have to struggle to make things work on Windows.
My strategy for this has been to start with a "any Windows support is entirely accidental" approach, and then once someone steps forward to fix it place them in charge of Windows compatibility. There are never very many Windows users who are actually interested in helping though, especially compared to the number who complain about it not working.
We just bought a half dozen MacBookPros so the QA team would be on something the Dev team could write automated testing tools for. After seeing how much time we had been wasting on Windows, when we finally asked for the switch, management didn't even flinch. You could say it was a cost saving measure.
Personally I'd rather someone not write an opensource project than write it and not document it. Understanding someone else's thought process is hard and it gets to be an order of magnitude more difficult without documentation. If it's easier for your average hacker to write his/her own implementation than use your library, you've failed.
I really doubt that's true. You're saying that instead of using a poorly-documented mencode filter to fade a video in and out, you'd rather have to write your own filter or not be able to use mencode for the job at all?
While working on a project with a tight deadline, I've needed complex functionality that wasn't well-documented, and I've needed complex functionality that wasn't there. When it wasn't there, I was forced to produce complex functionality that wasn't documented. If I'm generous enough to release that for others to use, reference, and modify for free, then the last thing I want to hear is somebody bitching about how I didn't document it well enough for them. I have other things to do, and for me it was a one-off project.
In a pinch, source code is documentation, and it's pretty hard to make an argument that undocumented source code is worse than nothing at all. At least then when you're implementing your nicely-documented tool (because you'd rather have documentation than get stuff done), you'll have a reference implementation to look at and test against.
Note: I'm not arguing that documentation isn't important, I absolutely agree that it is. I just think it's absurd to rank the importance of documentation above the stuff that it's documenting.
It's my personal preference, so it's true whether you doubt it or not. :-p
Using undocumented code is like programming in a language I hate. Mentally it's a really shitty experience and I'm not in a position where I have to settle for that. If I needed to do it to put food on the table, obviously I would.
You're also creating this false dichotomy between getting stuff done and documenting your code. I've never been in a position to have to choose between the two. If you have time to write a good program but not enough time to document it, I would question whether the program you wrote is actually good. If I'm writing a paper, sure I can spew shit out on a page for a couple hours and produce a "paper". But realistically if I'm actually going to produce a paper I need time for outlining and proofreading.
(Note that "documentation" isn't really what matters, it's that the code is understandable. If the code is self-documenting or the program is small enough, I consider that documented code.)
Quite often, "getting stuff done" and documenting your code is two separate steps. The example I made up above was not something I pulled out of the air. I've been doing a lot of different little jobs for a client, and one of those was to create a program that could automatically publish a finished video based on a raw video input (adding an intro and outro sequence, fading in and out, etc.). To do that, I used ffmpeg.
ffmpeg had recently dropped support for an old filter system and began using a new, not-fully-fleshed-out system. The filter creation process was pretty well documented, and there were a few well-documented filters. There was a filter or two that I needed which was documented poorly but had a couple usage examples, just enough to get me going. I was very grateful for these! Then there were a few filters I needed that weren't there. I had to write them, which involved some level of trial-and-error to get working correctly.
My line of work is not video filters. The work I typically do for clients does not involve video processing. If I quote a client an amount for a job, the more time I spend on a one-off job, the more money I'm subtracting from my hourly, and the longer until I move onto the next job. Quite literally, I can get more stuff done, or I can document something that I'll never use again.
Reminds me of one of my favorite quotes from an IT manager whose project was slowly going mushy. She assembled the team and told everyone they had to stop all work and "write some emergency documentation." I was working with another group a few cubes over and so had enough detachment to savor the phrase "emergency documentation" without personally wanting to leap from the nearest ledge. The "documentation culture" phrase from the article seems like the opposite-in-a-good-way strategy from "emergency documentation."
Most documentation fails in giving too little overview.
The trees are described, the bark and leaves are commented, but there is no map of the forest.
- p.165 The Mythical Man-Month (ch 15, Anniversary Ed.) Some is dated, but like the rest of the book, most is incredibly modern.
Totally agree about lack of overview. eg. Java's docs on JTree are detailed and there is no shortage of tutorials and samples online, but I've yet to find an overall description of how it fits together. One has to piece it together from carelessly left clues.
Auto-generated documentation can be quite useful for working out how things fit together, and you can click through linked types to follow the connections. This is one advantage of statically typed languages like Java (bonus: combining source and doc-comments - "literate programming" - keeps them synchronized, or helps to). I often scan javadocs by return type and/or arg types to find the method I need. Haskell has an amazing doc tool where you enter the type signature, and it tells you the functions that match. (BTW: I love this, but do haskellers find it useful on real-world projects, in practice?) Python benefits greatly from not having static types; this is one of the few disadvantages.
The author was honest enough to admit some particulars where he didn't follow his advice. It's easy to advise on the right thing to do, but if you find it difficult to get it done yourself, is it really practical advice, under real-world pressures, on those particulars? Else we mouth an empty folklore, like Catmull's architect who said to design "inside out" but didn't.
1. Poor grammar and bad writing are often a sign of poor comprehension.
2. Good documentation takes time.
3. Deep expertise is not automatically a prerequisite for good documentation.
4. Don’t let working cultures that put too great a premium on knowing everything dominate - i.e. being 'in the know' should be a tool for helping others up rather than beating them down.