I was also constantly thinking of PowerShell while reading that. A PowerShell-specific list of such advice would actually be rather short, given that most of the pitfalls are already avoided. I still firmly believe that PowerShell is actually a much more consistent Unix shell in that several concepts that ought to be separate are actually orthogonal. Let's see:
Input from stdin, output to stdout: Nicely side-stepped in that most cmdlets allow binding pipeline input to a parameter (either byval or byname, if needed). Filters are trivial to write, though.
Output should be free from headers: Side-stepped as well, in that decoration comes from the Format-* cmdlets that should only ever be at the end of a pipeline that's shown to the user.
Simple to parse and to compose: Well, objects. Can't beat parsing that you don't need to do.
Output as API: Well, since output is either a collection of objects or nothing (e.g. if an exception happened) there isn't the problem that you're getting back something unexpected.
Diagnostics on stderr: Automatic with exceptions and Write-Error. As an added bonus, warnings are on stream 2, debug output on stream 3 and verbose output on stream 4. All nicely separable if needed.
Signal failures with an exit status. Automatic if needed ($?), but usually exception handling is easier.
Portable output: That's about the only advice that would still hold and be valuable. E.g. Select-String returns objects with a Filename property which is not a FileInfo, but only a string; subject to the same restrictions that are mentioned in the article.
Omit needless dagnostics: Since those would be either on the debug or verbose stream they can be silenced easily, don't interfere with other things you care about and cmdlets have a switch for either of that, which means you only get that stuff if you actually care about it.
Avoid interactivity: Can happen when using the shell interactively, e.g.
Home:> Remove-Item
cmdlet Remove-Item at command pipeline position 1
Supply values for the following parameters:
Path[0]: _
However, this only ever happens if you do not bind anything to a parameter, which shouldn't happen in scripts. If you bind $null to a parameter, e.g. because pipeline input is empty or a subexpression returned no result, then an error is thrown instead, avoiding this problem.
Nitpick: You'd need ls | % Name or ls | % { $_.Name } there. Otherwise you'd have an expression as a pipeline element, which isn't allowed.
I have never used a computer that had access to Powershell, but in my new job I may have to do some small stuff to tie some systems together. I'm terrified of learning it because I don't want to be lured into some kind of lock-in scenario.
If you can't accomplish the same goals in a portable way. Of course, if you can put the knowledge to work as soon as you learn it, you're already starting to recoup your investment.
My issue with Powershell is it creates a distance from the scripting language and an ordinary executable, which makes it difficult to use just a little bit (and I suppose violates the rule about composability).
Input from stdin, output to stdout: Nicely side-stepped in that most cmdlets allow binding pipeline input to a parameter (either byval or byname, if needed). Filters are trivial to write, though.
Output should be free from headers: Side-stepped as well, in that decoration comes from the Format-* cmdlets that should only ever be at the end of a pipeline that's shown to the user.
Simple to parse and to compose: Well, objects. Can't beat parsing that you don't need to do.
Output as API: Well, since output is either a collection of objects or nothing (e.g. if an exception happened) there isn't the problem that you're getting back something unexpected.
Diagnostics on stderr: Automatic with exceptions and Write-Error. As an added bonus, warnings are on stream 2, debug output on stream 3 and verbose output on stream 4. All nicely separable if needed.
Signal failures with an exit status. Automatic if needed ($?), but usually exception handling is easier.
Portable output: That's about the only advice that would still hold and be valuable. E.g. Select-String returns objects with a Filename property which is not a FileInfo, but only a string; subject to the same restrictions that are mentioned in the article.
Omit needless dagnostics: Since those would be either on the debug or verbose stream they can be silenced easily, don't interfere with other things you care about and cmdlets have a switch for either of that, which means you only get that stuff if you actually care about it.
Avoid interactivity: Can happen when using the shell interactively, e.g.
However, this only ever happens if you do not bind anything to a parameter, which shouldn't happen in scripts. If you bind $null to a parameter, e.g. because pipeline input is empty or a subexpression returned no result, then an error is thrown instead, avoiding this problem.Nitpick: You'd need ls | % Name or ls | % { $_.Name } there. Otherwise you'd have an expression as a pipeline element, which isn't allowed.