It’s unfortunate that Go’s standard package `flag` doesn’t follow the standard either, given the language is otherwise a good fit for command-line tools.
I ran into a related issue a couple of years back where people were using single-dash flags for a C++ project that was using Abseil flags in conjunction with getopt parsing of short flags (for legacy reasons). Why were they using single-dash flags, despite that not showing up anywhere in our documentation? They copy-pasted from --help.
(I'm happy to say that --help in Abseil has since been fixed.)
But that doesn’t preclude mistakes by collision (N short flags match a long one) or unpredictable bugs in a long flag interpreter (a short flag being a substring of a long one)—both being trivially common bugs when this ambiguity is allowed, especially when an API is ported to another environment with less tooling standardization around interpreting the input.
Go doesn't allow for specifying multiple short flags all run together, or for flag args without spaces, so neither of those are directly relevant here.
Also, that first issue happens with POSIX flags (with the GNU long flag extension, anyhow): `grep -help` is different from `grep --help` (and if you type the former, it'll just wait patiently for you to close stdin).
Which is also why Windows uses backslash (\) as their path separator. Because forward slash would have collided with the slash option marker Windows inherited from VMS.
That is surprisingly false. Microsoft operating systems use both / and \ as a path separator, going all the way back to DOS.
Early versions of MS-DOS made it a user preference option in the command.com interpreter, whether the user wanted to use / for options and \ for path separation or vice versa.
In longer words: Windows was originally a GUI system on top of DOS which was influenced by CP/M. The NT kernel did away with DOS, but the influence still lives to this day. For a simple one: not being able to name a file "con" (or any capitalized variation) comes all the way from CP/M.
For the uninitiated: OSes from that era didn't have "directories"; Everything lived in the root of the drive, including device files. So, to print a file, you could literally do something like:
A> type FILE.TXT > PRN
When DOS added directories, they retained this "feature" so programs unaware of what directories were could still print by writing to the `PRN` "file". Because of "backwards compatibility", NT still has this "feature" as well.
One thing VMS got right is that each binary declared its supported options and the shell could tell you what they were. And it would take any unique abbreviation.
Powershell scripts and cmdlets work similarly. They probably won't have help text but at least you can see what's available without having to look at the argument parsing section of the script. And you can use the shortest unique prefix as the short form of an argument (though I don't love this since adding an argument can break the shortened form of other arguments)
It'd make the typing simpler. PowerShell has posix-like aliases, like 'rm' and 'cd', but they don't accept POSIX parameters. So you end up with "rm -Recurse", since rm is an alias for Remove-ChildItem.
I like PS in theory but the syntax and naming just absolutely kill me. What were they smoking when they named as simple an operation as delete "Remove-ChildItem"? And what's with all of the capital letters?
That's what happens I guess when the people designing it haven't actually used a CLI day to day much, because, well, they're using Windows.
I can't agree. I have used Linux shells for some time (since 97), and while the olden days would be me laughing at vbs and all that awfulness, I'd take PowerShell any day.
The short terse commands and the really awkward, confusing, mistake prone syntax of sh or bash really reels their ugly head in scripts.
Interactive shell? No problem. But that's the beauty of PowerShell: verbosity and correctness in scripts, where the IDE quickly expands those long commands, and short aliases for interactive use.
> The short terse commands and the really awkward, confusing, mistake prone syntax
When used in an interactive shell short commands save time and effort. And it is easy to learn and remember them because in everyday work you need only about 10 commands. For some some commands which I use a lot I have one-two letter aliases to type even less e. g. i=fgrep.
It makes shell scripts less readable for someone who come from windows and and don't know even common shell commands, but for someone who use shell at least from time to time it should be easy to read.
Yeah I agree with that. Bash (and friends) scripts are awful. PS scripts are nice and readable, and not subject to the insane quirks of bash ([ vs [[ vs test? come on)
Seems like the real solution is separating scripts from interactive use.
Ironically it already happened: bash for user interface, but /bin/sh is something else. But bash for user interface keeps being a repl that was accidentally promoted to user interface.
> What were they smoking when they named as simple an operation as delete "Remove-ChildItem"?
Simple. All these commands work with providers, of which a file system is just one. Other providers include Windows Registry, environment variables, certificate stores, functions and variables in PowerShell runtime. More providers can also be created and plugged into the system. PowerShell Providers are essentially Window's FUSE. See [0] for details.
So, for instance, you can do `Get-ChildItem HKCU:` to list entries under HKEY_CURRENT_USER in the Registry, the same way `Get-ChildItem C:/` will list you top-level items on the C: drive. Worth observing: while the console output for these two commands is similar, the results are in fact different objects underneath (Microsoft.Win32.RegistryKey vs. System.IO.FileInfo).
In short, these commands are an abstraction over file-system-like things. Whether or not that was a good idea is a different question.
It makes a little more sense in context to me. The verbose Verb-Nounish works because the verbs are designed to be limited. E.g. there's Remove- but no Delete- in the standard (shown in `Get-Verb`). So you can then press ctrl+space after typing Remove- and see all the different types of things you can remove. Too many, so you can filter to Remove-<prefix>* etc. The verbosity of cmdlet names when using it as a shell is mitigated with the aliases (e.g rm), and the parameters by accepting any case and shortening to anything non-ambiguous (e.g. `rm -rec -fo`).
I guess the capitalisation comes from C# or .net's casing? I like PascalCase for it's great readability/conciseness tradeoff over others, and it's standard windows case-insensitive so I've never had a huge issue with it.
The tradeoff is that "all the things I can remove" is usually "the set of all things my shell knows about" and not "the set of things related to my task at the moment" -- ChildItem-* would be more helpful!
Neat thing you can do is type "*-Noun" and the tab completion will give you options that fill in the "*". Alternatively "Get-Command *-Noun" will also list out all of the matching commands. Get-Help also supports that kind of wildcard so you get the list of commands along with their help summary.
The "*" can even be in the middle. I open VS solution files all the time from Powershell. Since there are often many other files and folders with similar names alongside them I just type ".\*.sln" and hit tab.
I disagree and agree with the sentiment. As someone more familiar with Linux, I sure would prefer to be able to assume a similar style.
But the biggest thing I'm happy about WRT Powershell is that it's consistent (and pretty well documented). At least it makes sense. Batch scripting really didn't.
Except they did, and I for one wish traditional Unix shells would die. Composing software by having every single program and script include a half-assed parser and serializer is causing a lot of unnecessary waste and occasional security problems in computing. Moving structured data in pipes is just a better idea.
Wish I could (actually, I'd prefer JSONB or other binary format). Unfortunately, every program in the UNIX ecosystem assumes unstructured text in pipes, and makes it my responsibility to glue them together by building ad-hoc parsers with grep, head, sort, sed and awk.
A lot of more recent programs (such as AWS, K8s tools) can easily output JSON. You can make schemas match, but you'll most of the time need to use something like jq to transform what one program outputs into what makes sense for the other.
I always try to design my tools with a "terse" output that makes it easier to pipe it into other programs.