> Normal shell usage means doing a lot of one-shot pipelines incrementally.
Sure because no-one ever takes those "one-shot" pipelines full of `cut`s and `grep`s and distributes them in real products. /s
As far as I can tell, Powershell is designed for scripts that are meant to be distributed, so they must be robust. Normal 'text scraping' Bash is shit at that.
If we are not talking about interactive use then to distribute "robust" scripts, you could use nicer and richer Perl, Python eco-systems.
For interactive use, the conciseness, the human-readability of input/output and the fault-tolerance to unrelated changes in the input/output formats provided by Unix shells are also preferable.
The motivation for PowerShell was to have something good at both. Why switch contexts when moving from interactive use to coded automation if you don't have to?
Because the requirements change depending on which one you're doing: quick shell automation isn't the same as the kind of automation you'd use perl/python for.
And yet most of the time I see bash being used for the same kind of automation that you think is good for perl/python.
And not because bash is good, but because it is what people know (since they use it interactively) and because it is installed everywhere.
So if someone had a tool that was installed everywhere, and used interactively, that could also be used to create more robust automation tasks, that seems like a win to me.
It depends on how complex the automation is. I'd no doubt use a pipeline in places where you'd use perl/python.
As for interactive use and robust automation, Bash isn't as bad as you'd think. The reason I'd go to python is because of script complexity, not lack of robustness.
Sure because no-one ever takes those "one-shot" pipelines full of `cut`s and `grep`s and distributes them in real products. /s
As far as I can tell, Powershell is designed for scripts that are meant to be distributed, so they must be robust. Normal 'text scraping' Bash is shit at that.