- bash is available pretty much everywhere, so if you learn it you can always use it, whereas if you learn a weird new shell you'll be occasionally forced to fall back on bash anyway, so people learn just bash for efficiency's sake (because learning one shell is painful enough as it is). And any proposed replacement will be non-portable.
- some of the things that make shell scripting terrible can't be fixed in the shell itself, and need changes to the entire ecosystem of console applications. e.g. it would be awesome if every Unix utility output structured data like JSON which could be parsed/filtered, instead of the soup of plaintext that has to be memorized and manipulated with awk, but that almost certainly won't happen. There's a bunch of backward-compatibility requirements like VT-100 and terminal escape sequences limiting the scope of potential improvements as well
- there's a great deal of overlap between "people who do a lot of shell scripting" and "people who are suspicious of New Stuff and reluctant to try it"
> it would be awesome if every Unix utility output structured data like JSON
I see this argument a lot, and I think it has a ton of overlap with the struggles from devs trying to grok RDBMS I see as a DBRE.
Most (?) people working with web apps have become accustomed to JSON, and fully embrace its nesting capabilities. It’s remarkably convenient to be able to deeply nest attributes. RDBMS, of course, are historically flat. SQL99 added fixed-size, single-depth arrays, and SQL2003 expanded that to include arbitrary nesting and size; SQL2017 added JSON. Still, the traditional (and IMO, correct) way to use RDBMS is to treat data as having relationships to other data, and to structure it accordingly. It’s challenging to do, especially when the DB providers have native JSON types available, but the reasons why you should are numerous (referential integrity, size efficiency, performance…).
Unix tooling is designed with plaintext output in mind because it’s simple, every other tool in the ecosystem understands it, and authors can rest assured that future tooling in the same ecosystem will also understand it. It’s a standard.
JSON is of course also a standard, but I would argue that on a pure CLI basis, the tooling supporting JSON as a first-class citizen (jq) is far more abstruse than, say, sed or awk. To be fair, a lot of that is probably due to the former’s functional programming paradigm, which is foreign to many.
Personally, I’m a fan of plaintext, flat output simply because it makes it extremely easy to parse with existing tooling. I don’t want to have to fire up Python to do some simple data manipulation, I want to pipe output.
If there were some kind of standard or widely-followed convention for Unix tools to print plaintext, I wouldn't mind it so much. It's the fact that you need to memorize a different flag and output format for each tool, followed by an awk command where the intention of the code is generally very obtuse, which bothers me. By contrast, for all its faults, the fact that PowerShell has unambiguous syntax for "select this field from the output" helps a lot with both reading and writing. e.g. to get your IP address, "Get-NetIPAddress | ? {$_.InterfaceAlias -like "Ethernet" -or $_.InterfaceAlias -like "Wi-Fi"} | select IPAddress" is a lot clearer in intent than the Unix equivalent regex soup, and it can be written without looking anything up by printing the raw output of "Get-NetIPAddress" to the shell and seeing what you need to filter/select on. You can even get tab completion to help.
A hypothetical Unix equivalent doesn't need to be JSON, I just brought that up as an example. But any structured data would be an improvement over the situation now, and as the PS example shows, with the appropriate ecosystem tooling you can do all the data manipulation over pipes.
I’d guess there’s a solution to almost any set of priorities you have for shell scripting.
The domain space is extremely challenging: by default, executing any program on behalf of the caller using arbitrary text inputs, and interpreted.
All modern shells allow calling of literally any binary using the #!/usr/bin/env randombinary incantation.
Upshot: bash has its place, and while some of that place is unearned inertia, much of it is earned and chosen often by experienced technologists.
In my case, if I’m doing a lot of gluing together of text outputs from binaries, bash is my go-to tool. It’s extremely fast and expressive, and roughly 1/4 the length of say python and 1/8 the length of say go.
If I’m doing a lot of logic on text: python. Deploying lots of places/different architectures: go
Evolve into what? Something that works and doesn't have foot guns? Then you might as well just use any other programming languages that exist today. I will not suggest any alternative as that just attract flame wars.
Why hasn't using a rock as a tool evolved? Exactly, because it's god-awful and got superseded, use a real hammer or a cordless screwdriver instead.
Just like the post about improvements to C on the front page today, that list can be made infinite. Language designers have already learned from those mistakes and created zig, rust, go or even c++, that fixes all of them and more. Fixing all the flaws will turn it into a different language anyhow. There is only that much you can polish a turd.
It wouldn't be a worthy bash feature if after learning about it you wouldn't spend a few days figuring out why the damn thing doesn't work the way it works in any other language.
Too many. I still write /bin/sh syntax (I know it's a symlink to bash now but I mean the old school sh). Anything that requires bash-that-isnt-sh is usually better written in perl or something else.