I particularly like the golang one, because it results in smallish (~1mb) files that can crash land on any linux with no other bootstrap at all... which while troubling for replacing your entire library of shell scripts, is great for bootstrapping. (For replacing your entire library of shell scripts, I just use it with `go run` after my bootstrap has installed the go toolchain.)
Li Haoyi just gave a talk at Scala by the Bay entitled "Beyond Bash", which details the shell scripting environment he's been working on that is hosted in a reimplementation of the Scala REPL, called Ammonite.
I wasn't at the talk, but I downloaded it and have been playing around with it. It's really fun!
The path operations are all typed (ie, you can't combine relative and absolute paths in stupid ways), you get all of Scala to operate on files and the filesystem (if you know Scala, this is pretty huge), and it has a handy pipelining syntax that is effectively an extension of the shell `|` operator we all know and love: http://lihaoyi.github.io/Ammonite/#Extensions
There are other niceties built in as well, like syntax highlighting and pretty printing, that gave me the impression that the author really cares about the UX of the software. It's not all academic/pure, in fact it appears to be the kind of pragmatic, practical thing that I wish Scala was known for. I highly recommend giving it a shot, especially if you already know Scala. I definitely will be giving it some time in the coming weeks.
I've been playing around with the werc framework, 9base, plan9port and other Plan 9-derived tooling and have found rc shell to be rather pleasant compared to Bourne and Korn dialects.
Bash isn't particularly hard. Unfortunately, it carries a significant amount of historical baggage that make things very confusing. Many examples and existing bash scripts usew these older feature which can make learning bash even more confusing.
A couple examples of what I mean are:
# old style command substitution (don't use this)
echo "`ls *.mp3 | wc -l` MP3 files in $PWD"
# new style
echo "$(ls *.mp3 | wc -l) MP3 files in $PWD"
# while it is used in some place, the use of $*
# to mean "all arguments" is probably always wrong
for i in $* ; do do_something $i ; done
# instead, you almost always want "$@"
# (and always use quotes on variable expansion)
for i in "$@" ; do do_something "${i}" ; done
A lot of really nasty sometimes-incorrect behavior goes away when you use the modern replacements.
Another big thing that confuses people at first with sh/bash style shell script is that they treat it like a regular programming language. Instead, realize that most of the magic happens as "expansions" of the command line. Thinking about bash a something closer to a fancy macro language doing simple string manipulates can help a lot.
Finally: RTFM. Seriously. Modern versions of bash have a very nice manual. Cargo-culting pieces of existing scripts may solve an immediate problem, but it won't teach you the real language nearly as well as simply reading bash(1) (especially the "EXPANSIONS" section).
greycat's writing are the best thing for bash, in my opinion. The submitted article is his 'bash pitfalls'... his formal bash guide is here: http://mywiki.wooledge.org/BashGuide
greycat is a guy who idles in #bash on freenode and has helped thousands of people.
The funny thing is, I don't consider myself a very smart person or a programmer, but I've gotten bash down pretty well at this point. And I'm a person who has a looooot of trouble with C++ etc., most of my code is shit, if not just unusable.
Though, I have been using linux command-line daily for 12 years now for sysadmining tasks and such. I know there's a lot of weird stuff to worry about, (especially as is noted in the 'bash pitfalls' guide), but I think with a little bit of practice (about 6 months' worth) and good instruction (I love greycat's guides), anyone can get a reasonable good handle of bash.
Agree. Also, http://wiki.bash-hackers.org/doku.php has tons of great info. Sad that TLDP gets all the Google ranking for most bash searches when wooledge and BHW typically have much more detailed and modern (e.g. using "${var}" and [[ instead of [ ) explanations.
Lastly, if you're really stuck on a bash problem, head on over to #bash in Freenode and the awesome people there will very likely help! Greycat hangs out there.
I have yet to see a good rationale for this. If you need the additional operators [[ provides, such as regular expression matching, great. Otherwise, why not just use [ and remain portable?
Referencing the other thread on shell scripting currently on the front page (https://news.ycombinator.com/item?id=10068668), some of these examples show ways in which bash is actually quite verbose:
# POSIX
for i in *.mp3; do
[ -e "$i" ] || continue
some_command "$i"
done
# HYPOTHETICAL SYNTAX 1
map some_command *.mp3
# HYPOTHETICAL SYNTAX 2
some_command *.mp3
It's hard to imagine how to create simple syntax for operations like that while accomodating other syntactic requirements (strings without quotations, pipelines, etc), but I dream about a shell language that lets me do things like the above.
map() {
local COMMAND="${1:?Argument is required: command to execute, e.g. "echo". Example: \"map echo *\".}"
shift 1
local I
for I in "$@"
do
[ -e "$I" ] || continue
$COMMAND "$I"
done
}
The first command is distinguishing between broken links, and changing the -e to different letters, it can differentiate between 23 different attributes, and act differently depending on each. Your hypothetical examples do not do not have that ability. And, to do what your commands would imply, usually you really can do "some_command *.mp3", it's not hypothetical!
There are two points here. You say you can do "some_command *.mp3"...true if either a) your command is safe to run with garbage input or b) you know that you have >= one file and no whitespace in filenames. Both of these are often reasonable assumptions, but I want a general purpose solution that doesn't have more edge cases than it has to.
The mental model of bash is "this text expands to this text like so, which means x haappens." The model I want is "I'm gonna do x to each of these files."
As for the 23 commands, my attitude is that this is usually either ~19 too many, or infinitely too few.
I acknowledge that maybe the way of doing things will start to make sense at some point, but it hasn't happened yet, and I find trying to climb the hill of understanding the shell to be a joyless experience.
> You say you can do "some_command * .mp3"...true if either a) your command is safe to run with garbage input or b) you know that you have >= one file and no whitespace in filenames.
Globbing preserves whitespace. It's up to the target function to discard it, by forgetting to quote arguments.
Everything I know of bash I learned from Freenode irc #bash channel who have a very active bot that always points to this wiki, so many scripts at work I see the dreaded
for i in `ls`; do...
Which only works due to the fact that we hardly have any files with spaces, newlines etc..
Alternatively, allow everything in filenames, including whitespace and '/', but escape them on the filesystem using something like URL escaping (%20, %2F). No reason the filesystem names have to match the user-friendly names precisely, as long as a lossless bidirectional conversion exists.
Neat idea. If I understand correctly, it'd still be fair to say that those characters wouldn't be allowed in file names since encoding/decoding the friendly form would be opt-in work each and every userspace program couldn't be relied upon doing?
A really helpful guide, particularly for someone just starting out on Linux such as myself. Hopefully I will not get into bad habits to begin with :-)
Having looked at my scripts I seem to have been pretty cautious already (Windows batch has already scarred me plenty), but I have shored up a few minor areas.
https://amoffat.github.io/sh/