"your server needs a GPU and an i5 to use our shell, as it provides a graphical interface and some shader animation because that's what attract the money people, they want shiny stuff y'know"
This trend of "new modern shells" that runs and start as slow as some javascript code (powershell) needs to stop
People forgot what shells are for, and what scripting is for
Count me in this camp as well. The hardware guys did their jobs, our hardware as now faster and more power efficient that anything that came before it. But now the software guys are letting us down making everything slow again.
Agree, I actually wrote a multiline dual sided powerline inspired prompt with modular segments (info from git/pyenv etc) in bash exactly to get these fancy shiny things without having to introduce new dependencies. I log into any server, pull my config and bob’s your uncle. (Btw, that expression is so weird). Ofc, to have the powerline look I would need custom fonts on the client terminal but I usually don’t enable them as I’ve added other “themes” instead.
Doesn't seem like a real problem. Why are we grumbling about this? No one is making rxvt less an option. Why the fear? Why the negativity?
Does start-up time matter? Who cares? (I have three terminals that have been open for almost a year.) Are we concerned about only first start, or primarily the faster second starts once all the dynamic libraries have loaded? Does anything actually require an i5? Whats wrong with requiring a GPU?
Doesnt it get tiring, being so grumbly about other people having fun & doing cool things? Do you really think we should do as you say & just freeze time, insist on doing nothing?
I care. I care about start up time. I care because I just had a black out, and that requires booting, I care because I mess with my system and I have to reboot sometimes. I care because I expect software to improve, not get worse. I care about engineering because bad engineering ultimately wastes my time and everyone elses. Everyone can have fun and nobody should prohibit others from doing so. but valid criticism is valid criticism, even though I hate it.
With that said, the counter argument should've been that although powershell does start up slow, (and many other things) it is better than bash in many cases and more performant.
> I care because I just had a black out, and that requires booting, I care because I mess with my system and I have to reboot sometimes.
I have a very hard time believing you are talking about 5 minutes or more per year of wait time. Even two minutes feels suspect.
Personally, kitty or Alacritty or gnome-terminal or terminator or any other graphical terminal I've tried... they are slower to start but it's under 2s, and faster on second load (let's say 1s). It's hard for me to imagine the amount of agony & bitterness, the "I am being deprived of valuable time" for something that costs let's generously say 10 instances of 2s a month, not even a full minute.
And no one is forcing you to switch off what you have. No one is forcing you to stop using serial console or whatever else.
People need to dial down their outrage. This is a huge social problem online. People are vastly overconcerned. Ya'll are not being reasonable. You are being absolutist & maximalists about very particular narrow concerns.
> I care because I expect software to improve, not get worse.
You have an exceedingly narrow & particular view of progress. And it's conservative in that it recognizes & permits no other forms of growth or advancement. You have a high concern that trumps all other concerns & nothing but your own particular view matters.
> I have a very hard time believing you are talking about 5 minutes or more per year of wait time. Even two minutes feels suspect.
Well, the device I was whining about was a pi4 booting from a usb3 hdd (So yes, it will not be instantaneous! because its not an ssd), not an i20 ssd 100 cores device. And I'm not outraged, I looked at it, saw it was a hog and moved on (because again, I care about that stuff). Though, I am slightly outraged at you putting words in my mouth and making a caricature of my self, you don't know me, I don't have an "exceedingly narrow & particular view of progress", I simply know that a device has limited resources, software uses those resources. Allowing software to get less performant, means the system will get less snappy, and that I do hate (mostly because it automatically alerts me of viruses(still have that paranoia) or a runaway process hogging the CPU).
I'm glad that you never encountered this before and I sincerely wish you never will: getting on call 3am in the morning due to server outage and you couldn't diagnose remotely. You rushed to the server room, which was only 50F btw, connected to the machine and brought up a rescue shell. Oh, did I tell you that none of them has integrated GPU?
See, it's not about the time you sit in front of your M1 MacBook and have a nice cup of tea -- it's about the situation where everything goes south and your tools and infrastructures can still have your back
That's a weird strawman. Nobody aims to take away your standard framebuffer / textmode terminal. It's there for a reason and I don't think this post cares at all about that use case. (Also, you should really invest in iLO or your vendor's equivalent - it pays for itself if you do trips like that)
> being so grumbly about other people having fun & doing cool things
The downstream effect of people "doing cool things" (making insanely bloated crap) is that we often have to use it.
I don't understand this attitude that every claim and endeavor is immune from criticism as long as you can frame it as someone "having fun" (I'm sure these corporate software projects are super duper fun) or being experimental.
> One of its modules attempts to translate natural language requests into the correct shell commands and syntax. For example, if you typed “compress Documents folder,” CLAI will recommend the corresponding Tar command.
This is such a bad idea I don’t know where to start. Shell commands are a dangerous, but precise tool, somewhat like using a scalpel or a surgical tool. Dumbing it down so it can “guess what you want it to do” is going to result in more people (Specifically people who don’t bother to read the docs) breaking things.
> Warren Teitelman originally wrote DWIM to fix his typos and spelling errors, so it was somewhat idiosyncratic to his style, and would often make hash of anyone else's typos if they were stylistically different. Some victims of DWIM thus claimed that the acronym stood for ‘Damn Warren’s Infernal Machine!'.
> In one notorious incident, Warren added a DWIM feature to the command interpreter used at Xerox PARC. One day another hacker there typed delete *$ to free up some disk space. (The editor there named backup files by appending $ to the original file name, so he was trying to delete any backup files left over from old editing sessions.) It happened that there weren't any editor backup files, so DWIM helpfully reported *$ not found, assuming you meant 'delete *'. It then started to delete all the files on the disk! The hacker managed to stop it with a Vulcan nerve pinch after only a half dozen or so files were lost.
Then 99% will give up. I mean just try to read the man page of any modern GNU utility, it's so long and filled with options that nobody will ever read that through.
What we need to do is dumb down surgical tools. Anyone should be able to do a coronary artery bypass with just a quick google and some AI assistance. That way your analogy will match the goal of these new shells.
If only there existed some thing, some sort of UI, that basically showed you all you could do with the tool, and there was some sort of checkbox thing where you wanted it to do this and that... gosh, we would be living in the future. not a CUI where you have to memorize --options, but one that has all the options and you pick what you want, and then maybe it prints out the full command you want... we would be living in the future.
(I'm talking about a TUI, this was solved 60? years ago)
One thing not discussed are the libraries used for command-line parsing (parsing argv), and how that might get complicated by shells trying to make the command-line into something effectively more than an array of strings.
Having written a non-trivial command-line parser in C, and having used a bunch of them in other languages, it seems to me that this task would benefit from some more standardization and maturation. What is the JSON of the command-line? How can we do to increase the level of interoperability between how information is encoded on different tools' command-lines? e.g. think of ImageMagick "convert" versus "find" versus "ffmpeg": totally different universes, but all of them in their own way turn command-line arguments into mini-DSLs.
Given the prevalence and longevity of GNU style short and long options, pretty much anything that doesn’t follow that is “out of compliance”.
However, you also called out some very specific commands that are that way for a reason. For example the order of options for ffmpeg matters very much, as that’s used to construct the processing pipeline. It does make sense for certain things to be custom, but that should only be done when there’s a good reason.
It doesn't scale especially well to UIs with tens of subcommands, but I'm a fan of Docopt as a reasonable way to write basic CLI interfaces in many languages with a minimum of fuss:
I have a hard time imagining how we get out of the gravity well of CLI programs handling their own parsing.
A tool I write has a use-case for understanding the syntax of at least ~common CLI tools well enough to pick out args that will be other executables (sudo cat, find blah -exec...), so I have been idly pondering whether there's a humane, declarative, descriptive grammar that can express nearly all CLI interfaces.
It's probably not worth the work for my case, but it might get to be more tractable if it was also an input for better completion, help, linting, etc. tools.
Ideally something that drives enough all-around value that projects would start up streaming the grammars (and maybe adopting an associated parser?)
Parsers designed for implementing CLI programs are generally too opinionated to handle ~strange commands. (In my terms I'd say it's a prescriptive parser as opposed to something that attempts to be flexible enough to describe nearly all existing CLIs).
Wouldn't it be easier to have a convenient library/parser for almost all of the use cases instead of an immensely complex catch-all solution?
Having custom logic when required should almost always be less complex when such a ~strange command is to be implemented.
As I said, my usecase doesn't involve implementing the commands--it involves reliably identifying executables in the arguments to many different commands.
I can't go rewrite awk, find, and sed with an opinionated cli module. I have to deal with the current reality.
(you're roughly describing what I already do, and it scales poorly)
Completions have in general been of interest, though the shell-specific completions I've looked at so far were all too dynamic.
I'd forgotten all about Fig since I saw your launch post here last year, so thanks for reminder. (I don't think I had quite started to work on parsing specific external commands, yet, so it wouldn't have clicked at the time. Was still focused on just identifying the likely presence of exec in the executables.)
Are you familiar with the parse code? Are you handling painful stuff like combined short flags with a trailing option? (If I ferreted out some of the more painful cases I've had to wrangle, I am curious if you'd have a gut sense of whether your approach handles it. Would you mind if I reach out? I am working on this for https://github.com/abathur/resholve)
I've always wondered about expanding stdin, stdout, stderr. Say, stdjson that doesn't get visually displayed, but can be piped (and would only be generated if it is needed on the pipe stream).
ls | cat </dev/stdjson | string_proc_the_json_for_some_reason
You seem to be pretty close to Nushell, which was mentioned in the article. The Nushell `ls` (and `ps`, etc.) builtins generate structured data that can be sorted, queried, reduced, and then transformed to many different types of structured data.
$ ls | get 0 | select modified | to json
{
"modified": "2022-08-16 16:38:28 -04:00"
}
The internal data format looks pretty JSON-like, with the added ability to keep Nushell types intact.
While I'm not ready to replace Fish with Nushell, it's definitely taken the place of jq for me.
I see much potential in adding stdjson as well, but I do caution against opening the floodgates to std* being implemented for every pet format and insignificant corner case.
What about an ioctl that lets you query what formats are accepted by the file descriptor? We already sort of have a precedent for it with giving different output based on isatty().
(fully daydreaming here) For C, would it be useful to have main() look like:
int main(int argc, char *argv[], char *envp[], JSON *json)
for some JSON data type that is part of C, kind of analogous to a FILE stream? I'm not sure how the the json info would get into that fourth argument (has to be independent of argv), but it would keep std{in,out,err} as is.
You would need to modify the execve system call (or equivalent non-Linux OS's) to take in the 4th json arg for the shell to pass when it exec's the new process. And then of course modify the OS kernel to parse/deal with it. But in reality, since you can't break the ABI like that for all existing software, it would end up being execve_2 or something and you'd have to both convince everyone else it's worth using, and deal with the inevitable incompatibilities when not everyone does. Not impossible perhaps, but certainly an uphill battle.
Right; every file is just bytes but we get a lot of mileage out of libraries like libpng that parse those bytes into usefully structured info. And I was pondering what more could evolve to parse info from the command line.
I agree that a standardized structured format would be awesome, but I'm not convinced that JSON is it. And there should _not_ be more than one.
One thing that I do like about JSON is that it is ubiquitous - and that makes up for a lot of its other faults. But I would like to see proposed use cases that JSON would not support beforehand, to clearly define where the limitations are, and what limitations the community is willing to accept.
It feels solved in raw Powershell functions, but running external CLI tools inevitably returns text and ruins the workflow.
"Crescendo" has been marketed as a solution and looks cool, but it means relearning the tool or documentation being less useful. The sheer amount of existing time people spent learning arcane git syntax means they're not going to switch to a hypothetical "New-GitCommit" function, even if it accepts arrays or PSCustomObject as input.
Is this the Crescendo you meant https://github.com/PowerShell/Crescendo ?
From your comment I initially thought Crescendo was some separate commercial software.
Cool, thanks for the info. It is very intriguing, and I admit that until this HN thread I didn't know about powershell, or appreciate this totally different model for how to connect programs together.
This is probably way off-topic now, and my question surely shows my ignorance, but do you know if there precedent for a program, once packaged to work within powershell (or maybe nushell), to use the associated input and output specification as the starting point for making a web interface to the same code (as bridged by a webserver)? Or have I just described .net ?
I'm no expert on web dev, mostly working in BI/DBA.
Running commands over WinRM or SSH can return objects of any type from remote machines. In the background I believe it's converting them to serialized CLIXML over the wire.
e.g. $RemotePSVersion = Invoke-Command -ComputerName 'SomeOtherComputer' -ScriptBlock {$PSVersionTable}
Rather than the variable $RemotePSVersion being a string it's an object with the type "System.Management.Automation.PSVersionHashTable", just like if you ran $PSVersionTable locally.
For anything that returns text (e.g. external tools like curl/robocopy) you'll usually convert to an object in your script before further processing. That way it can be passed into whatever next steps in a generic way.
That's less important when working interactively, but one major difference between Powershell/Bash is the relative focus on scripting vs interactive terminal use.
It seems very well reasoned, has stable API, excellent backwards compatibility and does not require GPU and i5 as this one might? Its author also has proven record and actual experience, which I'm not sure authors hpf TFA have, judging solely from their writing.
NixOS has a nice module that can be easily enabled. After that default DE durden can be invoked with single command and can be interacted with as a normal VM/Window alongside your existing DE.
Admittedly I haven't tries making it my default yet, but seemed fairly well adaptable for tiling WM/Openbox needs.
I'm finding the text block with an overall left-to-right gradient surprisingly hard to read. Continuing from the end of one line to the beginning of the next takes more effort than it should. I'm guessing it's the abrupt change in color.
I've found the gradient text trend to be interesting for titles and single lines, but I don't think it works for multi-line text.
Isn't this still confined to what a terminal gives you right now (as it is using current terminal specifications)?
While certainly colorful and fancy, it doesn't really introduce new concepts, does it?
I mean, for starters:
-> unix introduced text as a universal interface
-> bash made reusing stuff a lot easier via file descriptors, etc. (think: <(input to regard as file))
-> powershell allowed for object oriented scripting
-> some older systems (name forgotten/unknown) even had interactivity in the cli: click on parts of command-output and stuff happens, even after other commands have been run already
The command line ought hybridize some, please. Having better self describing interfaces, machine to machine capabilities... humans are awesome enouhh to whip up super wild magic on the fly ("spellcasting on the fly") but these same tools are much weirder to use as you descend into scripting, as you start bringing "real" programming languages in (which have their own alternate realities: "standard libraries").
The command line should/ought bridge & integrate better. Making it more usable from these higher (more pre-baked/automated) levels is one side. And then reciprocally, how wonderful it would be to see execution flow expressed less in terms of stack traces & more in terms of networks of communicating processes. Create boundary layers, make the cli tools visible & known operations sequenced by (but still visible within) higher level systems.
Amen. Offhand, can we start with e.g. a 2 or 3 window deal, in which there is a file manager and a document viewer that one can easily copy and paste from? There's stuff out there that sort of does this but it could be done MUCH better.
Has anyone done anything around just ... mixing images in with the terminal output? Let's say I wanted to check if I had any old memes lying around my home directory, and have a quick look so I can decide to delete or not.
~$ls *{png,jpg}
oldmeme.png
~$imgcat oldmeme.png
/----------------\
| oldmeme.png |
| appears right |
| here in the |
| terminal |
\----------------/
~$rm oldmeme.png
Terminfo man page shows some evidence of support for "bit_image" commands but none of the terminals in my terminfo files seem to have it. I have over 2000 terminfo files though, I like the idea that if I found some literal teletypewriter from 1973 and figured out some way to hook it up, I am probably prepared with the proper escape sequences.
"This is an EFL terminal emulator with some extra bells and whistles such as the ability to display in-line images, video and even play music files, background images, videos, Z-Modem like sending (e.g. SSH into a server and use tysend to send a file back to the local terminal), GPU Accelerated rendering (optional - just set the EFL Elementary toolkit engine to use OpenGL) and much more."
I use wezterm which supports sixel, so I can do exactly what you show here. I use lsix and img2sixel in the terminal. Another fun thing you can try: curl v3.wttr.in/Texas.sxl
I love the future that fzf has given us. So many new ideas for selecting and viewing lines of space delimited records — the bread and butter of the shell — are possible with fzf and you get to build them in the traditional of small composable tools.
Junegunn Choi is really talented designer. More please.
To me this is just more of the same: added convenience to what we already have. Converting individual commands to TUIs is a step away from the unix philosophy.
I think the future of the command line lies in the direction of flow-based programming and spatial representation of complex commands.
I would like to see a terminal that, as I type, _generates_ a flow-based view of my command. Every command would be visualised as a component: ls, awk, sed... Every |, < or > that I type would append a link and a new component to my flow, and ultimately I would be able to manipulate my flow instead of typing: click the ls component, have it output Creation Date instead of Modified Date, then click the awk component and add another output to a new sed component and so on.
It seems to me the solution to the problem posed in the article -- and to the wishes of many of the commenters here -- is to use JavaScript as your shell. The missing piece is an app that presents a text UI with a JavaScript REPL and that renders the DOM inline.
Missing file, device, and information manipulation applications that shell programmers string together would be replaced by JavaScript functions from a library. If you really want JSON, use the JavaScript Serialized Object Notation to serialize JavaScript objects in JavaScript.
Why do people write "what I want is..." articles and comments when they could be writing solutions that scratch their itch and meet their needs?
Everyone who builds one of these never fixes the fundamental problem: they don't distinguish between "space" as a character, and the next element in the argv array.
Stop making me have to engage in bizarre escaping rituals and let me just toggle between "string mode" and "array mode".
This trend of "new modern shells" that runs and start as slow as some javascript code (powershell) needs to stop
People forgot what shells are for, and what scripting is for