As long as Linux distros have such shit accessibility stories, MacOS and Windows being available should be a requirement for all systems in government.
A better analogy is if one bulb in the right rear brake light group is burnt out. Technically the car is broken. But realistically you will be able to do all the things you want to do unless the thing you want to do is measure that all the bulbs in your brake lights are working.
That's an awful analogy because "realistically you will be able to do all the things you want to do". If a random GitHub service goes down there's a significant chance it breaks your workflow. It's not always but it's far from zero.
One bulb in the cluster going out is like a single server at GitHub going down, not a whole service.
Ansible exists because it makes things idempotent, which is great when you have to do a thing on 1,000 servers because you can just fix the role and re-run it.
Bash can be idempotent but isn’t by default, so you either spend time making and idempotent bash script or you spend time learning Ansible to accomplish the same thing in a reusable way
Ansible’s idempotency is dependent on the specific module being invoked. What ansible mainly brings to the table is the parameterized modules. Which brings us back to people adopting it because they don’t know how to compose one liners, quote them properly, and wrap them in a for loop.
Many years ago I wrote my own "cloud instance bootstrapper" that would pull a tar off of S3 based on EC2 instance tags / metadata, untar it, then run a script. I never got into Ansible and I hated having to rebuild AMIs for minor changes.
Why does the workflow lie at the level of a real or virtual piece of paper and not in the metadata from the applications used to create that piece of paper? Seems like a CAD tool would allow you to identify each element of the drawing, assigning metadata as required.
Only a small set of construction stakeholders participate in the CAD ecosystem (e.g., architects, large GCs) while a broader set of stakeholders (subcontractors, trades, smaller GCs/CMs) do not receive BIM files and work with PDFs. CAD/BIM is a wonderful aspiration but for many the reality is PDFs.
Re. "CAD/BIM", technically speaking CAD doesn't imply BIM, and the industry's promotion of BIM is akin to AI promotion among software engineering teams - the benefits aren't clear upon detailed review of the advertised capabilities. The CAD part, on the other hand, is generally recognized as the essential tooling for the profession and I'm surprised to hear that it just is a "wonderful aspiration".
"The profession" actually is a wide variety of trades, not just architects and contractors. Electricians, plumbers etc. where CAD is not yet widely spread.
Which hopefully will change in the near future, with open source BIM tool chains, boosted by generative/agentic AI.. Finally, a huge source of confusion and execution hiccups will be overcome.
Oh you sweet summer child. These draws are anywhere from 0 to 120 years old and might just be something pulled out of a floppy disk from 1970 to scanned in coffee ridden pieces of paper sitting in a desk folded a hundred times.
The world in which metadata is a common thing attached to any file doesn't exist, and probably never will, no matter how much you try to improve CAD work flow.
It turns out that it's possible for the server to detect whether it is running via "| bash" or if it's just being downloaded. Inspecting it via download and then running that specific download is safer than sending it directly to bash, even if you download it and inspect it before redownloading it and piping it to a shell.
The server can also put malware in the .tar.gz. Are you really checking all the files in there, even the binaries? If you don't what's the point of checking only the install script?
The latest Firefox build that Debian did only took just over one hour on amd64/armhf and 1.5 hours on ppc64el, the slowest Debian architecture is riscv64 and the last successful build there took only 17.5h, so definitely not days. Your average modern developer-class laptop is going to take a lot less than riscv64 too.
> If you don't what's the point of checking only the install script?
The .tar.gz can be checksummed and saved (to be sure later on that you install the same .tar.gz and to be sure it's still got the same checksum). Piping to Bash in one go not so much. Once you intercept the .tar.gz, you can both reproduce the exploit if there's any (it's too late for the exploit to hide: you've got the .tar.gz and you may have saved it already to an append-only system, for example) and you can verify the checksum of the .tar.gz with other people.
The point of doing all these verifications is not only to not get an exploit: it's also to be able to reproduce an exploit if there's one.
There's a reason, say, packages in Debian are nearly all both reproducible and signed.
And there's a reason they're not shipped with piping to bash.
Other projects shall offer an install script that downloads a file but verifies its checksum. That's the case of the Clojure installer for example: if verifies the .jar. Now I know what you're going to say: "but the .jar could be backdoored if the site got hacked, for both the checksum in the script and the .jar could have been modified". Yes. But it's also signed with GPG. And I do religiously verify that the "file inside the script" does have a valid signature when it has one. And if suddenly the signing key changed, this rings alarms bells.
Why settle for the lowest common denominator security-wise? Because Anthropic (I pay my subscription btw) gives a very bad example and relies entirety on the security of its website and pipes to Bash? This is high-level suckage. A company should know better and should sign the files it ships and not encourage lame practices.
Once again: all these projects that suck security-wise are systematically built on the shoulders of giants (like Debian) who know what they're doing and who are taking security seriously.
This "malware exists so piping to bash is cromulent" mindset really needs to die. That mentality is the reason we get major security exploits daily.
> And I do religiously verify that the "file inside the script" does have a valid signature when it has one.
If you want to go down this route, there is no need to reinvent the wheel. You can add custom repositories to apt/..., you only need to do this once and verify the repo key, and then you get this automatic verification and installation infrastructure. Of course, not every project has one.
>Which sounds great, but the way things work now tend to be the exact opposite of that, so there will be no trustable platform to run the untrusted code in.
This is the problem with software progressivism. Some things really should just be what they are, you fix bugs and security issues and you don't constantly add features. Instead everyone is trying to make everything have every feature. Constantly fiddling around in the guts of stuff and constantly adding new bugs and security problems.
It's entirely the second paragraph and not part of certificate expiration, in and of itself, lends to being MITM. Firefox tells me what the problem is, expired, wrong name, etc. So, it's not just saying "oh no, something is wrong." I can tell what is wrong before I choose to proceed.
I always turn off swap and that solves this problem. I really don't understand why it is on by default anymore. If you need swap, you are doing something very wrong somewhere else.
So, it depends on how much RAM do you have. Also with a swap enabled system can swap out some very rarely used memory pages, and cache some frequently used files instead - so by disabling swap you rob yourself of this opportunity :)
I have 64 GB RAM on my workstation, yet i still have swap enabled (but with lowered swappiness value).
You're never going to be able to IPO your space startup with that attitude.
reply