1. Stop putting mission critical systems on Windows, it's not the reliable OS it once was since MS has cut off most of its QA
2. AV solutions are unnecessary if you properly harden your system, AV was needed pre-Vista because Windows was literally running everything as Administrator. AV was never a necessity on UNIX, whatever MS bundles in is usually enough
3. Do not install third party software that runs in kernel mode. This is just a recipe for disaster, no matter how much auditing is done beforehand by the OEM. Linux has taught multiple times that drivers should be developed and included with the OS. Shipping random binaries that rely on a stable ABI may work for printers, not for mission critical software.
None of this advice is useful for massive organizations like banks and hospitals who got hit by this. They cannot switch off of windows for a number of reasons.
There's nothing they can do right now, but my issue is that this will be forgotten when next update/purchasing round swings into action.
Take Mærsk who couldn't operate their freight terminals due to a cyber attack and had the entire operation being dependent on a hard drive in a server that happened to be offline. Have they improved network separation? Perhaps. Have they limited their critical infrastructure to only run whitelisted application? I assure you they have not. They've probably just purchased a Crowdstrike license.
Companies continuously fail to view their critical infrastructure as critical and severely underestimate risk.
Mærsk is kind of a bad example, because they made real security mitigations afterwards.[0] I cannot speak to whether they whitelist applications, but neither can you.
That's the reason why I wrote, "stop putting" instead of "throw all of your PCs out of the window". Just like they migrated away from DOS they should start planning to migrate away from Windows to more modern, sandboxed solutions. There are ZERO reasons why a cash register shouldn't boot from a read-only filesystem, run AV, and so on.
All of the hardware that's attached to workstations in our hospital are designed for windows. Certain departments have specific needs as well and depend on software that is Windows only. After decades of Windows it develops an insidious grasp that is difficult to escape, even moreso when your entire industry is dependent on Windows.
Switching over to windows wouldn't just be extremely costly from an IT perspective but would require millions of dollars in new hardware. We are in the red in part because of the pandemic, existing problems in our industry accelerated by the last few years, and because a large percentage of our patients are on Medicare, which the fed govt shrinks fixed service payments for every year.
I can't imagine convincing our administration to switch over to Linux across the hospital without a clear, obvious, and more importantly short-term financial payoff.
I'm working for a company that has no Windows boxes at all, anywhere. Sure, some Windows software has no alternatives. We're running all of those programs in VMs.
Does this make financial sense? Probably not in the short run, which is an issue for most companies nowadays. But in the long run? I think it's the right choice.
It is not the hardware designed for windows but the driver code, which is most probably written in basic C, which most probably can be cross-compiled for usage outside Windows – so instead of millions of dollars in new hardware it is really thousands in porting the drivers and GUIs to the new platform. What works on windows in 90% cases is an easy porting job for the manufacturer, they just won't be doing it unless someone stops paying for windows version and be willing to pay for alternative platform port.
Anyway, i totally agree with you. The convincing part here is short of clear and obvious for administration types. Until MS finally bricks it's OS and renders it totally unusable they can continue to do whatever shit they want and keep mocking their loyal customers forever.
Well, there’s this one app, written in VB6 using lots of DCOM that produces XML and XSLT transforms that only work in IE6, and the entire organisation depends on it, and the nephew who built it is now a rodeo clown and is unavailable for consultation.
1/ imagine running >1000 legacy applications, some never updated in 20 years
2/ imagine a byzantine mix of local data centers, VPCs in aws/gcp/azure
3/ imagine a IT departament run by a lot of people who have never learned anything new since they were hired
That would be your typical large, boring entity such as a bank, public utility or many of the big public companies.
Yeah, there is no law of physics preventing this, but it's actually nearly impossible to disentangle an organization from decades of mess.
People have continued to run old management systems inside of virtual machines and similar solutions. You can sandbox it, reset it, do all kinds of wondrous things if you use modern technologies in an era-appropriate way. Run your old Windows software inside of a VM, or tweak it to run well on Wine if you have the source. The reason this mess happened is that all of those software are literally running a desktop OS in mission critical applications.
I have worked as an embedded engineer for a while and I can't count the number of nonsensical stuff I've seen incompetent people running on unpatched, obsolescent Windows XP and 7 machines. This mess is 100% self inflicted.
I think these are just technical excuses, but the real answer lies somewhere in the fields of politics and economics. If people in charge are to make a decision – then us tech nerds are going to migrate and refactor 1000 applications and update 20 years of byzantine code mess. I saw entities so large and boring they can barely move one step – changing rapidly and evolving once their economic stability is at stake, and this is a great example of such a disruption which can push them into chasm of change.
This issue could easily happen on any other OS - Linux, macOS, BSDs - because it's a third party kernel driver which would be installed by the corporate IT regardless of anyone's opinion for compliance reasons. Your advice is incompatible with how the real world operates.
Alas in the world of B2B, contracts from larger companies nearly always come with lists of specific requirements for security controls that must be implemented, which nearly always include requiring anti-virus.
It just not as simple as commenters on this thread wish!
The contracts are rarely specifying stuff like antivirus explicitly, but instead compliance with one or more of the security standards like PCI DSS. Those say you have to use antivirus, but they all have an escape hatch called a "compensating control" which is basically "we solved the problem this is trying to solve this other way that's more conducive to our overall security posture, and got the auditor to agree with us".
My source: I review a lot of contracts. It's very common for things to be explicitly required.
Yes you can go back and forth and argue the toss, but it pushes up the cost of the sale and forces your customer to navigate a significant amount of bureaucracy to get a contract agreed. Or you could just run AV like they asked you to...
Can you propose an example of a compensating control for an "antivirus" that had a chance to pass? Would you propose something like custom SELinux/Apparmor setup + maybe auditd with alerting? Or some Windows equivalent of those.
compensating controls ftw. the spirit of the law vs the letter of the law. our system was more secure with the compensating controls, vs the prescribed design. this meant no having to rotate passwords because fuck that noise.
Same, I’ve been in an org that got PCI-DSS level 1 without antivirus beyond Windows Defender or any invasive systems to restrict application installation.
It did involve a lot of documentation of inter-machine security controls, network access restriction and a penetration test by an offensive security company starting with a machine inside the network, but it can be done! Also in my opinion it gives you a more genuinely secure environment.
Nothing like that, basically what sitharus said above you. Extra network level, zero trust to minimize lateral movement and giving the pen testers a leg up by letting them start already within the corporate network.
> AV was never a necessity on UNIX, whatever MS bundles in is usually enough
What prevents someone pushing a malicious package that takes my user data (that is accessible from a logged in session directly) and sends it somewhere? Especially in non-system repos, like Maven/NuGet/npm/pip/RubyGems and so on? What about the too widespread practice of piping shell scripts from the web, or applications with custom update mechanisms that might be compromised and pull in malicious code?
I'm not saying that AV software would protect against all of these, but even if users don't do stupid things (which they absolutely will anyways, sooner or later), then there are still vectors of attack against any system.
As for why *nix systems don't see that much malware, I've no idea, maybe because it's not as juicy of a target because of the lower count of desktop software installations (though the stuff that is on the systems might be more interesting to some, given the more tech savvy userbase), or maybe because a lot of the exploits focus on server software, like most CVEs.
On Windows, I guess the built in AV software is okay, maybe with occasional additional scans by something like Malwarebytes, but that's situational.
Nothing, in fact there have been many cases where python's and nodejs's package systems were exploited to achieve arbitrary code execution (because that's a feature, not a bug, to allow "complicated installation processes to just work").
AVs are the wrong way to go about security anyway, it's a reactionary strategy in a cat and mouse game by definition. For prevention, I think the BSDs are doing some promising work with the "pledge" mechanism. And as much hate as they get, I like appimages and snap et al for forcing people to consider a better segmentation model and permission system for installed software.
Crowdstrike agent is theoretically able to detect that what you just pipe-installed is now connecting to a known command and control server and can act accordingly.
Carbon Black will block any executables it pulls down though. And I think it may also block scripts as well. Executables have to be whitelisted before they can run.
Its an extremely strict approach, but it does address the situation you're talking about.
If you write a batch file on a Windows PC with Carbon Black on it, you will not be able to run it. Of course there is customisation available to tweak what is/isn't allowed.
Yes, but that's like 1% of the actual surface area for "running a script". I am not a Windows expert but on, say, Linux you can overwrite a script that someone has already run, or modify a script that is already running, or use an interpreter that your antivirus doesn't know about, or sit around and wait for a script to get run and then try to swap yourself into the authorization that gets granted for that, or…there's a whole lot of things. I assume Windows has most of the same problems. My confidence in Carbon Black stopping this is quite low.
If your malicious script starts doing things like running well known payloads or trying to move laterally or access things it really shouldn't be trying to access AV will flag/block it.
No one is suggesting it is 100% coverage but you would be suprised at the ammount of things XDR detects and prevents in a average organization with average users. Including the people who can't stop clicking YourGiftcard.pdf.exe
I am not against trying to protect against people who do that. The problem is that you pay XDR big bucks to stop a lot more than that, and this mostly doesn't work.
In a perfect world, AV software wouldn’t be necessary. We don’t live in a perfect world. So we need defense-in-depth, covering prevention, mitigation, and remediation.
> What prevents someone pushing a malicious package that takes my user data
That's not an argument in good faith. If you install unvetted packages in your airline control system, bank, or supermarket, the kind of systems that we're talking about here, you have much bigger problems to worry about.
> I'm not saying that AV software would protect against all of these,
Or indeed any of these. Highly privileged users piping shell scripts from untrusted sources is out of scope for any antivirus system, on any platform.
That doesn't mean all platforms are identical, or share the same attack vectors. It is much more accepted to install kernel mode drivers on the Windows platform, where it is not only accepted but have established quality control programs to manage it, than on Linux, where the major vendor will very literally show you the middle finger on video for everyone to see for doing so.
The Linux community is more for doing that kind of work upstream. If some type of new access control or binary integrity checking is required, that work goes upstream for everyone to use. It is not bolted on running systems with kernel mode drivers. That is because Linux is more like a shared platform, and less like a "product". That culture goes way beyond mere technical differences between the systems.
> If you install unvetted packages in your airline control system, bank, or supermarket, the kind of systems that we're talking about here, you have much bigger problems to worry about.
Surely we can agree that if it's a vector with an above 0% chance of it being exploited, then any methods for mitigating that are a good thing. Quite possibly even multiple overlaid methods for addressing the same risks. Defense in depth and all, the same reason why many run a WAF in front of their applications even though someone could just say: "Just have apps that are always up to date with no known CVEs".
> Or indeed any of these. Highly privileged users piping shell scripts from untrusted sources is out of scope for any antivirus system, on any platform.
You don't even have to be highly privileged to steam information, e.g. an "app" for running some web service could still serve to exfiltrate data. As others have mentioned, maybe this is not what AV software has been historically known for, but definitely there are pieces of software that attempt to mitigate some of the risks like this.
I'd rather have every binary or piece of executable code be scanned against a frequently updated database of bad stuff, or use heuristics to figure out what is talking with what, or have other sane defaults like preventing execution of untrusted code or to limit what can talk to what networks, not all of which is always trivial to configure in the OSes directly (even though often possible).
I won't pretend that AV software is necessarily the right place for this kind of functionality, but I also won't pretend that it couldn't be an added benefit to the security of a system, while also presenting different risks and shortcomings (threat vector in of itself or something that impacts system stability at worst, or just a hog on the resources and performance in most cases).
Use separate VMs, use secret management solutions, use separate networks, use principle of least privilege, make use of good system architecture, have good OS configuration, use WAFs, use AV software, use scanning software, use dependency management alerting software, use static code analysis, use whatever you need to mitigate the risk of waking up and realizing that there's been a breach and that your systems are no longer your own.
Even all of that might not be enough (and sometimes will actually make things worse), but you can at least try.
In that we can agree. But I would put "build on operating systems intended for the purpose" on top of that list, too. There is no excuse for building airline or bank systems on office operating systems and trying to compensate by bolting on endpoint protection systems.
The issue here is not simply scanning for known malware, "endpoint protection" systems go way beyond that. I have never, in practice, seen any of those systems be a net benefit for security. And I mean in a very serious and practical way. Depending on your needs, there are far more effective solutions that don't require backdooring your systems. There simply shouldn't be any unauthorized changes for this type of systems.
> In that we can agree. But I would put "build on operating systems intended for the purpose" on top of that list, too.
Agreed, most folks should probably use a proven *nix distro, or one of the BSD varieties. That would be a good starting point.
That said, I doubt whether the OS alone will be enough, even with a good configuration, but at some point the technical aspects have to contend with managing liability either way.
Carbon Black, running in DO NOT LET UNTRUSTED EXECUTABLES RUN mode,
would not let you run binaries that curl | sh just grabbed unless they were allow-listed.
Windows Defender is more than sufficient for most of these companies, but they need that false sense of security, or maybe they have excess budget to spare, or they are transferring the risk per their risk management plan.
This isn't a windows issue. For what it's worth, I've had plenty of problems in the past with kernel panics from crowdstrike's macos system extension, although it was fairly random, nothing like today's issue.
Linux isn't exactly reliable either... I'm sorry but that OS is barely capable of outputting a stable HDMI signal, god help you if you are on a laptop with external monitor.
For 3 computers, 2 laptops, I've never _not_ had display bugs/oddities/issues. System upgrades always make me nervous because there is a very real chance of something getting fucked up and my screen staying black the next time it boots, having to go into a TTY, and manually fixing stuff up or booting the previous version that was still saved in GRUB.
We can not get computers perfect. They are too complicated. That's true for anything in life. As soon as it gets too complicated, you're left in a realm of statistics and emergent phenomena. As much as I dislike windows enough to keep using Linux, I never had display issues on windows.
To anyone compelled to reply with a text that contains "just" or "simply": simply just consider that if you are able to think of it in 10 seconds, then I have thought of it as well, and tried it too.
In my comment I was referring to mission critical systems, which most definitely you don't put on cheap commodity hardware you buy in a brick and mortar store.
Linux is used EVERYWHERE for a reason. Most car HUD now run on some form of Linux embedded, like basically all embedded and low power devices. The problem here is that people still put embedded mission critical systems on a desktop OS and slap desktop software on it, which is _a bad choice_.
> Linux isn't exactly reliable either... I'm sorry but that OS is barely capable of outputting a stable HDMI signal, god help you if you are on a laptop with external monitor.
This is demonstrably false, given the amount of people that game on Linux nowadays.
> System upgrades always make me nervous because there is a very real chance of something getting fucked up and my screen staying black the next time it boots, having to go into a TTY, and manually fixing stuff up or booting the previous version that was still saved in GRUB.
I had this happen to me once. Timeshift was painless to use, and in about 15 minutes I had my machine up and running again, and could apply all updates properly afterwards. If anything it made me bolder lol.
> Linux isn't exactly reliable either... I'm sorry but that OS is barely capable of outputting a stable HDMI signal, god help you if you are on a laptop with external monitor.
It just just works for me, and has just worked with every laptop I have had in the last 15 years. My kids and I have several Linux installs and the only one with HDMI output issues is a cheap ARM tablet that is sold as a device for early adopters.
> For 3 computers, 2 laptops, I've never _not_ had display bugs/oddities/issues. System upgrades always make me nervous because there is a very real chance of something getting fucked up and my screen staying black the next time it boots, having to go into a TTY, and manually fixing stuff up or booting the previous version that was still saved in GRUB.
At least that number of machines (I do not know whether you mean three or five in total) for the last 20+ years and can recall one such issue.
> For 3 computers, 2 laptops, I've never _not_ had display bugs/oddities/issues. System upgrades always make me nervous because there is a very real chance of something getting fucked up and my screen staying black the next time it boots, having to go into a TTY, and manually fixing stuff up or booting the previous version that was still saved in GRUB.
I also had Windows Update fucking up my VMs and physical installs multiple times - this stuff just happens _with desktop machines, on desktop OSes_. The point is, lots of companies are using random cheap x86 computers with Windows desktop for mission critical appliances and systems, which is nonsensical. The rule of thumb has always been, do not put Windows (client) on anything you can't format on a short notice at any time. Guess people just never learn
How is your lack of a stable HDMI signal relevant to that the world's airlines and supermarkets and banks probably shouldn't run Windows with third-party antivirus software bolted on? That is a platform originally intended for office style typewriter emulation and games.
Every engineering-first or Internet-native company that could choose chose Linux and for simple reasons. Anything not Linux in The Cloud is a rounding error. Most of the world's mobile phones is Linux. And most cloud-first desktops too. They don't seem to be particularly more troubled with HDMI signal quality or other display issues than other devices.
> I'm sorry but that OS is barely capable of outputting a stable HDMI signal, god help you if you are on a laptop with external monitor.
You may have had particularly bad luck with poorly supported hardware, but I don't think this is a normal experience.
I've been using Linux exclusively on desktops and laptops (with various VGA, DVI, DisplayPort, HDMI, and PD-powered DisplayPort-over-USB-C monitors and TVs since 2002 without any unstable behavior or incompatibility.
Most likely. I think laptops are particularly gnarly, especially when they have both an apu and a discrete gpu. While manufacturers use windows' amenities for adding their own drivers and modifications so that they ensure that the OS understands the topology of the hardware (so that the product doesn't get mass RMA'd), there's no such incentive to go out of your way to make Linux support it.
But working with hundreds of computers, running many different distributions of Linux for decades, they just haven't ever seen what you're describing. It's really hard to reconcile what I read here with my hands-on experience.
2. plenty of malware and c2 systems happily operate off all systems, regardless of how hardened (or how unix) they are - IDS/IPS is a reactive way to try and mitigate this
3. you don't need third party software to compromise the unix kernel, you just need to wait a week or two until someone finds a bug in the kernel itself
all that being said, this has solarwinds vibes. the push for these enterprise IDS systems needs to be weighted, the approach adjusted
Windows RTMs used to be shipped in a usable state (albeit buggy) for more than a decade. You installed it from a CD and it worked fine, you installed patches every once in a while from a random Service Pack CD you got from somewhere.
Modern Windows has had the habit of being so buggy after release in such horrendous ways that I can't imagine being able to use the same install CD for years. This is definitely putting less attention to detail in my view.
The slice of Microsoft stuff I worked at certainly did not have dedicated QA at the time I was there and used to have a QA team before, so there is some degree of truth to the statement. I can't speak for other Microsoft teams and offices. It was very disappointing for me, because I have had the opportunity to work with great QA staff before and in my current job and there is no way a developer dedicating 25 % of their time (which is what was suggested as a replacement for having dedicated QA) can do a job anywhere near as good.
I have a feeling most commenters (not just here) don't really know what Falcon is and does, if EDR (and more?) keeps getting compared to a plain antivirus.
Depending on the threats pertinent to the org they may require deep observability and the ability to perform threat hunting for new and emerging threats, or detect behaviour based signals, or move to block a new emerging threat. Not all threats require Administrator privileges!
Not installing AV might be fine for a small number of assets in a low risk industry, but is bad advice for a larger more complex environment.
If were unbiased here the apparent crowdstrike problem could occur on any OS and with any vendor where you have updates or configuration changes automatically deployed at scale.
> Do not install third party software that runs in kernel mode. T
You mean don't install Steam nor the Epic Store, nor many of the games.
Note: I'm agreeing with you except that pretty much the only reason I have a Windows machine is for games. I do have Steam installed. I also have the Oculus software installed. I suspect both run in kernel mode. I have to cross my fingers that Valve and Facebook don't do bad things to me and don't leave too many holes.
I don't install games that require admin.
Oh, and I have Photoshop and I'm pretty sure Adobe effs with the system too >:(
Admin privileges aren't the same thing as a kernel-mode driver. Steam does require admin to be installed, but it does not install a kernel-mode driver.
I've never seen a program running in kernel mode other than AV software. Pretty sure all stuff you listed doesn't. Asking admin permissions doesn't mean it's kernel mode software.
this "kernel level = invasive" paranoia that's been going on lately is complete FUD at its core and screams tech illiteracy
no software vendor needs to or wants to write a driver to spy on you or steal your data when they can do all of that with user-level permissions without triggering any AV.
3rd party drivers are completely fine, and its normal that advanced peripherals like an Oculus uses them
> Linux has taught multiple times that drivers should be developed and included with the OS.
I've had Linux GPU drivers fail multiple times due to system updates, to the point were I needed to roll back. I've had RHEL updates break systems in a way were even Red Hat support couldn't help me (I had to fix them myself).
I don't see how Linux is any better in this regard than Windows to be honest.
Also:
> AV was never a necessity on UNIX
Sure, why write a virus when you can just deploy your malware via official supply chains?
Do you have/need GPUs on your 'mission critical systems'? I would bet most of us don't.
I quite agree with OP here. VMs are now quite lightweight (compared to available resources on machines at least) and I would rather use a light, hardened Linux as my base OS that will run windows VM and do snapshots for quick rollbacks. Actually, that's what I run on my own PC, and I think it would be the sanest way to operate.
Have some kind of soaking/testing environment for production critical systems, especially if you're a big business. If you're hip, something like a proper blue/green setup (please chime in with best practices!). If you're legacy, do it all by hand if you must.
Blindly enabling immediate internet-delivered auto-update on production systems will always allow a bad update to cause chaos. It doesn't matter how well you permission things off on your favourite Linux flavor. If an update is to be meaningful, the update can break the software. And clearly you're relying on the software, otherwise you wouldn't be using it.
1. Stop putting mission critical systems on Windows, it's not the reliable OS it once was since MS has cut off most of its QA
2. AV solutions are unnecessary if you properly harden your system, AV was needed pre-Vista because Windows was literally running everything as Administrator. AV was never a necessity on UNIX, whatever MS bundles in is usually enough
3. Do not install third party software that runs in kernel mode. This is just a recipe for disaster, no matter how much auditing is done beforehand by the OEM. Linux has taught multiple times that drivers should be developed and included with the OS. Shipping random binaries that rely on a stable ABI may work for printers, not for mission critical software.