> people claiming "AI" can now do SWE tasks which take humans 30 minutes or 2 hours
Yes people claim that but everyone with a grain of salt in his mind know this is not true. Yes, in some cases an LLM can write from scratch a python or web demo-like application and that looks impressive but it is still far from really replacing a SWE. Real world is messy and requires to be careful. It requires to plan, do some modifications, get some feedback, proceed or go back to the previous step, think about it again. Even when a change works you still need to go back to the previous step, double check, make improvements, remove stuff, fix errors, treat corner cases.
The LLM doesn't do this, it tries to do everything in one single step. Yes, even when it is in "thinking" mode, in thinks ahead and explore a few possibilities but it doesn't do several iterations as it would be needed in many cases. It does a first write like a brilliant programmers may do in one attempt but it doesn't review its work. The idea of feeding back the error to the LLM so that it will fix it works in simple cases but in most common cases, where things are more complex, that leads to catastrophes.
Also when dealing with legacy code it is much more difficult for an LLM because it has to cope with the existing code with all its idiosincracies. One need in this case a deep understanding of what the code is doing and some well-thought planning to modify it without breaking everything and the LLM is usually bad as that.
In short, LLM are a wonderful technology but they are not yet the silver bullet someone pretends it to be. Use it like an assistant to help you on specific tasks where the scope is small the the requirements well-defined, this is the domain where it does excel and is actually useful. You can also use it to give you a good starting point in a domain you are nor familiar or it can give you some good help when you are stuck on some problem. Attempt to give the LLM a stack to big or complex are doomed to failure and you will be frustrated and lose your time.
> The other might be more humbling: how significant are we? Or, as a statement instead of a question, we are the only significant thing of which we know.
We may assume that we are the only intelligent life in the universe and that life on our planet is highly significant. Humanity itself faces a great challenge in finding its way. We are currently in a dark period of our evolution—one where we have mastered a great deal of technology to make our lives materially comfortable, yet we have not mastered the "demons" within our minds. We fail to control them as individuals, and even less so as societies. These demons were instilled in us by natural evolution, serving us well until the Neolithic age. But in the modern era, they have become our greatest enemy. At this point, the biggest problem facing humanity is human nature itself. We stand on the brink of destroying our planet in numerous ways. Humans have already caused one of the greatest mass extinctions of large animals in Earth's history.
One argument supporting the theory that Earth is the only planet with advanced life is the growing realization of how many rare conditions must be met for life to emerge. In the past, scientists believed it was enough for a planet to be located within the habitable zone of its star. We are now beginning to understand that this is merely one of the most basic requirements among many others.
Earth itself has come close to losing all its life on multiple occasions—such as during the Snowball Earth period—despite the Sun remaining stable and the planet still being within the habitable zone.
One crucial factor for sustaining life is a planet’s internal magmatic activity, which must be powerful enough to generate a stable magnetic field. This field protects the atmosphere from being stripped away by solar winds. Additionally, it seems that magmatic activity played a key role in warming the planet during its early years when the Sun’s radiation was weaker. In fact, the gradual increase in solar radiation over billions of years appears to have offset the decrease in Earth's internal heat, maintaining the planet’s temperature within a range suitable for life to thrive.
However, Earth's prolonged and vigorous magmatic activity appears exceptional, likely because a colossal collision with a rogue protoplanet—the event known as the Giant Impact Hypothesis—not only formed the Moon but also injected an enormous amount of thermal energy into the young Earth. This impact created a long-lasting magma ocean phase, effectively resetting the planet's internal heat and driving rapid mantle convection and differentiation. Such enhanced magmatic activity contributed to the early formation of a stable geodynamo, which has sustained Earth's magnetic field and, consequently, its atmosphere over geological time.
For all we know, Earth may be unique in the universe, but we are far from certain enough to make such a claim.
The other possibility is that intelligent life exists elsewhere, but the barriers imposed by the speed of light—combined with the unimaginable vastness of the universe—may render it impossible for advanced civilizations to find or communicate with one another. Who knows? Perhaps the universe was created by some form of intelligence that ensured life could develop, but only in such rare and distant pockets that no two civilizations could ever reach each other, or even communicate.
EDIT: expanded the paragraph about the big impact hypothesis.
An airplane is far less energy-efficient than a bird to fly, to such an extent that it is almost pathetic. Nevertheless, the airplane is a highly useful technology, despite its dismal energy efficiency. On the other hand, it would be very difficult to scale a bird-like device to transport heavy weights or hundreds of people.
I think current LLMs may scale the same way and become very powerful, even if not as energy-efficient as an animal's brain.
In practice, we humans, when we have a technology that is good enough to be generally useful, tend to adopt it as it is. We scale it to fit our needs and perfect it while retaining the original architecture.
This is what happened with cars. Once we had the thermal engine, a battery capable of starting the engine, and tires, the whole industry called it "done" and simply kept this technology despite its shortcomings. The industry invested heavily to scale and mass-produce things that work and people want.
I am now using Modelica with OpenModelica at work to describe electromagnetic systems and it is an excellent language and, with OpenModelica, a excellent graphical user environment. Sometimes I think of it like SPICE but for multi-physics systems.
The Modelica library is quite mature and complete and the numerical solvers included with OpenModelica robust and performing.
It looks me a while to learn it but now it is paying out.
In addition the fact that Modelica is a standard implemented by several suppliers with an open source application is also great to avoid vendor lock in so that is a technology on which is safe to invest as an engineer and as a company.
Modelica is an excellent way to perform these simulations. Exporting a functional mock-up unit (FMU) according to the FMI standard is a first-class capability [1] that is another huge source of value, especially for systems integrators. You are able to have reasonably obfuscated models of your system in untrusted hands, and they get the full benefit of your system model. This is one area where OpenModelica is ahead of competitors including the open-source ModelingToolkit.jl [2] and related library FMIExport.jl [3].
I remember using OpenModellica to test an Functional Mockup Unit where I extracted Neural Net as ONNX and the ONNX runtime DLL and then connected it to Simcenter Amesim as my student work at Siemens. Pretty okay standard compared to how old and crusty the APIs of engineering software are.
I think we as human beings do not actually invest almost anything in these kinds of quality of life improvements. Everything is based on commercial interests driven by capitalism, so big projects are undertaken only when there is proportionate commercial interest in them. I feel we could improve people's quality of life significantly if only we put our resources and technology in service of this purpose.
In this case, you may notice that this project was done just by the municipality, not by any government or commercial entity, and only because of the goodwill of a few people in this town. I would say we need political will at the state level to accomplish these kinds of projects.
I feel that with modern day technology could do marvel to improve the quality of life of people. Instead technology often ends up making life of people subtly more miserable.
Is this another failure of the US government to provide one of the most basic protection to its citizens? Normally the government should establish rules that prevent this sort of thing to happen.
This is a new problem created by the rise of very complex fintech companies using middlemen to handle transactions. The FDIC is proposing a new rule to require better record keeping to try to prevent issues like this in the future.
I would say that normally, to face "very complex fintech" the goverment should set a solid set of fundamental rules that ensure security for the people, then it should ensure seriously that they are respected. That should be something like, "do whatever creative fintech you want as long as you respect these rules".
It seems to me that not creating and enforcing such a system of rules, or doing poor rules that leaves doors open for abuse or errors, is a failure of the government and of the political estabilishment.
Right but then they have to litigate every new fancy attempt to dodge the rules individually, and they don't have the staff or the funding for that.
The Supreme Court has, for decades, been carving down regulatory powers to only exactly cover explicit, specific, literal interpretations of confessional law, so now we get thousands of potential cases that amount to "Stop touching the customer's money" "I'm not touching it, the atoms in my hand are getting close enough to the money to affect it with atomic forces, but nothing can be truly described as touching anything, if you think about it."
Solid fundamental rules don't work if the Court takes every possible chance to redefine every single term to make them not apply.
you should watch state of the union when they pan out to show congress(wo)man and senators and see about roughly their age based on how they look (or even easier, see it online). average age is like 87.7 :) how are they going to set fundamental rules about very complex fintech while talking to their grand-granddaughters on their nokia’s…
Unfortunately Government regulations are almost never proactive, always reactive. Honestly a lot of these new fintech companies feel like they exist mainly to get around current regulations.
It isn't the government's job to stop people from being stupid with their money. Go look at https://www.withyotta.com/. It is literally a gambling site. If people want to put their life savings in there then, well, that's on them.
This also happens because the Italian television dumbs people down. Now we have also the social media so there is very little hope things will get any better.
The Windows ecosystem typically deployed in corporate PCs or workstations is often insecure, slow, and poorly implemented, resulting in ongoing issues visible to everyone. Examples include problems with malware, ransomware, and Windows botnets.
In corporate environments, IT staff struggle to contain these issues using antivirus software, firewalls, and proxies. These security measures often slow down PCs significantly, even on recent multi-core systems that should be responsive.
Microsoft is responsible for providing an operating system that is inherently insecure and vulnerable. They have prioritized user lock-in, dark patterns, and ease of use over security.
Apple has done a much better job with macOS in terms of security and performance.
The corporate world is now divided into two categories:
1. Software-savvy companies that run on Linux or BSD variants, occasionally providing macOS to their employees. These include companies like Google, Amazon, Netflix, and many others.
2. Companies that are not software-focused, as it's not their primary business. These organizations are left with Microsoft's offerings, paying for licenses and dealing with slow and insecure software.
The main advantage of Microsoft's products is the Office suite: Excel, Word and Powerpoint but even Word is actually mediocre.
I think you represent the schism in your own post. Retail is hyper focused on the name Microsoft and Windows. But the enterprise and technical people are focused on rolling back a bad CrowdStrike bad update. They will spend hours and even days focusing on doing that, asking why they were vulnerable to such an update and what they should have done to avert being vulnerable to a bad update.
And for them it will be a bit of a stretch to say Microsoft should have stopped us deploying CrowdStrike. I’m sure Microsoft would love to do just that and sell its own Microsoft Solution.
> it will be a bit of a stretch to say Microsoft should have stopped us deploying CrowdStrike
I read GP's post to mean that if you take a step back, Windows' history of (in)security is what has led us to an environment where CrowdStrike is used / needed.
I can answer this. For the same reason I have run ClamAV on Linux development workstations. Because without it, we cannot attest that we have satisfied all requirements of the contract from the client's security organization.
Also if you are a small business and are required to have cybersecurity liability insurance, the underwriter will require such sensors to be in place or you will get no policy.
For the same reasons there's antivirus software for Mac and Linux.
People coming from Microsoft systems just expect it to be required, so there's demand for it (demand != need). And in hybrid environments it may remove a weak link: e.g. a Linux mailserver that serves mail to Windows users best has virus detection for windows viruses.
I’m not defending CrowdStrike here. This is a clearly egregious lack of test coverage, but CrowdStrike isn’t “just” antivirus. The Falcon Sensor does very useful things beyond that, like USB device control, firewall configuration, reporting, etc.
If your use case has a lesser need for antimalware you might still deploy CrowdStrike to achieve those ends. Which help to lessen reliance on antimalware as a singular defense (which of course it shouldn’t be).
It's not just those darn windows admins. Alot of the certifications customers care about- SOC II, ISO whatever, FedRamp, have line items that require it.
I've had to install server antivirus onto my Linux laptop at 4 different companies. Every time it's been a pain in the ass because the the only antivirus solutions I've found for Linux assume that "this must be a file server used by Windows clients". None of them are actually useful, so I've installed them and disabled them. There, box-checking exercise done.
> For the same reasons there's antivirus software for Mac and Linux.
Because they can also get malware or could use the extra control CS provides, and the "I'm not a significant target so I'm safe" is not really a solid defense? Bad quality protection (as exemplified by the present CS issues) isn't a justification for no protection at all.
Would you ignore the principle of least privilege (least user access) and walk around with all the keys to the kingdom just because you're savvier than most at detecting an attack and anyway you're only one person, what are the chances you're targeted? You're the Linux/MacOS of the user world, and "everyone knows those principles are only for the Windows equivalent of users".
I'm not arguing that Linux or Mac need no protection.
There are serious threats to any Linux machine. And if you include Android, there are probably far more Linux machines out there. Hell, including their navigation, router, NAS, TV, and car, my 70+ yo mom runs at least 5 Linux machines at her home. It's a significant target. And Mac is quite obviously a neat target, if only because the demographic usually has higher income (hardly any Bangladeshi sweatshop worker will put down the cash to buy a MacBook or iphone. But might just own an Android or windows laptop)
I'm arguing that viruses aren't a threat, generally. Partly due to the architecture, partly due to their useage.
Neither Linux nor OSX are immune to viruses, though malware is more commonly written to target Windows given its position in the market. Both iOS and Android are frequent malware targets despite neither being related to Windows, and consequently, both have antivirus capabilities integrated deeply into both the OS and the app delivery ecosystem.
Any OS deployed on a user device needs some form of malware protection unless the device is blocked from doing anything interesting. You can generally forgo anti-malware on servers that are doing one thing that requires a smaller set of permissions (e.g., serving a website), but that's not because of the OS they are running.
Sure, “AVG Mobile Security” is available, but nobody needs it, and it isn’t anything like antivirus software on a computer. It provides... a photo vault, a VPN, and “identity protection.”
To tell people that they are vulnerable without something like this on their iPhone is ludicrous.
Nobody meeds antivirus software or malware protection like this on their iPhone, unless they like just giving money away.
If you'll scroll up to the comment you originally replied to, you'll see that I said Android and iOS have AV capabilities built into the OS and app delivery ecosystem. That's more than enough for most users: mobile OSes have something much closer to a capability-based security paradigm than desktop OSes, and both Apple and Google are pretty quick to nerf app behavior that subverts user expectations via system updates (unless it was done by the platform to support ad sales).
Your mobile device is a Turing machine, and as such it is vulnerable to malware. However, the built-in protections are probably sufficient unless you have a specific reason to believe they are not.
The only AV software for mobile devices that I have seen used is bundled with corporate "endpoint management" features like a VPN, patch and policy management, and remote wipe support. It's for enterprise customers that provision phones for their employees.
> You can generally forgo anti-malware on servers that are doing one thing that requires a smaller set of permissions (e.g., serving a website), but that's not because of the OS they are running.
It seems to me like you’re trying to have it both ways.
It really is because of the OS that one doesn’t need to run anti-malware software on those servers and also on the iPhone, which you seem to have admitted.
It seems like we're both trying to make a distinction that the other person thinks is unimportant. But if the crucial marker for you is whether anti-malware protection is built into the OS, then I've got great news for you: Windows has built-in AV, too, and it's more than enough for most users.
The distinction I was trying to make is that the anti-malware strategy used by servers (restrict what the user can do, use formal change control processes, monitor performance trends and compare resource utilization against a baseline and expectations inferred from incoming work metrics) is different from the anti malware strategy used by "endpoints" (scanning binaries and running processes for suspicious patterns).
I'd say very special people need malware protection like this on their iPhone.
Remember NSO Group? Or the campaign Kaspersky exposed last year? Apple successfully made malware on iOS very rare unless you are targeted. But right now, it is impossible for these targeted people to get any kind of protection. Even forensics after being compromised is extremely difficult thanks to Apple's walled garden approach.
The usefulness of a theoretical app that might be able to stop high-power exploits isn’t being debated. The claim I’m objecting to is that everybody should be running (available) antivirus software on their phone.
But if you mean that these highly targeted people would have been helped by running “AVG Mobile Security” or one of the other available so-called “antivirus” apps, then I’ve got an enterprise security contract to sell you. :)
> The claim I’m objecting to is that everybody should be running (available) antivirus software on their phone.
You're objecting to the (much more specific) claim that everybody should be running 3P antivirus software on their phone. Nobody made this claim. You are already running AV software on your phone, and whatever is built into the platform is more than sufficient for most users.
I spent some time on STIG website out of curiosity. There seem to be down-to-earth practical requirements but only for Windows, cf. https://public.cyber.mil/stigs/gpo/
Why does it justify running antiviri on Linux is beyond my understanding.
Weak, impotent, speechless IT personnel that can not face off incompetence?
Windows IT admins who don’t use or understand Linux/Mac. Who also buy at the enterprise level. And who probably have to install (perhaps unnecessary) endpoint protection to satisfy compliance checklists.
The amount of Windows centric IT that gets pushed to Linux/Mac is crazy. I’ve been in meeting where using Windows based file storage was discussed at a possibility for an HPC compute cluster (Linux). And they were being serious. This was in theory so that central IT could manage backups.
To make money? Just because CrowdStrike is available for Linux and Mac doesn't mean that a) people buy and use it in substantial numbers b) people need to buy it. It would be interesting to hear from someone using CrowdStrike in a Linux/Mac environment.
We run Crowdstrike on Linux and Macs so that we can tick some compliance checkbox.
Fun fact: they’ve recommended we don’t install the latest kernel updates since they usually lag a bit with support. We’re running Ubuntu LTS, not some bleeding edge arch. It now supports using ebpf so it’s somewhat better.
The policies are written by folks who have no understanding of different operating environments. The requirement "All servers and workstations must have EDR software installed" leads to top-level execs doing a deal with Crowdstrike because they "support" Linux, Mac, and Windows. So then every host must have their malware installed to check the box. Doesn't matter if it's useful or not.
Indeed and insurance too. For our business, our professional errors and omissions coverage for years had the ability to cover cyber issues. No more. That requires cybersecurity insurance and the underwriters will not entertain underwriting a policy unless EDR is in place. They don't care if you are running OpenBSD and are an expert in cybersecurity who testifies in court cases or none of that. EDR from our list or no insurance.
For macOS? Because without it you don't have certain monitoring and compliance capabilities that are standard built-ins in windows, plus for windows/linux/mac the monitoring capabilities are all useful and help detect unwanted operation.
> I read GP's post to mean that if you take a step back, Windows' history of (in)security is what has led us to an environment where CrowdStrike is used / needed.
Windows does have a history of insecurity, but it is no different from any other software in this regard. The environment would be the same in the absence of Windows.
Attacks are developed for Windows because attacks against Windows are more valuable -- they have a large number of potential targets -- not because they're easier to develop.
In the case of a bad Linux kernel update I would just reboot and pick the previous kernel from the boot menu. By default most Linux distributions keep the last 3. I'm not an IPMI remote management expert but it may be possible to script this.
All my machines at home run Linux except for my work laptop. It is stuck in this infinite blue screen reboot loop. Because we use Bitlocker I can't even get it into safe mode or whatever to delete the bad file. I think IT will have to manually go around to literally 8,000 work laptops and fix them individually.
You would "just pick the previous kernel from the boot menu". That's funny, cause in this case you could "just delete the file causing the issue." Anything can sound easy and simple if you state it that way.
How do you access the boot menu for a server running in the cloud, which you normally just SSH into (RDP in Windows' case)?
About your last paragraph: we have just started sending out the bitlocker keys to everyone so it can be done by them too. Surely not best practice, but it beats everyone having to line up at the helpdesk.
> You would "just pick the previous kernel from the boot menu". That's funny, cause in this case you could "just delete the file causing the issue." Anything can sound easy and simple if you state it that way.
One small difference, is that choosing the kernel from the boot menu is done before unlocking the encrypted drive, so no recovery keys would be necessary. And yes, choosing an entry from a menu (which automatically appears when the last boot has failed) is simpler than entering recovery mode and typing a command, even without disk encryption.
A better analogue would be a bad update on a non-kernel package which is critical to the boot sequence, for instance systemd or glibc. Unless it's one of the distributions which snapshot the whole root filesystem before doing a package update.
NixOS boots to a menu of system configuration revisions to chose from which includes any config change, not just kernel updates.
It's not filesystem snapshots either. It keeps track of input parameters and then "rebuilds" the system to achieve the desired state. It sounds like it would be slow, but you've still got those build outputs cached from the first time, so it's quite snappy.
If you took a bad update, and then boot to a previous revision, the bad update is still in the cache, but it's not pointed to by anything. Admittedly it takes some discipline to maintain that determinism, but it's discipline that pays off.
I don't expect to use it much myself but I love the idea of reducing the OS to an interchangeable part. What matters is the software and its configuration. If windows won't boot for some reason, boot to the exact same environment but on a different OS, and get on with your day.
If something is broken about your environment, fix it in the code that generates that environment--not by booting into safe mode and deleting some file. Tamper with the cause, not with the effect. Cattle, not pets, etc.
This sort of thing is only possible with nix (and maybe a few others) because elsewhere "the exact same environment" is insufficiently defined, there's just not enough information to generate it in an OS-agnostic way.
I can't delete a file if the machine doesn't finish booting. Unless you are suggesting removing the drive and putting it in another machine. That requires a screwdriver and 5 minutes vs. the 10 seconds to reboot and pick a different kernel.
I'm not talking about the cloud. I am talking about the physical machines sitting in front of me specifically my work laptop.
I am an integrated circuit computer chip designer, not a data center IT person. I have seen IPMI on the servers in our office. Do cloud data centers have this available to people?
I have a cheap cloud VM that I pay $3.50 a month. I normally just SSH in but if I want to install a new operating system or SSH is not responding then I log in to the web site and get a management console. I can get a terminal window and login, I can force a reboot, or I can upload an ISO image of another operating system and select that as the boot device for the next reboot and install that.
Does your cloud service not have something like this?
I don't know what our corporate IT dept wants to do. We all work from home on Friday and I can't login to check email so I'll just wait until Monday as there is nothing urgent today anyway.
The OS drive is encrypted with Bitlocker. I've seen another thread where corporate IT departments were giving out the recovery key to users. I don't need to get anything done today. I'll go into the office on Monday and see what they say.
Idk if this is a serious question, but you just turn on console access in the cloud provider. It’s super easy. Same concept as VMWare. It’s possible that not all cloud providers do that, I suppose.
MacOS has been phasing out support for third-party kernel extensions and CrowdStrike doesn't use a kernel extension there according to some other posts.
I’m convinced that one reason for this move by Apple was poor quality kernel extensions written by enterprise security companies. I had our enterprise virus/firewall program crash my Mac all the time. I eventually had to switch to a different computer (Linux) for that work.
It wasn’t Crowdstrike, but quality kernel level engineering isn’t was I think of when I think of security IT companies.
But, also credit Apple here. They’ve made it possible for these programs to still run and do their jobs without needing to run in kernel mode and be susceptible to crashes.
Not only security software, but really any 3rd party drivers have caused issues on Windows for years. Building better interfaces less likely to crash the kernel was a smart move
When I started doing driver development on MacOS X in the early 2000s, there were a number of questions on the kernel/driver dev mailing lists for darwin from AV vendors implementing kernel extensions. Most of them were embarrassing questions like "Our kernel extension calls out to our user level application, and sometimes the system deadlocks" that made me resolve to never run 3rd party AV on any system.
Whether you like macOS or not, they definitely are innovating in this space. They (afaik) are the only OS with more granular data access for permissions as well (no unfettered filesystem access by default, for instance)
It's also a shame CrowdStrike doesn't take kernel reliability seriously
The user can change anything they want, but a process launched by your user doesn't inherit every user access by default. You (the user) can give a process full disk access, or just access to your documents, or just access to your contacts, etc. It's maximizing user control, not minimizing it.
You say they're planning to add a feature in the next release, but what you linked to is merely an uncompleted to-do item for creating a UI switch to toggle a feature that hasn't been written yet. I think you win the prize for the most ridiculous exaggeration in this thread. Unless you can link to something that actually comes anywhere close to supporting your claim, you're just recklessly lying.
The linked Issue #8553 is "just" about creating a toggle for GPU acceleration. It's blocked by Issue #8552 [0], which is the actual Issue about the acceleration and originally belonged to Milestone "Release 4.3". It seems to have been removed later, which I didn't expect or know about. Accusation of lying was completely unnecessary in your comment.
Moreover, the Milestone was removed not because they changed their mind about the Release but for other reasons [1].
Ok, so your [0] shows that the real work has barely been started. The only indication it was ever planned for the next release was a misunderstanding on your part about the meaning of a tag that was applied to the issue for less than one day last fall, and they've stopped tagging issues with milestones to prevent such misunderstandings in the future. It still looks to me like your exaggerated claim was grounded in little more than wishful thinking.
Am I missing something? This is to add a toggle button and the developers say they are blocked because GPU acceleration feature doesn't exist so the button wouldn't be able to do anything.
The issue with Crowdstrike on Linux did not cause widespread failures, so its clear that the majority of enterprises that do run their servers on Linux were not affected. They were invulnerable because they do not need Crowdstrike or similar.
Linux (or BSD) servers do not usually require third party kernel modules. Linux desktops might have the odd video driver or similar.
>Apple has done a much better job with macOS in terms of security and performance.
I really like their corporate IT products that are going to push MS out as you say. I particularly love iActive Directory, iExchange, iSQLserver, iDynamics ERP, iTeams. Apples office products are the reason noone uses Excel any more. Their integration with their corporate cloud, iAzure is amazing. I love their server products in particular, it being so easy to spin up an ios server and have dfs filesharing, dns etc is great. MS must be quaking in their shoes
All of those are product that creates huge risks when deployed to mission critical environments and this is exactly the problem.
The entire wintel ecosystem depends on people putting their heads in the sand and repeating "nobody ever got fired for buying Microsoft/crowdstrike/IBM" and neglecting to run even the most trivial simulation of what happens when the very well understood design flaws of those platforms gets triggered by a QA department you have no control over drops the ball.
The problem is that as long as nobody dares recognizing that the current mono culture around the "market leading providers" this kind of event will remain really likely even if nobody is trying to break it and and extremely likely once you insert well funded malicious actors(ranging from bored teenagers to criminal gangs and geopolitical rivals).
The problem is that adding fair weather product that gives the illusion of control though fancy dashboards on the days they work is not really an substitute for proper reliance testing and security hardening but far less disruptive to companies that don't really want to leave the 90ies PC metaphor behind.
You have 100,000 devices to manage. How do you handle that efficiently without creating a monoculture?
It's not a "90ies PC metaphor" problem. Swap Chromebooks for PCs and you still have the problem-- how do you handle centralized management of that "fleet"?
Should every employee "bring their own device" leaving corporate IT "hands-off"? There are still monocultures within that world.
Poor quality assurance on the part of software providers is the root cause. The monocultures and having software that treats the symptoms of bad computing metaphors aren't good either, but bad software quality assurance is the reason this happened today.
> Swap Chromebooks for PCs and you still have the problem-- how do you handle centralized management of that "fleet"?
Simplicity (and hence low cost) of fleet management, OS boot-verification, no third-party kernel updates, and A/B partitions for OS updates are among the major selling points of Chromebooks.
It's a big reason they have become so ubiquitous in primary education, where there is such a limited budget that there's no way they could hire a security engineer.
The OP was deriding monoculture. My point was that pushing out only Chromebooks is still perpetuating a monoculture. You're just shifting your risk over to Google instead of Crowdstrike / Microsoft.
re: Chromebooks themselves - The execution is really, really good. The need for legacy software compatibility limits their corporate penetration. I've done enough "power washes" to know that they're not foolproof, though.
ChromeOS is just Linux, isn't it? It's going to suffer from the same problem as NT re: a buggy kernel mode driver tanking the entire OS.
Google gets a pass because their Customers are okay with devices with limited general purpose ability. Google is big enough that the market molds product offerings to the ChromeOS limitations. I think MSFT suffers from trying to please everybody whereas Google is okay with gaining market share by usurping the market norms over a period of years.
> ChromeOS is just Linux, isn't it? It's going to suffer from the same problem as NT re: a buggy kernel mode driver tanking the entire OS.
ChromeOS is not just Linux. It uses the Linux kernel and several subsystems (while eschewing others), but it also has a security and update model that prevents third parties (or even the user themselves) from updating kernel space code and the OS's user space code, so basically any code that ships with the OS.
Therefore, the particular way that the Crowdstrike failure happened can't happen on ChromeOS.
However, Google themselves could push a breaking change to ChromeOS. That, however would be no different than Apple or Microsoft doing the same with their OS's.
I am familiar with Google's walled garden w/ ChromeOS. I didn't mean to give the impression that I was not.
It's "just Linux" in the sense that it has the same Boolean kernel mode/user mode separation that NT has. ChromeOS doesn't take advantage of the other processor protection rings, for example. A bad kernel driver can crash ChromeOS just as easily as NT can be crashed.
Hopefully Google just doesn't push bad kernel drivers. Crowdstrike can't, of course, because of the walled garden. That also means you can't add a kernel driver for useful hardware, either. That limits the usefulness of ChromeOS devices for general purpose tasks.
> That also means you can't add a kernel driver for useful hardware, either. That limits the usefulness of ChromeOS devices for general purpose tasks.
It's target market isn't niche hardware but rather the plethora of use cases that use bog standard hardware, much like many of the use cases that CS broke a few days ago.
Yes. I said that in a post up-thread. Google is making the market mold itself to their offering, rather than being like Microsoft and molding their offering to the market. Google is content to grow their market share that way.
If crowdsource QA department is all that stands between you and days of no operations then you chose to live with the near certainty that you will have days rather then hours of unplanned company wide downtime.
And if you cannot actually abandon someone like microsoft that consistantly screws up their QA then it's basically dishonest for you to claim that reliability is even a concern for your desktop platform.
And that's essentially what i say when i accuse the modern enterprise it's client device teams of being stuck in the 90ies as those risk were totally acceptable back when the stakes were low and outages only impacted non time critical back office clerical work. but what we saw today was that those high risk cost optimized systems got deployed into roles where the risk/consequence profile is entirely different.
So what you do is that you keep the low impact data entry clerks and spreadsheet wranglers on the windows platform but threat the customer facing workers dealing with time sensitive task something a bit less risky.
It's might not be as easy as just deploying the same old platform designed back in the 90ies to everyone but once you leave the Microsoft ecosystem dual sourcing based on open standards become totally feasible, at costs that might not be prohibitive as everything in the unix like ecosystem including web browsers have multiple independent implementations so you basically just have to standardize of 2-4 rather then one platform which again isnt unfeasible.
It's telling that an Azure region failed this news cycle without anyone noticing because companies just don't tolerate the kind of risk people takes with their wintel desktop for their backends so most critical services hosted in microsofts Iowa datacenter had and second site on standby.
>And if you cannot actually abandon someone like microsoft that consistantly screws up their QA
The last outage I can remember due to an ms update was 7 or 8 years ago. Desktops got stuck on 'update 100% complete'. After a couple of minutes I pressed ctrl+alt+del and it cleared. Before that...I don't remember. Btw MS provides excellent tools to manage updates, and you can apply them on a rolling basis.
> If crowdsource QA department is all that stands between you and days of no operations ...
For companies of a certain large size, I guess. For all but the largest companies, though, there's no choice but to outsource software risks to software manufacturers. The idea that every company is going to shoulder the burden of maintaining their own software is ridiculous. Companies use off-the-shelf software because it makes good financial sense.
> And if you cannot actually abandon someone like microsoft that consistantly screws up their QA then it's basically dishonest for you to claim that reliability is even a concern for your desktop platform.
When a company has significant software assets tied to a Microsoft platform there's no alternative. A company is going to use the software that best-suits their needs. Platform is a consideration, however I've never seen it be the dominant consideration.
Today's issue isn't a Microsoft problem. The blame rests squarely on Crowdstrike and their inability to do QA. The culture of allowing "security software" to automatically update is bad, but Crowdstrike threw the lit match into that tinderbox by pushing out this update globally.
As another comment points out, Microsoft has good tools for rolling update releases for corporate environments. They're not perfect but they're not terrible either.
> It's might not be as easy as just deploying the same old platform ...
When a company doesn't control their software platform they don't have this choice. Off-the-shelf software is going to dictate this.
In some fantasy world where every application is web-based and legacy code is all gone maybe that's a possibility. I have yet to work in that environment. Companies aren't maintaining the "wintel desktop" because they want to.
Blaming crowdstikes QA might feel good but the problem is that no company in the history of the world have been good enough at QA for it not to be reckless to allow day one patching of critical systems, or for that matter to allow single vendor, single design, critical systems in the first place. and yet the cyber security guidelines required to allow the pretense that windows can be used securely all but demand that companies take that risk.
It's also fundamentally a problem of Danial, everyone knows there will not be an good solution to any issue around security and stability that does not require that the assets tied up inside fragile monopoly operated ecosystems to be eventually either extracted or written off but nobody want to blaze new trails.
Claiming powerlessness is just lazy yes it might take an decade to get out from under the yokel of an abusive vendor, we saw this with IBM, but as IBM is now an footnote in the history of computing it's pretty clear that it can be done once people start realizing there is an systematic problem and not just a serious of one-off mistakes.
And we know how to design reliable systems, it's just that doing so is completely incompatible with allowing any of America's Big IT Vendors to remain big and profitable, and thats scary to every institution involved in the current market.
To be fair, IBM products back in the day when that saying made sense never had these kinds of problems. It's straight up insulting to compare them to somebody like Crowdstrike.
Wintel won by being cheaper and shittier and getting a critical mass of fly by night OEMs and software vendors on board.
IBM was more analogous to the way Apple handles things. Heavy vertical integration and premium price point with a select few software and hardware vendors working very closely with IBM when software and hardware analogous to Crowdstrike in terms of access was created.
> I really like their corporate IT products that are going to push MS out as you say. I particularly love iActive Directory, iExchange, iSQLserver, iDynamics ERP, iTeams.
You’re being sarcastic, but do you like those MS products, specifically Teams?
I genuinely believe that any business that doesn’t make Teams is doing the lords work.
I'm stuck with them on my company Macbook and will definitely say, they suck.
In the 5 years I've been here, Outlook has never addressed this bug (not even sure they consider it a bug): Get an invitation to an event. See it on calendar view. Respond to it on calendar view. Go to inbox. Unread invitation is sitting there in your inbox requesting a response.
I don't even need to talk about why Teams is trash. Terrible design is in Teams's DNA.
In enterprise software, you don't need to be good. Just better than your competitors. I distinctly remember doing a happy jig about 6 years ago when we moved from Skype for Business (shudder) to Teams. Did teams drive me nuts? Absolutely. But I was free from the particular hell of SFB.
TBF I have less experience with Dynamics than the others, but yes they are all excellent.
I include Teams in that. I don't think there is another app on the market that does what Teams does. Integrated video conferencing, messaging, and file sharing in one place. All free with the office package my team already use and fully integrated with Azure AD for sso. I use it all day with zero problems. I honestly can't see why anyone would use anything else
The fact Apple is not trying to be a tentacular behemoth syphoning profits in every enterprise environment does not invalidate the fact macOS is secure and performant.
Apple is a tentacular behemoth in the consumer space.
Not a single statement you purport as "fact" has been true cross large scale deployments in my experience. Especially the first part which tells me you have not experienced working with them as a supplier. I think you mean in your opinion or experience, but please don't attribute wishful thinking to factual statements. It derails objectivity and discussions.
As ridiculous as it sounds, this does work on a subset of the machines affected based on my experience of the last few hours. With other machines you can seemingly reboot endlessly with no effect.
Dynamics, Teams, Exchange, Active Directory all suck. There are better alternatives but CIOs are stuck in 1996. Apple themselves in their corporate IT environment use none of those things yet somehow are one of the biggest and most profitable companies in the world. Azure is garbage compared to AWS. Using Azure Blob vs S3 is a nightmare. MSSQL is garbage compared to PostgreSQL. Slack is vastly better than Teams in literally every aspect. I just did a project moving a company from AWS to Azure and it was simply atrocious. Nobody at the user level likes using MS products if they have experience using non-MS products. It’s like Bitbucket — nobody uses that by choice.
You got to admire Apple fanboy's nerve to say Apple is a better company when it comes to IT in a professional setting.
It appears whatever their basic and narrow use-case is becomes what the whole "corporate IT" is.
Windows sucks and recently Microsoft has been on a path to make it suck more, but saying Apple is better for this part of the IT universe is.. hilarious.
I think if someone wants to criticize Microsoft after experiencing their buggy products for 20 years straight, that is not “baseless,” although I accept that taking responsibility for literally anything our products do goes against the core values of our profession.
The do have some crappy products, but those crappy products make the world move, because nobody really makes better drop in replacement products, same as SAP, Canonical, Android, etc, none of them are fault tolerant, they all have issues and will fail if you fuzz them with enough edge cases, and according to this article CroudStrike caused the issue, not Windows which is what I was pointing at.
Do you think MacOS can't fail if you fuck with it long enough? Sometimes you don't even have to, it just fails by itself. My Ubuntu 22.04 LTS at my previous job gave me more issues than Windows ever did. Thanks Snaps, Wayland and APT. No workstation OS is perfect.
If you want a fault tolerant OS you're gonna have to roll out your own Linux/BSD build based on your requirements and do your own dev and testing. Which company has money for that? So of course they're gonna pick an off-the-shelf solution that best fits their needs on the budget. How is this Microsoft's fault what their customers choose to do with it? Did they guarantee anywhere their desktop OS os fault tolerant should be used in high availability systems and emergency services, especially with crappy endpoint solutions hooked at kernel level?
lol. i’ll dunk on Apple as much as i’ll dunk on any other OS, but they wouldn’t be as praised for security if they had to manage the infrastructure and users that Windows supports
> I particularly love iActive Directory, iExchange, iSQLserver, iDynamics ERP, iTeams. Apples office products are the reason noone uses Excel any more.
I see your sarcasm backfire as most you are listing is just Microsoft dog-food with no real usefulness. The only good thing in your list is Excel, all the rest is bloatware. Teams is a resource hog that serve no useful purpose. Skype was perfectly fine to send messages or have some video call.
I admit I don't have experience as an IT administator but things like managing emails, accounts, database, manage remote computers can be done with well estalished tools from the linux/BSD world.
Wild that you’d write this comment with such a confident voice then.
I worked at a company who’s IT team managed both windows and Mac computers and apparently MS’s ActiveDirectory is leagues ahead of apple’s offering.
Which makes sense. MS is selling windows to administrators, not to users
I'm a die hard FOSS guy, but as someone who has done LDAP work with FreeIPA and OpenLDAP -- AD does a better job.
Admittidly, it's mostly a better job at integrating with Microsoft-powered systems, so it should damn well do a better job, but it's a core business offering and has polish on it in ways that many FOSS offerings don't.
disclaimer: haven't done FreeIPA and LDAP work in the last ~3 years, maybe they got better.
I would disagree. I work in healthcare and we’ve always used SQL Server. While I wouldn’t pick it, it’s been reliable and integrates with auth.
No one “loves” Teams, but honestly it serves its purpose for us at no cost.
No one loves OneDrive but it works.
I think people underestimate how much work it would take to integrate services, train people, and meet compliance requirements when using a handful of the best in class products instead of MS Suite.
People use Teams and OneDrive because it’s “Free” when you use Office. IMO, that’s a bit of an anti-trust problem. Both have good competitors (arguably better competitors) that are getting squeezed because of the monopoly pricing with Office.
But with SQL Server, on the other hand, I think you are right. It is a good piece of software. But it also has high quality competition from multiple vendors. Some of it enterprise (Oracle, DB2), some of it FOSS (Postgres, MySQL). Because of this, it has to be better quality to survive… they couldn’t bundle it to get market share, it actually had to compete.
People use Teams because it's well integrated into Office, 365, Entra and other MS products, they would (and recently do) pay for it. It has functionalities that no other alternative has, e.g. it can act as a full call centre solution through a SIP gateway.
Yeah, sure. But the marginal cost is zero, whereas a Slack subscription for every person in our org will cost about 1 million dollars a year. And it doesn’t integrate as well with every other piece of functional but mediocre software.
The person approving the $1 million dollar budget item doesn’t really care that Teams isn’t “free” in the sense that there is no free lunch, and while they perhaps have moral qualms of antitrust, that’s outside their purview. We’re locked into Office suite and right now there is no extra charge for Teams.
> Teams is a resource hog that serve no useful purpose. Skype was perfectly fine to send messages or have some video call.
I’m sorry, this is a very silly take. I’m no fan of Teams or Slack but I can’t deny the functionality they offer, which is far above and beyond what Skype does.
> I admit I don’t have experience as an IT administrator
Time was, NeXT was a hard sell into corporations because it required so little administration, and what there was was so easily done IT staffs were hugely cut back after implementing them.
Had to move my Cube this past week-end, and it made me incredibly sad.
Using a NeXT Cube w/ Wacom ArtZ and an NCR-3125 running Go Corp.'s PenPoint (and rebooting into Windows for Pen Computing when I wanted to run Futurewave Smartsketch) was the high-water mark of my computing experience.
It was elegant, consistent, reliable, and "just worked" in a way no computer has since (and I had such high hopes for Rhapsody and the Mac OS Public Beta).
Then you probably shouldn't speak on software exclusively understood and administered by IT administrators. I've worked in IT for some time and every single one of those products(aside from Dynamics) have been the most important parts of our administrative stack.
Even Excel is beginning to be regarded as a dangerous piece of software that gives the illusion of power while silently bankrupting departments who depend on the idea that large spreadsheets is an accurate and reliable way to analyze large/complex datasets.
the 90ies are over but for some reason average enterprise department have a problem internalizing the fact that the demands today is different then they were 25 years ago.
Meanwhile, while HN bubble imagines people doing big data jobs on Excel, in the real world 10s or 100s of millions of people are perfectly satisfied doing small data jobs in Excel.
The problem is that without tools and processes to systematically validate those result's people might be perfectly happy about completely inaccurate results.
I know i have had to correct one in three excel sheet i have ever gone over using pen and paper in order to validate the results but i am a paranoid sod who actually do this kind of exercise on a regular basis.
almost all of the disciplines known to rely on excel have a serous issue with repeatability of results either because nobody ever attempts it, or because it's a messy field without a well defined methodology.
I work in finance. We have double entry accounting and literal checks and balances to validate our results. It is not a messy field, and has a well defined methodology. We have been the biggest spreadsheet users at many of the companies I have worked with.
I used to run a C++ shop, writing heavy-duty image processing pipeline software.
It did a lot, and it needed to do it in realtime, so we were constantly busting our asses to profile and optimize the software.
Our IT department insisted that we install 'orrible, 'orrible Java-based sneakware onto all of our machines, including the ones we were profiling.
We ended up having "rogue" machines, that would have gotten us in trouble, if IT found out (and I learned that senior management will always side with IT, regardless of whether or not that makes sense. It resulted in the IT department acting like that little sneak that sticks his tongue out at you, while hiding behind Sister Mary Elephant's habit).
But, to give them credit, they did have a tough job, and the risks were very real. Many baddies would have been thrilled to get their claws on our software.
We were in need of an MDM to help staff (non-techs) with their Mac books. I haven't noticed any issues, nor have two of my staff who are trialling it. What's been your main gripe?
I'm a Dev but also manage the It team of one sys admin and haven't noticed any performance hits. Yet anyway, but it's only been two weeks.
Installing software is painful- some of this is perhaps related to how the IT group has restricted so much for us, can't even change my screen saver, and weirdness like bizarre pop ups asking for your password from time to time. It just doesn't belong on a developer machine.
The poor quality of Windows and associated software is not the problem here. The problem is that Microsoft especially, but software vendors generally, encourage users to blindly accept updates which they do not understand or know how to roll back. And by "encourage" I mean that they've removed the "no thanks" and "undo" buttons.
Here on Linux (NixOS), I am prompted at boot time:
> which system config should be used?
If I applied a bad update today, I can just select the config that worked yesterday while I fix it. This is not a power that software vendors want users to have, and thus the users are powerless to fix problems of this sort that the vendors introduce.
It's not faulty software, it's a problematic philosophy of responsibility. Faulty software is the wake-up call.
What makes you think the FAANG companies don't use windows? Spent four years at Amazon recently and unless you were a dev, you were more likely to have a windows PC than Mac. Saw zero Linux laptops.
Leave FAANG and most internal developers at large corporations are running Windows. It wasn't until I started at a smaller shop that I found people regularly using Linux to do their jobs, usually in a dual-boot or with a virtual Windows install "just in case" but most never touched it.
I'm presently working supporting a .NET web app (some of which is "old .NET Framework) but my work machine runs OpenSUSE Tumbleweed. I can't see that flying at the larger shops I have previously worked at. I'll admit, that might be different -- today -- I haven't worked at a large shop in more than a decade.
Most corporations have no interest in paying the cost of running a multi-OS IT shop nor dealing with the challenges of fleet management with both Linux and Mac that make running those fleets more expensive and challenging.
That's before you factor in that almost everyone in IT is a born and bred in Windows and in almost every case people tend to choose what they know best.
Depend on which FAANG I guess. Approaching now 10y at Google and I saw Windows laptops only used by very few sales people. Everyone else is either using Macs or Chromebook.
Fellow Googler here. I'm the exception that proves the rule. After 7 years of Macbook and Linux devices, I needed Windows for a special project, so I got a "gWindows" device and found it very well supported.
Aside from the specific Windows-only software I needed, I would still just ssh into a Linux workstation, but gWindows can do basically everything my Mac can. I was pleasantly surprised.
> The Windows ecosystem typically deployed in corporate PCs or workstations is often insecure, slow, and poorly implemented
Yes, but that's not because of Windows itself (which is fast and secure out of the box) but because of an decades-old "security product" culture that insists on adding negative-value garbage like Crowdstrike and various anti-virus systems on the critical path, killing performance and harming real security.
It's a hard problem. No matter how good Windows itself gets and no matter how bad these "security products" become, Windows administrators are stuck in the same system of crappy incentives.
Decades of myth and superstition demand they perform rituals and make incantations they know harm system security, but they do them anyway, because fear and tradition.
It's no wonder that they see Linux and macOS as a way out. It's not that they're any better -- but they're different, and the difference gives IT people air cover for escaping from this suffocating "you must add security products" culture.
Disagree. At least in the context of business networks.
My favorite example is the SMB service, which is enabled by default.
In the Linux world, people preach:
- disabling SSH unless necessary
- use at least public key-based auth
- better both public key and password
- don't allow root login
In Windows, the SMB service:
- is enabled by default
- allows command execution as local admin via PsExec, so it's essentially like SSH except done poorly
- is only password-based
- doesn't even support MFA
- is not even encrypted by default
It's a huge issue why everyone gets encrypted by ransomware.
I always recommend disabling it using the Windows firewall unless it is actually used, and if it is necessary define a whitelist of address ranges, but apparently it is too hard to figure out who needs access to what, and much easier to deploy products like Crowdstrike which admittedly strongly mitigate the issue.
The next thing is that Windows still allows the NTLM authentication protocol by default (now finally about to be deprecated), which is a laughably bad authentication protocol. If you manage to steal the hash of the local admin on one machine, you can simply use it to authenticate to the next machine. Before LAPS gained traction, the local admin account password was the same on all machines in basically every organization. NT hashes are neither salted nor do they have a cost factor.
I could go on, but Microsoft made some very questionable security decisions that still haunt them to this day because of their strong commitment to backwards compatibility.
You don't need Crowdstrike to disable any of these things. You can use regular group policy. I'm not saying Windows can't be hardened. I'm saying these third party kernel hooks add negative value.
Fun fact, these negative value garbage offerings are often “required” by box checking certifications like SOC2.
Sure, if you have massive staffing to handle compliance you might be able to argue you’ve achieved the objective without this trash. The rest of us are just shrug and do it.
Some of the “compliance managers as a service” push you in this direction as well.
Why do companies need these "box checking certifications"? I imagine the answer, as usual, is that either they or one of their customers is working with the government which requires this for its contractors. That's usually the answer whenever you find an idiotic practice that companies are mindlessly adopting.
Pretty much. We’re in the healthcare space and most of our customers are large hospital systems. Anything except “SOC2 compliant, no exceptions on report” will take an already long deal cycle (4-18 months) and double or triple it.
If you’re a startup it also means that your core people are now sitting in multiple cycles of IT review with their IT staff filling out spreadsheet after spreadsheet of “Do you encrypt data in transit?”
> > The Windows ecosystem typically deployed in corporate PCs or workstations is often insecure, slow, and poorly implemented
> Yes, but that's not because of Windows itself
Come on. There’s a reason Windows users all want to install crappy security products: they’ve been routinely having their files encrypted and held for ransom for the last decade.
And Linux/BSD generally would not help here. Ransomeware is just ordinary file IO and is usually run "legitimately" by phished users rather than actual code execution exploits
I have a similar disdain for security bloatware with questionable value, but one actually effective corporate IT strategy is using one of those tools to operate a whitelist of safe software, with centralized updates
I think having a Linux/BSD might be helpful here in the general case, because the culture is different.
In Windows land it's pretty much expected that you go to random websites, download random executables, ignore the "make changes to your computer?" warnings and pretty much give the exe full permission to do anything. It's very much been the standard software install workflow for decades now on Windows.
In the Linux/BSD world, while you can do the above, people generally don't. Generally, they stick to trusted software sources with centralized updates, like your second point. In this case I don't think it's a matter of capability, both Windows and Unix-land is capable of what you're suggesting.
I think phishing is generally much less effective in Max/Linux/BSD world because of this.
Until a a lucrative contract requires you to install prescribed boutique windows-only software from a random company you've never heard of, and then it's back to that bad old workflow.
Yeah, because no one on Linux or Mac would clone a git repo they just found out about and blindly run the setup scripts listed in the readme.
And no one would pipe a script downloaded with wget/curl directly into bash.
And nobody would copy a script from a code-formatted block on a page, paste it directly into their terminal and then run it.
Im not going to go so far as to claim that these behaviors are as common as installing software on Windows, but they are still definitely common, and all could lead to the same kinds of bad things happening.
I would agree this stuff DOES happen, but typically in development environments. And I also think its crappy practice. Nobody should ever pipe a curl into sh. I see it on docs sometimes and yes, it does bother me.
I think though that the culture of robust repositories and package managers is MUCH more prominent on Mac/iOS/Linux/FreeBSD. It's coming to Windows too with the new(er) Windows store stuff, so hopefully people don't become too resistant to that.
A developer is much more likely to be able to fix their computer and/or restore from a backup than a typical user is. A significant problem is cascading failures, where one bozo installing malware either creates a business problem (e.g. allowing someone to steal a bunch of money) or is able to disable a bunch of other computers on the same network. It is not that common for macOS to be implicated in these sorts of issues. I know people have been saying for a long time that it’s theoretically possible but it really doesn’t seem that common in practice.
I'd wager if Linux had the same userbase as Windows, you'd see more ransomware attacks on that platform as well. Nothing about Linux is inherently more secure.
> Yeah I don't get where this "Linux is more secure" thing comes from.
It comes from the 1990s and early 2000s. Back then, Windows was a laughingstock from a security point of view (for instance, at one point connecting a newly installed Windows computer to the network was enough for it to be automatically invaded). Both Windows and Linux have become more secure since then.
> Basically any userspace program can read your .aws, .ssh, .kube, etc... The user based security model desktops have is the real issue. Compare that with Android and iOS for instance. No one needs anti-virus bloatware, just because apps are curated and isolated by default.
Things are getting better now, with things like flatpak getting more popular. For instance, the closed-source games running within the Steam flatpak won't have any access to my ~/.aws or ~/.ssh or ~/.kube or etc.
What fraction of ransomware attacks would these security products have prevented exactly? Windows already comes with plenty of monitoring and alerting functionality.
Probably close to none at some point. They may block some things.
But most of Windows falling to this is that it’s what people use. The only platform that is somewhat actually protected against attacks is the iPhone - the Mac can easily be ransomwared it’s just the market is so small nobody bothers attacking it; no ROI.
Yeah. The mobile ecosystems are what real security design looks like. Everything is sandboxed, brokered, MACed, and fuzzed. We should either make the desktop systems work the same way or generalize the mobile systems into desktops.
The mobile ecosystem is what corporate IT should be. Centralized app store, siloed applications, immutable filesystem (other than the document part for each application), then VM and specials computers for activities like development. However locked iOS can be, most upgrades happen without an hitch, and no need for security software.
Hard to say, but windows defender doesn't stop as many as EDR's can. There are actual tests for this, ran by independent parties that check exactly this. Defender can be disabled extremely easily, modern EDRs cannot.
Yes, average Windows users are significantly less tech literate due to obvious reasons and there are way more of them. This create a very lucrative market.
How is desktop Linux somehow inherently particularly more secure than Windows?
okay so i did and he defs claims windwos was secure out of the box. so again, i ask if he really said that ahaha, with a straight face.
SMB 1.0 is enabled, non admin users have powershell access, defender can be disabled with a single command, user is admin by default, passwords can be reset via booting to a bootable media device and then swapping its CLI to c:
there are so many basic insecurities out of the box in windows.
Apple on the desktop/laptop, Google in the cloud for email, collaboration, file sharing, office suite. I ran a substantial sized company this way for a decade. Then we did a merger and had to migrate to Microsoft- massive step backwards, quintupling of IT problems and staff.
> Companies that are not software-focused, as it's not their primary business. These organizations are left with Microsoft's offerings
I wonder why is it the case. These companies still have IT departments, someone has to manage these huge fleets of Windows machines. So nothing would prevent them from hiring Linux admins instead of Windows admins. What makes the management of these companies consider Windows to be the default choice?
1. Users are more comfortable running Windows and Office because it's Windows they likely used in school and on personal laptops.
2. This is the biggie: Microsoft's enterprise services for managing fleets of workstations are actually really good -- or at least a massive step up from the competition. Linux (and it's ilk) is much better for managing fleets of servers, but workstations require a whole different type of tooling. And once you have AD and it's ilk running and thus Windows administrators hired, it's often easier to run other services from Windows too, rather than having to spin up another cluster of management services.
Software focused businesses generally start out with engineers running macOS or Linux, so they wouldn't have Windows management services pre-provisioned. And that's why you generally see them utilising stuff like Okta or Google Workspace
Unfortunately Google did not succeed to get more into schools around the globe with chromebooks, which is a pity by my opinion. That helps to keep the Win/Office monopoly situation to go on in organizations and businesses hiring people who never used another software than one from Microsoft.
One reason being that Microsoft lobby hard against low-end PC & notebooks that are not aligned with its interests. [1]
Microsoft has a large, entrenched distribution network and market all over the world. It makes an uphill battle to create low-end programs for schools, universities, governments, SMBs.
Hence the phrase "no one was ever fired from buying Microsoft". It's too hard a battle to go against the flow.
Inertia, plus integration - AFAIK Exchange and SharePoint don't run on Linux, so if the company buys into that, then it's Windows all the way down.
Still, all this is a red herring. Using Linux instead of Windows on workstations won't change anything, because it's not the OS that's the problem. A typical IT department is locked in a war on three fronts - defending against security threats, pushing back on unreasonable demands from the top, and fighting the company employees who want to do their jobs. Linux may or may not help against external attackers, but the fight against employees (which IT does both to fulfill mandates from the top and to minimize their own workload) requires tools for totalitarian control over computing devices.
Windows actually is better suited for that, because it's designed to constrain and control users. Linux is designed for the smart user to be able to do whatever they want, which includes working around stupid IT policies and corporate malware. So it shouldn't be surprising corporate IT favors Windows workstations too - it puts IT at an advantage over the users, and minimizes IT workload.
>Windows actually is better suited for that, because it's designed to constrain and control users. Linux is designed for the smart user to be able to do whatever they want, which includes working around stupid IT policies and corporate malware.
This just tells me you don't know linux. Linux can be much more easily hardened and restricted than windows. It's trivial to make it so that a user can only install whitelisted software from private repos.
Excel. There is no other software that can currently fill excel’s role in business. It’s the best at what it does and what it does is usually very important. Unfortunately.
The situation might have changed since I last used Excel on Mac, but in 2018, the "Excel" on Mac barely resembled the Excel on Windows. Many obvious and useful features were missing.
My guess is that the fact you can buy about two to three cheap Dell desktop machines for the price of one Mac probably factors quite heavily into the equation.
If you’re only doing vacation travel planning, sure. But there’s a long tail of advanced functionality used across all kinds of industries (with plugins upon plugins) that are most certainly not even close to being supported by any of the options proposed.
I don't know, but I would guess that Microsoft Office is what retains people; personal anectodal experience suggests that anything else (Apple's offerings, Google Docs, LibreOffice &c.) is not acceptable to the average user.
My suspicion is that Microsoft would be very unhappy to have MS Office running successfully on Linux systems.
A lot actually don’t, in any meaningful sense. My partner’s company has a skeleton IT staff with all support requests being sent offshore. An issue with your laptop? A new one gets dispatched from ??? and mailed to you, you mail the old one back, presumably to get wiped and redispatched to the new person that has a problem.
Tooling, infra, knowledge? The only reason why people are talking about "issues in Windows" because people are widely using it.
If linux had software anywhere close to the amount that windows has, it would have experienced the same issues too. After all it is not just about running a server and tinkering with config files. It is about ability to manage the devices, rolling out updates and so on.
You have to also factor in competition. I think it's a big factor on why corporate IT is generally bad, Microsoft and their partners have no reason to improve on the status quo. If we had viable alternatives, in a market where no entity has more than 20% market share or something like that the standards would be much higher.
The whole idea of running a backdoor with OS privileges in order to increase system security screams Windows. In Linux, even if Crowdstrike (or similar endpoint management software) is allowed to update itself, it doesn't have to run as a kernel driver. So a buggy update to Crowdstrike would only kill Crowdstrike and nothing else.
And Linux is not even a particularly hardened OS. If we could take some resources from VC smoke and mirrors and dedicate them to securing our critical infrastructure we could have airports and hospitals running on some safety-critical microkernel OS with tailored software.
the comment I am replying to explicitly mentions Linux as an alternative to Windows. In any case, yes, one could use Mac, as I do, but it comes with its own issues, starting from price. I'd happily switch 100% to Linux if I didn't need to work on documents edited with Office. The online version may actually solve this, but it's still buggy as hell.
Word, Excel, Powerpoint and all the other windows software. Plus all the people that know how to use the windows software vs Linux equivalents (if they exist).
Purchasing decisions are made by purchasing managers. Purchasing managers spend their time torturing numbers in spreadsheets, writing reports, and getting free lunches from channel sales reps. Microsoft is just a sales organization with some technical prowess, and their channel reps are very effective.
Technical arguments, logic, and sense do not contribute much to purchasing decisions in the corporate world.
I'd say something implementing the ideas of NixOS, i.e. immutable versioned systems and declarative system definitions, is poised to replace the current deployment mess, which is extremely fragile.
With NixOS, you can upgrade without fear, as you can always roll back to a previous version of your system. Regular Linux distributions, macOS, and Windows make me very nervous because that is not the case.
> I'd say something implementing the ideas of NixOS, i.e. immutable versioned systems
NixOS isn't immutable, things aren't mounted read only. AFAIK, it can't be setup that way.
> With NixOS, you can upgrade without fear, as you can always roll back to a previous version of your system. Regular Linux distributions, macOS, and Windows make me very nervous because that is not the case.
The store is immutable in the functional programming sense, as the package manager creates a new directory entry for each hash value.
Backups could be an option, but it is much better to have a system where two computers are guaranteed to be running the exact same software if configuration hashes are the same.
In other OSes, the state of your system could depend on previous actions.
> Regular Linux distributions, macOS, and Windows make me very nervous because that is not the case.
I'm personally only really nervous when updating Linux distributions. Besides security updates it usually hardly matters or is noticeable on macOS/Windows (well besides the random UX changes..).
Ideally there would be a usable security first os based on something like sel4 with a declarative package system for slow to change mission critical appliances.
In NixOS, you have a bootloader to load your OS. Unless you botch your bootloader, you can't paint yourself into an unbootable state. If one system configuration doesn't work, you reboot and choose the prior one before the OS begins to load in a menu displayed by the bootloader.
This is also true of most regular Linux setups. Except that in those, you can only choose the kernel. Hence, if you have broken other parts of your configuration, your system might not be bootable. So the safety net is much thinner.
Because you just want stuff to work and couldn't care less about the ideology part?
Also no feature parity (it's not about Windows being "better" than Linux or the other way around, none of that matters) there are not out of the box solutions to replace some of the stuff enterprise IT relies in Windows/etc. which would mean they'd have to hire expensive vendors to recreate/migrate their workflows. The costs of figuring out how to run all of your legacy Windows software, retraining staff etc. etc. would be very significant. Why spend so much money with no clear benefits?
To be fair I'm not sure how Apple figures into this. They don't really cater to the enterprise market at al..
Why? Both things seem pretty tangential. Poorly written software exists or can exist on any platform, just like the IT infrastructure wouldn't somehow automagically become robust if they just switched to Linux.
When I took a Linux course in college I had an old laptop that I installed Linux on. However, for some reason my wireless card wouldn't work. I mentioned it to my professor and the next day he told me "It's actually quite simple, you just have to open up the source code for the wireless driver and make a one line change."
Maybe things have gotten better, but I think that's why people use Mac. It's POSIX but without having to jump through arcane hoops.
The problem with the linux desktop was usually that most hardware companies were either not spending any time/effort on non-windows drivers/compatibility or when they did it was a tiny fraction of the effort that went into working around bugs in the windows driver API's.
Today with the failure of windows in both the mobile and industrial control space we now see vendors actually giving a damn about the quality of their Linux drivers.
Today the main factor keeping the enterprise marked locked on windows is the fat clients written around the turn of the millennium, and that's as much a problem for mac adaptation as it is Linux adaptation.
The macs are slick well designed devices that speaks to a huge segment of the consumer market so will eventually find the way into the high cost niches where no specific dependency on legacy software exists but they are too expensive and inflexible to replace all of the wintel system so for Microsoft and it's partners to have their license to screw over the enterprise sector revoked Linux(or FreeBSD) will have to play a role too.
Things have definitely gotten better. I remember the painful years. My most recent Ubuntu install on a new laptop was about 3 years ago. As someone who has used Linux as the daily driver for more than a decade (and dual booted as a second OS for another decade) I was pleasantly surprised that everything just worked! I think that was a first
It was an HP from Costco, not something special sold with Linux. My wireless worked, dual monitors just worked, even the fingerprint reader that I never use. I remember sitting there thinking "I didn't have to fight anything?" Hopefully that becomes the norm, maybe it is - I haven't needed a new laptop yet.
Because for some people (certainly not all), their objection is not to a "corporate" OS, but to the specific things Microsoft does that Apple does not.
Honestly, windows out of the box is pretty secure. I don't want to defend Microsoft here, but adding third party security to Windows hasn't been anything but regulatory compliance at best and cargo culting at worst for over a decade now. If you actually look at core windows exploits compared to market share, they're comparable to Apple. Enterprises insist on adding extra attack surface area in the name of security.
I agree that people who actually know what they're doing are generally running Linux backends, but Microsoft have enterprise sewn up, and this attack is not their fault.
A lot of active directory defaults are wildly insecure, even on a newly built domain, and there are a lot of active directory admins out there that don't know how to properly delegate as permissions.
This is true. You are basically one escalation attack on the CFO away from someone wiring money to hackers and a new remotely embedded admin freely roaming your network.
Windows is leagues ahead of MacOS in terms of granularity of permissions and remote management tools. It's not even close. That's mainly why enterprise IT prefers it to alternatives.
downvoted, because in your response you conflate two issues:
1. The problem with using Microsoft
2. The lack of institutional knowledge of securing BSD and MacOS and running either of those at the scale Microsoft systems are being run at.
The vast majority of corporate computer endpoints are running windows. The vast majority of corporate line-of-business systems are running Windows Server (or alternatively Microsoft 365).
That means a whole lot of people have knowledge on how to administer windows machines and servers. That means the cost of knowledge to adminster those systems is going down as more people know how to do it.
Contra that with MacOS Server administration, endpoint administration, or BSD Administration. Far fewer people know how to do that. Far fewer examples of documentation and fixing issues administrators have are on the internet, waiting to help the hapless system administrator who has a problem.
It's not just about better vs. worse from your perspective; it's about the cost of change and the cost of acquiring the knowledge necessary to run these corporate systems at scale -- not to mention the cost of converting any applications running on these Windows machines to run on BSD or MacOS -- both from an endpoint perspective and a corporate IT system perspective.
It's really not even feasible to suggest alternatives to any of the corporations using Microsoft that are impacted by this outage.
If you want to create an alternative to Microsoft's Corporate IT Administration you're gonna need to do a lot more than point to MacOS or BSD being "better".
The voice quality and pronunciation are excellent. However, the system struggles with acting, so the tone and emotional expression are often wrong during dialogues. Additionally, I have to fragment the text into short paragraphs, making it challenging to set appropriate break durations, resulting in an unnatural rhythm.
Despite the technical quality and my appreciation for the reading voice, I won't continue in this direction.
ElevenLabs is quite expensive, but it would be worth it if the final result were good enough for listeners to purchase the audiobook.
I don't know if using OpenAI's API in English would yield better results. However, OpenAI's performance in non-English languages is not satisfactory.
Some time ago I set up a server for a website and I was appalled, like many others, by the number of SSH connection attempts. I decided to open SSH only in a randomly chosen port number above 1024 and now I have essentially zero probing attempt. It is trivial but for me is a satisfying configuration.
This was true in 2018. In recent years I get 100s, sometimes 1000s of login attempts a day on high addresses.
My servers are on AWS addresses. If someone searches for servers (as opposed to routers, phones etc.) AWS might be a preferred address range. No experience whether scan rates depend on the address used.
There are open port scanners that just check what ports are open on which IPs, and there are separate ssh login brute-forcers. Once your machine gets picked up by the former, the latter will pile up.
I have two servers on adjacent IPs, both with ssh listening on a high port. One gets hammered with login attempts and the other does not.
Often with default parameters such as zmap setting ip id to 54321, having tcp initial window at 65535, having no SACK bit set and masscan with no SACK bit either, tcp initial window at 1024, tcp maximum segment size 1460 (which is strange to put below initial window size!), (older versions having fixed src port 61000 or 60000 from documentation examples and no MSS set), all of which are extremly uncommon in legitimate traffic and thus easily identified.
Even those so called "legitimate" scanners (emphasis on the "") seem to use these tools with little or no extra configuration.
With this setup the last time my high-port ssh (key-only) has got an attempt on it was 2023-07-26 (previous intruders get permanently firewalled).
This might not matter for your setup, but I would have thought it's bad
in general to have sshd listening on a high port
because then any non-root user who finds a way to crash it can
replace it with his own malicious ssh server on the same port.
That's a good point, though you could use some firewall rules to rewrite the port number so that the local daemon is listening on the normal port but accessible via an alternate high numbered port.
Maybe that's the case. The machines where I am seeing a lot of ssh login attempts on high ports have been on the same IPv4 address for years. Some since 2018.
Interesting to know. For the moment, several months, I still have no login attempts but so that means my server didn't get picked up by any port scanner.
Yes people claim that but everyone with a grain of salt in his mind know this is not true. Yes, in some cases an LLM can write from scratch a python or web demo-like application and that looks impressive but it is still far from really replacing a SWE. Real world is messy and requires to be careful. It requires to plan, do some modifications, get some feedback, proceed or go back to the previous step, think about it again. Even when a change works you still need to go back to the previous step, double check, make improvements, remove stuff, fix errors, treat corner cases.
The LLM doesn't do this, it tries to do everything in one single step. Yes, even when it is in "thinking" mode, in thinks ahead and explore a few possibilities but it doesn't do several iterations as it would be needed in many cases. It does a first write like a brilliant programmers may do in one attempt but it doesn't review its work. The idea of feeding back the error to the LLM so that it will fix it works in simple cases but in most common cases, where things are more complex, that leads to catastrophes.
Also when dealing with legacy code it is much more difficult for an LLM because it has to cope with the existing code with all its idiosincracies. One need in this case a deep understanding of what the code is doing and some well-thought planning to modify it without breaking everything and the LLM is usually bad as that.
In short, LLM are a wonderful technology but they are not yet the silver bullet someone pretends it to be. Use it like an assistant to help you on specific tasks where the scope is small the the requirements well-defined, this is the domain where it does excel and is actually useful. You can also use it to give you a good starting point in a domain you are nor familiar or it can give you some good help when you are stuck on some problem. Attempt to give the LLM a stack to big or complex are doomed to failure and you will be frustrated and lose your time.