My first day at my first full-time programming gig, I was asked to look at some reporting job that had been failing to run for months. I logged in, found the error logs, found that it needed a bit more memory assigned to the script (just a tweak in the php.ini) and let the team lead know it should run fine that night. He was shocked, "Dude, if you just fixed that report you probably just got a promotion, no one has been able to figure that out for months." He was joking about the promotion, but my boss was just as shocked. I'd realize later that most the other people on the dev team didn't like linux and wanted to rewrite everything in .NET and move everything to Windows so no one even tried with anything related to any of the linux machines.
I know things have gotten somewhat better, but the amount of wasted time and latency of using RDP and Windows UI for development, testing and production maintenance is insane. Throw in some security requirements of RDP into host 1 to the RDP jump to host 2 and companies are just wasting money on latency. There is, often, not an appreciation of the administrative costs of the delivery. Not necessarily system admin costs, but developer and QA time associated with delivering and ongoing maintenance.
>Throw in some security requirements of RDP into host 1 to the RDP jump to host 2 and companies are just wasting money on latency.
So much this at my job, login to my laptop, login to vpn using password and 2-factor, check out my admin credentials using my login and 2-factor, login to the jump box using my admin creds and a different 2-factor, finally login to the system I need to be on using my admin creds. Multiply the last step by however many systems I need to connect to. Also, the clipboard and screen resolution are going to get messed up along the way no matter how much you mess with the settings.
Oh yeah, been there. I personally find Windows to be a terrible server OS. I'd rather be stuck with a fleet of AIX machines or something equally uncommon (but still nix-ish) than any Windows system.
The reverse also applies, where people that like Linux just totally ignore Windows maintenance. They’ll look down their noses at it and say that it’s insecure when they spend their time patching Linux and leaving Windows to fend for itself against attackers.
Someone must have made that "smart" decision to use servers with a proprietary OS, so that someone surely has paid the extra cost for MS support. No need to worry then.
I am not talking about Windows updates here. But even if I was, there are things wrong with it. One example, that frequently bites me on my gaming OS:
Windows 10 by itself decides: "Hey, I'm going to replace your installed graphics card driver with some other version, that I will find myself online!" Then next time I start this OS, I get crashing graphics card driver and need to go and look on AMD website for the correct drivers, installing them over whatever crap Windows installed. This has happened at least 3 times already and it is pissing me off. It would probably be better to deactivate automatic Windows updates completely and then try to have a manual selection of installed components. But even then I could not be sure, that it will not try to replace my installed drivers with crap.
I think AMD has that too in its app, but if your graphics card driver is crashing in windows, your screen is freezing. Windows might try to restart the driver or so, getting the screen working briefly, but then it crashes again and ultimately it fails completely, you can only press the power button on your machine then.