Does anyone else feel like if Windows was still a classic OS instead of "as a service" then these types of changes would have a chance to get more scrutiny during development and hopefully get caught?
Since they supported multiple versions at once, important bugfixes were backported, but risky new features were not. For a year or so after each release, the previous release was more stable. This let the customer choose between security/stability and new features.
I think it’s a great model, but it breaks down when the new version is a feature/usability regression vs the old version.
Once that started happening more often than not, the “aas” models came out, where the vendors started force upgrading users instead of trying to compete with version N-1.
This is worse for the users, but better for deadline-focused middle management, so it’s become incredibly popular.
This. Twenty years ago, if you reported a bug to MS and they agreed it was a bug and they agreed to fix it then you might actually have working code in three years time when the next version of Windows was released. Five years after that all of your customers might have upgraded to this new version of Windows. And then you could rely on the fix.
Important customers could get a hotfix. For anyone else reporting bugs was pointless.
In its consumer operating systems, MS used to not consider it a bug for a process to exploit other processes on the same machine. After all, why would you be running untrusted code?
I'd also add that Microsoft used to be notoriously bad at security, such that even Mac owners would make fun of them. We're still dealing with the design decisions made during that era. It's only since XP SP1 that security was taken more seriously and it wasn't until Vista that they truly started to grapple with the whole mess from the ground up.