Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Odd, this seems to be happening across the industry and is now either reported better or just more prevalent. Besides bricks (well, is it a brick if you can recover?) we have seen post-update BSOD's, data loss, browsers going bad, settings getting reset on various operating systems of the big vendors out there. I know that it isn't like software 'used to be perfect' but this type of defect feels bigger than it used to be.


I guess two reasons:

1. The OS update policies became more "agile". Both Windows and macOS starts to update the OS more often than ever. macOS moved to an annual update from biannual, Windows 10 switched to a subscription-like model.

2. They laid off the testing team and stopped testing on actual PCs in favor of VMs. [1] It makes sense considering the broken updates are often more low-level bugs that cannot be caught by VMs.

[1]: https://www.techradar.com/news/windows-10-problems-are-ruini...


> They laid off the testing team

Must be good for the bottom line.

The cost is absorbed by the unlucky users.


It's another case of focusing on the short term to save money. In the long term it damages their reputation and some frustrated businesses/users might start to consider moving away from their products if a viable alternative exists.


Unfortunately there’s only two players in this game (I doubt mass consumption of linux any time soon). So, if they both suck, neither loses the reputation game.


If everyone does the same, then reputation doesn't matter as much.


I think this is the saddest part:

was meant to resolve issues created by the 10.15.4 update in late March

The "fix some bugs, add some new ones" trend has certainly gotten more common within the past few years. I still remember when the official advice was not to install updates unless you were experiencing the issues it was to fix, but that wasn't really applicable to security fixes, and now official advice seems to be "always install" regardless of security or indeed usefulness.


I think it's always been there. I remember reading a paper that said for every two bugs fixed a new one is created.


I don't think it's more prevalent. I think it's more notable precisely because we've gotten so used to our devices and data being stable. Don't forget that in the past it was pretty much expected you'd lose or break something when upgrading your OS, and that you could pretty much count on physical hard drive failure resulting in data loss at some point in the life of your PC. BSOD's especially were almost certainly more prevalent in the bad old days of Windows. Looking just at Mac, don't forget about when PRAM resets were part of the normal debug flow.


> this seems to be happening across the industry and is now either reported better or just more prevalent.

Isnt that because lof of companies are eliminating QA people and expect the devs to do the testing.


I've experienced it myself with one of the Android apps i use (a VoIP client). The latest update killed my ability to call out through my office PBX, and I had to fish the old version out of one of my backups, since the Play Store oh-so-helpfully doesn't let you install older versions of an app.


Move fast, break things!

On a serious note, possibly because things are much too capable and complex than they used to be, combined with tight release cycles.


Also more software systems enable "automatic updates". Whereas in the past only those that chose to upgrade were affected, now everyone is affected except for those that chose not to enable automatic updates.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: