The title incorrectly implies implies this is the only bad memory leak in Monterey (even though the author acknowledges it's not the only in an edit complaining about 'comment bombing').
While the pointer one may be the worst culprit, it's definitely not the only one - every few days I see Control Center using >3GB (once force-killed, it goes back down to 26MB). It's not a massive issue on my machine (I just check memory usage every day) but it gives a very bad impression that Apple haven't fixed it yet.
I'm sure is especially challenging for non-technical users who're more likely to be on a lower-memory system and run out of memory regularly
> it gives a very bad impression that Apple haven't fixed it yet.
This suggests that it's a common or even universal problem, but are you sure that's the case? I've never seen Control Center use an inordinate amount of memory -- it may very well be a rare or unique issue.
edit: I see someone else ran into it — fair enough! Still, I think it's funny how often I see "I keep running into this bug, why haven't they fixed it!" on HN, considering how software developers should know, better than anyone, how often bugs are hard-to-reproduce edge cases.
>I think it's funny how often I see "I keep running into this bug, why haven't they fixed it!" on HN, considering how software developers should know, better than anyone, how often bugs are hard-to-reproduce edge cases.
We (software engineers) usually aren't any better than other customers despite having the same issues ourselves.
Everyone ships bugs that seem obvious in retrospect. Everyone has some small % of customers who hit a bug yet scream loudly about how terrible quality is and how stupid we are for shipping something so obvious. Everyone has had to make last-minute calls "fix or ship?", knowing that every accepted fix has a chance of introducing a new potentially worse bug. Anyone who deals with hardware understands what delaying software can mean.
If you haven't experienced these things then you just haven't operated at large scale yet.
IMHO I think it is perfectly fine. When you're the customer complain about whatever bothers you. That's valuable feedback.
I've seen enough people talking about it that I don't believe it is unique.
I check my activity a couple times a day, and there doesn't appear to be any correlation between use and control center randomly taking >1GB of memory. I could for instance check when all apps are closed and it's memory usage is high but then after a long day of work it's still floating around 30MB. I'm baffled at what the issue actually is but control center appears to be processing a lot more data now as all media seems to pass through it in some way.
Is the background apps closed as well? I wonder if you have a background app that are constantly using or interacting with the system that Command Center need to keep it updated? Honestly, I am just throwing out possible idea.
I recalled there was an issue with Google Play service consuming higher usage than other apps when Play is not in focus. Turns out it is apps that are using Google Play services for GCD or GSM, causing wake lock issues. The reason why they are seeing the higher usage because the Google Play services is the one that are interacting with the system and the network and it is the third party app that initiate the call which put Google Play service as scapegoat. Unfortunately, Android phone does not have a way to separate them.
I wonder if this also applies in macOS. I remember I had a issue with Windows service that are consuming a high cpu usage which is abnormal. I ran malware and antivirus scanning and come up clean. It still didn't explain why the service is so high. So I used ProcMon (Process Monitor) to keep a log of what is interacting with the service. It took me a while to filter it out because it have 100 new entries every second for that service. Found out it that third party app that are hitting the service hard, I assume the app was misbehaving and caused this. So I force closed the app and restart it, it went back to normal.
I wonder if macOS have process monitor for checking this out?
Yes I keep very few apps open on my laptop and zero are backgrounded. I suppose that's why I'm only leaking around 1GB where as some people I have seen are seeing control center take upwards of 3GB.
Basically my suspicious as well. I did report lots of SwiftUI apps were using less memory when I started Monterey. But over a few weeks of use I see a lot of memory creep up once you left them opened. Apple Music sitting idle at 4GB memory.
I think the workaround for the Control Center leak is to turn off Airplay target mode in System Pref -> Sharing and then restart (kill) the CC process.
I recently ran into a memory leak in Mojave (Mail was consuming 49 GB of RAM), so I finally upgraded to Monterey. The memory leak is gone, but now I’m seeing incredible CPU usage in Mail (1/2 to 3/4 of my MBP’s total resources, while Mail is idle).
I called AppleCare and they said they are aware of the problem and will be fixing it in an upcoming release.
This is why tools like valgrind are indispensable. You can run your target process through your test suite with valgrind (or apple’s Instruments) and it’ll tell you if any memory leaked. Even if it’s just a few bytes.
One way is the classic where you get memory from an OS call, and then you lose the pointer to it. Valgrind will then correctly tell you that there's this memory that your app can no longer access.
But you can also simply use more and more memory without losing track of where it is. Say you have some function that gets a new piece of memory and then stores the pointer, again and again. Such a program isn't leaking memory in the classic sense, but it's still using more and more memory. Your tool won't know that this isn't supposed to be happening.
IIRC a common way to leak memory the second way is in GC'ed programs where for instance you might have a closure that owns the pointer, thus the GC when it's marking the objects does not think it needs to clean up the object at the next sweep.
That's usually called a space leak, distinguishing space-wasting program logic from explicit memory mismanagement. Users of a program generally can't distinguish space leak from memory leak, unless you inspect the heap to look for growing chains or pools of similar objects.
I don't know if Valgrind (or Clang msan) would catch this bug as it might not be associated with any calls to *alloc/mmap/free/etc. Seems like a kernel issue, and I don't know if Valgrind has the capability to peer that deep.
I got my first (Your computer has run out of memory) warnings recently on an Intel Mac (MBP 2015), during normal laptop usage. My pointer settings are already the default. Maybe the laptop is already too old for Monterey.
It’s possible that changing the pointer setting and changing it back could help.
It is also possible that your MBP is too old to run Monterey smoothly. I’ve noticed that my 2017 MBP (base model CPU) is noticeably slower with common tasks like spotlight.
I’m sure apple is optimizing for their own chips these days, and they don’t mind if this results in worse performance on Intel machines. It’ll speed up the upgrade cycle, which is good for them (and bad for us).
-[NSCursor set] appears to be leaking images, causes 70GB leak in Firefox
This appears to be a regression in one of the recent macOS Monterey Betas.
Two Firefox users reported a very large memory usage. We’re tracking this in https://bugzilla.mozilla.org/show_bug.cgi?id=1735345 .
We have a memory report which blames the allocations on this stack:
#02: _malloc_zone_malloc[/usr/lib/system/libsystem_malloc.dylib +0x1e770]
#03: createImageWithSizeRenderInstructions[/System/Library/PrivateFrameworks/AccessibilitySupport.framework/Versions/A/Frameworks/AccessibilityFoundation.framework/Versions/A/AccessibilityFoundation +0x7430]
#04: -[_AXFMouseCursorGeneratorLayer _createImageForScale:color:][/System/Library/PrivateFrameworks/AccessibilitySupport.framework/Versions/A/Frameworks/AccessibilityFoundation.framework/Versions/A/AccessibilityFoundation +0x66e0]
#05: -[_AXFMouseCursorGenerator createImageForScale:][/System/Library/PrivateFrameworks/AccessibilitySupport.framework/Versions/A/Frameworks/AccessibilityFoundation.framework/Versions/A/AccessibilityFoundation +0x5064]
#06: -[_AXFMouseCursorGenerator _registerWithName:connection:isGlobal:setActive:seed:][/System/Library/PrivateFrameworks/AccessibilitySupport.framework/Versions/A/Frameworks/AccessibilityFoundation.framework/Versions/A/AccessibilityFoundation +0x5418]
#07: _AXFCursorSetAndReturnSeed[/System/Library/PrivateFrameworks/AccessibilitySupport.framework/Versions/A/Frameworks/AccessibilityFoundation.framework/Versions/A/AccessibilityFoundation +0x70b8]
#08: _AXInterfaceCursorSetAndReturnSeed[/System/Library/Frameworks/ApplicationServices.framework/Versions/A/Frameworks/HIServices.framework/Versions/A/HIServices +0x2d1d8]
#09: -[NSCursor _reallySet][/System/Library/Frameworks/AppKit.framework/Versions/C/AppKit +0x1e01d0]
#10: -[NSCursor set][/System/Library/Frameworks/AppKit.framework/Versions/C/AppKit +0x1e0050]
Firefox calls -[NSCursor set] very frequently during mouse motions, so many leaked images might accumulate.
I can write you a memory leak in any language you want. No GC strategy can solve that as doing so is equivalent to solving the halting problem (at least that's my intuition).
It's not equivalent; you can tag every pointer with a special bit. Then statically you know the accessible roots at every instruction pointer value, and you can simply traverse accessible memory regions, following tagged pointers. Free all memory that is not visited.
A decision problem equivalent to the halting problem might be (depending on the language): does this code free all allocated memory correctly (assuming no garbage collector)?
Cool. That only tells you which memory is no longer actively referenced. It does not tell you whether or not you’ve leaked memory due to stale live references.
My point was that a GC’ed language doesn’t guarantee that this particular instance has to do with anything, especially since they use Swift for a lot of code (which does a form of GC via reference counting). The recommendation to switch to a tracing GC’ed language is naive because there are better solutions to this problem that don’t require a GC (ARC from Apple, managed pointers in c++, rust lifetime ownership etc), solve 90% of the problems with far less cost, which matters for mobile battery powered devices but also matters in high performance servers)
It's still unclear whether these are actual leaks (orphaned references), unconstrained growth, or deallocated memory whose blocks haven't been reclaimed by the OS. Automatic memory management would only help with the first case, and both Obj-C and Swift have it by default. Of course, it's entirely possible that the sources of these leaks are written in another language (C++ is fairly common) or Obj-C with ARC turned off (if that's even still allowed within Apple).
Seeing all the comments here I'm asking myself once more how people can bear such crappy products. Especially as the products are overpriced on any reasonable scale.
With a stock Linux you never see anything even close to all those catastrophes.
I'm not aiming at a OS flame but honestly question how this is bearable to otherwise mostly reasonable people. Is there a true core to this Stockholm syndrome meme about Apple users? Why are people using obviously extremely broken stuff, especially as it costs at least one-third more than perfectly fine working stuff? I just don't get that. Honestly!
I have been using Linux since last millennium and I can tell you there is no shortage of bugs.
Just right now I am fighting my Fedora upgrade and Pipewire systems that are constantly reverting my custom configuration that I wrote to prevent automatically suspending audio to my external DAC. On a normal audio output it does not matter when your audio stops sending analog signal, but mine is connected over Toslink. When the signal stops the DAC on the other side also shuts down. Then when the sound gets back it takes time for it to turn on again (2s of silence + 2s of slow volume ramp up) which is extremely annoying. This happens by default after 30s of inactivity.
Unfortunately, Pipewire developers seem to be having some pretty strange ideas about how to work with user configuration (basically, don't care about it).
I had custom configuration and then it got overriden when they decided to move configuration from /etc to somewhere in /usr. Then I edited in /usr but it got overriden by fedora update system the next time it has been updated (I guess I could have predicted it). Then I followed documentation and edited in my $HOME/.config but apparently this also stopped working yesterday.
And I just want to keep my audio working. Don't get me started on graphics or dev environment.
I'm also using Linux since the last millennium. I can't confirm such issues.
Using Fedora means using experimental stuff. Pipewire for example is not ready for prime time, so it has obviously issues. Simple as that.
But I don't say of course Linux is without bugs. Never seen any more complex software without bugs… The point is how grave those bugs are and how they could possibly pass QA. Especially Apple has a comparably simple job here as they only support officially a hand full of products which are even build by themself.
And it's not only me who sees one catastrophe following the other at Apple. Just have a look at the discussion thread on the other current front page topic:
To repeat my question: How do people bear that? That's what I just can't understand.
Again, I'm not interested in an OS flamewar. That Apple's products are subpar is a fact at this point, and we don't need to discuss that again. (Just follow the other HN thread). The question is why people are still using this stuff and not voting with their wallet (especially as it would make their wallet more happy).
> Again, I'm not interested in an OS flamewar. That Apple's products are subpar is a fact at this point
That second sentence seems like it was specifically crafted to start a flame war to me. You don't even mention on what dimensions Apple products are subpar. All possible ones I presume?
I have a standard AMD CPU+GPU, and Manjaro has spontaneously rebooted (the hardest of crashes) at least 10 times in the last 2 years. I use Windows 10 on this machine about 10 times more often and in the same time-frame it has crashed _maybe_ once.
My first Linux install was Mandriva in 2001, followed quickly by Slackware, then Debian, Ubuntu, Pop!OS and now I pretty much exclusively use Arch and Manjaro. I'm pretty sure that in all those 20 years, every time I use up almost all my RAM, the system becomes entirely unusable, to the point where I can't even interact enough to kill programs, and not even the OOM saves the system from total lockup, so I need to power off and on. I've _never_ had this experience on Windows. It gets slow, but it can recover.
The only time where Windows (kernel) was less reliable for me than Linux, as a desktop user, was when I had Windows Millenium Edition -- when my journey all began.
I'd still say I prefer Linux, but it's largely because it's better for my work on systems software. It's definitely not nicer to use as a desktop if you hate crappy products and like good ones.
Regarding spontaneously reboots: I never ever experienced this on Linux besides once having a hardware defect. (I would check the RAM in your case first; Manjaro is an Arch derivative so it incorporates the latest SW but it's not so terribly unstable that it would crash every two month I guess; not sure of course. I'm avoiding Arch for exactly the reason not being stable enough cause it's extremely fresh SW).
Regarding the unresponsiveness on low memory: That's an known problem caused by a live-lock in the kernel. It has a few solutions by now. Just have a look at Facebooks oomd (or one of the solutions working in a comparable way):
I disagree, I think I dealt with the two key elements of what you wrote.
> Seeing all the comments here I'm asking myself once more how people can bear such crappy products. Especially as the products are overpriced on any reasonable scale.
Sure, macOS is a crappy product. A product that crashes to reboot is an even crappier product.
> With a stock Linux you never see anything even close to all those catastrophes.
Crashing to reboot is a bigger catastrophe, so I see worse on Linux.
And I don't mean this in a harsh way, but I know it's a known problem, I just said I've been dealing with it in one form or another for about 20 years. The fact that it has been 20 years means it's a crappy product for desktop, and I won't believe otherwise until it's capital f Fixed in standard installs.
One of the first things I do when I get a machine is install Linux, either to disk or in a VM, and it's the best thing to have happened in the OS space in the last 50 years, but it's still a crappier product for desktop than macOS or Windows. I use all 3 regardless.
The core of my question was something completely different.
I think I should better have left out any comparison to another OS. This part is actually only there to underpin my basic assumption. That is that macOS is as it stands right now quite terribly broken.
If even Apple friendly media starts to complain loudly and this gets some traction over here where people are usually quite objective, and just at the same time I read how macOS fails in regard to most basic OS functionality (memory management, which is at the core of an OS) I take it as a fact that macOS is utterly broken.
Now discussing whether the other mentioned OS is even more broken is the exact flame that I wanted to rather avoid.
I fully agree that crashing to reboot is an unbearable catastrophic failure. But that's not the point.
The question was how a product which (also) fails in catastrophic manners seems bearable to otherwise mostly reasonable people.
If I would've got a computer that needs to be restarted at least once a day as the system has eaten all memory for no reason (or it would just restart itself for no reason) I would look for something else. That's for sure!
In the case of Linux crashing you can at least try your luck with some other hardware (or just distribution). There are rumors that Linux runs rock stable on some machines. So catastrophic failures like spontaneous reboots aren't at least a fundamental and general flaw of this system.
In case of the current Apple issues on the other hand there is not much room to escape though. Also those issues seem to be quite fundamental (like an OS with a more or less completely broken memory management), and also affect more often than not large parts of the user base (if not even everybody as it is, or was lately, the case with a lot of other severe macOS bugs). But people stick to those products despite their severe failures. That's the part I don't get. Honestly, as said already before. Like I would not stick with a computer that self reboots randomly (if I couldn't repair it) I would also not stick with machines that start to have flaws like Windows ME. Especially as there is nothing you can really do about such a problems despite hoping for some good will at Apple.
Even this thread is quite long I didn't see good answers until now. People don't tell why they stick with a broken system they can't repair themself but try to point out that other systems could be equally broken. Despite this is obviously not true as for some people even Windows works fine most of the time. Which, again, can't be said about the current state of macOS. By now one can see die-hard fans complaining loudly, usually about issues that fall in the category of catastrophic failures. Still the majority seems to endure those troubles. Otherwise one would see consequences on Apple's balance sheet.
Maybe it's me who doesn't see the great value proposition that lets one forget about everything else. That also part of the reason I'm asking. I was hopping that I would get some meaningful answers at least here. But despite the usual "everything's equally broken, it makes no difference" (which to some degree is of course right, but only to some degree as pointed out above) or the just nonsensical but also usual "the other stuff is even more broken" (which is objectively not true at this point in time) there wasn't much… I'm wondering whether gaining no new insights was worth bringing up a question which provoked (as expected) mostly down votes. Maybe I have to accept that some human behavior doesn't have any rational explanation and asking about those things only causes troubles.
The egregious macOS bug will be fixed shortly. AFAIK it's only happening in some of the big upgrades released in the last few months. Shockingly bad for a company with Apple's resources, but how many people are affected and for how long? Just the people who upgraded promptly or bought a new device? It's an indictment of Apple but most people intentionally don't upgrade promptly (because it's a hassle, or they know to wait for the bugs to be ironed out). It's an _extremely_ effective mitigation strategy to just not do big upgrades until they've been out for about 6 months. For the type of OS that macOS is, that's an extremely low cost strategy.
If as a hypothetical we imagine macOS and Linux have the same number of "regular" users who upgrade after a few months, and "lemming" users who upgrade ASAP (directly, or indirectly by buying a new product that only comes with the new version), and we look at the number of people affected by crappiness in each crappy product, it's basically just "lemming" users for macOS, and all users for Linux. There's no strategic fix for Linux.
This has nothing to do with open source vs commercial software. Most software from both camps is just really crappy from an end-user perspective, and culturally, both camps are largely getting worse with time at about the same rate. I disagree that Linux is any kind of oasis in the desert of crappy software. We have to ask the same "why do we put up with this?" questions of Linux as well.
The problem is that you’re taking a relatively small selection of comments—usually only posted by people who have experienced a problem—and extrapolating an overall picture from it that’s probably not that reflective of reality.
I use both Linux and MacOS roughly the same amount day-to-day. My experience is that they suffer from various niggly failures and issues to roughly the same extent in practice. I’ve had the Mac sometimes lose Bluetooth connections and need lots of switching things on-and-off to work, in the same way I’ve had NetworkManager drop off wifi with no apparent explanation. Or PulseAudio suddenly deciding to use all my my CPU. Or weird lock screen bugs galore!
The products are broadly fine and functional, priced somewhat higher than commodity equivalents but in exchange for benefits which are worth it for some people and not others. I would gently encourage you to consider that other people likely aren’t idiots and are generally making decisions based on their requirements, which in some cases will not match yours!
Thanks! First more reasonable answer that doesn't go into OS flamewars.
> I would gently encourage you to consider that other people likely aren’t idiots and are generally making decisions based on their requirements, which in some cases will not match yours!
That's the core of my question actually! I spelled it even out. I just don't get why reasonable people would buy such things.
I'm actually not sure the amount of trouble is the same on average: I'm not reading almost everyday about some obscure catastrophes on Linux (like not working memory management like here, not working mouse or keyboard, not working USB, not working monitors, etc.). The other thing is Apple supports only a fraction of the hardware Linux does. It actually seems they don't do any QA. All that broken stuff shouldn't ever reach end-customers. Especially as mostly all that has to be done is to test against their own products.
Desktop Linux is absolutely riddled with bugs unless you have a suitably generic hardware configuration and even then you will very likely still have problems with:
- Audio
- Wifi
- Bluetooth
- External monitors
- Suspend/Resume and power management in general
- many other things
I abandoned it years ago. I just don't have time for that anymore.
I didn't ask why you don't use Linux. (Additionally I can't confirm any of the claimed problems; using Linux as a daily driver for more than 20 years. All the mentioned things just work OOTB since many, many years).
I asked why people are using an utterly broken OS where even most basic stuff like memory management is defect. (Not even asking how this could possibly pass any QA)…
Even if you think Linux isn't better, it can't be worse!
But Linux is free. (And actually doesn't have most of the problems Apple has with even it's own hardware).
So it's valid to ask how people bear using a broken system and even paying for it more than elsewhere.
And to repeat myself form an answer given previously: That Apple's software is utterly broken and you won't find anything worse in the market is a fact at this point. Please just follow the other front-page discussion:
> Even if you think Linux isn't better, it can't be worse!
It could be! The truth is complicated. I’ve been using a ryzen desktop running mint for the last year and I just got a M1 pro. I love both of these machines and I still don’t know which I want to be using as a daily driver. (Well, now mint is set up how I like).
These macos bugs are real - and they feel like amateur hour junk. As well as the memory leak I’m also running into a bug on my M1 where WindowServer idles at 100% cpu after I’ve been using my computer for awhile. I have no idea what it’s doing or what causes it, but the only way to fix it is a reboot. (Or kill -9 the process, which logs me out and closes all my programs anyway).
But Linux’s bugs are different. I haven’t rebooted my Linux machine in ages. But on linux I’m annoyed by other things. Lots of programs have broken smooth scrolling. Keyboard shortcuts for moving the cursor around are inconsistent. I can’t use the Meta key in intellij as a modifier for some reason - so I can’t use my muscle memory shortcuts (and the ctrl+shift+C is awful as a copy shortcut in xterm!). I use a trackpad on my desktop and the sensitivity is way off - so i get misclicks all the time. App distribution is a mess - flatpak, snap, apt, custom dpkgs and manual make / make install? All of the above? I accidentally ended up with 2 copies of discord installed and didn’t know which one was “right”.
Linux does the core engineering work right but often falls down in frontend UX. (Its definitely getting better these days though). Apple makes nice UX but their core engineering is remarkably buggy. Apple please - I know you think the market wants a new OS every year with new features. You don’t need a new font every year to compete with cheap plastic x86 laptops. Please slow down and fix your shit.
> It could be! The truth is complicated. I’ve been using a ryzen desktop running mint for the last year and I just got a M1 pro. I love both of these machines and I still don’t know which I want to be using as a daily driver. (Well, now mint is set up how I like).
You had better love the way Apple set up their system because that's the only way you're going to get.
For me that's the reason to use FOSS OSes (FreeBSD right now). I got annoyed with Apple changing stuff around on me. Sometimes I like it, sometimes not, but I have very strong opinions on how I like things and on Apple it's their way or the highway (or the steep mountain trail of add-ons/work-arounds that Apple will constantly break :) ).
I used to put up with it because of the "just works" thing but things kinda stopped "just working". I used to see the benefit of opinionated software but it's not for me anymore. This is also why I said things like Ubuntu and Gnome because they're heading in the same direction.
Of course mileages vary and I'm far from a standard user. But I feel like Apple used to care more about non-standard users when they only did Macs and their marketshare was tiny.
How do you find FreeBSD as a desktop operating system? I’m using FreeBSD on my home server and I love it, but I expect that whatever problems Linux on the desktop has will be 10x worse under a more obscure operating system. Though I’d also expect (rightly or wrongly) that FreeBSD would be easier to customise.
So it’s something I’d like to tool around with at some point. Do you recommend it? Do you use any of the Linux emulation stuff to run Linux apps?
FreeBSD is very nice for a well controlled and stable desktop. I use it on desktop since 2005 and did not yet saw a Linux distribution that would provide the same 'zen' experience. Especially with ZFS Boot Environments that make you bulletproof for changes and upgrades.
Maybe my guide will help you to start with FreeBSD desktop:
Yes your site was one of my references when I started with this. Many thanks for writing it!
I had used FreeBSD as a server in 2006 or so but moved off it as I moved from colo to VPS at the time and it wasn't supported. So it was off my radar for a long time.
But it's really a great OS for the desktop for me.
I find it really good. It's super stable, I have it running 24/7 for a month at a time and FreeBSD has some really nice things that Linux is only now catching up on. Like ZFS on root (snapshotting your desktop is so cool). It's never crashed on me. Unlike my Mac.
Hardware issues are present yes. Your hardware may vary of course. In my case I had to turn off bluetooth on my NUC (NUC5i5RYK). Because it hangs on boot with some cryptic messages. I probably could have troubleshooted it more but because all my peripherals are wired I didn't really care :)
Also, I don't think sleep works (I don't care, I want it to be ready for me all the time) and I didn't even try WiFi (I have wired ethernet)
In fact the sleep thing is one thing I hated about my Mac Mini. Apple always forces some kind of sleep.. Every morning when I logged in my apps would have this state of "oh wait, just catching up" despite me having switched off all the sleep states with "pmset" and in the GUI, and using an app ("Jiggler") to fake mouse inputs every few minutes. Yet the apps still go to some kind of sleep somehow. Which meant it would take a couple of minutes to be responsive again. I looked it up and this is some "app nap" thing or something. Cool but let me turn it off. My desktop takes 5-10W idle so this is really not something I care about. And during the night I schedule stuff like backups.
Another thing where FreeBSD shines are the jails. I've used them for 15 years and you can even jail GUI apps now. Linux has snaps etc but they're really slow to start up and clutter your mounttab. I prefer this.
But anyway yeah it's really stable. Wayland is not working at the moment but it's being worked on. The developers are extremely quick and responsive at #freebsd-desktop on libera. When a new KDE version comes out they usually have it ready in a day or so. Which is pretty amazing compared to most Linux distros.
I don't use any linux emulation right now. I don't need it. Any apps that were not packaged for FreeBSD I was able to compile natively or use an alternative.
What I also like about FreeBSD is that it's just one OS instead of one kernel and hundreds of distros (basically OSes) doing things around it in different ways. There's usually one good and well documented way to do things, but not by hiding powerful configurability from you like most 'modern' OSes do. If you're looking for something the handbook usually has the answer (though it should be kept up to date more...)
The ports collection is also super. If you want a package but with different compile options than it comes standard you can just do so.
Thanks! That's the second reasonable answer I've got until now.
It guess the distinction in "core engineering" and UX is valid and helpful.
Adapting my point to it: Even the UX may be inconsistent or even outright broken the resulting system is at least somehow usable. If the core of a system is a tire-fire nothing can work, and it makes no difference any more whether you have the nicest UX or not.
One hint regarding the awful shortcuts: You could try to switch CTRL and ALT or META. That's even a common "one click" setting supported usually OOTB by the desktops. Swapping those keys was the one truly right decision regarding usability by Apple, which I welcome even not liking the rest of the macOS UX.
I should have expected "it works for me" to be the go-to reply about why desktop linux is still terrible after all these years.
The web browser is pretty much the best thing that ever happened to linux users because now their $favorite_distro is actually a viable desktop platform.
It may be "terrible"¹ (even it works for me, and a lot of other just fine) but it's not broken beyond usability at least! And even if it would be as broken as macOS it's free and supports much more hardware—compared to something where you pay even a premium to be bound to only one vendor.
I switch to macOS after successive updates of Ubuntu caused audio to be a pain in the neck (due to PulseAudio), no longer waking from sleep, and wifi not working. This was quite a while ago. Audio still doesn't work by default on my Ubuntu 18 system.
My stock Ubuntu LTS desktop crashes all the time, about 3x a week. Still can't figure out why.
I love linux, have used 4 different distros as my daily driver for the last 15-20 years or so. Yet still my linux-using friends and I have this joke: 'but will it hibernate?'
And to answer your question directly:
> Why are people using obviously extremely broken stuff
a) it's not obvious and b) it's not extreme. Apple products work smoothly for most people most of the time. We're on hacker news where the entire point of this forum is the selection bias around tech stories like this.
Have you considered a hardware defect? Even Ubuntu doesn't crash the whole time. Also you see usually at least some obvious reason in the logs if it's software related.
Regarding the "extremely broken stuff": I came here from another thread:
I couldn't help to post the above after also seeing what's described here. (Given all the down-votes maybe not the smartest idea, but I honestly couldn't help :-)).
The problems are in my opinion indeed extreme and by now the mess gets undeniable even to die-hard fans.
What makes me really scratch my head is why otherwise reasonable people don't vote with their wallet and still buy this things (even at a premium!).
> What makes me really scratch my head is why otherwise reasonable people don't vote with their wallet and still buy this things
This shouldn't make you scratch your head though... other people value different things than you do, and each person has their own personalized formula for whether a Mac is worth it or not. still_grokking's value judgement is not canonical for anyone other than still_grokking.
I can't be the only user who just doesn't care about hibernate.
I definitely understand why other people want it, but I find suspend and hibernate almost disorienting.
I don't want to continue a session where I left off. I'd start it back up, forget what I was doing anyway, and then I'd likely close all the programs and start over, realizing how amusing the effort I'd just gone through was.
I'm willing to admit it might be a personal problem.
I don't care about hibernate either. It's impracticable with the current amounts of RAM anyway.
But I couldn't live without suspend!
I hardly shutdown my boxes. Not because I like "impressive" output form uptime, I don't care, but as I don't like to reopen things.
There is by now even enough RAM around that you don't have to close almost anything if you don't like.
Also wake up from suspend is not more than a second. That's much faster to get into a workable state compared to the 20 seconds to boot an average box and than even waiting longer until all applications started up.
The only reasons for shutdowns are imho kernel updates or when you need to lock the computer. (You should not trust Linux screensavers, they had obscure bugs in the past, even things got better on Wayland. Also FDE encryption keys remain in RAM throughout suspend usually).
:D I don't actually care about hibernate. But after it came out on Windows and worked well enough, and Linux started introducing it, it became a meme to laugh at how bad linux suspend/hibernate is
It refers to a default install of any mainstream distribution, with few modifications. Eg you're bound to run into problems if you're running some strange window manager with a funky desktop. By comparison, I'd expect a base install with Gnome to mostly just work. Which it does with supported hardware. eg arandr(/whatever the GUI tool is) works fine for arranging monitors.
Extremely broken is not an apt description. It's mostly minor annoyances, and they usually get fixed over time if they affect enough people. Apple isn't 100% unresponsive.
Linux is equally, or perhaps more broken in many ways.
I'm getting one of the new macs because the hardware seems great. I got one around the turn of the century for the same reason (and mostly ran linux on it.)
While the pointer one may be the worst culprit, it's definitely not the only one - every few days I see Control Center using >3GB (once force-killed, it goes back down to 26MB). It's not a massive issue on my machine (I just check memory usage every day) but it gives a very bad impression that Apple haven't fixed it yet.
I'm sure is especially challenging for non-technical users who're more likely to be on a lower-memory system and run out of memory regularly