Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What an uninformed rant.

When someone rants about memory usage it is usually a sign he knows nothing what he is talking about. On virtual memory systems with on-demand paging that use shared libraries and where all file system I/O is mmap(2) based, memory is managed in a very different way than what most people expect. It's understandable, most people don't know and don't have to know what virtual memory is, even if they have a superficial understanding of swapping. Most people, even most technical people, don't know about the implications of shared libraries in memory measurement.

The users are presented with data they don't understand. Everybody talks about things like "this app is using 300MB of RAM", when such statements don't make any sense in the modern world. The way file systems, file system caches, virtual memory, and shared libraries in the context of virtual memory interact is architecturally identical on all major operating systems today, including Windows, Mac OS X, Linux, Solaris, and the BSDs. There are various differences in implementation making each system optimized for particular workloads, but understanding the differences between the system is out of reach of most people who complain on their blogs, and it only affect out-of-reach workloads anyway. It's funny how much can one advocate for something when all alternatives are the same.

But all memory management rants are nothing compared to mentioning Mac OS X' repair disk permissions feature. Of course, this feature doesn't magically repair anything, but it's sold as a panacea. I read the first paragraph about memory management and decided to give it one more chance, but then repair permissions was mentioned as a solution. Sorry, this is no HN worthy.



Okay, I really think you know much more about memory management than the writer of the original post, the question for me is using chrome with a few tabs (8 for example) and Dropbox besides the default applications I face huge slowdowns (the system becomes unresponsible for a few minutes), the same happens with Safari, for a system that was described for me as "it just works" this really sucks. This with 4GB of RAM in my MacBook Pro.

I face no such problem with Linux or FreeBSD in the same hardware.


Open activity monitor, and check which applications use a lot of memory. (Look at the "Real Mem" column).

If your computer frequently freezes, the problem is mostly a specific application (Virtualbox comes to mind), not just Mac OS X itself. 4GB are more than enough for casual browsing.


Activity Monitor is a tool too blunt for these kind of scenarios. It pains me, Mac OS X has DTrace which makes it a breeze to find out what really happens, however, the only GUI tools built on top of DTrace are profiler for programmers, nothing for the casual user.

Activity Monitor presents data that's not really useful and does so in an intrusive manner and in a confusing display.

Mac OS X already comes with parts of the DTrace toolkit, you can install whatever you're missing: http://www.brendangregg.com/dtrace.html#DTraceToolkit. These tools allow a very deep understanding of performance problems.


If your problem is that you are running out of RAM, Activity monitor will tell you precisely what app is the culprit. Nothing "blunt" there.

If you want to know why a specific app is frozen (beachballing), you can click the "Sample Process" button in Activity monitor to perform a time profile.

Yes, if you want more details, you have to use Instruments or write your own DTrace scripts. But I doubt that even a time profile is useful to the "casual user".


Sorry, but no, you're just illustrating the problem I've mentioned. In a virtual memory system with on-demand paging, memory mapped I/O, shared libraries and copy on write pages even measuring and interpreting memory stats is very difficult and subtle.

The "real memory" column is resident set size, a completely useless metric for the problem at hand because of many reasons. One reason is that much of the physical pages can be shared, indeed most of them usually are, on my system a chrome process has a 120MB RSS, but after closer inspection 110MB is shared, and after even closer inspection 90MB is shared with non-chrome processes. Closing the process with top RSS usage might do very little for decreasing memory pressure. Another reason is that physical memory usage is very misleading, if the system is swapping, a process has less resident physical pages than the virtual pages it uses, that's the reason the system is swapping in the first place! A process can thrash memory and have a relatively small RSS.

You also simply assume what the problem is without actually testing for it. You need to look why the system spends time in kernel mode, maybe it's not swapping, most likely is not swapping in this particular case, it's more likely to be the random I/O caused on-demand paging of memory mapped things or something more subtle, like copy-on-write pages being written to. It also could be a million other things.

Even if the problem is caused by memory pressure, memory pressure is a remarkably generic term, the VM system has many components and different components are affected by different workloads. A simple metric like RSS can't tell much.

Yes, Activity Monitor is very blunt tool.


I do understand how memory works on modern systems, and I agree that Activity Monitor can still be used as a blunt tool.

My system (MBP, 4 GB of RAM) works beautifully most of the time, but I noticed that sometimes when I went to a new tab in Chrome, there was a multi-second delay. I opened up Activity Monitor, went to a new tab, and noticed that my disk activity had spiked. I figured that the memory system was swapping in/out a lot of pages, so I looked at rough memory usage, and saw that the Shockwave Flash plugin had close to 1 GB in RSS. I then realized that using YouTube as a music player is probably not a good idea - Flash was designed to run as the main thing you're doing, not in the background while you're doing other things. I killed the tab, and I had no more problems.

Blunt tools can still be useful. The problem isn't the tool, it's the knowledge that people have when they use it. For example, I know that RSS overcounts, but I also suspect that the Flash plugin isn't sharing enough to make a big dent in 1 GB. And, it turns out I was right.

You wrote an excellent explanation above about modern memory systems, I find it strange that you're harping on this particular point so much.


Yep, I checked out my Activity Monitor and the Flash Plugin was also the culprit, using over 890mb of RAM. I'm running Firefox with multiple tabs (5), Chrome with multiple tabs (7), pages, X-Lite, Adium, mail and thunderbird and 4GB of RAM with no problems.


You are missing the point. The idea is not to determine exactly what's going on, because that doesn't matter. You mentioned the casual user before, but all these details are only important to specialists trying to debug an issue with a specific application.

When you run out of RAM, it is completely irrelevant which application is thrashing, because the problem is that you run out of RAM. It doesn't matter if the reason is that an app is accessing a memory mapped file or modifying a copy-on-write page, when underlying reason is that you ran out of RAM.

Activity Monitor is perfectly suitable to find out why you ran out of RAM. Oh, Mathematica is using 2GB of RAM? Maybe I try closing that. It doesn't matter if "Real Mem" actually counts some memory twice, it is still a useful measure.

EDIT: Yes, I assumed that the problem is memory pressure. Since this is a common reason for "The whole system becomes sluggish", it seemed reasonable to start testing for this.


Fantastic argument, user is having a problem, but it doesn't matter what's going on, let the user try arbitrary things which I already explained that are folklore and why they make no sense, maybe it works.

If that's you advocate, let's stop here with this discussion, as there's no common ground.

When you're having a problem, first you try to understand it in order to try to solve the root cause. Applying rules of thumb like "kill top RSS process" are as sensible as rules of thumb regarding running repair permissions, sizing paging files or hoping arbitrary herbs cure arbitrary diseases.

Activity Monitor is useless because it's impossible to assess how a specific action will affect the system. Users should understand what's going on when they kill a process and the tools should help them to do so. When people do something, they should understand it, even casual users. Activity Monitor exposes data that's not understood by most users, although it leaves the impression that it does.

Just for trivia, memory pressure, hasn't been the primary reason for "the whole system become sluggish" for a few years already.


1) Nowhere have I suggested to "kill top RSS process". 2) If you start Activity Monitor, you immediately see if memory pressure could be the problem (well, after 30 seconds, because it takes time to start Activity monitor if you ran out of memory) 3) If you ran out of memory, looking at which apps use much RAM is useful. And it's not as unpredictable as you make it seem. Quitting an app, you'll free at least the private memory, and you might free some of the shared memory, and other processes (e.g. Window server) might also free some more memory as a consequence. This might not be a precise prediction, but it's not quite "impossible to assess".

I do not know how prevalent memory pressure is for other users. It's been the primary reason of "sluggish computer" for me. If you know more about this topic, I'd be thrilled to hear other possible explanations beyond "it's more complicated than that".


Your argument basically calls for casual users to stop being casual and to begin to become experts. Activity monitor certainly does help casual users understand what's going on. For the casual user using a system normally (meaning having a browser open, checking mail, writing in a word processor) seeing that application X is using the most memory means that's the problem to them and they're correct to assume they should kill it. It may be hit or miss but that's all they know and in most cases things get fixed that way.

As professionals we often forget what it is to be a casual user. Asking a casual user to learn what you explained as folklore is simply too much to ask. Everyone who uses a computer should at least be technically literate to a degree but that means understanding the basics. To a casual user the basics are: my computer has a processor that executes tasks, it has RAM that stores data for quick retrieval, and a hard disk for long term storage. Each application uses a percentage of my finite RAM and when it runs out my system slows down. Therefore logic dictates that if I kill the app taking the up the most RAM my computer will go faster.

That's all they usually know. We understand that Activity Monitor lies to us and killing random processes is voodoo but we also have to take into account how we use our machines. The casual user will be able to solve their problem by killing processes more often than people like us will because of the way they use their systems plus there is a placebo effect for them. When they kill a process they often feel like the system just got faster regardless of whether it really did.

I liken it to driving a car. Ask some random person about fuel economy. Their thinking is "high octane fuel has more energy per gallon therefore if I use it I'll get better fuel economy". They might even know the relationship between tire inflation and fuel economy too if you're lucky. Ask a professional driver about those things and they'll look down at the average person like they're crazy. They know all how octane, oil, air filtration, shocks, struts, aerodynamics, etc, etc. all contribute to better fuel economy. "If only the average driver knew what I knew, then they'd save a ton on gas" theyd think. But alas, that's too much to expect so we just have to make sure they get the basics and it's up to the professionals to provide the average person with something that just works and do our best to be one step ahead of users by anticipating their usage patterns. This applies to hardware/software engineering, car manufacturing, and anything else. You just can't expect the user to learn or even take an interest in even a quarter of what we know.


Well, I know the basic about SystemTap, but I'm not so much familiarized with low-level stuff, I'll look for DTrace to see what's happening. Thanks for the information.


SystemTap is a Linux thing, which you can use on Linux to see how this works, but it won't be a pleasant experience.

DTrace is also available on FreeBSD, so if you look at what happens on Mac OS X, it might be helpful or insightful to also look at what happens on FreeBSD.


Activity Monitor is way older than DTrace and I don't think Apple has made any code changes to it in a long time.


The casual user doesn't know what any of the words in your post means.


The private memory column is likely a better measure when looking for memory hogs.


I've never had this happen, and I run Mac OS X on a MacBook Pro 5,5 with 4 GB's of RAM (core2duo at 2.33 Ghz). This laptop is from the mid-2009's.

My default set of apps that I run:

1. Safari 2. Mail.app 3. Terminal.app 4. Adium 5. Twitter 6. iCal 7. X11 8. MacVim 9. GitX 10. Activity Monitor

I have no slow-downs and I regularly compile the entire companies code-base on this little machine. Sure it isn't fast compared to some of the newer monsters out there, but I have had no issues with slow downs what so ever.

That being said, I don't use Chrome, and I don't have DropBox on my work laptop.

I keep hearing about people having really bad slow downs and I just don't understand it. Mac OS X has never been anything but "Just Works" for me.

11:29 up 69 days, 17:34, 11 users, load averages: 1.38 1.24 1.18


Besides of that OS X works really fine for me, although I do not use it to serious work, which is programming in C++, Python and R since I work as a statistician, I will upgrade the memory to 8GB and see if things become better for me.


I can't directly answer the question but I get the feeling most memory issues have to do with an individual's use of their system, and not the system itself. I have a 2009 iMac and a little netbook running Xubuntu. The iMac has 4gb RAM and the netbook just a single gig RAM and not even 2ghz processor. The netbook doesn't slow down on me while the Mac is prone to freeze ups. I Could say "Linux is better than my Mac because it doesn't slow down" but that's not fair. They each have their own pros and cons and I don't see one as being better than the other, it's all a matter of preference.

When the slowdowns bother me I look at usage. On the Mac I always have 4 spaces open (the dashboard is not set as a space on my Mac), and at all times I have the following apps open: Mail.app, Terminal.app, Chrome w/>=5 tabs at a time, MAMP, CodeKit, iTunes, and Sublime Text. Then Photoshop is open a lot in a addition to a lot of others that get opened at times. On the netbook I've got a terminal session, Chrome, and Sublime Text. That's it. It's no wonder the Mac slows down. So I'd say look how you use the thing. No machine has unlimited performance and when it comes to memory usage it's a lot like money in that the more you have the more you tend to spend and you never seem to be able to have enough.


I thought about that some time ago but I use the laptop mainly to browse the web, never more than 10 tabs, and login via ssh in a Linux desktop. Maybe it is slow because of that, but the main problem I have with this is that I get no slowdowns running Linux and FreeBSD in the same computer and with the same use case.

This computer stays up for several days[1], and I suspect the main cause is because of Chrome leaking memory.

[1] Now: 20:16 up 43 days, 23:41, 2 users, load averages: 0.38 0.31 0.26


The parent told you he was using Linux and the Mac the same way.


But it's a real problem the "inactive" memory on OS X.

If I do a malloc of 7GB with 6 GB free and 800 MB of inactive memory on my MBP, the system starts paging before it deallocates the inactive memory.

If I ctrl-c the application, suddenly the active and inactive memory go down drastically, with inactive going to like 100 MB, so it obviously could have been given to the process wanting it before paging started.


+1. It is interesting that there are no complaints that "system A is slower than B". I don't understand why people bother with VM stats if there are no performance issues.


I agree with this, what is left is one guys opinion, which makes it to hacker news front page. Next up: vim vs emacs.


another uninformed rant (this is also an uninformed rant)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: