Hacker News new | past | comments | ask | show | jobs | submit login

Wow, that's actually a pretty big limitation. I guess it's tough to do 64 GB with their on-package unified memory.

I wonder if they're working on a version with discrete memory and GPU for the high end? They'll need it if they ever want to get Intel out of the Mac Pro.




This would seem to point toward a tiered RAM configuration that acts somewhat like Apple's old Fusion Drives: On-package RAM would be reserved for rapid and frequent read/write, while the OS would page to discrete RAM for lower priority. Discrete RAM would act as a sort of middle ground between on-package RAM and paging to the SSD.

Then again, maybe their in-house SSD controller is so blazing fast that the performance gains from this would, for most applications, be minimal.


Apple doesn't like writing to their SSDs much, to prevent wear-out.


Let's think about that a little bit. If the RAM is fast and the SSD is fast and the virtualization options are limited, then this is good enough?

Or, inspire me. Which processes really require occupying and allocating bigger blocks of RAM?

I personally don't want to purchase another machine with 16gb RAM but that's mainly because I want the option of having a powerful Windows guest or two running at the same time. But if you take out that possibility, for now, what if the paradigm has changed just a tad.


SSD latency is still several orders of magnitude higher than RAM latency. Having similar magnitudes of total throughput (bandwidth) isn't enough to make RAM and SSDs comparable and thus remove the need for more RAM. Random access latency with basically no queue depth is very important for total performance. Certainly, SSDs are far better at this than hard drives were... but, SSDs still serve a different purpose.

Intel's Optane SSDs are based on a completely different technology than other SSDs, and their low queue depth latency is significantly better than any other SSDs out there, but good luck talking Apple into using more Intel stuff just when they're trying to switch away, and even then... just having some more real RAM would be better for most of these creator/pro workloads.


I have a project that won’t compile on systems with less than 32 GiB of RAM, and I refuse to refactor the hideously overgrown C++ template magic that landed me here.


I suspect "Apple silicon" will not really be very suitable for software engineering.


Their performance claims are the very essence of vague, but Apple sure seems certain it will be great for software engineering. I'm curious. I won't be convinced until we get some real data, but signs seem to point that way. What makes you strongly suspect it won't be great?

    Build code in Xcode up to 2.8x faster.
    
    [...]

    Compile four times as much code on a single charge, thanks to the game-changing performance per watt of the M1 chip.
source: https://www.apple.com/newsroom/2020/11/introducing-the-next-...

I have a hunch it will be adequate for single-threaded tasks and the real gains will come for multithreaded compilation, since its superior thermals should enable it to run all cores at full blast for longer periods of time without throttling, relative to the intel silicon it replaces.


Not everyone's codebase is an over-bloated mess


I've been building an over-bloated mess on Apple silicon for months now; it's been quite good at it actually.


For now, most developers use MacBooks and tools like vscode already have apple silicon build.


Saying that "most developers use MacBooks" requires a very different understanding from mine of what the words "most" or "developers" mean.


You are downvoted, but you are right. In Asia (esp India, Indonesia, Phillipines & China) but also in the EU, I see 'most developers' walking around with PC (Linux or Windows=>mostly Windows of course) laptops. I would say that by a very large margin 'most developers' on earth use Windows machines.

The most vocal and visible (and rich) ones have Macbook's though, i guess that's where the idea comes from.


I'm guessing some took issue with the possibly poorly phrased "our understanding [...] of what [...] 'developers' mean [differ]". It can be read as me dismissing people that use Macs as "not real developers", where my intent was to counteract the opposite: people dismissing others that might be using Windows as "not real developers" because otherwise they would be using Macs, which is circular logic that I have heard expressed in the past. And I say that as someone who has used Macs at work for the past decade.


According to the Stackoverflow Survey 2019, 45% use Windows, 29% use MacOS, 25% use GNU+Linux


It's actually from the 2020 survey in 2019 windows was at 47,5% and in 2018 at 49,9%. Unfortunately this metric apparently wasn't tracked in 2017 so we'll never be sure if they were above 50% back then.

So currently the majority of responding dev use a POSIX (mostly)-compliant operating system. That is actual food for thought.

Sources: https://insights.stackoverflow.com/survey/2018#technology-_-... https://insights.stackoverflow.com/survey/2019#technology-_-... https://insights.stackoverflow.com/survey/2020#technology-de...


I suspect your suspicion is going to be very wrong.


    Which processes really require occupying and 
    allocating bigger blocks of RAM?
It's not uncommon to work with e.g. 50GB+ databases these days.

They don't always need to be in RAM, particularly with modern SSD performance, but if you're using a laptop for your day job and you work with largeish data...


Virtualization and containers. Especially if you want to run an Electron based code editor next to it.


Containers on Mac rely on virtualization, don't they still? Will the new CPU arch have a native virtualization SW? Because if not, I suspect that the virtualization layer might break with the translations to and from X86, and/or might take pretty significant performance penalty.

A wild unsubstantiated guess of course, at this point (or rather, a worry of mine).


Containers on Mac still rely on virtualization, but Apple said at WWDC (and showed a demo) of Docker for Mac running on Apple Silicon, and of Parallels running a full Debian VM. Both were running ARM builds of Linux to run properly, and Apple added virtualization extensions to the new SoC so it doesn't have to resort to emulation (if the software running is ARM and not x86).

https://developer.apple.com/documentation/hypervisor


They must be... there's no chance they're wiping out their Intel lineup with machines that max at 16GB of RAM. Especially not for the Mac Pro.


I suspect/hope they are for the 16” MacBook Pro, which is still Intel-based.


They only launched their lower performance machines today. Air, mini, two port Pro.

So that’s the context to interpret the Ram they offer.


The context is the Mac Mini previously supported 64GB of RAM. In fact it was a pretty powerful machine, with options for a six core i7, 10Gb Ethernet, and as I said 64GB RAM. Now it's neutered with GigE and 16GB RAM, despite having much better CPU and GPU performance.


You can still buy an Intel iMac Mini with 64GB RAM and 10GB ethernet. If you don’t like the M1 mini, buy the Intel one.


But then you get worse CPU and GPU performance. What a choice.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: