I'm surprised we didn't get any performance numbers. Either raw power or at least power efficiency and projected battery life improvements. Seeing as this is a major reason for the transition (according to them), it feels very weird.
They're shipping a 'Development Transition Kit' Mac mini with an A12Z this week, so it's not like the numbers are going to stay private for a long time. Even if there's an NDA, someone's bound to break it.
There's no indication that the A12Z will be the chip that ships to consumers at the end of the year. So honestly it'd be a bit out of character to boast about the specific performance numbers of a pre-release dev kit chip - especially when that chip has already had Geekbench run on it for a while: https://browser.geekbench.com/ios_devices/ipad-pro-12-9-inch...
Last time around, the dev kits had Pentium 4 processors but the Intel Macs that launched used Core and Core 2 processors— a totally different microarchitecture with drastically different performance and power characteristics. It's a pretty safe bet that the first ARM Macs will be using SoCs that are at least a generational improvement over the A12Z. The higher-power Macs that will probably be released toward the end of the transition will likely use chips that are more drastically different from what's in an iPad Pro.
I'm honestly pretty excited to see what Apple can deliver. The A12Z is passively cooled, yet is on par with the 10th gen i5 in the new 13 inch MBP: https://browser.geekbench.com/macs/macbook-pro-13-inch-mid-2... Just imagine what they can do with active cooling!
While it's certainly possible the A12Z in the Developer Transition Kit is a true drop-in from the iPad, I would be surprised if it's not clocked significantly higher and actively cooled given the relaxed power and thermal restrictions.
I don't imagine the benchmarking software is being run through the App store process, so does the OS really make that big of a difference in the results? I'd think that if anything, the restricted nature of iOS would lower the benchmarks.
I just meant, we know very little about how the underlying technology we're testing actually works—how it prioritize cores and resources, how instructions get optimized under the hood, etc etc.
To be picky, that's the hardware, not the OS. And the same argument still applies: If we're flying blind, the benchmarks may underrepresent performance (although I think you're overestimating the opacity of the i-architecture).
It was pretty much the same with PowerPC to Intel...
Steve Jobs demoed OSX first. Then surprised everybody by saying, OSX had lived a double life in a secret building for many years (with photo), and that he had been running on an Intel Pentium 4 during the demo all morning. Nothing about performance.
There was also a developer system back then; an Intel Pentium 4 in a PowerMac case.
In a lot of ways, this is far more ambitious, and could mean a lot more for Apple long term, but...
... The one thing that hit me the most was, how impressed I was with Apple back then, and how excited I was that a company could do this. Steve Jobs presented it really well, but this time it felt quite flat.
... I really, wish they worked a bit on their showman-ship. They rushed through so many small things, and the presentation felt unnatural. Like they have all over-rehearsed it, but are still reading while presenting (you could even see eye-movements). It is just too smooth, too generic, and a bit too polished.
Please slow down, focus on only the most interesting bits, and give us time to digest it...
I think some of that stems from the fact that Steve Jobs was saying his own words, but everyone else is saying marketing's words. The best marketing could do to Steve was tell him that he's using a competing's brand name incorrectly. Everything else was his. So he could speak passionately in his own words.
Nailed it. With Steve Jobs his own passion and enthusiasm really shone through, whereas now we're seeing a rehearsed, scripted presentation. I don't think they're necessarily wrong to take this route, and I think with Apple's much bigger reach it's probably a fairly wise and safe bet, but it does sadden me a bit that we don't get Steve's showmanship anymore.
That wasn't their first processor change, either. Not even was it first in the Mac line.
The Apple I and II were MOS 6502 machines except for the Apple IIgs which was a 65c816. Then the early Macs were 680x0 machines. Then PowerPC. Then Intel.
They looked at Intel chips for the iPhone and settled on Arm before launch. I wouldn't be surprised if some very brittle, early development version of iOS was running on an Intel mobile platform at some point.
Your "buts" are dead-on. Everything felt so distant and unauthentic. They should require their execs and presenters to not read from somewhere else and do it live.
I don't understand why anyone would care about this. What difference could it possibly make? Are fewer people going to buy ARM Macbooks because the execs sounded a little wooden?
When I saw the PowerPC to Intel move, it felt like a company with ambition and vision who knew what they were doing with technology.
It was a confident CEO that used his own words to passionately live-demo the products his company was developing and selling. He almost apologetically told us that Apple had to make the change to deliver the notebooks he had promised two years before -- But couldn't with PowerPC. It made sense. And then, he showed us that all along Apple had the foresight to plan for this many years ahead.
It was inspiring, and I was really excited about it. As a user and computer scientist, it made me curious about OSX. As a developer I wanted to support their platform, and went on to work on iOS apps a couple of years later. Apple felt like the future.
This time, I feel unenthusiastic, and wondering where to go next... Despite the fact, that I objectively think this has the potential to be far more significant.
Delivery with confidence and passion for the product always matter. A lot.
Meh. You're in tech. The CEO could be a dildo on a stick for all I care, the only thing that should matter are the products itself. Otherwise you're just buying into a cult.
I am sorry, but I really need confidence in the person leading the platform I am developing for. My income depends on it, so I need to feel confident that the platform will actually move forward in the right direction.
I don't have confidence that a dildo on a stick can make the right decisions... But what do I know... I suppose, I have heard stranger things.
> He almost apologetically told us that Apple had to make the change to deliver the notebooks he had promised two years before -- But couldn't with PowerPC. It made sense. And then, he showed us that all along Apple had the foresight to plan for this many years ahead.
And what is the difference with the current switch?
I mean, apart from the presentation, which I do not care about.
I’m sorry, but to me what you just described is a sales pitch. I consider it vitally important to see through them even evaluating technical decisions.
It wasn't a surprise to anyone paying attention; NeXTSTEP had run on 68K/x86/Sparc/PA-RISC. Removing architecture support would have been remarkable.
What's important, for those paying attention, is that Apple promoted PowerPC emulation with the first x86 Macs in OS X 10.4 and then removed it after 10.6. If you think Apple won't screw you again, well, go ahead, it's your money.
[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.
The first Intel Macs were shipped in January 2006. Rosetta was dropped with the release of 10.7 in July 2011. Five years' support for a discontinued architecture seems rather generous.
(1) OS X 10.0 through 10.3 were released for PowerPC only. Apple first supported x86 in 10.4 and last supported PPC in 10.6.
(2) It's not 2030. If you're reading this is 2030, HN won't let you reply. That's a separate but related problem.
(3) Time passed, so fuck you is not a customer-centered philosophy. Time passed, so I'm going to remove already shipped capabilities is a customer centered philosophy in the sense that it's centered on fucking the customer.
(4) I have an Apple IIe and a MacPro3,1 and a whole bunch in between: fool me once, shame on you; fool me fifteen times, shame on me.
[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.
I still think that comments here, which require some minimal creative effort and are attached to identifiable user names, are usually somewhat legitimate, and more likely to be from fanbois than financiers. Voting to amplify or silence perspectives, on the other hand, entirely lacks accountability.
(I'm curently taking bids on an HN account with 4575 interweb points.)
It seems to me that the reverse is more likely at the scale of most people on HN: people emotionally invested in a brand use disposable income to buy shares of the brand (or, the reverse).
It wasn't just lack of performance numbers, there were no actual products announced. They would have had to tip their hand on a lot of info that is not helpful to customers or their ability to keep selling Intel stuff.
One big question though will be how this devkit benchmarks against the current maxed Intel mac mini. I'm curious if GPU performance beats the current BlackMagic eGPU. (rx 580)
I think concerns about how Apple will handle the transition can generally be addressed by the relatively smooth transition from PPC to Intel. Apple has literally done this before.
Apple transitions CPU architectures every 10-15 years.
6502 -> 68k in 1984 via hard-cutover [edit: see cestith's reply, there's more to this story than I knew]
68k -> PPC in 1994 via emulation
PPC -> x86 in 2006 via Rosetta JIT translation
x86 -> ARM in 2020 via Rosetta 2 static recompilation
You could even argue the transition from Mac OS 9 to Mac OS X was a transition of similar magnitude (although solely on PowerPC processors), with Classic Environment running a full Mac OS 9 instance [1]
I disagree that 6502 -> 68k was a "transition." The Apple II and Mac were two separate product lines. The three major early home computer companies (Apple, Atari, Commodore) all did this.
This is true, but note it was released in 1991, many years after the Mac's introduction. By that time, the Apple II was definitely on the way out. The last hold outs (schools...) probably needed encouragement.
Yes, in 8-bit mode. The IIgs runs with the processor in 16-bit mode from everything I've read about it. It might be able to swap modes to run older Apple software, but the IIgs is a 16-bit machine.
Not divulging their hand may be a thing. But they could at least have said something (rehashed) about the A12Z: "it performs better than the CPUs currently shipping in the Mac mini by X% in Y benchmark".
I'm not intrisically excited for a new Apple product, but if they could have told me, we can deliver 50% extra battery life in your new MacBook at comparable performance, that would build up some hype and maybe mindshare.
> not helpful to [...] their ability to keep selling Intel stuff.
I hope that that's it. If we're going through the pains of a platform transition, I'd like to get something out of it.
> Not divulging their hand may be a thing. But they could at least have
Let's say that the new numbers are mindblowingly good. So then what? Nobody buys anything from them until next year because they're all waiting? Yikes. This way fewer people will be mortified of the idea of buying something right now instead of waiting.
> But they could at least have said something (rehashed) about the A12Z: "it performs better than the CPUs currently shipping in the Mac mini by X% in Y benchmark".
It's a kit to allow developers to prepare for transitioning their applications to ARM, for future retail MacOS/ARM devices. It's not a new Mac Mini, and it doesn't make sense to compare the retail machines to this dev kit (which is probably running a yet-to-be-fully-optimised OS)
I'm guessing they're not planning on releasing any A12Z products. They kept going on about how "scalable" their platform. I'm betting they launch with a significantly more powerful processor (they could easily double core counts and up clock frequencies for a laptop-class processor) on a next-gen process (i.e. 5nm). They probably don't even know what the performance will be like yet.
I think the lack of hardware and lack of benchmarks are related. Apple doesn't know yet what the thermal throttle will be on an A12Z MacBook until they start testing the cooling system.
There is no reason why rx580 would not be supported on ARM or why there would be any meaningful performance delta. AMD does not have any kind of “secret-sauce” driver for that, it is simply LLVM targeted to that architecture that converts HLSL/GLSL/SPIR-V into the architecture specific code.
It's an integrated GPU so it isn't going to compete against serious dedicated GPUs, and no one should expect that. I imagine much like existing Apple devices (and Windows laptops) with dedicated GPUs it will switch as necessary. But at least the integrated GPU will be better.
I think the 5700 series is eGPU compatible on macOS.
These fail because Apple is orders of magnitude away from Nvidia and AMD in performance, plus this is a chip with a very limited TDP. I think they will fall a bit slower than current AMD APUs.
I don't think they even used the term "ARM" at any point. They're calling it "Apple's silicon," and they acknowledged it's the same as what iPhone and iPad use. But I thought it was interesting how they seemed to avoid the term. It's probably just a matter of avoiding getting too "techy" and marketing.
The first guy in the "lab" scene (Johny maybe?) mentioned plenty of other "techy" terms. I think that they want to distance themselves from other ARM manufacturers and put the focus on Apple's advantages over Qualcomm and others.
ARM doesn't really matter very much to Apple - Apple designs the micro-architecture and many (most?) of the other SOC components themselves.
With the technology moves Apple has made, they could probably switch to RISC-V at this point, however being able to use ARM devtools probably adds more value to Apple than any cost savings moving Apple would gain from moving away from ARM
No, the custom silicon matters more. They've spent years building infrastructure to make it easy to change late stage code generation to multiple ISAs.
It doesn't. Ninety-nine point five nines of software is architecture-independent, and if you're an App Store sharecropper you'll never notice. It's the users with paid-for x86 binaries who will be screwed, like they were when Apple removed the ability to run PowerPC binaries in OS X 10.7.
[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.
They mentioned running Linux in a VM at least twice in the keynote. I'm not sure why, unless it's an acknowledgement that OS X is no longer a usable development environment.
Linux, like any OS written in the past 30 years, is substantially architecture-independent. My day job involves coding for several devices with Linux kernels on ARM (32bit) and Aarch64 and I have no idea which is which, nor any need to.
[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.
(‘ARM’ has become meaningless marketing drivel; there are physically existing pairs of 32-bit ‘ARM’ processors that have exactly zero physically existing machine instructions in common.)
[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.
It's funny how that works. This story may be only tangentially related, but here goes: A few years back, I was visiting the local ARM offices in Trondheim, Norway. I happened to mention something about the iPhones using their processors, and my host immediately said, “I am not allowed to comment on that”. But everybody knows it, I said. In response, he said yes, but he still can't talk about it. Possibly, he broke the rule by admitting even that much.
I don't think it's strange at all. Consider for a moment, what competitive advantage does Apple have advertising that the iPhone uses an ARM CPU? The people who need to know the architecture can find out easily enough, the people who are just buying the latest iPhone know it has an Apple CPU.
Everyone and their dog has known that Apple uses Gorilla glass on their devices since the original iPhone. I believe it was only until recently (this year) that Apple executives acknowledged the relationship in any capacity (one of their SVP visited the Gorilla Glass factory with press)
FAANG NDAs can be crazy. I've been in a position where, yes, everyone knew something, and the other company openly talked about it, but because of the way the NDA was written there'd be heavy financial penalties for us to talk about what everyone seemed to know anyway.
I also thought this might be a 'dry run' for a risc-v strategy across their platforms, as risc-v was designed to be expanded for specific applications. Most of the IP is no doubt in the peripherals, which are largely CPU-instruction-set independent. We'll see. I say 5 years, tops, to first Apple risc-v device.
RISC-V doesn't really get Apple anything. They were a very very early investor in ARM back in the early 90s, and the rumor is that they have a pretty much "do whatever you want" licence to ARM technology.
Maybe people thought I was making some kind of value judgement about web developers not being "real developers" because they're not kernel hackers. That's just my guess though - I doubt any of the downvoters will see or reply to this.
I believe Apple can make processors for laptop and normal desktop in A-series way but I'm curious how they make for Mac Pro. Adopting architecture like chiplet?
I imagine they may use a mass amount of SOCs. Keep in mind that their Laptop game plan may be to build CPUs to the Max limits of the TDP and then use their (rather incredible) power management system to reduce power far below Intel chips for normal use. This is the iPad Game-plan, the chip is rarely ever fully hammered. You may see a 12 core Macbook Air capable of (unsustained) desktop Ryzen-level performance but it will generally be ran as 6 low power cores and one or two high power cores, with bursts of insanity when needed. If you can pool a few of those together, it would be easy to beat Intel's server offerings.
They said they'd be using Intel chips for a while, it could specifically in regards to Mac Pro.
Even if apple could make a great server/high performance chip that seems like possibly a lot of additional work for little gain in the market?
Keeping intel chips for some computers may make more sense in keeping their toes in the water for intel incase they need to rekindle that relationship for future projects down the road.
The goal is to transition in two years. Given Apple's record at the high end, I believe that goal is optimistic.
I would be astonished if an Ax chip can match the media creation performance of the Xeons in the MacPro and iMacPro any time soon. The top Xeon has 28 cores, and matching that is going to be an interesting challenge.
I think it's more likely there were be further splits in the range, with prosumer iMacs running Ax and no-compromise pro models staying on Xeon for at least the next few years.
Intel support was announced during OS X Tiger, and 10.4.4 was the first public release with support for x86. 10.7.0 was the first release without support for PPC. So 10.5 Leopard and 10.6 Snow Leopard were the two major releases with support for both Intel and PPC.
Now, Apple tends to provide security updates for at least a few years for each OS release, so I can envision recent Intel Macs getting security updates for another 4-5 years.
Up until last month I would've said they would keep things around much longer - I mean, they supported old iPods and iPhones far longer than their competitors...
But the killing of openGL and 32-bit software is making me wonder about their previously-amazing commitment to supporting older things.
Even if the goal to transition is 2 years that still means there will be a long tail of having to support Intel chips for future updates to the operating system. I have a 2011 Mac Pro that is chugging along perfectly fine with zero issues.
Possible, they've yet to drop the i-moniker of the imac after all, though resurrecting the ibook after so long and when the macbook remains "available" would be a bit odd.
I am interested in seeing real world performance. I agree that there isn't a lot that Apple needs to do here; the most curious bit will be how much they're able to automate behind the scenes with Rosetta to help out the development community. For most of my workload I am sure that it'll be completely transparent. The only bits that'll likely be less performance will be testing in virtual machines for x86, but it isn't like I care too much about that performance. I'd take the 3+ hours on battery.
3. Osbourne effect avoided by not crowing about specs - those who are cautious or dependent on x86 will continue buying Intel kit. Once the new product is available and it kicks the daylight out of the legacy, they will already have their pareto-split of interest in their new offerings.
The A12Z is already shipping in the latest iPad Pro, so it's not like its performance is some unknown quantity: benchmarks are aplenty... Although I guess this dev kit could run at a different frequency, and have more/less memory bandwidth. Performance should still roughly be that of the iPad Pro.
The A12Z is itself only a small update on the A12X from 2018, so it's basically two years behind whatever will ship in actual ARM Macs this fall...
It might be running at a somewhat higher frequency, because why not. But the DTK is not a production model, so I don't expect Apple to spend significant resources on it. After all the A12Z should be good enough as-is.
You have to consider that Apple engineers are busy designing new SoCs for the entire Mac line-up: optimizing the A12Z for the DTK is probably not their top priority at the moment. They will want to wow people with a new MacBook (Air? Pro? <blank>?) this fall: that SoC should be their priority...
Apple will almost assuredly have a dedicated event later this year where they will announce new hardware.
Apple has development kits running in modified Apple TV's. This is a chip that has essentially been out for a few years in iPad Pros. Why would Apple announce numbers based on this? It also assumes Apple will ship future laptops without fans or ports, which is how the development kit is coming out.
Apple will most likely have an A14X out later this year in at least one laptop. That's going to be significantly newer and more advanced than the A12Z in development kits.
No it's not less dependency on 3rd parties, it's more dependency on TSMC. Before they were able to play Intel vs TSMC, it's not the case anymore. Then you add the geopolitical issue of Taiwan vs China and the risks level keep increasing.
> I'm surprised we didn't get any performance numbers.
It's the CPU in the iPad Pro, performance numbers are out there in the wild. The only big change is the RAM. This isn't a retail product, it's a developer kit. When they release retail Macs I'm sure there will be some performance numbers.
There's still a huge difference in TDP. The iPad probably has a TDP of maybe 10 watts? The Mac Mini Intel CPU's have 65 watt TDP's. They can deliver more power and cooling to the 12Z than in an iPad and it should result in much higher performance.
They said they are planning on making a family of SoC for the Mac though, I doubt we are going to see iPad-level processors in the actual ARM-based Macs they will sell.
I wonder if they're holding them for the actual hardware release in Fall? They could still be deciding the tradeoff between battery life and raw power.
It's not a product release - it's an announcement of direction for the Mac product line and the Mac OS platform. Once they have hardware with ARM processors for purchase they'll be speaking to the processor specs and how much better they are at power management.
They had to announce this early to allow developers to get ready. If they could have gotten away with not announcing early they would have. Obviously (if all apps would automatically run natively on ARM without any developer involvement) they would have first announced this with an actual new Mac.
That, however, was not an option. So they have to tread carefully in what they say and they also have to be a bit careful about showing off too much.
They only had to tout the benefits of ARM insofar as to placate the fears of consumers (their Rosetta story plus virtualization story helped there) and to provide some reasonable justification to actually make devs at least a bit excited, even though they have to do additional work.
Plus: No ARM Mac (except the transition kit) currently exists. It’s not even clear if the first Mac they will announce is even finished yet, if only internally. And even if it is finished: Do you think going on stage now and talking about a new MacBook Air that has twice the performance and 50% more battery life as the current MacBook Air – oh, and you can get one in December – would be a good idea?
This is Apple’s tightrope walk to avoid too much of an Osborne effect. I think they are ok with some Osborne effect (if only because they know that even if no one buys an Intel Mac ever again during the next two years transition time they will not go bankrupt, so far from it) but you don’t have to provoke one, right?
I expect plenty of numbers and comparisons when they introduce the actual first ARM Mac.
It doesn't seem weird at all, and the fact that they're sending out devices on the A12Z seems to be intentional sandbagging: they know that people are going to benchmark and the results will likely be simply comparable to current hardware (from a performance perspective...energy efficiency will clearly be much better). When they release the actual devices, where their power and thermal profile is dramatically higher than an iPad Pro, it will actually wow.
That’s my takeaway too. But imagine the impact of the speed is as good or faster? They’re shipping with 16 gigs ram too so it’s at least not the typical 8 gig minimum.
I would also expect this hw to be nerfed compared to what actually goes out to customers. That way no matter what is achieved on this it is better for real users.
Those numbers would be meaningless without knowing what the actual geometry of the internals will be, because cooling is a major limiting factor for laptop processor performance.
Don’t expect performance. The Intel DTK was Prescott based (while AMD had great dual cores and Intel were lagging). Then they’ve released their Core series that started from mobile and had great performance.
I guess they did some homework before ditching Intel.
The big question Is if they have enough headroom for manufacturing reliable chips with sustained high power.
i was replying to the parent thread that observed no performance data was shared. if performance were say, the same or slightly worse, then it would explain why apple wouldn't release performance data. but they might still want their own chips, if they increased profitability
Unlikely - being able to have full control of your roadmap is a huge strategic advantage. Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.
You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs? It’s not just about profit margin.
> Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.
If Apple could do that and play it to their advantage, they would have done so a long time ago.
Higher margins and profits are the drivers in the end. Strategic control or not is just a way they use to achieve that. It is a publicly traded company, after all.
> You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs?
I doubt Oculus cares given their goal. There are pros and cons of vertically integrating an entire company into one.
Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?
Apple has zero presence in data centers.
I read people here writing "double the battery life" without any source, but even if that was the case I own a laptop that does 2 hours on battery, I use it to run models on a discrete GPU so power efficiency goes out of the window anyway, it's really not achievable.
The other one can handle average workloads for 12 hours and weights a bit more then a kilo, if it was smaller or lighter it would be a much worse laptop than it is (if it's too light it has no stability and you fight constantly to keep it steady on your desk)
> Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?
I think it does. Other than double the battery life (which I wouldn't really need, but my Dad who travels a lot would absolutely love), the big thing is thermals (which were specifically mentioned in the keynote).
The biggest constraint on Laptop performance is thermal throttling. That's why gaming laptops have huge bulky fans, and a current MacBook has pretty decent performance for short bursts, but if you are running something (say a compiler) at full throttle for a few minutes then it gets significantly throttled.
Better thermal performance (which is directly proportional to power usage) could well be the key to unlocking close-to-desktop performance in a laptop form-factor. Which could be a pretty big win for the MacBook Pro market.
> I'm surprised we didn't get any performance numbers.
The fact that all the demoes were on their top-of-line most-expensive machines fell very weird to me. "Look at this amazing performance" would be great if the demo was on a Macbook Air.
They did have a Mac Pro there as a prop, which was interesting and potentially confusing. There was a brief moment where you could see that it was connected to something in a Mac mini case.
They're shipping a 'Development Transition Kit' Mac mini with an A12Z this week, so it's not like the numbers are going to stay private for a long time. Even if there's an NDA, someone's bound to break it.