Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm surprised we didn't get any performance numbers. Either raw power or at least power efficiency and projected battery life improvements. Seeing as this is a major reason for the transition (according to them), it feels very weird.

They're shipping a 'Development Transition Kit' Mac mini with an A12Z this week, so it's not like the numbers are going to stay private for a long time. Even if there's an NDA, someone's bound to break it.



There's no indication that the A12Z will be the chip that ships to consumers at the end of the year. So honestly it'd be a bit out of character to boast about the specific performance numbers of a pre-release dev kit chip - especially when that chip has already had Geekbench run on it for a while: https://browser.geekbench.com/ios_devices/ipad-pro-12-9-inch...


Last time around, the dev kits had Pentium 4 processors but the Intel Macs that launched used Core and Core 2 processors— a totally different microarchitecture with drastically different performance and power characteristics. It's a pretty safe bet that the first ARM Macs will be using SoCs that are at least a generational improvement over the A12Z. The higher-power Macs that will probably be released toward the end of the transition will likely use chips that are more drastically different from what's in an iPad Pro.


I'm honestly pretty excited to see what Apple can deliver. The A12Z is passively cooled, yet is on par with the 10th gen i5 in the new 13 inch MBP: https://browser.geekbench.com/macs/macbook-pro-13-inch-mid-2... Just imagine what they can do with active cooling!


> The A12Z is passively cooled

While it's certainly possible the A12Z in the Developer Transition Kit is a true drop-in from the iPad, I would be surprised if it's not clocked significantly higher and actively cooled given the relaxed power and thermal restrictions.


I'm not disagreeing per se, but "on par" is really hard to say for sure, since you're benchmarking across CPU architectures.

And, one of the comparison points can only ever be run on a highly locked down black box of an OS.


I don't imagine the benchmarking software is being run through the App store process, so does the OS really make that big of a difference in the results? I'd think that if anything, the restricted nature of iOS would lower the benchmarks.


It is run through the App Store process, and AR on iOS is an...eye-opening...experience on performance characteristics of their CPUs


I just meant, we know very little about how the underlying technology we're testing actually works—how it prioritize cores and resources, how instructions get optimized under the hood, etc etc.


To be picky, that's the hardware, not the OS. And the same argument still applies: If we're flying blind, the benchmarks may underrepresent performance (although I think you're overestimating the opacity of the i-architecture).


> So honestly it'd be a bit out of character

This is Apple we're talking about. They boast about unverifiable performance claims in pretty much every product they announce...


But not before they announce it.


It was pretty much the same with PowerPC to Intel...

Steve Jobs demoed OSX first. Then surprised everybody by saying, OSX had lived a double life in a secret building for many years (with photo), and that he had been running on an Intel Pentium 4 during the demo all morning. Nothing about performance.

There was also a developer system back then; an Intel Pentium 4 in a PowerMac case.

In a lot of ways, this is far more ambitious, and could mean a lot more for Apple long term, but...

... The one thing that hit me the most was, how impressed I was with Apple back then, and how excited I was that a company could do this. Steve Jobs presented it really well, but this time it felt quite flat.

... I really, wish they worked a bit on their showman-ship. They rushed through so many small things, and the presentation felt unnatural. Like they have all over-rehearsed it, but are still reading while presenting (you could even see eye-movements). It is just too smooth, too generic, and a bit too polished.

Please slow down, focus on only the most interesting bits, and give us time to digest it...


I think some of that stems from the fact that Steve Jobs was saying his own words, but everyone else is saying marketing's words. The best marketing could do to Steve was tell him that he's using a competing's brand name incorrectly. Everything else was his. So he could speak passionately in his own words.

Everyone else is acting, but they aren't actors.


Nailed it. With Steve Jobs his own passion and enthusiasm really shone through, whereas now we're seeing a rehearsed, scripted presentation. I don't think they're necessarily wrong to take this route, and I think with Apple's much bigger reach it's probably a fairly wise and safe bet, but it does sadden me a bit that we don't get Steve's showmanship anymore.


That wasn't their first processor change, either. Not even was it first in the Mac line.

The Apple I and II were MOS 6502 machines except for the Apple IIgs which was a 65c816. Then the early Macs were 680x0 machines. Then PowerPC. Then Intel.

They looked at Intel chips for the iPhone and settled on Arm before launch. I wouldn't be surprised if some very brittle, early development version of iOS was running on an Intel mobile platform at some point.


Not only that, there were also early versions of iOS compiled for PowerPC, for running inside the iPhone simulator app on PowerPC-based Macs.


Apple is known for throwing the towel early in backwards compatibility though.


Your "buts" are dead-on. Everything felt so distant and unauthentic. They should require their execs and presenters to not read from somewhere else and do it live.


There is a Pandemic going on. This fully online WWDC certainly affected the feeling as much as the content.


I don't understand why anyone would care about this. What difference could it possibly make? Are fewer people going to buy ARM Macbooks because the execs sounded a little wooden?


When I saw the PowerPC to Intel move, it felt like a company with ambition and vision who knew what they were doing with technology.

It was a confident CEO that used his own words to passionately live-demo the products his company was developing and selling. He almost apologetically told us that Apple had to make the change to deliver the notebooks he had promised two years before -- But couldn't with PowerPC. It made sense. And then, he showed us that all along Apple had the foresight to plan for this many years ahead.

It was inspiring, and I was really excited about it. As a user and computer scientist, it made me curious about OSX. As a developer I wanted to support their platform, and went on to work on iOS apps a couple of years later. Apple felt like the future.

This time, I feel unenthusiastic, and wondering where to go next... Despite the fact, that I objectively think this has the potential to be far more significant.

Delivery with confidence and passion for the product always matter. A lot.


Meh. You're in tech. The CEO could be a dildo on a stick for all I care, the only thing that should matter are the products itself. Otherwise you're just buying into a cult.


No...

I am sorry, but I really need confidence in the person leading the platform I am developing for. My income depends on it, so I need to feel confident that the platform will actually move forward in the right direction.

I don't have confidence that a dildo on a stick can make the right decisions... But what do I know... I suppose, I have heard stranger things.


> He almost apologetically told us that Apple had to make the change to deliver the notebooks he had promised two years before -- But couldn't with PowerPC. It made sense. And then, he showed us that all along Apple had the foresight to plan for this many years ahead.

And what is the difference with the current switch?

I mean, apart from the presentation, which I do not care about.


I’m sorry, but to me what you just described is a sales pitch. I consider it vitally important to see through them even evaluating technical decisions.


It wasn't a surprise to anyone paying attention; NeXTSTEP had run on 68K/x86/Sparc/PA-RISC. Removing architecture support would have been remarkable.

What's important, for those paying attention, is that Apple promoted PowerPC emulation with the first x86 Macs in OS X 10.4 and then removed it after 10.6. If you think Apple won't screw you again, well, go ahead, it's your money.

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.


The first Intel Macs were shipped in January 2006. Rosetta was dropped with the release of 10.7 in July 2011. Five years' support for a discontinued architecture seems rather generous.


There was almost 10 years between 10.0 and 10.7:

https://en.wikipedia.org/wiki/MacOS_version_history#Releases

So if you're worried about not being able to run your Intel Mac OS X software in 2030, yeah, you should probably avoid these new ARM Macs.


(1) OS X 10.0 through 10.3 were released for PowerPC only. Apple first supported x86 in 10.4 and last supported PPC in 10.6.

(2) It's not 2030. If you're reading this is 2030, HN won't let you reply. That's a separate but related problem.

(3) Time passed, so fuck you is not a customer-centered philosophy. Time passed, so I'm going to remove already shipped capabilities is a customer centered philosophy in the sense that it's centered on fucking the customer.

(4) I have an Apple IIe and a MacPro3,1 and a whole bunch in between: fool me once, shame on you; fool me fifteen times, shame on me.

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.


> [Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.

It’s extremely weird to me to assume that an opinion someone posts in an online message board will be heavily influenced by their stock portfolio.


I still think that comments here, which require some minimal creative effort and are attached to identifiable user names, are usually somewhat legitimate, and more likely to be from fanbois than financiers. Voting to amplify or silence perspectives, on the other hand, entirely lacks accountability.

(I'm curently taking bids on an HN account with 4575 interweb points.)


It seems to me that the reverse is more likely at the scale of most people on HN: people emotionally invested in a brand use disposable income to buy shares of the brand (or, the reverse).


It's the other way around, some opinions are intended to influence their stock portfolio value.


It wasn't just lack of performance numbers, there were no actual products announced. They would have had to tip their hand on a lot of info that is not helpful to customers or their ability to keep selling Intel stuff.

One big question though will be how this devkit benchmarks against the current maxed Intel mac mini. I'm curious if GPU performance beats the current BlackMagic eGPU. (rx 580)


> there were no actual products announced.

Correct me if I'm wrong, but I think they also didn't announce any actual hardware when they introduced Intel Macs.

Interestingly, for the PowerPC->Intel transition, they also had an Developer Transition Kit: https://www.macstories.net/stories/this-is-not-a-product-the...


I believe the first Intel product announced was the [EDIT: iMac - see: https://en.wikipedia.org/wiki/IMac_(Intel-based)#1st_generat...] MacBook Pro at Macworld in January of 2006, long after it was announced at WWDC in 2005.

I think concerns about how Apple will handle the transition can generally be addressed by the relatively smooth transition from PPC to Intel. Apple has literally done this before.


Apple transitions CPU architectures every 10-15 years.

6502 -> 68k in 1984 via hard-cutover [edit: see cestith's reply, there's more to this story than I knew]

68k -> PPC in 1994 via emulation

PPC -> x86 in 2006 via Rosetta JIT translation

x86 -> ARM in 2020 via Rosetta 2 static recompilation

You could even argue the transition from Mac OS 9 to Mac OS X was a transition of similar magnitude (although solely on PowerPC processors), with Classic Environment running a full Mac OS 9 instance [1]

[1] https://en.wikipedia.org/wiki/List_of_macOS_components#Class...


I disagree that 6502 -> 68k was a "transition." The Apple II and Mac were two separate product lines. The three major early home computer companies (Apple, Atari, Commodore) all did this.


Though Apple did sell a card that allowed 6502 apps to run on a Macintosh.

https://en.wikipedia.org/wiki/Apple_IIe_Card


This is true, but note it was released in 1991, many years after the Mac's introduction. By that time, the Apple II was definitely on the way out. The last hold outs (schools...) probably needed encouragement.


They also had the IIgs which was the 65c816 in 16-bit mode. The Lisa was in 1983 before the Mac, and it was a 68k.


A 65c816 runs 6502 code natively.


Yes, in 8-bit mode. The IIgs runs with the processor in 16-bit mode from everything I've read about it. It might be able to swap modes to run older Apple software, but the IIgs is a 16-bit machine.


To be honest, this entire plan looks like copy-Paste, kinda like how Star Wars Episode 7 is kinda a remake of A New hope?

Rosetta 2, Universal 2, The DTK Program demo on ARM,, it's the same plan.


This is business, not some entertainment-value focused venture.

Apple did great last go-around, and they are leagues ahead of Microsoft or any other tech company in such a massive/successful platform shift.

This is their 3rd architecture shift btw - 680x0 -> Power -> Intel -> ARM. They practically wrote the book on how to do it right.


The last time it went very well, so following the same playbook seems a good idea.


Not divulging their hand may be a thing. But they could at least have said something (rehashed) about the A12Z: "it performs better than the CPUs currently shipping in the Mac mini by X% in Y benchmark".

I'm not intrisically excited for a new Apple product, but if they could have told me, we can deliver 50% extra battery life in your new MacBook at comparable performance, that would build up some hype and maybe mindshare.

> not helpful to [...] their ability to keep selling Intel stuff.

I hope that that's it. If we're going through the pains of a platform transition, I'd like to get something out of it.


> Not divulging their hand may be a thing. But they could at least have

Let's say that the new numbers are mindblowingly good. So then what? Nobody buys anything from them until next year because they're all waiting? Yikes. This way fewer people will be mortified of the idea of buying something right now instead of waiting.



> But they could at least have said something (rehashed) about the A12Z: "it performs better than the CPUs currently shipping in the Mac mini by X% in Y benchmark".

It's a kit to allow developers to prepare for transitioning their applications to ARM, for future retail MacOS/ARM devices. It's not a new Mac Mini, and it doesn't make sense to compare the retail machines to this dev kit (which is probably running a yet-to-be-fully-optimised OS)


I'm guessing they're not planning on releasing any A12Z products. They kept going on about how "scalable" their platform. I'm betting they launch with a significantly more powerful processor (they could easily double core counts and up clock frequencies for a laptop-class processor) on a next-gen process (i.e. 5nm). They probably don't even know what the performance will be like yet.


I think the lack of hardware and lack of benchmarks are related. Apple doesn't know yet what the thermal throttle will be on an A12Z MacBook until they start testing the cooling system.


There is no reason why rx580 would not be supported on ARM or why there would be any meaningful performance delta. AMD does not have any kind of “secret-sauce” driver for that, it is simply LLVM targeted to that architecture that converts HLSL/GLSL/SPIR-V into the architecture specific code.


It's an integrated GPU so it isn't going to compete against serious dedicated GPUs, and no one should expect that. I imagine much like existing Apple devices (and Windows laptops) with dedicated GPUs it will switch as necessary. But at least the integrated GPU will be better.


I can assure you that performance will not come anywhere close to an RX580, much less to a more modern Radeon 5700.


Where do you think it will fall? Do you think these will be compatible with eGPUs?


I think the 5700 series is eGPU compatible on macOS.

These fail because Apple is orders of magnitude away from Nvidia and AMD in performance, plus this is a chip with a very limited TDP. I think they will fall a bit slower than current AMD APUs.


    “PC guys are not going to just figure this out.
     They're not going to just walk in.”


I don't think they even used the term "ARM" at any point. They're calling it "Apple's silicon," and they acknowledged it's the same as what iPhone and iPad use. But I thought it was interesting how they seemed to avoid the term. It's probably just a matter of avoiding getting too "techy" and marketing.


The first guy in the "lab" scene (Johny maybe?) mentioned plenty of other "techy" terms. I think that they want to distance themselves from other ARM manufacturers and put the focus on Apple's advantages over Qualcomm and others.


True. I think you're right.


ARM doesn't really matter very much to Apple - Apple designs the micro-architecture and many (most?) of the other SOC components themselves.

With the technology moves Apple has made, they could probably switch to RISC-V at this point, however being able to use ARM devtools probably adds more value to Apple than any cost savings moving Apple would gain from moving away from ARM


But the instruction set (ISA) absolutely matters for the entire software ecosystem!


No, the custom silicon matters more. They've spent years building infrastructure to make it easy to change late stage code generation to multiple ISAs.


It doesn't. Ninety-nine point five nines of software is architecture-independent, and if you're an App Store sharecropper you'll never notice. It's the users with paid-for x86 binaries who will be screwed, like they were when Apple removed the ability to run PowerPC binaries in OS X 10.7.

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.


Running Linux on VM is essential for devs so it's difficult to switch to arch that Linux doesn't support well.


They mentioned running Linux in a VM at least twice in the keynote. I'm not sure why, unless it's an acknowledgement that OS X is no longer a usable development environment.

Linux, like any OS written in the past 30 years, is substantially architecture-independent. My day job involves coding for several devices with Linux kernels on ARM (32bit) and Aarch64 and I have no idea which is which, nor any need to.

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.


I think mentioning Linux in a VM is their way of telling you that Microsoft Windows will no longer be supported on the new hardware.


Seems more likely an acknowledgement that Windows 10 with WSL2 is a threat to their developer market share.


I don't say ARM is not suitable.

Linux able to support new arch but it not means ecosystem going to support any arch. ARM is great arch for now.


ARM isn't new. Not even Aarch64 is new.

(‘ARM’ has become meaningless marketing drivel; there are physically existing pairs of 32-bit ‘ARM’ processors that have exactly zero physically existing machine instructions in common.)

[Dis]claimer: I have no long or short in AAPL. Anyone posting or voting in this thread should similarly disclose.


I don't know that Apple ever really talks the fact that their chips are based on ARM during this kind of event.


It's funny how that works. This story may be only tangentially related, but here goes: A few years back, I was visiting the local ARM offices in Trondheim, Norway. I happened to mention something about the iPhones using their processors, and my host immediately said, “I am not allowed to comment on that”. But everybody knows it, I said. In response, he said yes, but he still can't talk about it. Possibly, he broke the rule by admitting even that much.


How strange. I mean, Xcode shows which architectures you build for. They even mention "armv6", "armv7", "arm64" etc in their own tech notes. https://developer.apple.com/support/required-device-capabili...


I don't think it's strange at all. Consider for a moment, what competitive advantage does Apple have advertising that the iPhone uses an ARM CPU? The people who need to know the architecture can find out easily enough, the people who are just buying the latest iPhone know it has an Apple CPU.


Everyone and their dog has known that Apple uses Gorilla glass on their devices since the original iPhone. I believe it was only until recently (this year) that Apple executives acknowledged the relationship in any capacity (one of their SVP visited the Gorilla Glass factory with press)


FAANG NDAs can be crazy. I've been in a position where, yes, everyone knew something, and the other company openly talked about it, but because of the way the NDA was written there'd be heavy financial penalties for us to talk about what everyone seemed to know anyway.


Apple also uses EmberJS but ember is not allowed to advertise it on their site.


Maybe that lives them open to switch the internal instruction set to risc-v down the road without a naming issue.


I also thought this might be a 'dry run' for a risc-v strategy across their platforms, as risc-v was designed to be expanded for specific applications. Most of the IP is no doubt in the peripherals, which are largely CPU-instruction-set independent. We'll see. I say 5 years, tops, to first Apple risc-v device.


RISC-V doesn't really get Apple anything. They were a very very early investor in ARM back in the early 90s, and the rumor is that they have a pretty much "do whatever you want" licence to ARM technology.


But this is WWDC...


You might see it in the State of the Union, that's more aimed at developers.


Sure, the D stands for Developers, but how many of those developers care about things like CPU microarchitecture?

Aren't they mostly web/app developers?


...Why is this being downvoted?


Maybe people thought I was making some kind of value judgement about web developers not being "real developers" because they're not kernel hackers. That's just my guess though - I doubt any of the downvoters will see or reply to this.


It's because the A stands for Acorn and old rivalries die hard.


The A12Z is already faster than every tablet and phone in existence. Depending on the benchmark, it smokes about 80% of off the shelf PC hardware.

They really don't need to do that much.


I believe Apple can make processors for laptop and normal desktop in A-series way but I'm curious how they make for Mac Pro. Adopting architecture like chiplet?


I imagine they may use a mass amount of SOCs. Keep in mind that their Laptop game plan may be to build CPUs to the Max limits of the TDP and then use their (rather incredible) power management system to reduce power far below Intel chips for normal use. This is the iPad Game-plan, the chip is rarely ever fully hammered. You may see a 12 core Macbook Air capable of (unsustained) desktop Ryzen-level performance but it will generally be ran as 6 low power cores and one or two high power cores, with bursts of insanity when needed. If you can pool a few of those together, it would be easy to beat Intel's server offerings.


Using mass amount of SoCs effectively on a single machine isn't trivial. Even AMD go for simple solution on EPYC 7002.


Of course it isn't, but it's very, very valuable for them. Even if they make a custom chip that is the CPU minus the SOC portions.


Tim Cook did say at the end that they still had intel hardware in the pipeline. Maybe Mac Pro will stay intel for a while?


Rumors have it that there's a new iMac with Intel processors.


They said they'd be using Intel chips for a while, it could specifically in regards to Mac Pro.

Even if apple could make a great server/high performance chip that seems like possibly a lot of additional work for little gain in the market?

Keeping intel chips for some computers may make more sense in keeping their toes in the water for intel incase they need to rekindle that relationship for future projects down the road.


I'm also agree that keep using Intel for Mac Pro is reasonable but I heard that they transition for own chip within two years.


The goal is to transition in two years. Given Apple's record at the high end, I believe that goal is optimistic.

I would be astonished if an Ax chip can match the media creation performance of the Xeons in the MacPro and iMacPro any time soon. The top Xeon has 28 cores, and matching that is going to be an interesting challenge.

I think it's more likely there were be further splits in the range, with prosumer iMacs running Ax and no-compromise pro models staying on Xeon for at least the next few years.


Transition = all products have an option within 2 years

Also means == some products (13" mbp) migrate fully and are Apple only, And some don't (Mac Pro) and have both options.

Just reading the tea leaves.


You think these 2 years is also the number of macOS versions that will keep supporting Intel?


No way, in 2 years they will only stop shipping new Intel Macs but will continue supporting existing ones for a few years more.

Last PowerPC Macs were released in October 2005 and first Intel-only MacOS X was released in August 2009, almost 4 years later.


That's what happened last time:

Intel support was announced during OS X Tiger, and 10.4.4 was the first public release with support for x86. 10.7.0 was the first release without support for PPC. So 10.5 Leopard and 10.6 Snow Leopard were the two major releases with support for both Intel and PPC.

Now, Apple tends to provide security updates for at least a few years for each OS release, so I can envision recent Intel Macs getting security updates for another 4-5 years.


Up until last month I would've said they would keep things around much longer - I mean, they supported old iPods and iPhones far longer than their competitors...

But the killing of openGL and 32-bit software is making me wonder about their previously-amazing commitment to supporting older things.


I suspect that killing 32bit was a necessary step, as they aren’t going to port deprecated features to new platforms.

They did some of this before intel, I think with some of the transitional MacOS->OS X APIs.


Even if the goal to transition is 2 years that still means there will be a long tail of having to support Intel chips for future updates to the operating system. I have a 2011 Mac Pro that is chugging along perfectly fine with zero issues.


Mayhaps ARM will be for non-pro lines at least initially aka Macbook (Air?), then iMac, and possibly later MBPs and Mac Pro?


I was thinking they would put ARM into an "i" line of laptop and desktop machines. iBook and iMac naming maybe.


Possible, they've yet to drop the i-moniker of the imac after all, though resurrecting the ibook after so long and when the macbook remains "available" would be a bit odd.


I have to assume they'll just put a bunch of processors in there, go massively parallel.


For Mac Pro prices and TDP they certainly could put out 4-,8-,12-way boxes/mobos.


Massive NUMA .... nightmare happen!


I'd expect that the high end Mac Pro is the last model to drop.


I am interested in seeing real world performance. I agree that there isn't a lot that Apple needs to do here; the most curious bit will be how much they're able to automate behind the scenes with Rosetta to help out the development community. For most of my workload I am sure that it'll be completely transparent. The only bits that'll likely be less performance will be testing in virtual machines for x86, but it isn't like I care too much about that performance. I'd take the 3+ hours on battery.


> I'm surprised we didn't get any performance numbers.

It makes sense that they wouldn't:

1. Its dev hardware, not final production hardware. They'll want to compare the A13 or A14 (or whatever SoC they finally use) for benchmarks.

2. Until the apps are optimized, it doesn't make sense for Apple to put itself in the position of bad press with premature figures.


3. Osbourne effect avoided by not crowing about specs - those who are cautious or dependent on x86 will continue buying Intel kit. Once the new product is available and it kicks the daylight out of the legacy, they will already have their pareto-split of interest in their new offerings.


The A12Z is already shipping in the latest iPad Pro, so it's not like its performance is some unknown quantity: benchmarks are aplenty... Although I guess this dev kit could run at a different frequency, and have more/less memory bandwidth. Performance should still roughly be that of the iPad Pro.

The A12Z is itself only a small update on the A12X from 2018, so it's basically two years behind whatever will ship in actual ARM Macs this fall...


I expect DTK performs better than iPad Pro because it can supply more power and has more thermal budget (maybe still has active cooling fan).


It might be running at a somewhat higher frequency, because why not. But the DTK is not a production model, so I don't expect Apple to spend significant resources on it. After all the A12Z should be good enough as-is.

You have to consider that Apple engineers are busy designing new SoCs for the entire Mac line-up: optimizing the A12Z for the DTK is probably not their top priority at the moment. They will want to wow people with a new MacBook (Air? Pro? <blank>?) this fall: that SoC should be their priority...


The devkit has 16GB of RAM; no way it's not at least dual-channel memory.


Apple will almost assuredly have a dedicated event later this year where they will announce new hardware.

Apple has development kits running in modified Apple TV's. This is a chip that has essentially been out for a few years in iPad Pros. Why would Apple announce numbers based on this? It also assumes Apple will ship future laptops without fans or ports, which is how the development kit is coming out.

Apple will most likely have an A14X out later this year in at least one laptop. That's going to be significantly newer and more advanced than the A12Z in development kits.


The developer kit is in a Mac Mini. It has a full complement of the usual Mac Mini ports and, unless they've made major internal changes, a fan.


From what I understood the transition is of strategic nature. Less dependency on third party suppliers for vital components.

The performance and/or battery gains is almost incidental.


No it's not less dependency on 3rd parties, it's more dependency on TSMC. Before they were able to play Intel vs TSMC, it's not the case anymore. Then you add the geopolitical issue of Taiwan vs China and the risks level keep increasing.


>Before they were able to play Intel vs TSMC

How? Did Intel suddenly allow their CPUs to be manufactured by TSMC? Or, vice versa, agree to manufacture Apple's Ax chips?


I found in the Press release the cost of the Transition Kit: $500. Not bad at all.


Probably a rental, based on the last transition.


It is indeed a rental, but on the upside, it's only half the cost of the Intel Transition Kit, which was $999.


When the Intel Transition Kit was returned, they gave the devs a first gen Intel iMac for free.


there is no intel chip to pay for, so they can make it cheaper


And only a tiny case and no huge cooler ;-)


> I'm surprised we didn't get any performance numbers.

It's the CPU in the iPad Pro, performance numbers are out there in the wild. The only big change is the RAM. This isn't a retail product, it's a developer kit. When they release retail Macs I'm sure there will be some performance numbers.


There's still a huge difference in TDP. The iPad probably has a TDP of maybe 10 watts? The Mac Mini Intel CPU's have 65 watt TDP's. They can deliver more power and cooling to the 12Z than in an iPad and it should result in much higher performance.


Less throttling is probably the big one here.


They said they are planning on making a family of SoC for the Mac though, I doubt we are going to see iPad-level processors in the actual ARM-based Macs they will sell.


They said the first ARM macs are coming at the end of this year, so even if there's an NDA it won't be long until we'll see the real numbers.


I wonder if they're holding them for the actual hardware release in Fall? They could still be deciding the tradeoff between battery life and raw power.


Well, it appeared to be a developer version of the chip and not the final customer copy.


Possibly. The developer kit is using an A12z chip. It’s possibly more for proof of concept than tech demo of what’s possible.


It's not a product release - it's an announcement of direction for the Mac product line and the Mac OS platform. Once they have hardware with ARM processors for purchase they'll be speaking to the processor specs and how much better they are at power management.


They had to announce this early to allow developers to get ready. If they could have gotten away with not announcing early they would have. Obviously (if all apps would automatically run natively on ARM without any developer involvement) they would have first announced this with an actual new Mac.

That, however, was not an option. So they have to tread carefully in what they say and they also have to be a bit careful about showing off too much.

They only had to tout the benefits of ARM insofar as to placate the fears of consumers (their Rosetta story plus virtualization story helped there) and to provide some reasonable justification to actually make devs at least a bit excited, even though they have to do additional work.

Plus: No ARM Mac (except the transition kit) currently exists. It’s not even clear if the first Mac they will announce is even finished yet, if only internally. And even if it is finished: Do you think going on stage now and talking about a new MacBook Air that has twice the performance and 50% more battery life as the current MacBook Air – oh, and you can get one in December – would be a good idea?

This is Apple’s tightrope walk to avoid too much of an Osborne effect. I think they are ok with some Osborne effect (if only because they know that even if no one buys an Intel Mac ever again during the next two years transition time they will not go bankrupt, so far from it) but you don’t have to provoke one, right?

I expect plenty of numbers and comparisons when they introduce the actual first ARM Mac.


It doesn't seem weird at all, and the fact that they're sending out devices on the A12Z seems to be intentional sandbagging: they know that people are going to benchmark and the results will likely be simply comparable to current hardware (from a performance perspective...energy efficiency will clearly be much better). When they release the actual devices, where their power and thermal profile is dramatically higher than an iPad Pro, it will actually wow.


That’s my takeaway too. But imagine the impact of the speed is as good or faster? They’re shipping with 16 gigs ram too so it’s at least not the typical 8 gig minimum.


I would also expect this hw to be nerfed compared to what actually goes out to customers. That way no matter what is achieved on this it is better for real users.


Those numbers would be meaningless without knowing what the actual geometry of the internals will be, because cooling is a major limiting factor for laptop processor performance.


Don’t expect performance. The Intel DTK was Prescott based (while AMD had great dual cores and Intel were lagging). Then they’ve released their Core series that started from mobile and had great performance.

I guess they did some homework before ditching Intel. The big question Is if they have enough headroom for manufacturing reliable chips with sustained high power.


Have they ever given performance numbers that weren't like "up to 400% faster" in selected tasks?


A12Z isn't a consumer chip for macOS. That's why you aren't going to get benchmarks from Apple.

They are also being very very coy with what they have under their sleeves because of small of an upgrade A13 was over A12.

Same thing happened during the intel transition. The first consumer chips were dual cores, but the DTK was Pentium 4s


If A12Z is based on the A12 they will never make consumer products with it but instead take a new A14


> I'm surprised we didn't get any performance numbers

That's not this presentation - when they actually ship hardware (transition kit not included) - then they'll talk performance.


How do I get into the DTK program? We produce a compiler for macOS and obviously will need access to this.

I looked all over developer.apple.com and didn't find anything.



They are probably still working on ARM specific optimizations and benchmarks, and they prefer to wait to show their results, to maximize the impact.


We pretty much know it'll perform fine in native programs, the only real question is how well the x86 translation layer performs.


Why show performance numbers on a machine that will not be going to production?


maybe it's about profits not performance?


Why not both?


i was replying to the parent thread that observed no performance data was shared. if performance were say, the same or slightly worse, then it would explain why apple wouldn't release performance data. but they might still want their own chips, if they increased profitability


Ding-ding-ding. We have a winner.


The major reason for the transition is higher margins, plain and simple.

For customers, both average users and developers it will be a pain with little to be gained.


Unlikely - being able to have full control of your roadmap is a huge strategic advantage. Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.

You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs? It’s not just about profit margin.


> Profits and revenue are nice, but if Apple was interested in that they could dual source x86 from AMD and drive cost down.

If Apple could do that and play it to their advantage, they would have done so a long time ago.

Higher margins and profits are the drivers in the end. Strategic control or not is just a way they use to achieve that. It is a publicly traded company, after all.

> You don’t think companies like Oculus are envious of Apple’s flexibility from not having to rely on Qualcomm for their mobile SoCs?

I doubt Oculus cares given their goal. There are pros and cons of vertically integrating an entire company into one.


No doubt profit margins are a big factor in their thinking.

However, performance/watt matters too.


Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?

Apple has zero presence in data centers.

I read people here writing "double the battery life" without any source, but even if that was the case I own a laptop that does 2 hours on battery, I use it to run models on a discrete GPU so power efficiency goes out of the window anyway, it's really not achievable.

The other one can handle average workloads for 12 hours and weights a bit more then a kilo, if it was smaller or lighter it would be a much worse laptop than it is (if it's too light it has no stability and you fight constantly to keep it steady on your desk)

Who needs more?


> Genuine question: unless you run a datacenter with thousands of CPUs, does it really matter?

I think it does. Other than double the battery life (which I wouldn't really need, but my Dad who travels a lot would absolutely love), the big thing is thermals (which were specifically mentioned in the keynote).

The biggest constraint on Laptop performance is thermal throttling. That's why gaming laptops have huge bulky fans, and a current MacBook has pretty decent performance for short bursts, but if you are running something (say a compiler) at full throttle for a few minutes then it gets significantly throttled.

Better thermal performance (which is directly proportional to power usage) could well be the key to unlocking close-to-desktop performance in a laptop form-factor. Which could be a pretty big win for the MacBook Pro market.


> I'm surprised we didn't get any performance numbers.

The fact that all the demoes were on their top-of-line most-expensive machines fell very weird to me. "Look at this amazing performance" would be great if the demo was on a Macbook Air.


It's was a Pro monitor, but they explicitly said that the demos actually ran on the developer transition kit powered by an A12Z...


They did have a Mac Pro there as a prop, which was interesting and potentially confusing. There was a brief moment where you could see that it was connected to something in a Mac mini case.


Yeah it was done so they could pull the switcheroo later and reveal that everything seen so far was on an ARM Mac


Yep. I don’t think they needed the Mac Pro as a prop though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: