> So, I’m predicting an MBP 13 - 16 range with an extra three hours of battery life+, and 20-30% faster.
I'm predicting the opposite: you won't actually see any difference.
Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains [1]. Not one thing that they demoed requires any meaningful CPU power.
Similarly, while ARM parts are more efficient than x86 per compute cycle, it's not a dramatic change.
The big changes, I think, are more mundane:
- Apple is going to save $200-$800 cost per Mac shipped
- Apple can start leaning on their specialized ML cores and accelerators. They will probably put that stuff in T2 for Intel Macs. If they're already shipping T2 on every machine, with a bunch of CPU cores, why not just make those CPU cores big enough for the main workload?
Doubling CPU perf is meaningless if you can ship the right accelerators that'll do 100x energy/perf for video compression, crypto and graphics.
[1] for a regular web browsing type user; obviously if you're compiling stuff this may not apply; if that is true you're almost certainly better off just getting a Linux desktop for the heavy lifting
Apple can start leaning on their specialized ML
cores and accelerators
Thank you for mentioning this. I feel like many have missed it.
I think Apple sees this sort of thing as the future, and their true competitive advantage.
Most are focusing on Apple's potential edge over Intel when it comes to general compute performance/watt. Eventually Apple's likely to hit a wall there too though, like Intel.
Where Apple can really pull away is by leaning into custom compute units for specialized tasks. Apple and their full vertical integration will stand alone in the world here. Rather than hoping Intel's chips are good at the things it wants to do, it can specialize the silicone hardcore for the tasks it wants MacOS to do in the future. It will potentially be a throwback to the Amiga days: a system with performance years ahead of competitors because of tight integration with custom hardware.
The questions are:
1. Will anybody notice? The initial ARM Macs may be underwhelming. I'm not sure the initial Mac ARM silicon will necessarily have a lot of special custom Mac-oriented compute goodies. And even if it does, I don't know Mac software will be taking full advantage of it from Day 1. It will take a few product cycles (i.e., years) for this to really bear fruit.
2. Will developers bother to exploit these capabilities as Apple surfaces them? Aside from some flagship content-creation apps, native Mac apps are not exactly flourishing.
1. If done correctly, non-Apple laptops may become significantly less attractive. Just like Android phones.
2. Intel may be in for a tough time, especially with AMD winning big on the console and laptop fronts recently.
3. AMD and Intel may have to compete for survival and to save the non-Apple ecosystem in general. If AMD/Intel can consistently and significantly beat Apple here, it may mean that the non-Apple ecosystem survives and even thrives. It may even mean that Apple looks at Intel/AMD as an option for Pro MacBooks in the future. However, this does seem a little less likely.
4. This could also herald the entry of Qualcomm and the likes into laptop territory.
Looks like a very interesting and possibly industry changing move. This could potentially severely affect Intel/AMD and Microsoft. And all these players will have to play this new scenario very carefully.
But isn't it just a matter of time til the novelty of smart phones wear of, they stop being tres chic and the cheap ones becoming 'good enough'? It might have taken decades, but eventually Ford bought Cadillac, Fiat bought Ferrari, VW bought Porsche (and Bugatti and a few more).
Big difference is Ford, VW, et al had local dealer networks that not only fixed the cars, but turned the lessons and data learned in the fixing back into engineering improvements upstream. The net result of this is over a span of years Ford and VW buyers would see the product get better each time they bought a new one.
Android will always be a low budget product as a market, because it's run by Google. Google doesn't care about its customers at all, but for the data they generate and its impact on ad sales.
Every time a user opens the Google app store, they can expect it to be worse than the time they opened it previously. Every time an Android user buys a new device, it's a crap shoot what sort of hardware issues it will have, even if it's Google or Samsung branded.
Market share and attractiveness aren’t necessarily related. A Kia isn’t as attractive to its target customer as a Mercedes but outsells it because of price.
Much more interesting would be the CVS gearbox which is THE Mercedes advantage, the TCU, the shifter or the ECU.
100x better but also 100x more expensive. Will not happen. Worked in F1.
I do hope AMD does well here as Apple's chips with all their custom silicon, T3 etc, mean the death of booting anything but apple signed OS images on that, forget Linux.
And that's not the future I am willing to buy into.
Thank you for expressing this. As much as I like Apple and the wonderful approach they have to design, something felt amiss. This is what I wanted to express.
I'm somewhat confused by this rhetorical question since the microcode of processor is vastly different than the userspace & kernel of Mac OS. Running an OS bare metal versus in a VM on top of Mac OS are different across a wide array of things. At a minimum performance is lower and less predictable on the VM; you now have two different OS's updates to worry about breakage with on top of their mutual interface (ask anyone who's done serious Linux dev work on a Mac); you have two different sets of security policies to worry about; the low level tools to debug performance in a VM don't have the level of access they do on bare metal; and if you're working with hardware for linux servers & devices in a VM you are going to have to go bare metall sooner or later.
The abstractions are leaky, the VM is not a pristine environment floating on top of some vaguely well-defined architecture. The software in one has two extra layers (VM software & OS) between it and the actual platform and all this is before you start hitting weird corner cases with cpu architecture differences in the layers.
Hmm, really? Since Windows 10 your desktop runs in a guest domain, while the kernel running the drivers is isolated.
Since about 5 years Apple provides this kit: https://developer.apple.com/documentation/hypervisor Yeah, you got to use the hardware drivers from Apple unless it also supports PCI pass through, not sure but with the current user base I guess nobody would do that anyway.
I expect Apple to eventually run their ring-1 off the T chip, with everything else from a VM abstraction. It’s just the natural evolution of the UEFI approach, and Apple being themselves they’re doing it “their way” without waiting for the crossfire infested industry committees to play along.
Nope there was no mention of booting another OS. Craig talked about native virtualization that can be used to run Docker containers and other OS runtimes.
Do many people care about phone CPU performance? Sure, it needs to be good enough, but after that it's really far down on the list of things that matter.
What matters to everyone I know is screen size, camera quality and that a really small selection of apps (messaging, maps, email, browser, bank app) work well. Raw CPU performance is only a very abstract concept.
Raw CPU performance, perhaps not. But people definitely do care about a specific set of user-facing, ML-driven functionality - think speech recognition, speech synthesis, realtime video filtering, and so on.
Many of these are only barely possible on "pre-neural" mobile ARM CPUs, and at a significant cost to power consumption. Developing for newer devices is like night and day.
Google's speech recognition is damn impressive, but I'm talking performance/power consumption, not "quality". Sticking a 2080 into an iPhone won't give you better speech recognition results, but it will give you bad results faster.
> > Many of these are only barely possible on "pre-neural" mobile ARM CPUs
> Speech recognition on my old Pixel 2
I don't think the Pixel 2 can be called "pre-neural". "[...] The PVC is a fully programmable image, vision and AI multi-core domain-specific architecture (DSA) for mobile devices and in future for IoT.[2] It first appeared in the Google Pixel 2 and 2 XL [...]" https://en.wikipedia.org/wiki/Pixel_Visual_Core
When speech recognition starts understanding European Portuguese without me playing stupid accent exercises, and mixed language sentences as well, then I will care about it.
> only one camera, just a 4.7-inch display, and less than Full HD screen resolution
cpu selection is likely coming from industrialization concerns, less production line to maintain, less price per unit at volume etc, but they're going to beat that drum loud and proud for all it's worth, meanwhile the phone is cheap in area that in 2020 _do_ matter.
I know a couple people trying to port an ML app to iOS. It sounds like the interfaces are a bit of a nightmare, and the support for realllly basic stuff in the audio domain is lacking.
I don't know the dev ecosystem for apple broadly, but this doesn't bode well for people "bothering to exploit" the hardware.
#1, who can say. #2 might be side stepped by the compatibility they with iOS apps they will gain? (making it so all those iphone/ipad developers can ship their apps to macs, too.)
> I fully expect any reduction in costs for Apple will get sent to their shareholders, not the consumers.
Apple's margins are consistent, if their costs go down significantly, pricing comes down or features increase. The iPad is a perfect example, for years it was $500 and they just kept increasing the feature-set until eventually they could deliver the base product for significantly less.
Shareholders benefit from increased market share just as much as they do from increasing margins, arguably more. The base iPad and the iPhone SE both "cannibalize" their higher end products, but significantly expand their base. I wouldn't be surprised at all to see a $800 MacBook enter their lineup shipping with the same CPU as the iPad.
Considering they're selling a device with a 10.5" touchscreen and an A12 SoC for $500 today, I think they can go even lower than $800 for a device with only a slightly larger LCD and no digitizer.
While they won't be competing with Chromebooks for general education use cases, I could very well see Apple trying to upsell schools on a $599 alternative that happens to run GarageBand, iMovie, and even XCode.
Eh I don't see Apple selling their cheapest education MacBook for $600 instead of $900 simply because one of many components suddenly got significantly cheaper.
I can see them doing that for big volume buys for education. I don't see why they wouldn't just pass on the entire Intel margin to them, getting students using Apple products young has value.
Chromebooks are doing well in education at the moment. If Apple launched a product in that space, they could easily claw half of that back overnight. The ability to run real software is huge, especially for subjects like graphic arts and engineering.
> Considering they're selling a device with a 10.5" touchscreen and an A12 SoC for $500 today, I think they can go even lower than $800 for a device with only a slightly larger LCD and no digitizer.
While there is no digitizer, there is a keyboard and a touchpad. Also, I expect Apple is going to try to keep a gap between the base Mac and the iPad price-wise so they would add to the base storage and maybe RAM.
Then again, considering the pricing on the base iPad, maybe they will bring it down to $600.
Maybe if they take a bet on (or force) the App Store to be the primary method of obtaining software. I’d expect Apple forecasts some amount of yearly revenue per iPhone/iPad and a lower amount per MacBook.
Why do we need to buy so many devices anyway? Why can't I just plug my iPad or iPhone into a dumb dock with a laptop screen and its own storage and battery, and use the CPU and GPU from the phone/ipad?
I don't need VSCode, Docker, or node.js on my phone. I don't want all the clones of the various repositories I'm working on on my phone. Even the best phones lack the RAM, high capacity drives, and video card my computer has. Nor does it have a keyboard or trackpad.
If your phone is good enough to take care of your day to day computing, you can probably get by with an inexpensive all-in-one computer and save the headache of docking.
You'd be surprised how many people would like exactly this, interestingly. There are certainly enough to quite literally pay real money for a somewhat lousy facsimile of the real thing; I know from experience.
Then what is the point in docking at all? Now you have to keep track of what's on the dock and what's on the phone. Plus, by the time you integrate all this into a dock, you basically have something that costs as much as an inexpensive PC, so why bother?
You'll need something to connect all those dock components together so you don't have to run several cables to the phone. Something like a motherboard. So you'll have a full computer sans a cpu.
The Surface Book is exactly this: A (x64 Windows) tablet with a laptop dock that contains a stronger GPU and battery.
One problem is that people expect the CPU power of a laptop, which requires much more power and cooling than the typical tablet. As a consequence in tablet mode a Surface Book has about two hours of battery life.
So far: different architectures. But with this announcement it would make running macOS on a future (or even current) iPad quite feasible, so your kind of dock might become true soon. Apple's new magic iPad keyboards use a lot of weight to balance the heavy screen - might as well make that a battery.
When looking for IDEs or tooling on iOS I still have not found anything remotely professionally usable... (I mean Visual Studio + Reshaper like, not VS Code...) but perhaps somebody could enlighten me...
Because a general purpose device is not good business sense for a company that sells devices. The more they can artificially specialize each thing, the more things you need to buy, and the more money they make. This is a much larger phenomenon than just Apple, or even computers.
An iPhone is a general purpose device compared to an iPod. But maybe Apple has lost the willingness to cannibalise its own sales for the sake of creating stunning new product categories.
You can plug in a USB dock into a lot of Android phones, and if you get a DisplayLink dock, you can add 2-3 monitors. Keyboard, mouse, sound, Ethernet all work with it too.
Unfortunately for high priced premium products, the increase in quality of basic products forces premium products to be better or fail.
Related to your example - $1 burgers are increasingly better, than you would expect. The difference between McDonald's midrange line and, say, a burger at a restaurant for $18 is negligible in flavor. I can no longer justify going to a restaurant and pay $18+tip for a burger.
Sure, to you. There's a whole lot of not you out there for whom the distinction is worth the price differential. That's true in both hamburgers and hardware. Needs, goals, and use cases differ significantly among people.
I'd argue that the functional difference between a Honda Fit and a Tesla is less than the difference between the best McDonald's hamburger and an $18 hamburger. That's why I drive a Honda Fit. In the face of Tesla's increasing sales it would be pretty strange to assert that my taste was somehow universal.
I would argue a many, perhaps most, people that won't eat a McDonald's hamburger because of some perceived lack of quality probably haven't had one in many years and are instead working off public perceptions and status indicators about what they think it represents and must be like.
And then we've come full circle to Apple products.
I'm a classically trained chef who tends to specialize in bar food. I know more about the marketing, creation, and perception of food than you do— you're wrong.
McDonald's has very high quality preparation standards. Their ingredients and techniques were constructed to facilitate their high-speed, high-consistency process, but prevent them from incorporating things that the overwhelming majority of burger consumers prefer.
For example, the extremely fine grind on the meat, the thin patty, the sweet bread, the singular cheese selection, the inability to get the patty cooked to specification, the lack of hard sear or crust and the maillardization that accompanies it, etc. etc. etc. At a minimum, people prefer juicier burgers with coarser, more loosely-packed texture, usually cooked to lower temperatures (though this depends on what part of the country you're in,) and the flavor and texture differential from a hard sear, be it on a flat top or grill, and toasted bread.
For consumers who, at least at that moment, have a use case that requires their food be cheap, fast, and available, well we know who the clear winner is.
In my new career as a software developer and designer, I use apple products. I am willing to pay for the reliable UNIXy system that can also natively run industry-standards graphics tools without futzing around with VMs and things, and do all that on great hardware. There will always be people who aren't going to compare bits pushed to dollars spent and are going to be willing to spend the extra few hundred bucks on a device they spend many hours a day interacting with.
This isn't about perception at all— Apple products meet my goals in a way that other products don't. If your goals involve saving a few hundred bucks on a laptop, then don't buy one. I really don't understand why people get so mad at Apple for selling the products that they sell.
> I know more about the marketing, creation, and perception of food than you do— you're wrong.
I don't doubt you know more about food. If you applied that knowledge to my actual point instead of what it appears you assumed my point was, this assertion might have been correct.
That's not entirely your fault, I was making a slightly different point than the exiting conversation was arguing, so it's easy to bring the context of that into what I was trying to say and assume they were more related than they were.
The belittling way in which you responded though, that's all on you.
> This isn't about perception at all— Apple products meet my goals in a way that other products don't. If your goals involve saving a few hundred bucks on a laptop, then don't buy one. I really don't understand why people get so mad at Apple for selling the products that they sell.
My point, applied to this, would be to question what other products you've tried? My assertion is that people perceive other products to be maybe 50%-70% as good, when in reality they are probably closer to 85%-95% as good (if not better, in rare instances). That is a gap between perception and reality.
As applied to burgers, I was saying that people that refuse to eat at McDonald's because of quality probably have a very skewed perception of the actual differences in quality in a restaurant burger compared to a McDonald's burger.
I'm fully prepared to be wrong. I'm wrong all the time. I also don't see how anything you said really applies to my point, so I don't think you've really proven I'm wrong yet.
So you're creating metaphors that don't make sense using things that you have a limited understanding of to describe something you think you might be wrong about and getting annoyed that everybody else isn't following along with your deep conversational chess. Right then. I'm going to go ahead and opt out of this conversation.
Feel free. I simply made an observation that was loosely connected to the existing converaation and noted how it seemed to parallel something else.
I wasn't annoyed by you misunderstanding, I was annoyed by you misunderstanding, assuming you understood my position completely because it would more conveniently fit with your existing knowledge, and then using that assumed position to proclaim your superiority and my foolishness.
It's not about deep conversational chess on my part, it's about common decency and not assuming uncharitable positions of others by default on your part. A problem, I'll note, that you repeated in the last comment.
Just the mere perception of quality will increase your satisfaction levels. The perception of lack of quality will reduce you satisfaction levels.
Thus I still maintain that your "perfect" $18 burger is only marginally better than McDonald's midrange burger. The fact that you actually spend time on making that burger more appetising - is proof that the low cost foods are getting better and better.
While focusing on my analogy, you literally prove my overall point.
30 years ago you weren't necessary, as low cost food wasn't nearly as good as today. Now - you have to exist to justify that premium.
I think you're reading more into my comment than what I actually said, possible because of someone else's prior comment in this thread.
I was making a point less about McDonald's being equivalent to a restaurant burger and more about people's perceptions of McDonald's and how bad it is. That is, there's probably a lot less difference in the taste of those burgers than a lot of people want to admit.
The other aspect to consider is consistency. I had a $14 burger at a restaurant on Saturday that I would have been happy to swap in any single burger I've ordered from McDonald's in the last 12 months. You may not consider it high quality at McDonald's, but you have a pretty good idea what you're going to get.
All I'm really doing is making a point that there's a bit of fetishism about luxury items going on these days. Are Apple devices generally higher quality than many competitors? Yes. Is the difference in quality in line with most people's perception of the difference in quality? I don't think so.
I haven't had a McDonald's hamburger for many years. You are partly correct that it is because of my perception that it is trash. But when I walk by a McDonald's it doesn't smell like food to me anymore and smells more akin to garbage on a warm day.
> The difference between McDonald's midrange line and, say, a burger at a restaurant for $18 is negligible in flavor.
This may be the single worst analogy I've ever seen.
There is no amount of money you can pay at McDonalds to get a good quality burger.
I don't spend $18 for burgers, since there are a million places where you can pay $5-8 dollars and get a damned good piece of beef. But not at McDonalds.
If the employees are doing it right, it’s not “that bad” of a burger. So, just pay the employees enough to actually care about the burger and it comes out decent.
I’ve eaten at McDonald’s around the world, it really depends but they do have good burgers when they’re cooked right.
It's not the employees. In different countries the entire recipe and production system is different. In many non-US countries, McDonald's is a more upscale "foreign" restaurant and far more expensive than in the US.
90% of these Mac silicon investments would directly benefit their iPhone cash cow—perhaps not this cycle, but certainly in the chips they'll put in future iPhones.
And the remaining 10% would indirectly benefit benefit their iPhone cash cow in the form of keeping people inside the ecosystem.
The Mac silicon is inheriting the investments Apple made in the iPhone CPUs. This will continue. The bits which Apple invests to make their existing hardware scale to desktop and high end laptops won't benefit the iPhone much at all. On future generation chips, Apple will spread the development costs over a few more units, but since iPhone + iPad ship several times more units than the Mac, the bulk of the costs will be born by them.
Indeed, apples G4 cube debuted at 1800 base in 2000. That’s the same ballpark as their iMac now and their Mac mini starts at about half that. Meanwhile inflation would have made that G4 ~2700 today.
Silvers gone up. Golds gone up. Probably various fixed costs had to be further invested in the form of contracts with fabs or new fabs built. Etc. But really the Mac mini is more of a modern likeness to the G4 cube which retails now for a 800$ start, less than half the g4 cubes starting price.
Edit-they also went from being a company with around 8500 employees in 2000 to 137000 today. Surely every part of their organization chart has contributed toward pressures to otherwise push up their prices to maintain revenue.
Since Silver and Gold are priced in $USD their price is influenced by the actions of the US government. One of those driving forces is the current US monetary policies (i.e. an growing budget deficit).
Another factor is perceived risk. Since the markets are always worrying about the current US China trade talks, that uncertainty helps gold and silver as they are seen as safe havens.
This is exactly what Apple has done with all of their products over the 30+ years. The iPad is a perfect example of Apple doing both over the past 10 years. Likewise the iPhone SE & the Apple Watch. It's done it with every product in their portfolio.
I'm actually not so sure about this. Apple's gross margin target is in the ballpark of 38-40% and a savings of $200-800 per MBP would have a substantial upwards impact on that gross margin number. Apple carefully sets pricing to achieve a target gross margin without impacting sales too much (higher price = higher gross margin but likely lower net revenue because they're priced out of the market).
One of the two scenarios (or perhaps a mixture of both) are more likely, and I lean towards #1:
1. Apple decreases the price of the Mac to stay aligned with gross margin targets. This likely has a significant upwards impact on revenue, because a drop in price like this opens new markets who can now afford a Mac, increasing market share, driving new customers to their services, and adding a layer of lock-in for iPhone-but-not-Mac customers.
2. Apple uses the additional budget per device for more expensive parts/technology. They are struggling to incorporate technologies like OLED/mini-LED because of the significant cost of these displays and this would help open up those opportunities.
The high price of MacBooks is treated as a status symbol and the marketing department clearly knows as much, so I don't think they will be willing to give that up, so I lean towards your second option.
Why not got the same road with an MacBook SE? I even think this will be the first product out the pipeline.
MacPro buyers usually don't want to be beta testers and will probably be the last to transition out once horsepower is clearly there with mesurable gains.
The iPhone SE is less a "cheap iPhone" and more "an expensive basic smartphone". Far more people upgrade to it from a low cost Android device then downgrade from a different iPhone.
Bingo. Estimates vary wildly, but I've seen figures saying that Axx CPUs cost Apple about $50 each. Even if it's more like $100, that's still an insane amount of additional profit per unit to be extracted. They don't need to deal with single-supplier hassles and they get much more control over what cores go into their SoC.
This is sort-of-OK for consumers but amazing for Apple and its shareholers.
I suspect the big motivation for Apple is less about squeezing a few dollars more profit per system and more about shipping systems which just aren't possible on Intel's roadmap. Just putting the A12Z into the previous 12" MacBook would be a massively better computer with better battery life, better performance, and significantly less expensive. All while Apple maintains their margins.
This isn't a zero sum game. Being able to ship less expensive computers which perform better is a win for consumers and Apple shareholders at the some time.
> Just putting the A12Z into the previous 12" MacBook would be a massively better computer with better battery life, better performance, and significantly less expensive. All while Apple maintains their margins.
Microsoft doesn't control the hardware very much and definitely doesn't control the software developers whereas Apple completely controls the former and has a lot of leverage with the latter.
You can see that with the latest MacBook Pro 13", if you buy one of the cheaper devices it comes with last years processors. Intel are clearly having problems meeting customer demand.
But customers are going towards an entirely entirely closed everything. ios is apple languages, apple signature required to run code, apple processors. Desktop machines are the last bit of freedom in the apple ecosystem.
This isn't "sort-of-ok", it's "bad-for-customers" and "bad-for-developers".
Why are you implying that they're going to lock down the Mac and make it some kind of iPad Pro? You'll still have complete control to run anything you want on the system. Running unsigned binaries is as simple as a right click on the app to open it on Mac. Or launch it from the command line with no prompt at all.
It looks like from the freedom end of things, the only thing that changes with ARM Macs is they're requiring notarization for kexts, and the fact that other OSes won't boot on the hardware since they don't have support for it. Unless anything changed, the T2 chip already killed linux support before?
This is just my opinion but I think it's great for consumers and a good restriction for developers.
As a consumer you shouldn't be running unsigned software because you're putting not only your data at risk but any data you have access to.
And as a developer on mac you can still run anything reasonably well in a VM.
If you're using node, you should be running that in a virtualized environment in the first place, albeit I'm too lazy myself to always set that all up.
Actually it's pretty amazing that now we'll be able to run an entire x86 OS environment on an ARM chip and get very usable performance too.
> If you're using node, you should be running that in a virtualized environment in the first place
Just curious: why should node be ran in a virtualised environment for development? Is it a security concern? Does that apply to languages like python too? Would you be happy running it in a Docker container from macOS?
How do we know it's "very useable" performance wise?
I'd say that we've moved away from virtualisation completely, we now use containers, so developers will expect native performance, as we get on other platforms.
You could also argue that significant cuts to costs of already-profitable Mac computers, could lead to significantly higher sales volumes.
Greater marketshare also provides more value to shareholders meaning that shareholders still win, as do consumers.
More people with macs (and probably iPads/iPhones) would also increase other profit centers for Apple such as services (their highest profit center), warranties, and accessories. The profits and loyalty from these could easily far outweigh the $100-$300 of extra margin they might gain from keeping Mac prices the same.
Meaning that price cuts to macs might actually be more strategically beneficial (to EVERYONE) than hoarding higher margins.
Cost of cpu is unit cost + all other costs including r and d divided by units. We also don't arrive at a reasonable estimate of unit costs by taking a range of estimates and picking the estimate most favorable to our position.
I also don't believe it's reasonable to assume that switching to arm is as simple as putting an iPad cpu into a laptop shell.
Here is an estimate that their 2018 model costs 72 just to make not to design and make.
The a14 that will power a MacBook is likely going to be more expensive not less. Especially with 15B transistors on the a14 vs less than 7B on a12.
Average selling price of Intel cpu looks like around $126. This includes a lot of low end cpus which is exactly the kind of cpu apple fans like to compare.
Apple may realize greater control and better battery life with the switch but they won't save a pile of money and thoughts about increasing performance are fanciful speculation that Apple, the people with the expertise are too smart to engage in.
Indeed. Apple is going to have to eat R&D costs that were previously bundled in Intel's pricing. And Mac sales are relatively small compared to the Windows market, so economies of scale are going to be less significant.
Which means the actual per-CPU fab cost is going to become a smaller part of the complete development and production cost of a run. And that total cost is the only one that matters.
I expect savings can still be made, because Apple will stop contributing to Intel's profits. On the other hand I'm sure Apple was already buying CPUs at a sizeable discount.
Either way it's an open question if Apple's margins are going to look much healthier.
IMO an important motivation is low power/TDP for AR/VR.
Ax will also eventually give Apple the option of a single unified development model, which will allow OS-specific optimisations and improvements inside the CPU/GPU.
Ax has the potential to become MacOS/iOS/A(R)OS on a chip in a way that Intel CPUs never could.
This only makes sense if you know nothing about Apple's business.
You really think they're doing this to save $50 from ~5m Macs? You really think all this upheaval is for a mere $250m a year in savings? It'll cost them 10x that in pain alone to migrate to a new platform.
Come on now....$250m is nothing at Apple scale. Think bigger. Even if you hate Apple, think bigger about their nefariousness (if your view is that they have bad intentions - one I don't agree with).
I'm not sure how you calculated that but they sell about 20m macs per year not 5m. I also doubt the chips cost them 50$ per unit. The savings may worth few billions so it's not really like nothing. And they wouls save this every year. Will this change cost them 10x in pain alone? I doubt it. They already make the chips.
> I'm not sure how you calculated that but they sell about 20m macs per year not 5m
Quarterly numbers come in between 4.5-5m units these days but point taken - I recalled numbers for the wrong timeframe.
> I also doubt the chips cost them 50$ per unit. The savings may worth few billions so it's not really like nothing.
The true cost of this move is reflected in more than the R&D. This is a long multi-year effort involving several parties with competing interests. People are talking here as if they just flipped a switch to save costs.
Let me make this clear. In my view, this is an offensive/strategic move to drive differentiation, not a defensive move to save costs (though if this works, that could be a big benefit down the road). Apple has a long history of these kinds of moves (that don't just involve chips). This is the same response I have to people peddling conspiracy theories that Apple loves making money off of selling dongles as a core strategy (dongles aren't the point, wireless is; focusing on dongles is missing the forest for the trees).
You aren't making anything clear, just straw man arguments. Apple switches architecture when it suits them, you think they switched from powerpc to intel was for differentation? Nope, it was cost and performance aka value.
The question isn't whether it suits them. The question is: "Why did they choose to take on the level of risk in this portion of their business and what is the core benefit they expect?"
If the the main reason was cost savings, this would be a horrible way to go about it.
There's a better answer: Intel can't deliver the parts they need at the performance and efficiency levels Apple needs to build the products the way they want to build them. This is not a secret. There is a ton of reporting and discussion around this spanning a decade about Intel's pitfalls, disappointments, and delays. Apple might also want much closer alignment between iOS and MacOS. Their chip team has demonstrated an ability to bring chipsets in-house, delivering performance orders of magnitude better than smartphone competition on almost every metric, and doing it consistently on Apple's timelines. It only seems natural to drive a similar advantage on the Mac side while having even tighter integration with their overall Apple ecosystem.
I think you are spot on. Any kind of cost savings here is going to be gravy and won’t come for a long time. This is going to let Apple reuse so much from phones in the future Mac line - all their R&D on hardware, the developer community, etc. It will be very interesting to see what the actual products are like, and whether the x86 emulation is any good.
Oh, so we are talking about value now? Please stick to an argument after you fail to defend it. You already used your dongle argument no one asked for.
Then don't go on a tangent, when the point the parent was talking about potential savings and big oof when you get your numbers wrong then try to strawmen about points no one is arguing against. No one was arguing about vertical intergration bonuses Apple gets by their own SOC. You wanted to boil it into one dimension by dismissing the value Apple can provide with their own chip.
1. I stated quarterly numbers off the top of my head instead of yearly numbers. This mistake doesn't change my point at all at Apple scale - it's a negligible amount of savings relative to the risk. Companies of this scale don't make ecosystem level shifts without a reason far far better than "we can _maybe_ increase yearly profits by 1% (1/100 * 100) sometime in the future". It's just not relevant to bring that up as a primary motivation given what we're talking about.
2. I think you actually missed the point of the conversation. OP said "that's still an insane amount of additional profit per unit to be extracted" and followed that up with "amazing for Apple and its shareholers."
It is not insane at all. And not amazing. It just comes off as naive to anyone who's worked in these kinds of organizations and been involved in similar decisions.
I think it's hard for some people to comprehend that trying to save $1b a year for its own sake at the scale of an org like Apple can in many cases be a terrible decision.
You came with your strawman that it was for its own sake, they just stated it was a profitable move and "amazing for Apple and its shareholders, which is hard to refute. OP even said "They don't need to deal with single-supplier hassles and they get much more control over what cores go into their SoC." It seems you are now arguing with your own points.
> It seems you are now arguing with your own points.
Half the fun is writing down your own thoughts!
> You came with your strawman that it was for its own sake
That's possible. I saw the emphasis placed differently than you did even though we read the same words. Probably describes the nature of many internet arguments. Happy Monday - I appreciate you pushing me to explain myself. Seems like others were able to get value out of our back and forth.
The fact that they are saving $1 billion per year is what makes the transition possible, it's not actually the cause of the transition. They could have done the transition a long time ago if it was just about the money.
It saves them much more over the long term if it lets them get away from having two different processor architectures. It paves the way for more convergence between their OSes. Eventually a macbook will be just an ipad with a keyboard attached, and vice versa.
Yes, they're a big company. But they're also a mature company. A lot of their efforts are going to be boring cost-cutting measures, because that's how mature companies stay profitable.
It's more than just a CPU though - this will make the components of a Mac massively similar to an iPad, and probably save money on many other components.
It also removes any need for a dedicated GPU in their high-end laptops, which is probably $200 alone.
I have no idea how they justify the prices for their lower end laptops as-is, as they have worse screen and performace than recent iPads in pretty much all cases.
1. This is risky for consumers. Whereas the PPC->x86 move was clearly a benefit to consumers given how PPC was lagging Intel at the time, x86 had proven performance and a massive install base. It was low risk to consumers. This? Less so. Sure iOS devices run on ARM but now you lose x86 compatibility. Consumers need to be "compensated" for this risk. This means lower prices and/or better performance, at least in the short-to-medium term; and
1. This move is a risk for Apple. They could lose market share doing this if consumers reject the transition. They wouldn't undertake it if the rewards didn't justify the risk. They will ultimately capture more profit from this I'm sure but because of (1) I think they may well subsidize this move in the short term with more performance per $.
But I fully agree with an earlier comment here: Apple has a proven track record with chip shipments and schedules here so more vertically integrated laptop hardware is going to be a win, ultimately.
If you are a photographer, a developer, a graphics designer, a musician, a teacher, or whatever, and you are looking at buying a new Mac, what is going to get you to buy the new Apple Silicon powered Mac which is almost certain to impact your workflow in some way? If you are making purchase decisions for classrooms, what makes you buy 200 Macs with a new, unknown architecture?
The first generation of Macs on Apple silicon absolutely needs to have a significantly better price/ performance point versus the current generation or they won't sell to anything more than the most loyal fans. If the new Macs come out and pricing is not good, I could seriously see a sort-of anti-Osborne effect where people gravitate towards Intel based Macs (or away from Macs entirely) to avoid the risk of moving to new architecture.
If anything, I expect margins on the first couple generations of Macs to go DOWN as margins on the first couple generations on all Apple products are lower (also public record).
> If you are making purchase decisions for classrooms, what makes you buy 200 Macs with a new, unknown architecture?
Yes, the "unknown" architecture powering the highest performing phones and tablets.
Apple has plenty of problems selling to schools for classroom use because other platforms have invested more in that use case. But ISA being the reason? No. Simply no.
Have you ever been behind the purchase choice for dozens of computers? Hundreds?
IT managers are conservative, if they make a bad call, they have to support crap equipment for the next 5+ years or so. Yes, I'm aware Apple's CPUs are in the iPhone and iPad, but it's a huge change for the Mac and it's a big risk for people making those purchase decisions.
As for this, I have, and I certainly would not buy for the first two-three (if not more) hardware revisions after such a major architecture change until I could evaluate how that hardware has been working out for the early adopter guinea pigs. I'd also need to see where everything stood concerning software, especially the educational software that has been getting written almost entirely for x86 systems or specifically targeting Chromebooks for the last 5+ years. Even then I am not sure the Technology Director is going to be anything but skeptical about running everything in VMs or Docker containers. Chromebooks are cheap, reasonably functional, easy to replace, and already run all district educational software.
undoubtedly, that’s capitalism! but they may also introduce some price cuts. These would probaby increase units sold, so they could better take advantage of their increased margin.
I'm also predicting there will be no difference in battery life.
If you check technical specifications on past MBP battery specification and battery life you can notice one thing: Watt/hour battery is always decreasing and battery life is always remaining constant (e.g., 10 hours of web scrolling).
Gain in power consumption allows to reduce component space which allows further slimmer designs.
Linus Tech Tips recently published a video where they did all kinds of cooling hacks to a Macbook Air, including milling out the CPU heat sink, adding thermal pads to dissipate heat into the chassis (instead of insulating the chassis from the heat), and using a water block to cool the chassis with ice water.
They got pretty dramatic results from the first few options, but it topped out at the thermal pads and nothing else made any difference at all. Their conclusion was that the way the system was built, there was an upper limit on the power the system could consistently provide to the CPU, and no amount of cooling would make any difference after that point.
The obvious conclusion for me was that Apple made decisions based on battery life and worked backwards from there, choosing a chip that fell within the desired range, designing a cooling system that was good enough for that ballpark, and providing just enough power to the CPU/GPU package to hit the upper end of the range.
It could just as well have been, choose a pref level and assure it will run for 10 hours...
It actually good engineering to have all the components balanced. If you overbuilt the VRM's for a CPU that would never utilize the current, its just wasted cost.
OTOH, maybe they were downsizing the batteries to keep it at 10H so they could be like "look we extended the battery to 16 hours with our new chips" while also bumping the battery capacity.
> The 16" MacBook Pro, for example, has a 100 Wh battery, which is the largest that Apple has ever shipped in a laptop. This is the largest battery size permitted in cabin baggage on flights.
I agree battery life for casual workloads will probably stay the same. However, if CPU power consumption decreases relative to other components, battery life on heavy workloads should go up.
My new 16“ MBP is good for 2-2.5h max when used for working on big projects in Xcode. I expect to almost double that with the new CPUs. The people who have exactly this problem are also those who buy the most expensive hardware from Apple.
This isn't always true. The 16" MacBook Pro, for example, has a 100 Wh battery, which is the largest that Apple has ever shipped in a laptop. This is the largest battery size permitted in cabin baggage on flights.
Great, they can make the laptops even slimmer. They're going to make them so thin they won't be able to put a USB-C port and use wireless charging. You'll soon learn that you don't actually need to plug anything into your device. Apple knows best.
> Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains
This doesn't really seem to match my experience; at least on a 2015 MBP, the CPU is always consuming at least 0.5-1W, even with nothing running. If I open a webpage (or leave a site with a bunch of ads open), the CPU alone can easily start consuming 6-7 watts for a single core.
Apple claims 10 hours of battery life with a 70-something WH battery, which would indicate they expect total average power consumption to be around 7W; even the idle number is a decent percentage of that.
(Also, has anyone been able to measure the actual power consumption of the A-series CPUs?)
A typical laptop display can consume around 10W all the time so the 1W from the idle CPU is negligible in comparison.
If anything, you should install an adblocker. A single website filled with ads (and they're all filled with tons of ads) can spin the CPU to tens of watts forever, significantly draining the battery.
10w is on the high end of this, my 1080p screen on my Precision 5520 sucks down a paltry 1.5w at mid brightness, the big killer is the wifi chip. That takes between 1.5-5w.
CPU tends to be quite lean, until something needs to be done then steps up very quickly to consuming 45w.
I usually consider 5 to 15W for laptop display consumption. Depends on the display, size and brightness.
It's quite variable, the highest brightness can consume double of the lowest brightness for example. One interesting test if one has a battery app showing instant consumption (I know lenovo laptops used to do), is to adjust brightness and see the impact.
Yeah, this is probably harder to do on a macbook, but intels 'powertop' program on Linux has quite high fidelity, matches the system discharge rate reported by the kernels battery monitor too.
Anecdotal evidence: On my work notebook (Lenovo X1 Carbon, Windows 10), the fan starts spinning when Slack is on a channel with animated emoji reactions.
I looked up the numbers out of curiosity. The X1 Carbon has a i7-8650U processor which does about 26 GFlops. The Cray-1, the classic 1976 supercomputer did 130 MFlops. The Cray-1 weighed 5.5 tons, used 115 kW of power, and cost $8 million. The Cray-1 was used for nuclear weapon design, seismic analysis, high-energy physics, weather analysis and so forth. The X1 Carbon is roughly equivalent to 200 Crays and (according to the previous comment) displays animated emojis with some effort. I think there's something wrong with software.
Well yes, it's quite noticeably sluggish and bloated on the whole, with even UIs seemingly getting worse over time. Probably doesn't help that everything these days wants to push and pull from multiple networked sources instead of being more self-contained.
That’s because Slack runs on top of basically Chrome, which is a horrible battery hog.
If you run the web versions of Electron “apps” in Safari you’ll get substantially better battery life. (Of course, still not perfect; irrespective of browser all of these types of apps are incredibly poorly optimized from a client-side performance perspective.
If large companies making tools like slack had any respect for their users they would ship a dedicated desktop app, and it would support more OS features while using a small fraction of the computing resources.
(Large-company-sponsored web apps seem to be generally getting worse over time. Gmail for example uses several times more CPU/memory/bandwidth than it used to a few years ago, while simultaneously being much glitchier and laggier.)
Yes, Electron is a bit of a battery hog. But the Slack app itself is horrendous. If you read through their API docs and then try to figure out how to recreate the app, you'll see why. The architecture of the API simply does not match the functionality of the app, so there is constant network communication, constant work being done in the background, etc.
I'll turn your anecdote into an anecdatum and say the same; for all devices I've owned. (Linux on a Precision 5520 w/ Xeon CPU, Macbook pro 15" 2019 model, Mac Pro 2013)
On my laptop, scrolling through Discord's GIF list can cause Chrome and Discord to hard-lock until I kill the GPU process. Possibly because of a bug in AMD's GPU drivers on Windows.
Seems to me to be very likely that Apple's graphic's silicon is much more performant and power efficient than Intel's integrated GPUs. CPUs idle most of the time seems to point to the advantage of a big.LITTLE style design which Apple have been using for iPad's etc for a while. So maybe not 30% but not negligible either.
They demoned lightroom and photoshop which are surely using meaningful CPU resources?
Agreed on the accelerators and the cost savings. All together probably a compelling case for switching.
Try browsing the web on a semi-decent laptop from, say, 2008.
It's a frustrating experience. It is obnoxious how much CPU power modern websites require.
Honestly, back when my PSU died I just did that. Beyond the lack of video decoding support for modern codecs it was perfectly acceptable as a backup machine.
It's worse than that. At least someone would profit off of those bitcoins being mined. Instead we use all of that power to make the dozens of dependencies play nice with one another.
You know that Apple is going to be making the GPU with the same technology as the CPU right?
And those accelerators don't need to be discrete, Apple can add them to their CPUs.
So, it looks like your point is: Sure, Apple is going to jump a couple process nodes from where Intel is, but everything is somehow going to remain the same?
> Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU.
Hard to square this with the simple fact that my 2018 MacBook Pro 13" battery lifespan goes from 8 hours while internet surfing to 1.5 hours for iOS development with frequent recompilations.
I'm predicting a future where the os is completely locked down and all software for macs be purchased from the app store. Great revenue model for Apple.
And it didn’t help that the Windows Store back then was a store for UWP/Metro apps.
It also took a long time for Microsoft to actually tackle the issues that UWP/Metro and WinUI/XAML faced. It took so long, it doesn’t even matter anymore and even Microsoft has moved on. But there’s quite a bit of hypocrisy, with Microsoft telling others to use WinUI while not using it everywhere themselves while refusing to update the styles of other design frameworks.
Apple will simply use different bins in different products. The A12X is arguably a "binned" A12Z, after all. Higher bins for pro lines, lower bins for consumer lines.
Apple doesn't have the lineup for that. The CPU in the Mac Pro isn't the same silicon as the CPU in the Mini. It has more cores, bigger caches, more memory channels. It's not just the same chip binned differently.
In theory they could offer the Mini with eighteen different CPU options, but that's not really their style.
One question is whether they'll go down the chiplet route for higher end CPUs, then they can share a single die, binned differently, across more of their range, and just bundle them into different MCMs.
The 3990X costs more than ten times as much as the 3700X. It has eight times more cores. On anything threaded it smashes the 3700X. On anything not threaded it... doesn't. In many cases it loses slightly because the turbo clock is lower.
It basically means that the processor with the best single thread performance is somewhere in the lower half of your lineup and everything above it is just more chiplets with more cores. That's perfectly reasonable for servers and high end workstations that scale with threads. I'm not sure how interesting it is for laptops. Notice that AMD's laptop processors don't use chiplets.
Even the highest core count Threadrippers have decent single thread performance. The Epyc lineup has much lower single core performance and that may make it less useful for desktop workloads.
AFAIK the AMD distinction is currently that APUs (mobile or desktop) don't use chiplets.
On the whole my guess would be that we have the iPad Pro and MacBook Air using the same SoC, the MacBook Pro doing… something (it'll still need integrated graphics, but do they really sell enough to justify a new die? OTOH they do make a die specifically for the iPad Pro, and I'd guess it's lowest-selling iOS device v. highest-selling macOS device, and idk how numbers compare!), and the iMac (Pro)/Mac Pro using chiplets.
Don't worry, apple already tiers most of it's hardware by soldering in the ram / storage & charging an offensive, obviously price gouging amount to upgrade - even though the maximum spec has a base cost to them of 1/4 to 1/6 of what they charge FOR AN UPGRADE.
The Mac line will start to look like the iOS line very quickly. Binning will be important and you'll likely see processor generations synchronized across the entire product base.
I've been thinking about this. I can't see Apple realistically being able to produce multiple variants (phone, tablet, laptop, speaker, tv) of multiple elements (cpu, gpu, neural accelerator, wireless/network, etc) packaged up on an annual cadence.
The silicon team is going to be very busy: they've got the A-series, S-series, T-series, H-series, W-series, and U-series chips to pump out on a regular roadmap.
The A-series (CPU / GPU / Neural accelerator) is the major work. It gets an annual revision, which probably means at least two teams in parallel?
The A-series X and Z variants seem to be kicked out roughly every second A-series generation, and power the iPads. The S-series seems to get a roughly annual revision, but it's a much smaller change than the main A-series.
I could see the Mac chips on a 2-year cycle, perhaps alternating with the iPad, or perhaps even trailing the iPads by 6 months?
The iOS line looks like the low end device using last year's chip. How does binning help with that? Are they going to stockpile all the low quality chips for two years before you start putting them in the low end devices? Wouldn't that make the performance unusually bad, because it's the older chip and the lower quality silicon?
If you think the bins are determined by yield rather than by fitting a supply/demand curve, I have a bridge to sell you.
Of course, yield is still a physical constraint, but apple sells a wide range of products and shouldn't have any trouble finding homes for defect-ridden chips.
> CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains
I don't agree. Simply disabling Turbo Boost on MBP16 nets me around 10-15% more battery life. Underclocking a CPU can even result in twice to thrice the battery life on a gaming laptop under same workload.
I actually think total battery life will go up a fair bit and compile times will be much faster, 20-30%, while giving everyone full power when not on the mains. The amount my MacBook throttles when on battery is startling and stopping that while still giving huge battery life, say 6h at 80% CPU will be a huge win. Apple wouldn’t bother unless they knew the benefits they can bring over the next 10 years will be huge.
All of this is complete speculation of course but I don’t believe it will be a financial decision this one, it’ll be about creating better products.
Multi-core performance is not a strong suit of Apple's ARM architecture, I suspect you're going to see a mild to moderate performance hit for things like compilation.
The rumours are that they're doubling the number of high-performance cores for the laptop chips (so 8 high performance cores and 4 low-power cores). That + better cooling ought to boost the multi-core performance quite significantly.
Is their multi-core performance poor, or have they just made size/power trade-offs against increasing the number of cores? The iPad Pro SoCs are literally the only parts they've made so far with four big cores.
That’s mostly because desktop systems are built with more background services and traditional multitasking in mind. iOS has a different set of principles.
I was looking at the benchmarks of the latest MacBook Air here [1]. In GPU performance it's not competitive with the iPad Pro, and that's quite an understatement. For me the most obvious win of this migration to "Apple Silicon" will be that entry-level MacBook/iMac will have decent GPU performance, at long last...
“Apple can start leaning on their specialized ML cores and accelerators“
I think that hits the nail on the head. Since I only cursory listened to both the keynote and the state of the union I may have missed it, but I heard them neither mention “CPU” nor “ARM”. The term they use is “Apple Silicon”, for the whole package.
I think they are, at the core, but from what they said, these things need not even be ARM CPUs.
JS/ads and the wifi chipset seems to be the big culprit across laptops in general in this scenario. Even Netflix doesn't drain my battery as fast as hitting an ad heavy site with lots of JS and analytics and I can watch the power usage for my wifi chipset crank up accordingly. This happens across every laptop, iPad, Chromebook etc that I own.
-the CPU will be a lot more powerful and faster, but it isn't really faster because it's like an accelerator or something.
-if you actually use your computer get some vague "Linux desktop" or something (which is farcical and borders on parody, completely detached from actual reality). Because in the real world people actually doing stuff know that their CPU, and its work, is a significant contributor to power consumption, but if we just dismiss all of those people we can easily argue its irrelevance.
My standards for comments on HN regarding Apple events are very low, but today's posts really punch below that standard. It's armies of ignorant malcontents pissing in the wind. All meaningless, and they're spraying themselves with piss, but it always happens.
I was going to follow up with an anecdote about how my computer has used less than 15 minutes of CPU time in the last 2 hours but then again I forgot to stop a docker container that automatically ran an ffmpeg command in the background consuming 70 min of CPU time.
> Apple is going to save $200-$800 cost per Mac shipped
Does Apple actually have its own silicon fab now or are they outsourcing manufacture? If the former, those are /expensive/ and they'll still be paying it off.
This seems very inaccurate to me. Most laptops do not have discrete GPUs, so tasks like rendering a youtube video do require CPU cycles. Zoom is very CPU intensive on basically any mac laptop, and people always have a ton of tabs open, which can be fairly CPU intensive.
In other words, there are definitely gains to be had. My ipad pro offers a generally more smooth and satisfying experience with silent and much cooler running CPU versus my MBP, and they offer similar battery life. Scale up to MBP battery size and I suspect we will be seeing a few hours battery life gain.
I'm predicting the opposite: you won't actually see any difference.
Once you look closely at power profiles on modern machines you'll see that most energy is going into display and GPU. CPUs mostly run idle. Even if you had a theoretical CPU using zero energy, most people are not going to get 30% battery life gains [1]. Not one thing that they demoed requires any meaningful CPU power.
Similarly, while ARM parts are more efficient than x86 per compute cycle, it's not a dramatic change.
The big changes, I think, are more mundane:
- Apple is going to save $200-$800 cost per Mac shipped
- Apple can start leaning on their specialized ML cores and accelerators. They will probably put that stuff in T2 for Intel Macs. If they're already shipping T2 on every machine, with a bunch of CPU cores, why not just make those CPU cores big enough for the main workload?
Doubling CPU perf is meaningless if you can ship the right accelerators that'll do 100x energy/perf for video compression, crypto and graphics.
[1] for a regular web browsing type user; obviously if you're compiling stuff this may not apply; if that is true you're almost certainly better off just getting a Linux desktop for the heavy lifting