Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

IDK, "look how good our CPU support is by running a several year old GPU limited game" fell a bit flat for me.


I think your expectations for what emulation is capable of are set a bit high. The fact that it is able to emulate a game that's a few years old at decent frame rate is more than acceptable. You didn't see Microsoft demoing games for their Surface on ARM systems at all and for good reason.


I mean, if I had things my way they wouldn't be switching to ARM at all and emulation wouldn't be necessary, so I don't think it's wrong to be skeptical.

> You didn't see Microsoft demoing games for their Surface on ARM systems at all and for good reason.

Those were also lower-end computers with poor GPUs.


> Those were also lower-end computers with poor GPUs.

You mean the Surface Pro X? It has similar price tiering to the iPad Pro whose SoC was used in these demos.


...ouch, okay, I just looked up the price of Surface Pro X, and it's $1,000. I forgot how much Microsoft was charging for that thing.

Still, the Pro X has poor graphics. I'm going to assume that a dedicated GPU was being used for Tomb Raider—they would have said something otherwise.


> I'm going to assume that a dedicated GPU was being used for Tomb Raider—they would have said something otherwise.

They said exactly what SoC they were using, and it's not known to have spare PCIe lanes lying unused in existing products. Apple pretty much just demoed an x86 game running on an overclocked iPad Pro.


Interesting—are you expecting the Thunderbolt ports in the Developer Transition Kit not to work then? Since it's using a Mac Mini chassis.


Tech specs on the DTK webpage don’t mention Thunderbolt at all, just USB-C (2 ports) and USB-A (2 ports), plus a HDMI 2.0 port.

Makes me wonder how the Pro Display used in the demos was being driven. DisplayPort with USB-C alternate mode?


They only said the demos were running off Apple silicon, not that they were running off a Mac Mini DTK machine. They probably have other systems more akin to Mac Pros that they use internally.


They said the demos were running on an A12Z.


It's entirely possible the display was only running at 4K instead of the full, native resolution.


It's not like Apple can't change what connectors are available on the back of the mac Mini. The form factor may not have changed, but the ports available have in the past releases.

Don't be surprised if there is no Thunderbolt 3 at all, but just USB-C.


It may be that by the time these things are ready for an actual release they can be USB4, which is USB-C + Thunderbolt technology, but no longer Intel exclusive.


> Those were also lower-end computers with poor GPUs.

Were you under the impression this $500 developers kit shipping with an iPad Pro CPU/ GPU is a high end computer? While it's a decent chip, it's essentially the same CPU as the prior generation iPad Pro with 1 additional core.


I can emulate a PS3 on my computer with better graphics at higher framerates, including GPU emulation.

It's not impressive at all.


Shadow of the Tomb Raider is a PS4/XB1 game, not a PS360 game.

---

Edit 2: Please disregard my first edit, below—I was right the first time, then I got the games mixed up.

Edit 1: Oh wait, I forgot, SotTR actually did have an Xbox 360 port! It was one of the last big titles to have one. I think what they showed on screen looked better than the 360 version though, although it's admittedly hard to tell on a stream.


> SotTR actually did have an Xbox 360 port!

It did not, previous game in series (Rise of the Tomb Raider) did have but Shadow(...) was released only on current-gen consoles.


Not on low settings at fake 1080p with an inconsistent framerate, it isn't. It's PS4 level at 60fps on high.

Whereas I can emulate 4k60fps PS3 games with improved textures.


When its running on a iPad pro chip? It's pretty impressive., considering Microsoft x86 emulation on the Surface pro X is abysmal.


Is the target to match the performance of ten year old hardware? Then sure, that's matched. But it's not impressive. AMD FX CPUs have better performance than that, by a mile.


Yeah, but this is their existing CPU/GPU designed to fit into the constraints of the iPad form factor. They'll likely have something much more powerful for consumer hardware.


The iPad CPU/GPU is already thermal limited. An unlimited A12Z is right at the TDP of a laptop chip, at ~25-30W (5W per big core per anandtech, 4 cores, plus GPU and I/O, it's actually quite a generous estimate.

An unlocked A12Z is likely all you can get away with in a laptop, and inferior to SOTA x86 low power CPUs.


If the Tomb Raider game was actually running on a A12Z system (without any external GPU - note that this is the same CPU/GPU on the iPad Pro!), then that demo is actually really impressive, even if the game settings are set to low-quality and the framerate is a bit choppy.


It’s not about the game, it’s about running performance-dependent code written for x86 on an ARM chip. A lot harder to fake a stable frame rate in a game than in Photoshop.


I mean, it didn't look like a stable frame rate to me.


It’s mostly OpenGL/metal. Show me a cpu intensive task which isn’t already gpu native/jit


On the other hand, it probably doesn't use much CPU just because it's single threaded. No game ever uses more than 10% of my 16-thread CPU. But that also means that emulation could seriously tank single-thread performance and ruin the game.


It might be the other way around. The might have gotten around the big issue with running arbitrary x86 code on ARM (the way weaker memory model) by pinning all x86 threads in a process to a single core. Which would be unfortunate.


I would have been more impressed with them accessing an game service like Steam and launching random games from there.

Plus they insinuated they were running on the A12 but later demos including the game did not state exactly the hardware being used.

If they are limited to games support Metal then that is jettisoning a large number of games. Granted Catalina already did a of that work for them.

Are they going to support non Apple video cards? The skipped right over that but I suspect they don't think they need to

*edit on that last note it would be a good reason they never brought nvidia chips back as they would know they would not need them


Not even running it very well. All the graphics settings were turned down, the framerate looked choppy and it was only running in 1080p.


Again, people need to dial back their expectations here. You aren't going to see cutting edge games running well through emulation. There is a reason Apple made such a huge emphasis on native apps, native is always going to run much faster.

They didn't demo gaming to suggest this is a great machine for gaming, they demoed it to show that it was possible at all. The previous version of Rosetta during the PowerPC->Intel transition was not known for performance.

If gaming is important to you and you want a Mac then you want an Intel Mac or whatever games are released for Mac ARM. Emulated games are not going to compete with native.


It’s not a demo of their new gaming-class GPU though.


No - but they weren't exactly showing us performance graphs to see where the game was getting caught on bottlenecks either, however.


So many people don't get this that it's astounding


It was very odd seeing Lara walk through an area with dappled bright light, and her body remain uniformly lit. It may be that the game has a very basic lighting engine though.


It is like many triple A games in that it has a wide range of settings, all the way from full potato to RTX (it was ironically one of the first games to support that).


It doesn't, they just had the graphics turned down super low.


I was wondering that. I'm hopelessly out of the loop with exposure to modern game graphics, but to me, that looked worse than 360/PS3-era games.


But does it run better than on the current intel mac mini with integrated graphics? All it needs to do is beat intel in comparable circumstances.

It doesn't really matter what the graphics performance is, on high end macs they'll still ship a dedicated GPU from AMD. What matters is that the game is GPU-limited instead of CPU-limited.


> But does it run better than on the current intel mac mini with integrated graphics? All it needs to do is beat intel in comparable circumstances.

Having gone and checked, no. Not even close.

(Nor would it be plausible to expect to. But it's clear Apple have made a choice here, and that is that if you're a user who wants legacy software or desktop gaming, Apple do not care about you compared to their margins. It's that simple.)


The maxed out mac mini cpu is a 6 core 3.2Ghz i7 with turbo boost to 4.6Ghz. I wonder if they can beat that with a newly ARM optimized MacOS? The current i7 has tons of power still as an 8th gen Intel cpu.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: