Hacker News new | past | comments | ask | show | jobs | submit login

We're talking about the m3 ultra here, which is also wall powered and also expensive. Nobody is interested in dropping upwards of $10,000 on a Mac Studio to have "okay" performance just because an unrelated product is battery powered. Similarly saving a few bucks on electricity to triple the time the much, much more expensive engineer time spent waiting on results is foolish

Also Apple isn't unique in having an NPU in a laptop. Fucking everyone does at this point.




It almost feels like you're deliberately missing the forest for the trees, in order to fit some argument that I'm not quite able to sus out here.

The point is that, in terms of practical usage, the M3 Ultra is uniquely competent and highly affordable in a sea of enterprise technology that is decidedly not. I tried to demonstrate why I'm excited about it by pointing out the similar performance of a battery-powered, four-year-old laptop and a quite gargantuan gaming PC that's pulling over 500W from the wall, as an example of what several years of additional refinements and improvements to the architecture was expected to bring.

The point is that it's affordable, more flexible in deployment, and more efficient than similarly-specced datacenter servers specifically designed for inference. For the cost of a single decked-out Dell or HP rackmount server, I can have five of these Mac Studios with M3 Ultra chips - and without the need for substantial cooling, noise isolation, or other datacenter necessities. If the marketing copy is even in the same ballpark as actual performance, that's easily enough inference to serve an office of fifty to a hundred people or more, depending on latency tolerances; if you don't mind "queuing" work (like CurrentCo does with their internal Agents), one of those is likely enough for a hundred users.

That's the excitement. That's the point. It's not the fastest, it's not the cheapest, it's just the most balanced.


Apple defenders have some special sauce reasoning that makes no sense to anyone but them. Are you a boomer?

I have Apple hardware but it sucks for anything AI, buying it for that purpose is just extremely dumb, just like buying Macs for engineering CADs or things of the sort.

If you are buying Macs and it's not for media production related reasons you are doing something wrong.


> Apple defenders have some special sauce reasoning that makes no sense to anyone but them. Are you a boomer?

I continue to be in awe of the lengths some people will go just to fling insults and shake out some salt. We're, what, ten layers deep? With all the context above, the best you have to contribute to the discussion are baseless accusations and ageist insults?

Your finite time would have been better spent on literally anything else, than actively seeking out a comment just to throw subjective, unsubstantiated shade around. C'mon, be better.


Makes no mistake, it's not an insult. I'm saying that precisely because I have been there.

Apple is the master at creating desire and building narrative in their customers' mind about the many things their devices would allow them to do. It's very aspirational and in practice most of the Macs get used for things that could have been done with a much cheaper option.

It may not be obvious to you but it's somewhat funny seeing you rationalise all kinds of dreams of what this machine could potentially be when in practice the people who would really be working on the kind of stuff you are talking about don't even consider them viable for many good reasons.

It's not that those machines cannot potentially do it, it's just that they don't really fit the goal very well.

A lot like people buying Cybertruck to "haul" stuff when they are a lot more option that are just plain better and make a lot more economic/practical sense.

It's OK to desire the thing and be excited about it but it really doesn't serve anyone to rationalise it so hard, you are lying to yourself as much as everyone else, it's not healthy.

If that was not clear, people working on AI stuff professionally really don't have to deal with a Mac Studio, they have access to better stuff. If you want to get one personally to experiment/toy around it's ok but it's not going to be this amazing thing for AI.


10K doesn’t get you 512 GB of VRAM in Nvidia land.


indeed, it does not.

I am thinking that 7 A100's would be the lowest price for that, and that would be $80k with good discounts.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: