With educational pricing this thing starts as $500, and at 16GB of RAM (finally) I think this easily beats any sort of desktop PC you can buy at that price (let's exclude custom builds, they're not the same market).
I think this just became the go-to recommendation I'll give to anybody wanting an entry-level desktop computer of any kind. In fact I might buy one for my parents right now to replace the old mac mini they have. I really can't think of any reasonable competition for it at that price.
One issue to watch out for: Sub-4K res monitors look surprisingly bad on newer versions of macOS with Apple Silicon Macs. And no, it's not simply a matter of non-Retina obviously not looking as nice as Retina monitors - something like a 1440p monitor will look much worse on macOS than it would on Windows or Linux. This is partly caused by a lack of subpixel rendering for text on macOS, but it doesn't affect just text, with app icon graphics and such seemingly optimized for High-DPI resolutions only and thus looking awful too.
You commonly see people using 3rd party apps such as BetterDisplay to partially work around this problem by tricking the system to treat 1440p displays as 5K displays and then downscale, but it doesn't solve this completely.
So yes, the price for the machine is fantastic, but you may want to budget for a basic 4K display as well.
> Having experienced 4k I feel impoverished having to return to lower resolutions.
That's what they said. I've been using Retina/HiDPI displays at work for close to a decade now. Still can't say I prefer one over the other. I have no problem seeing pixels, especially now that I've switched to Linux (KDE Plasma) at home. In fact I kind of like being able to catch a glimpse of the building blocks of the virtual world.
What actually does matter (for me) is uniformity and color accuracy. And you can't have that for cheap, especially not in 4K.
Is this with newer Apple Silicon Macs? My 2020 M1 Mac Mini looks unremarkably normal on my 1440p display. I'm also going between that and my 14" M1 Pro Macbook Pro, which of course looks beautiful but doesn't really make the 1440p on the Mini 'bad'.
Edit: Adding that both of these machines are now running macOS 15.1 at this time.
In my experience, you can’t do any sort of scaling with sub-4K displays. This is “since M1”. Intel Macs, even on the latest macOS, can do scaling eg 1.5x at say 1440p, which last time I bothered with an Intel Mac required a workaround via Terminal to re-enable.
But that workaround is “patched” on Apple Silicon and won’t work.
So yes if you have an Apple Silicon Mac plugged into a 1440p display, it will look bad with any sort of “scaling”- because scaling is disabled on macOS for sub-4K displays. What you’re actually doing when you’re “scaling” on say a 1440p display is running that display at 1920x1080 resolution- hence it looks like ass. Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up to appear as though you had a 1920x1080 display- since it was still utilizing the full …x1440 pixels of the display, “1920x1080” looked nicer than it would now.
So brass tacks it’s just about how macOS/OS X would obfuscate the true display resolution in the System Preferences -> Displays menu. Now with Apple Silicon Macs, “1920x1080” means “2x scaling” for 4K monitors and literally “we’ll run this higher-res monitor at literally 1920x1080” for any display under 4K resolution.
> Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up
I’m almost sure that macOS can’t do that. It’s always done scaling by rendering the whole image at 2x the virtual resolution and then just displaying it on whatever screen you had. For example, for “looks like 1080p” on a 1440p screen it would draw onto a 2160p canvas (and do 2160p screenshots).
Yeah I was never able to get that to work on M1/M2 Macs. Intel, sure, but none of the workarounds (including BetterDisplay) worked on the ARM Macs. Do they now? I last tried in 2022.
If your 1440p monitor looks “fine” or “good”, it’s because the scale is 1x - for many people, including myself, UI elements are too small at 1x 1440p. I had to buy a 4K monitor so I could have larger UI elements AND crisp UI elements.
You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
The same way someone might not notice motion smoothing on a TV, or how bad scaling and text rendering looks on a 1366*768 panel, or different colour casts from different display technologies. All three took me a while before I could tell what was wrong without seeing them side by side.
> You may just not be seeing the visual artifacts on your screen because you don't know what they look like, or mentally adjust to what that screen looks like.
Does any of that matter, though? Who bothers with the existence of hypothetical artifacts in their displays they cannot even see?
It matters once you get used to something better. Our brains are really good at tuning out constant noise but once you start consciously recognizing things it’ll remain noticeable. If your vision slips you won’t constantly be walking around saying everything is fuzzy but after using glasses you’ll notice it every time you take them off. Low-res displays work like that for many people – back in the 90s, people were content with 800x600 27” monitors but now that would feel cramped and blocky because we’ve become accustomed to better quality.
This is the biggest issue with Mac hardware at the moment.
All because of a decision to make it easier for their developers (and 3rd party too I guess) to be able to claim they figured out high-DPI before everyone else.
It comes at a large cost now, either more money than reasonable for one of the few compatible displays or accept a much worse experience, that is just not good for devices of this price.
This is why a big affordable iMac is so necessary, but TC's Apple likes money too much to care about their legacy customers.
After such a long history of Mac OS having better font rendering and in general better graphic stack (Quartz, everything is basically a continuous PDF rendering) feels like a big letdown.
The problem is going to improve as more high-DPI displays are released for sale but it has taken a lot of time because most customers like to focus on other characteristics that are arguably more important for other use cases.
There are plenty of premium display that are just good to great but you really have to think how it will work if you buy a Mac, most likely you'll need to compromise, feels bad considering the price of admition...
Wait about what kind of people are you talking about and how niche is that target group?
You are saying Mac are expensive but at the same time the potential buyers cant afford even a cheap 4K monitor? They go by like 200$? now.
and even is that group exists.. its not like 2560p is torture on a Mac especially with that BetterDisplay HiDPI, I would bet many would not even notice the difference.
Unless you want to use your cheap 4k monitor as an equivalent 1080P display (which is not a lot of space for today's standard) it's not at all viable.
2560 is actually 1440p and no, it's not very good, even with BetterDisplay without even talking about the performance and rendering implications.
The fact is that if you buy an expensive Mac desktop, you need to also buy one of the few expensive displays that can work properly with it, otherwise you get degraded experience or compromise, which is unacceptable for hardware this price.
We are in this situation only because of both engineering and commercial decisions from Apple.
Considering that they sold an entry level 27" iMac for a lower price than what the Studio sells for, well into 2021, the position is indefensible. Even if they wanted to make an external display, they didn't have to make it that overbuilt/expensive without any other choice.
It is purely profit motivated move, because they want to extract 50% margin on everything, the old iMac was a low margin device and it's really the only reason it doesn't exist anymore. Conveniently all of that is supported by an unnecessary engineering decision (they didn't have to remove subpixel rendering).
Why do you feel the need to defend a mega corp terrible choices, only made to milk you as much as possible.
As for the niche thing, having installed/managed quite a few of those old 27" iMacs, I can tell you they were extremely popular, precisely because they were cost effective. I think you largely underestimate how large the customer base for the 27" iMac was.
As far as I'm concerned, it's way less niche than a 650$ headphone, but the difference is that they can milk 50% margin on those while a big iMac at that margin target would put it in a price territory where most wouldn't even consider.
So yes, you can get a cheap display to have a price contained setup, but it's really not a great experience and doesn't make a lot of sense to comprise if you are going to get an expensive computer in the first place.
At this rate you can get All-in-one 27" PC for not much more than a standalone display, it's not going to be great but at least it's cheap. If we are going to compromise, at least do it right.
Can confirm, you absolutely need BetterDisplay and a tiny bit of elbow grease to configure the 5k clone to downscale to your real monitor. Not rocket science, but could be more streamlined.
If you say it looks fine without it, I don't know what to say.
Is there a review that demonstrates and corroborates this issue? Is it a difficult problem if choosing to buy a new display for a Mac mini? My old display is 10 years old and I would have to get a new one then.
It's most visible with the macbooks because you have the retina display and the low dpi display next to each other.
In short: you probably want to get at least a 4k display anyway, but if you want to delay that, you should buy BetterDisplay. The difference is night and day.
My 7 year old QHD monitor pair through a M1 Pro MBP still looks fantastic. Then again, I do spend most of my day in apple Terminal, but I'm not really in want of anything more. Some other sibling comments are saying Windows 10/11 looks crappy, and I agree, as I have to occasionally switch between the two, I just don't like working in Windows anymore, mostly because of the poor display.
I use both OS on the same display and Windows looks much better on an "old" no Hi-DPI display, I can tell you that much.
I used to dislike Windows font rendering, but it's still better than what macOS gives you for "regular" displays. You can fix it somewhat with BetterDisplay but still...
Modern versions of macOS don't support subpixel rendering, what Windows calls ClearType, so that is why macOS will always look worse on low resolution displays.
Using BetterDisplay to force a "2x" resolution will give you better rendering but at the cost of lower usable/effective resolution.
Yeah sure I know that, which is why I relate my experience to that is the result of those engineering choices.
It's pretty funny that you need special hardware to keep nice macOS font rendering to stand in comparison to Windows.
Microsoft has a lot of problems but they are way more pragmatic in their choices giving a "good enough" experience on most hardware.
But if you don't follow Apple's choice, your experience can go from great to barely passable in an instant.
Very different approaches, but as I'm getting older, I understand why the one from Microsoft is popular and why they deliver more value.
Basically operating at standard pre-Retina Mac DPI levels. The 27" Apple Cinema Display had exactly this resolution, as well as the 27" iMac before it went to 5K.
I agree, it works… fine. But sadly more and more elements of modern macOS will look blurry / aliased because they are only made with hi-DPI in mind.
For example all SF Symbols, as far as I know, are not defined as pixel graphics but only stored as vectors and rasterized on the fly. Which works great at high res and makes them freely scalable, but on low-DPI displays they certainly look worse than a pixel-perfect icon would.
Came here to echo this. Also, it always amazes me how many people respond to warnings like this (as seen in this thread as well) saying lower-resolution displays look just fine. I returned a M2 Mac Mini solely because it looked so awful on all of my monitors -- I tried 2 different 32" 2k displays, plus a handful of 24" displays. Everything was fuzzy and awful looking. Not something that could be tolerated or ignored... Completely unusable. I feel like this fact is not well known enough.
The fact that so many seem to tolerate "low-res" or "mid-res" displays on the current M-series Macs is really puzzling to me... maybe my eyesight isn't as bad as I thought it was and everyone else's is a lot worse!?
This new M4 mini is tempting enough that I might try a Mac again... but this time I am definitely going to have to budget for a 4k/5k display.
Honestly I am going to say skip 4K and just go to 5K. They are not that much more. I have 2x5K setup and it is great. The main monitor is normal orientation and the other is mounted on the left at a 90 rotation centered on the side of the first. I keep my work on the main and all the documentation, chat, etc. on the vertical one. I hope to be able to ditch the 2 monitor setup next year and go to a single 8K display.
There's still good deals in mini PC land. Yes, the M4 is faster but there's loads of mini PCs with decent CPUs, 32GB RAM and a 1TB of SSD storage for under $600. I think for a lot of people for basic usage they'll get more value out of the larger and upgradable SSDs than the faster CPU.
I bought one of these once. The specs on paper look good, but the CPUs are weak. They’re like those U series Intel CPUs where you could get say an i7-7700U, with 4 physical cores and 8 total threads, but at 15W TDP you were never really going to benefit from the 4 cores and 8 threads.
I do love those particular boxes for certain workloads. I have a few Lenovo ThinkCenter small form factor boxes in my office. They’ve replaced all my Rapsberry Pis. Unlike the Pi, I was able to purchase these!
Yeah the Pi is way too expensive. NUCs are a better deal (and roughly the same price), and x86 is obviously going to smoke ARM in certain workloads (e.g. running a VPN server that has to encrypt/decrypt all traffic).
The concept of the Raspberry Pi is great, but the price point was never where it needed to be. Even when you could get one for around MSRP, doing anything with it was so much added cost. Yeah $35 for a little computer is great, but you needed a power supply, case, microSD card, and whatever hats (the hats have always been overpriced IMO).
I know a couple of iOS developers who recently switched to a M4 MacBook pro and they swear that in some frequent workloads it feels sluggish and slower than the old Intel MacBook pros. Being RAM-starved might have something to do with it though.
> but there's loads of mini PCs with decent CPUs, 32GB RAM and a 1TB of SSD storage for under $600.
I also add that, unlike Apple hardware, these miniPCs are built with extensibility in mind. For example, most NUCs from the likes of minisforum and Beelink ship with a single SSD but support multiple SSDs, with their cases also having room for SATA drives. They even go as far as selling barebones versions of their NUCs, where customers can then pick and choose which RAM and SSDs to add.
From my experience, TCO on most apple products ends up being roughly the same when you factor in resale value.
You'll be able to sell your M4 mac mini in 5 years for $150 for an instant-cash offer from backmarket or any other reseller, while you'd be lucky to get $30 for the equivalent Beelink or BOSGAME after 6 months on ebay.
> From my experience, TCO on most apple products ends up being roughly the same when you factor in resale value.
This reads like the epitome of Apple's reality distortion field. I mean, you're trying to convince yourself that a product is not overpriced when compared to equivalent products and subjecting customers to price gauging by asserting that you might be able to sell it later. That's quite the logical leap.
No that's an accurate TCO calculation.
It's interesting that on this topic, the inventor of the PC also seems to be caught in that supposed "Apple reality distortion field" and can't confirm the "price gouging" that you're trying to convince yourself Apple practices.
I’m curious about your definition of the word waste. If a $600 Mac lasts 5 years and still worth $150 and another machine loses all its value in six months, how is Mac a waste?
600-150 is a bigger number than 30 last time I checked. So even if the $30 machine were to loose all its remaining value instantly, not even scrap metal, you would be far, far ahead.
These are the dollar numbers claimed in the above post.
I do think we should at least use the same measure of time to compare. Even if that means one reaches $0 by the time the other reaches $150.
Macs do generally hold their resell value better than PCs, but that doesn’t necessarily have any correlation to usefulness.
I have bought several ThinkCenter small form factor PCs used for about $200 each, and they’ve each been about 5-7 years old. They’re perfectly fine and I can even get new parts from Lenovo, depending on the part and machine. Fantastic deal. They run loads of services in my home.
> You'll be able to sell your M4 mac mini in 5 years for $150 for an instant-cash offer from backmarket or any other reseller
If you want to put in a bit of elbow grease, you can get a much better deal. M1 Mac Minis in my area are regularly selling for $350+ on FB Marketplace right now.
I just checked out backmarket as I've been shopping for a mini PC with oculink and hadn't thought of them. They have a primary nav across the top of the site which has 5 generic categories (laptops, consoles etc.), one Google product (pixel), 4 Samsung items, and 20 Apple items - more than all the others put together. I guess this very much proves your point.
No need for a black market, there's plenty of public ones (Backmarket, eBay, etc.). That being said $200 seems not terrible given the step change in performance since then (I own a 2019 MBP and think we were very unlucky with our purchase timing). Backmarket seems to sell yours for ~$350-500, so maybe you'll get a little bit more trade-in for it.
I owned a 2014 MBP (~$1200?) for a long time and as late as 2019 it was resellable for $500.
> I think for a lot of people for basic usage they'll get more value out of the larger and upgradable SSDs than the faster CPU
Why exactly?
What are a "lot of people" storing on their computers these days? Photos are in the cloud or on our phones. Videos and music are streaming. Documents take up no space. Programs are in the cloud (for the most part).
None of them have a proper HDMI 2.1 FRL port that is needed to run a 4k 120Hz monitor. Likely because the Iris Xe / AMD equivalent does not support it, and dedicated ITX GPUs are expensive. This isn't a problem with M4.
I would second this! The N100 is super efficient., and can often be found for around $150. I can also recommend looking at used intel “NUC” mini PCs if you’re budget conscience. I have a couple of 5th gen i5 NUCs i got for $60 that that run multiple VMs and LXC containers as part of a Proxmox cluster.
Another valid option is a synology nas, not only can you build the storage you probably want (I have 12TB 1 redundancy with one slot spare, read backed by ssd) but can also run containers on em as well.
Not sure what the best to recomend, what I can say is to stay away from GIGABYTE .
I got a BRIX, which gave me nothing but trouble. Its UEFI is very picky with SSD brands, wasted money on a couple now being used as external drives, and in the end not even with Windows.
It is now collecting dust waiting for the city hall disposal round.
It starts €599,00 for 2(!) core Celeron. Seems absurd when you can get a Mini for an extra €100 (you can run Linux/Windows in a VM and still get a magnitude or few better perf). Or even an used old NUC or something, you'd need to go back very far to get a crappier CPU...
So the actual starting price seems to be €900-1000 (i.e. if you want an i5..)
The Celeron G6900 has a 46W TDP and seems to be around ~20% (multicore) slower than the <10W N100. Seems absurd that they are pushing garbage like that at such prices (even if its the base config)
Cirrus7 is expensive because you are paying for a very high quality machined chassis & case that act as a massive fanless heatsink. Those alone are pretty costly. The price cannot be compared with cheap NUC clones and mini PCs, nor with Apple.
I am not endorsing any particular brand, but Cirrus7 is not that expensive within the fanless market and the quality of the entire build is very high. They also somtimes offer nice discounts for students and SMEs. There are quite a few comparable brands and also DIY options with cases from Streacom or Akasa. If you want something cheaper, Minix is pretty inexpensive, especially when you take into consideration they offer a decent fanless enclosure.
The higher end configs seem fine even if a bit pricy (still, though the Mac Mini seems like great value if you're fine with the OS situation and non upgradable memory).
I still find it weird/confusing why would a reasonably high-end brand be selling configs with such horrible CPUs (especially perf/watt considering the whole fanless thing).
But I suppose they hardly have any options if they want a socketed MB. Laptop chips would probably be a lot better value (both cost and heat wise) but then it's no longer modular and e.g. Lunar Lake doesn't(?) even support non soldered-memory...
That is a good question. They sell those CPUs to industrial clients. Note cases can be configured to be completely sealed for high-dust environments, and it is also possible to get industrial motherboards with connections that no regular user needs. The fanless market has a pretty good niche in factory deployments. There, lots of software is designed to run 24/7 on cheap CPUs.
Mac Minis, and in general most Apple products, tend to offer great value in the lowest configuration. But upgrades are expensive. It is a bit of the opposite situation. I have not used Macs for very long, and I prefer Linux, yet the cheapest Mini looks quite appealing. When Intel Core 2 CPUs entered the laptop market, it was a similar situation. The cheapest MacBooks offered really great value compared to competition.
Install your favorite flavor of Linux then. Beelink devices have a good reputation for being quite happy with a new OS. It's more compatible that the latest Apple devices, that's for certain.
Apple has never put any technical or legal obstacles in the way of installing other operating systems on Mac hardware. Nor do they assist in any way, it's consistent benign neglect.
The old Intel machines made excellent Linux boxes, excepting the TouchBar era because the TouchBar sucked (it was possible to install Linux, it would display the fake function keys, they worked, but not a good experience). I've converted two non-TouchBar Mac laptops into Linux machines, with zero complaints, one of them is in current use (not the laptop I'm typing on this instant however).
Now there's Asahi, which as a sibling comment points out, will surely be supported for M4 eventually. This is a great time to buy the M2 Minis and put Linux on them, if that's what you're into. Or you can wait around for the M4 port, whatever suits your needs.
Yet they made BootCamp.
Do you see how foolish you look trying to defend nonsense?
Apple try to avoid being too heavy handed in the lockdown because they know the outrage it would cause from their legacy customers. Boil the frog slowly.
But they most definitely are trying to make the Mac more like an iPhone and they would rather not you install any other OS on it.
The bootloader not being completely locked is more for legacy reasons and multi-macOS support (dev/debug) but if you have any problem with it, you will need (surprise-surprise), another Mac for a DFU restore, just like an iPhone.
I have a minisforum minipc. first thing I did was wipe windows and put popos on it. super happy with it. That said, getting anyone who isn't used to linux to usd anything other than windows as easy as pulling your teeth. People go towards whats familiar; Even when what's familiar is objectively trash that spies on you.
I don't try to get others to use Linux anymore. "Anyone who isn't used to Linux" can keep doing whatever it is they're already doing. So long as we can use it, I'm happy. I care about Linux usage only as far as it makes it harder for companies to ignore or block us.
> I think this easily beats any sort of desktop PC you can buy at that price (let's exclude custom builds, they're not the same market).
This is squarely in the NUC/SFF/1l-pc territory, and there is plenty of competition here from Beelink and Minisforum.
I just found the Beelink SER7 going for $509, and it has an 8-core/16-thread Ryzen 7 CPU, 32GB DDR4. The 8845 in the beelink is very competitive[1] with M4 (beaten, but not "easily"), and also supports memory upgrades of up to 256GB.
If local LLMs become mainstream then you want as much memory bandwidth as possible. For regular home and office use two channels of DDR4 is more than enough.
It is not more than 60 Gb/s for extreme overclocked DDR4-4000 and sometimes much less than 50Gb/s for regular 3200
DDR5 is reaching 100 Gb/s overclocked for Intel, and 50-70 Gb/s in stock.
When factoring in motherboard, CPU etc, then yes. The max speed is only theoretical, unlike the Apple chips which actually benchmark on the speed specified.
That's such an Apple fanboy trope.
The bandwidth is shared with the GPU part that actually uses most of it.
You can't starve the CPUs for data in typical PC with the "standard" DDR5 bandwidth and they have much higher bandwidth for GPUs.
You know it's almost like if PC industry hardware designers are not complete morons.
There's a huge difference there. Those PCs have to be ordered from Aliexpress, or some other Chinese site, or else from Amazon via a third party resellers that adds their own markup on top.
Neither gets you any kind of useful warranty, at least for most people, who are unwilling to deal with overseas companies.
Apple has actual physical stores, and a phone number you can call.
> Those PCs have to be ordered from Aliexpress, or some other Chinese site, or else from Amazon via a third party resellers that adds their own markup on top
I anticipated this concern, the $509 I gave earlier is the Amazon price that includes the mark-up. The Beelink SER7 costs only $320 on AliExpress.
Modern solid-state electronics are very reliable, most reliability issues for electronics are related to screens or batteries; which desktop computers lack. I guess there was a bad-capacitor problem over a decade ago, but nothing since then. If your risk-aversion for a desktop computer is high, you pay the Apple premium (possibly buying Applecare), or self-insure by buying 2 SER7s for nearly the same price ($640) as one regular M4 Mac Mini and keep the other one as a spare.
IF you're ordering them in the context of a larger buying program like a university or other office you'd at least get some sort of account rep and Apple support as well. I'm not sure if you could get that from Beelink, could you? I see some benefit in that use case.
But that's aside from the main topic which was the personal and home use case. On that topic you get a decent set of products as well such as Pages/Numbers/etc. and others along with software support for the Mac Mini. I'm guessing the Beelink runs on Linux? That may be hard for some to work with (which is unfortunate since it's really not), or maybe they have to separately buy a Windows license? Something to consider in the comparison.
1) external storage to become faster and cheaper every year (subject to constraints around interface)
2) more and more digital assets to be cloud-native, e.g. photos stored exclusively on icloud and not on your computer
So I'm less worried about storage than some. If Asahi Linux achieves Proton-like compatibility with games [0], then we're getting closer to the perfect general purpose game console.
With Thunderbolt 5, once external SSD enclosures supporting it exist, there should be zero performance penalty for external vs internal storage speed, finally. Then you can built a 1PB array, if you want.
Indeed. Realistically if anything, one should consider the “physical world” hassles with permanent external storage arguably more than performance ones:
• Risk of accidental unplugging.
• Contacts may become wonky over time → see above.
• The need to sacrifice a port (or the portion of one in the case of a dongle).
• Enclosures tend to have annoying IO lights.
• Takes a bit of space.
All of these can be solved, especially when dealing with a desktop that stays in place. Paradoxically, there was never a better time to be modest with internal storage.
Although I will say:
> photos stored exclusively on icloud and not on your computer
Over my dead body :) If there’s one thing I’ll always happily carve out SSD space for, it’s local copies of my photo library!
We can expect different storage solutions by product depending on how fast things need to be. It doesn’t need to be lightning quick to load a frame in a movie, for instance, which is why streaming dominates there.
Yeah exactly my thoughts.
I have been trying all kinds of services since 2010 and it always comes back to that.
Even with very fast fiber, there is no realistic way to get the latency into desirable territory. It makes a large amount of games borderline unplayable and a whole lot extremely annoying.
Basically, the only thing half-working are slow paced story games and slow strategy games (mostly turn based), which ironically require little ressource most of the time (so why pay for cloud service ?!).
Like most things "cloud" conceptually it is seductive but in practice extremely compromised.
It's a major hassle and if you are going to get a small box to plug in all kinds of other small box around it with a web of cables, you might as well get a bigger box and put it all inside...
This doesn’t seem to be true, and I don’t even get what it would change if it were true. Developers aren’t the target demographic of the base version with low storage.
Docker in macOS (at least the useful one) just runs in a Linux VM, and I don't see why you couldn't run a VM off an image on an external drive. Maybe the UI doesn't let you select that location?
Apple offering expensive upgrades for storage and memory pre-dates the existence of iCloud storage by decades. It was entirely standard before MobileMe, or Apple offering any kind of "cloud" services.
Apple just charge a lot of money for upgrades, even did when it was trivial to do them yourself, and they're not going to change once they made it impossible to do any kind of internal upgrade.
I’m a mid 30s developer and I use a mac mini for all my hobby development. I’m planning to get a m4 mini to replace my current m1 mini. I like hooking up my own monitor and peripherals - I don’t like working hunched over a small screen and crunched keyboard. Plus, a m4 mac mini with 32GB RAM is only $999 - the most closely spec’d Macbook Air (on an M3, with 24GB RAM) is $1299. and then the m4 Macbook Pro with 32 GB RAM is $1999. So your last point about cost - why should I throw away an extra $1000 for no reason?
IMHO it's not as NUC style mini PCs with x86-64 CPUs from AMD and intel are really cheap and the 256Gb storage is way too small making the "real" price $200 higher for any sort of moderate usage.
> I think this easily beats any sort of desktop PC you can buy at that price
Not really. Do a quick googling for cheap miniPCs from brands such as minisforum or Beelink. Years ago they were selling Ryzen5 and Intel i5 with 16BG of RAM for around $300. No "educational software" bullshit either, just straight from Amazon to anyone who bothered to click on a button.
Then you have to factor in supporting those systems, because you will be the one they call. This is one of the major upsides to family & friends buying Macs.
Why not? As long as you discount the price of the new product by the perceived value of new vs used, that’s the correct comparison to make. If a used product is the same quality and $100 cheaper, and just having something that’s new is not worth $100 to you, you should pick the used option. The goal is to get the best value per money spent.
mental gymnastics; used is not new. even new isn't necessarily new when it's not sold by an authorized seller because it can invalidate the warranty.
new is new and has legal ramifications, you cannot compare them unless you're throwing in a trustworthy extended warranty that matches -- and pretty much nothing matches apple in that regard
16GB base RAM across the board, following the iMac. AI is certainly good for pushing up the baseline RAM that manufacturers can get away with shipping if nothing else.
This is huge for AI / ML at least for inference. Apple chips are among the most efficient out there for that sort of thing, the only downside is the lack of cuda
Lack of Cuda is not a problem if for most ML frameworks. For example, in PyTorch you just tell it to use the “mps” (metal performance shaders) device instead of the “cuda” device.
That simply isn't true in practice. Maybe for inference, but even then you're running up against common CUDA kernels such as FlashAttention which will be far from plug and play with PyTorch.
I tried training some models using tensorflow-metal a year ago and I was quite disappointed. Using a relu activation function led to very poor accuracy [0] and training time was an order of magnitude slower than just using the free tier of Google Colab
To be totally honest, there's enough money in the ML / AI / LLM space now that I fully expect some companies to put forward alternative cards specifically for that purpose. Why google does not sell their TPU to consumer and datacenter instead of just letting you rent is beyond me.
Yep, there's no performance x86 CPUs on the market with ambitious GPUs, only laptop chips. Games are optimized for discrete GPUs, Apple didn't have that software inertia to deal with.
Sort of, obviously quite a few games are optimized for the PS5 and Xbox series X.
GPU cores are generally identical between the iGPUs and the discrete GPUs. Adding a PCIe bus (high latency and low bandwidth) and having a separate memory pool doesn't create new opportunities for optimization.
On the other hand having unified memory creates optimization opportunities, but even just making memcpy a noop can be useful as well.
GPUs are all about (compute & memory) bandwidth. Using the same building blocks of compute units doesn't yet it go fast. You need a lot of compute units and a lot of bandwidth to feed them.
The performance dependency on DGPUs doesn't come from the existance of a PCIe bus and partitioned memory, but from the fact that the software running on the DGPU is written for a system with high bandwidth memory like GDDR6X or HBM. It creates opportunities for optimization the same way as hardware properties and machine balances tend to, the software gets written, benchmarked and optimized against hardware with certain kinds of performance properties and constraints (like here compute/bandwidth balance and memory capacity, and whether CPU & GPU have shared memory).
> Apple chips are among the most efficient out there for that sort of thing
Not really? Apple is efficient because they ship moderately large GPUs manufactured on TSMC hardware. Their NPU hardware is more or less entirely ignored and their GPUs are using the same shader-based compute that Intel and AMD rely on. It's not efficient because Apple does anything different with their hardware like Nvidia does, it's efficient because they're simply using denser silicon than most opponents.
Apple does make efficient chips, but AI is so much of an afterthought that I wouldn't consider them any more efficient than Intel or AMD.
For inference, Apple chips are great due to a high memory bandwidth. Mac Studio is a popular choice in the local Llama community for this particular reason. It's a cost effective option if you need a lot of memory plus a high bandwidth. The downside is poor training performance and Metal being a less polished software stack compared to CUDA.
I wonder if a little cluster of Mac Minis is a good option for running concurrent LLM agents, or a single Mac Studio is still preferable?
The memory bandwidth on Apple silicon is only sometimes comparable to, and in many cases worse than, that of a GPU. For example, an nVidia RTX 4060 Ti 16GB GPU (not a high-end card by any means) has memory bandwidth of 288GiB/sec, which is more than double that of the M4 CPU.
On the higher end, building a machine with 6 to 8 24GB GPUs such as RTX 3090s would be comparable in cost (as well as available memory) to a high-end Mac Studio, and would be at least an order of magnitude faster at inference. Yes, it's going to use an order of magnitude more power as well, but what you probably should care about here is W/token which is in the same ballpark.
Apple silicon is a reasonable solution for inference only if you need the most amount of memory possible, you don't care about absolute performance, and you're unwilling to deal with a multi-GPU setup.
Edit: since my reply you have edited your comment to mention the Studio, but the fact remains that the M2 Max has at least ~40% greater bandwidth than the number you quoted as an example.
Exactly, the M2 Ultra is competitive for local inference use cases given the 800 GB/s bandwith and a relatively low cost and energy efficiency.
The M4 Pro in the Mini has a bandwidth of 273 GB/s, which is probably less appealing. But I wonder how it'd compare cost-wise and performance-wise, with several Minis in a little cluster, each running a small LLM and exchanging messages. This could be interesting for a local agent architecture.
See my sibling reply below, but I disagree with your main point here. M2 Ultra is only competitive for very specific use cases, it does not really cost less than a much higher-performing setup, and if what you care about is true efficiency (meaning, W/token, or how much energy does the computer use to produce a given response), a multi-GPU setup and Mac Studios are on about equal footing.
For reference comparing to what the big companies use, an H100 has over 3TB/s bandwidth. A nice home lab might be built around 4090s — two years old at this point — which have about 1TB/s.
Apple's chips have the advantage of being able to be specced out with tons of RAM, but performance isn't going to be in the same ballpark of even fairly old Nvidia chips.
The cheapest 4090 is EUR 110 less than a complete 32GB RAM M2 max Mac Studio where I live. Speccing out a full Intel 14700K computer (avoiding the expensive 14900) with 32 GB RAM, NVMe storage, case, power supply, motherboard, 10G Ethernet … and we are approaching the cost of the 64GB M2 ultra which has a more comparable memory bandwidth to the Nvidia card, but with more than twice the RAM available to the GPU.
That's my point. I would absolutely be willing to suffer a 20% memory bandwidth penalty if it means I can put 200% more data in the memory buffer to begin with. Not having to page in and out of disk storage quickly make those 20% irrelevant.
If you have enough 4090s, you don't need to page in and out of disk: everything stays in VRAM and is fast. But it's true that if you just want it to work, and you don't need the fastest perf, Apple is cheaper!
How is that relevant when the discussion from the start was about comparing a two year old Mac with a two year old GPU as a cost-benefit discussion.
In any case how are you going to fit 50+GB in two (theoretically 24+24 GB) Nvidia cards without swapping to disk when the Mac in question has 64GB (also theoretically) available?
You seem confused. Please feel free to read my post near the top of this very chain of comments, where I specifically compare a Mac Studio to a machine with 6 to 8 Nvidia GPUs. That was the discussion “from the start.”
> In any case how are you going to fit 50+GB in two (theoretically 24+24 GB) Nvidia cards
What seems like a joke about it? And relevant to what, exactly?
The parent of my initial comment in this thread said: "For inference, Apple chips are great due to a high memory bandwidth... It's a cost effective option if you need a lot of memory plus a high bandwidth."
My post was attempting to explain at a high level how 1) Apple SoCs do not really have high memory bandwidth compared to a cluster of GPUs, and 2) you can actually build that cluster of GPUs for the same cost or cheaper than a loaded Mac Studio, and it will drastically outperform the Mac.
If you want specifics on how to build such a GPU cluster, you can search for "ROMED8-2T 3090" for some examples.
Yeah, sorry, I realized that as well so I edited my post to add a higher end example with multiple 3090s or similar cards. A single 3090 has just under 1TiB/sec of memory bandwidth.
One more edit: I'd also like to point out that memory bandwidth is important, but not sufficient for fast inference. My entire point here is that Apple silicon does have high memory bandwidth for sure, but for inference it's very much held back by the relative slowness of the GPU compared with dedicated nVidia/AMD cards.
It's still "fast enough" for even 120b models in practice, and you don't need to muck around with building a multi-GPU rig (and figuring out how to e.g. cool it properly).
It's definitely not what you'd want for your data center, but for home tinkering it has a very clear niche.
> It's still "fast enough" for even 120b models in practice
Is it? This is very subjective. The Mac Studio would not be "fast enough" for me on even a 70b model, not necessarily because its output is slow, but because the prompt evaluation speed is quite bad. See [0] for example numbers; on Llama 3 70B at Q4_K_M quantization, it takes an M2 Ultra with 192GB about 8.5 seconds just to evaluate a 1024-token prompt. A machine with 6 3090s (which would likely come in cheaper than the Mac Studio) is over 6 times faster at prompt parsing.
A 120b model is likely going to be something like 1.5-2x slower at prompt evaluation, rendering it pretty much unusable (again, for me).
You're mostly correct, though a 4060Ti 16GB is 20-30% cheaper than the cheapest Mac Mini. More importantly though, "fits inside a Mac Mini" is not a criterion I'm using to evaluate whether a particular solution is suitable for LLM inference. If it is for you, that's fine, but we have vastly different priorities.
I'm not sure what you mean. RTX 4060 Ti/4070 Ti Super/3090/4090 cards can be easily purchased at any major electronics store in person or online and have 16GB or 24GB depending on model. Once you get up to 32GB, your point would stand, but 16-24GB GPUs are common.
> I know ancient iGPUs had that thing for setting the GPU memory size in the BIOS, but that's aaaaaancient and completely obsolete. If you still have that, just set it to the minimum value. The rest of memory will be unified.
I hadn’t used a PC in so long, I still thought that bios setting decided the division. TIL.
You do have the option of a 10 gigabit Ethernet port, so you can build out a linux box for local shared storage with components as cheap as you're willing to trust.
As someone who just made a 16tb SSD array over Thunderbolt 3 (Best I could find) at 40gbps and the interface is still the bottleneck (disks are fast now!), 10gbps is going to feel really really slow vs the internal stuff.
It's possible to build a faster non-shared array if you aren't price sensitive (Thunderbolt 5 is 80 gigabits a secind), but someone with multiple computers and devices gets much better bang for the buck from shared local network storage.
As a bonus, you can back up your computers and iDevices to the shared local storage instead of paying for (probably much slower to access) cloud storage.
It's $800 to go up to 2TB from the 256GB model which is just criminally over priced. I can get double that for half the price with a Gen 4 NVMe drive. Weirdly the 8TB drive on the Pro is at least in line with the top of the line 8TB NVMe SSDs you can buy though there are cheaper options at about $600 vs the $1200 Apple is charging.
That's regular Apple pricing for you. Great deals on the baseline models, but insane margins on the upgrades that make them usable. And of course the ability to upgrade the devices yourself has been phased out in the name of performance and power efficiency
I just checked some Dell prices: $730 to upgrade an XPS desktop from 512GB to 4TB (Apple charges $1200), or $508 to upgrade an Optiplex tower from 256GB to 2TB QLC, or $654 to upgrade it from 256GB to 2TB TLC (Apple charges $800). Scalping on upgrade pricing is something all the PC OEMs do.
Yeah, but the 5 minute job of installing a cheap retail SSD in that Dell machine yourself is still an option which Apple has removed from all but the Mac Pro, which offsets any SSD savings by being $3000 more expensive than an equivalently specced Mac Studio.
I think it's more relevant with Apple because they've removed all the competition for basically all upgrades to their devices by either soldering things to the board or bundling them into their SOC. When there's no alternative their prices become the only option.
The Mac Studio technically still has socketed SSDs, which presumably cuts costs by not having to manage a separate motherboard SKU for every SSD capacity, but they went out of their way to design a proprietary SSD module format rather than just using the standard...
Kind of, they were socketed NAND cards and the controller lived on the mainboard, so as far as getting out of the problem of Apple entirely setting the prices for everything it's not relevant. As far as I'm finding no one managed to find a way to create a compatible card to create an avenue for DIY upgrades. I've found a few upgrades but they consist of buying an entire second Mac Studio to harvest the drive from.
Giles from Polysoft is manufacturing 3rd party mac studio nand cards. There is still a problem sourcing nand because apple doesn't let the owm sell to anyone but them.
I thought I had seen something about that but couldn't find the actual boards mentioned for sale. Sounds even worse though because it should be possible but Apple being Apple has ensured there's no source for Giles or other companies to perform repairs.
Yes, though that relies on the product being fairly popular and for the chip to be stable for a while for it to be a useful source. Mac Studio NAND chips aren't going to be readily available from them unless they happen to be a shared part from a more popular device.
Apples segmentation takes place in the nand firmware. The firmware contains the location in the storage configuration. And this may or may not be rewriteable. Iboffrc has done a video explaining how some of it works. It's all from reverse engineering though.
I think repairable or not, the pricing to upgrade to the max config just isn't something a price-sensitive consumer should take seriously. Beyond one or two upgrades, you might as well pretend it says "call for quote" and just not consider 4+ TB as a realistic option to get from the OEM, because those prices are trying to cause sticker shock. And that goes for any PC OEM—the price-gouged upgrades are so far beyond reasonable that it really doesn't matter whose prices are the most silly or by exactly how many hundreds of dollars. What does matter is whether aftermarket upgrades and repairs are possible.
It matters that there are no after market options for Apple because it means the inflated OEM upgrade price is the ONLY price available for every given upgrade. It matters less with Dell/Asus/Lenovo etc. because that's not the only price available.
The top of the line is also not where Apple is gouging the worst. It's in the middle tiers that are actually relevant to many more people. Most don't have a need for 4+ TB main drives but 1-2 TB is a size that's pretty easy to justify for a lot of people and Apple's price is the only option for them and they're absolutely lining their pockets with cash at the expense of anyone not going for the bargain bin basic tier that can't hold 2 modern games.
By all means, complain about the cost to get a 1TB config, and put that price in proper context. But it still doesn't make much sense to focus on the $1200 upgrades, or any of the other upgrades whose price rounds up to "lol, no".
The gouging is bad at all levels as well as the design effects on repairability and the issues with obsolescence. If you'll look way back though I acknowledged the $1200 is at least vaguely in line with the top of the line Sabrent 8tb NVMe of the same size.
"Repairability" is a red-herring when the discussion is about user-upgrades and the ability to purchase components from 3rd-party suppliers (who compete against each other and the OEM)
Apple's silicon is good but I don't see what's so special about all the other stuff. Looks like they just solder components to the motherboard instead of using industry standard interfaces.
> I can get double that for half the price with a Gen 4 NVMe drive
It's worse than that -- 4TB gen 4 drives can be had for well under $300, sometimes $225-250, and that's for buying a drive outright, not "trading up" from a 256GB device. I think it'd be more accurate to say that you can get double the capacity for a _quarter_ of the price.
I was ballparking it based on my recent buy of a Samsung 990 Pro 4TB and inflated the price a little in my head to closer to $400 than the $330 it actually was.
I also, as a side note, try to give the loosest most favorable (to my opposite) comparison because when I err on my side it becomes a "well actually" debate a lot of the time about how it's "not quite X times as many it's more like X-1 (so I'm not even going to touch that X-1 is still quite bad)" that is really tedious and annoying especially when the favorable version of the comparison is still quite bad for their point/side.
Very much so. When I bought my "cheesegrater" Mac Pro, I wanted 8TB of SSD.
Except Apple wanted $3,000 for 7TB of SSD (considering the sticker price came with a baseline of 1TB).
I bought a 4xM.2 card and 4x2TB Samsung Pro SSDs, cost me $1,300, I got to keep the 1TB "system" SSD, and was faster, at 6.8GBps versus the system drive at 5.5.
Similar with memory. OWC literally sells the same memory as Apple (same manufacturer, same specifications. Apple also wanted $3,000 for 160GB of memory (going from 32 to 192). I paid $1,000.
You can get some decent size external usb SSDs. I have the Samsung T5 2TB. I think they have larger models now. Works pretty well. And with USB-C speeds are very usable. You can probably get faster/bigger stuff via thunderbolt.
I'm considering getting one and a nice big monitor or TV. It needs to run x-plane 12 at decent speeds and maybe support a bit of light gaming. My macbook M1 pro is actually pretty decent for this but the screen is too small for me to easily read the instruments. I expect this will do better even in the base setup.
Otherwise my needs are pretty modest. I'd love to see steam add some emulation support for these things as I have some older games that I enjoy playing. I currently play those on a crappy old intel laptop running linux. I've also been eyeing a new AMD mini PC with the latest amd stuff (Beelink's SER9).
Seems pretty nice as well and seems like it is more performance for the money. Apple is doing its usual thing of charging you hundreds of euros for 50 euro upgrades. Get the base mac studio instead. It probably makes more sense if you are going down that path.
It's USD2,400 to upgrade the M4 Pro model from 512GB to 8TB, which feels a bit steep, but it's an option.
Alternatively, you can get one of these[1] external Other World Computing NVME SSDs for USD1,190 right now. And then you can easily move all your files from your laptop to your desktop when you get home.
There are no slots, it's all soldered directly to the motherboard. Even in the Mac Studio, which does use modular SSDs, they're proprietary modules rather than anything you can easily swap out yourself.
Fortunately I don't really see the point of using a mac mini, so this doesn't bother me too much, but... it's poor taste. You're holding it wrong was not cool the first time.
It kind of undermines the sleek form factor if you need to have a clunky NVMe enclosure dangling off the back though. Even with this tiny new design I bet they could fit a hatch on the bottom with space for a 2230 M.2 drive, but they don't want to because that would let you upgrade to 2TB of fast internal storage for $200 instead of $800.
Until someone brings out little two or four drive NVMe enclosures that fits exactly under the Mac Mini with a Thunderbolt bridge/plug that doesn't snag cables, because we all know Apple can't resist gauging buyers by refusing to include two easy to access M.2 bays on the underside.
I can't imagine anyone but Apple shareholders drooling at the taught of overpriced soldered memory would prefer a smaller Mac Mini case if ~0.5" more height would get you M.2 bays for storage.
There are a number of companies that specialize in making hubs/NVMe enclosures that match the aesthetics of the Mac Mini and sit directly underneath it.
It’s not the size of the NVMe enclosure that makes it clunky. It’s that you now have an extra dongle hanging off the back of your Mac and cluttering up the desk.
There is a reason for the popularity of those enclosure/hub combos that have the same footprint and color as the Mini.
This is good news for me because I usually buy the base machine and accept its performance as a constraint on what I'm doing. I'm not sure it is all about AI though, Apple has been getting a lot of criticism for selling machines with just 8GB of RAM.
The RAM is expandable as well… however I am curious how well the extra RAM performs. Part of the M-series performance gain is from having the RAM dies very close to the processor.
I learned yesterday that all M-chip Macs with enough RAM are getting Apple Intelligence?
This basically proves that Apple shot themselves in the foot for AI on mobile by artificially restricting RAM for so long! Heck, even the Neural Engine has turned out to be basically useless despite all their grandstanding.
So alas, their prior greed has resulted in their most popular consumer iDevices being the least AI compatible devices in their lineup. They could’ve leapfrogged every other manufacturer with the largest AI compatible device userbase.
I think it's great that Apple was able to ship devices that millions of people made happy use of without needing to put additional hardware resources into them. That's efficiency, not greed.
I own almost every Apple ecosystem device, but I definitely wouldn’t call their mobile device RAM capacity as sufficient. It physically hurts me when my iPad Pro M2 and iPhone 16 Pro Max (earlier 15,14,13,12,11) start to swap out live apps - sure some apps retain state, but the majority still don’t. Even Safari randomly reloads tabs for me, while I’m just researching purchases across <10 live tabs.
No, they do. They just don't understand enough about technology/computing to know what's going on or they don't care (or at least pretend to, because they don't have any other experience).
It's really crazy that some android phones at half the price give a better browsing experience than many iPhones, especially the non "Pro" ones.
Yeah on my S24U rn I have 8 or so apps open, 100 or so tabs in ff (enough that it just shows an infinity symbol rather than number of tabs). That's with 12GB memory.
Tbf it seems to be browsers that kill memory on any platform. The Web is now just mobile breakpoints that load needlessly high res images.
Well that's not a cheap phone but yes if you are going high-end, it's probably a better experience than a comparatively priced iPhone.
Of course, the browsers are problematic for memory but that's not new and hilariously the first iPhone was supposed to work only with web apps. It's not like if they couldn't spend a few dollars more on their expensive premium hardware to guarantee a good experience for all use cases.
This is the big problem with Apple today, the milking at every single step, the stupid pricing ladder and scrooge attitude is extremely distasteful and completely unreasonable for the price asked.
> This basically proves that Apple shot themselves in the foot for AI on mobile by artificially restricting RAM for so long!
What they shot was us. My 14 Pro won’t do AI despite having a better NPU than an M1, all because Apple chose - intentionally - to ship it with too little RAM. They knew AI was coming and they did this anyway.
Although having played with it on my MBP it’s clear I’m not missing much. But still.
Well given they're leaning on concepts like loras etc so much with apple "intelligence" I don't imagine they've been planning this for as long as you think they have.
And their npus weren't added in anticipation of LLMs, imo. You give em too much credit.
It's Unified RAM. So that memory is also used for the GPU & Neural Cores (which is for Apple Intelligence).
This is actually why companies moved away from the unified memory arch decades ago.
It'll be interesting to see as AI continues to advance, if Apple is forced to depart from their unified memory architecture due to growing GPU memory needs.
If it's the shift I think you're referring to, I find it strange that you compare computing decisions from the 50s and 60s to today. You're correct, but that was over half a century ago. The reasons for those decisions, such as bus speeds, high latency, and low bandwidth, no longer apply.
Today, the industry is moving toward unified memory. This trend includes not only Apple but also Intel, AMD with their APUs, and Qualcomm. Pretty much everyone.
To me, the benefits are clear:
- Reduced copying of large amounts of data between memory pools.
>This is actually why companies moved away from the unified memory arch decades ago.
I don't understand - wouldn't the OS be able to do a better job of dynamically allocating memory between say GPU and CPU in real time based on instantaneous need as opposed to the buyer doing it one time while purchasing their machine? Apparently not, but I'm not sure what I'm missing.
The usual reasoning that people give for it being bad is: you share memory bandwidth between CPU and GPU, and many things are starved for memory access.
Apple’s approach is to stack the memory dies on top of the processor dies and connect them with a stupid-wide bus so that everything has enough bandwidth.
Depart? They just got there, didn't they? And on purpose. There's more memory bandwidth, and also no need to copy from main memory to VRAM. Why would they bail on it?
I think they moved away because system memory was lagging behind in speed to the memory being used on video cards?
And besides, what Apple is doing is placing the RAM really close to the SoC, I think they are on the same package even, that was not the case on the PC either AFAIK?
Apple has an arm license but they still buy memory from Samsung (and others), it's not on the m chip die but it's provided by Samsung and then packaged above the m die.
At this point it feels like (correct me if I'm am wrong) that Apple's AI is often performed "in the cloud". I suspect though that if Apple moves increasingly to on-device AI (as I suspect they will — if not for bandwidth and backend resource reasons then for privacy ones) Apple's Silicon will have adopted more and more specialized AI components — perhaps diminishing the need for use of off-board memory.
Last I checked, Apple was pretty much the only major player who does everything that they can do on device on device, that is their whole ethos behind it, no?
I mean, there is no need to speculate about any of this, they've put out a number of articles that outline their whole approach. I'm not really sure where the ambiguity lies?
What a great little computer at a very reasonable price. A few interesting things with this announcement:
1. Interesting that they did not have this as part of an event. I think this either means they do not have much else to share around the Mac right now or the opposite, there just won't be room to talk about the iMac or Mac Mini. I am leaning towards the former as a I suspect the other computers in their lineup will just receive a spec bump soon.
2. On the product page (https://www.apple.com/mac-mini/) Apple highlights a number of third party accessories. Notably the PS5 controller and several keyboards and mice from different manufacturers. This seems small, but it would have been almost blasphemy under the jobs era.
3. This is quite the little powerhouse. Honestly it is so good it eliminates the need for most people to even consider the Mac Studio.
Jobs was extremely pragmatic, which is why for all his fault Apple would always come up with a desirable technological solution.
Most things said about Jobs are pop culture caricature derived from peoples talking shit about him (mostly because of success jaleousy).
> And the iMac's catch copy was "BYODKM" at the very start
I think you meany Mac mini there. It was definitely built as the gateway into the Mac ecosystem. Swap your tower for a Mac mini, and keep using the rest of your old hardware.
It was a weird win-win for Apple if I remember correctly. They were able to sell more iPods, get iTunes installed on all HP machines, and block HP from creating a rival music player. I honestly am not sure what HP got beyond the logo on the back of some iPods and the ability to try and associate themselves with a popular and cool product.
Apple has shown people gaming with PS5 controllers at events for I believe a few years now. Someone can fact check me on that but it's not the first time I've seen it.
I have to think economies of scale are coming into play for Apple. They can cut deals for chips and other components at a scale no one else is really capable of and they have the luxury of being able to pay up front in advance if they need to.
My M2 Mac Mini that I got for $499 is my favorite gaming computer I've had in a long time. Runs many games like WoW, Dota, League of Legends, etc great. Anything that it doesn't run due to MacOS I use GeForce Now over ethernet. And this was with 8gb unified memory, now with 16gb it'll be even better value.
Very excited to see how the GPU has improved in the M4, especially the Pro model.
I have a Steam Deck, Asus ROG Ally, M2 Mac Mini, M1 Pro laptop, M2 Max laptop (work). All of this runs on either an LG C3 42" OLED or a 34" 1440p ultrawide.
Linux GeForce Now can only do 720p or 1080p, can't remember which. Also, it's just kind of laggy in desktop mode. The Macs run so much smoother.
My current "main" desktop is actually my Asus ROG Ally. I use one USB C hub that is capable of 4k120hz, and I can move it between my Mac laptops and Asus ROG Ally very seamlessly.
The problem for me is Windows. Yesterday my start menu stopped loading for some reason and required a full reboot. Sometimes it refuses to go to sleep. Sometimes it refuses to come out of sleep. Sometimes a Windows update kicks off in the middle of a game and it slows everything to a crawl. Windows drives me crazy these days!
I purchase a Surface Pro 8 a year ago or something, thinking Windows would surely work better than usual when it is Microsoft's own hardware too.
But no, yesterday it got stuck in a boot loop, after a Windows update broke the audio drivers somehow. The Windows logs/reliability report can just tell me it "shut down abnormally" without any technical details what so ever.
I still have to use Windows on my desktop because of Ableton, but I'll never purchase any Microsoft hardware again, and as soon as I can, I'll run Ableton on Linux like the rest of my software.
Get rid of Windows on it! Digital Foundry put out a great video on this exact process the other day, weighing the pros and cons: https://www.youtube.com/watch?v=OwWRCrGoXV0
Neither MacOS nor Windows are very good console OSes - you're really better off using Linux where anticheat isn't concerned. Even on the Ally.
> Neither MacOS nor Windows are very good console OSes
They're great OSes for consumers who don't really work on their computers, and just want something that caters to the lowest common denominator.
For professionals who use computers for work, Linux is really the only option that doesn't eventually get in your way. You can set it up and leave it as-is, with only security updates, and everything keeps working the same way, basically forever.
I've tried to set up an experience like that on both macOS and Windows, but eventually, the company will find a way of forcing an update on your, intentionally or not.
> For professionals who use computers for work, Linux is really the only option that doesn't eventually get in your way.
I really hope you're not expecting anyone to take you seriously with this. On principle I get what you're saying but in practice no one who works as a professional in any field has the time (or expertise) to be worried about configuring their operating system.
As a Linux evangelist who begrudgingly daily drives a Mac, this kind of attitude is what does us in. It's the cocksure "akshually Linux is best" even when it materially, experientially, just isn't.
Most professionals use only a handful of software and don't really care about the OS other than what the OS should do (file management, connecting to the internet, launching software,...). Apple and Microsoft insists on doing other stuff that impedes you while not allowing you to do basic stuff you want. The main issue with Linux is hardware support (which no one other than the manufactures can solve properly) and professional development (The distributions are great, but monolithic development like FreeBSD would have been better).
Linux is best because it lets you use your computer for whatever workflow you need.
What stuff is Apple doing that is keeping me from doing what I want?
I think it's a good thing for 99% of computer users to not be able to just run any random software they download off of the internet. Gatekeeper, XProtect, and notarization are unfortunately necessary in the hostile computing environment we live in today. Aunt Tilly will happily download "PhotoShop" from that sketchy Russian Warez site and infect her machine if these protections didn't exist.
For power users that know what they're doing it is trivial to just use something like Homebrew or to bypass these protections on a case by case basis as needed. I can also run software in a Linux VM quite easily as well for open source software that isn't well maintained on macOS.
This is about using the computer in a professional settings. You don't go and download Photoshop randomly (and if you do, that's irresponsible). But you install Windows Pro and there's a slew of random tiles and widgets everywhere. And you can't remove Apple apps (Like TV or Music) from the M series computers.
I’d argue those protections are even more important in professional settings. You probably know how to properly obtain third party software in a safe manner, but Tina in graphic design probably doesn’t.
Regarding the built in Apple apps, I’m not sure what is gained by removing them other than a negligible amount of disk space. If you remove them from the dock they become out of sight out of mind. Same thing with the pre populated widgets and tiles on Windows.
I'm a professional who is often forced to suffer Windows nonsense. At work Windows routinely wastes my time with absolute bullshit I couldn't care less about and which makes me negative dollars, even though it is basically a glorified Chrome launcher.
Professionals should absolutely take it seriously because time spent updating Windows or even just waiting around while it gets its shit together is time you could have spent doing your job and making money. In fact, Windows and its spontaneous updates with obnoxious focus stealing prompts are major risks to the integrity of your work and might cause you to have to redo it from scratch, lowering the value of your time even further.
Linux boots in less than ten seconds and is already ready to use. There are distributions for all levels of expertise, and if there's an IT department it should be managing those boxes anyway. All that's missing is the Microsoft Office suite and in the end that's what the Windows vs Linux battle always boils down to. People put up with it because they just need muh Excel.
I'll take him seriously on it. MacOS and Windows are terrible for professional purposes, for a number of reasons:
1. Requires Windows Pro or Apple Developer license to unlock full featureset
2. Cannot reasonably disable targeted advertising or ad data collection from either OS
3. Neither come with package managers and do not respect third-party packaging either
4. Can be "managed" insofar as your buggy CPM software allows, often glitched by the OS itself
5. The experience is always getting worse since Apple and Microsoft share a united front of making people spend as much money on useless shit as humanly possible
Now, that's not to say nobody should use these OSes - certainly people are locked into them for some purposes. But as a programmer it's genuinely hard for me to be productive on these OSes because I end up fighting them just for everyday, non-programming purposes.
I think it's entirely possible that MacOS and Windows can be inherently terrible experiences while also being mandatory for certain workflows.
I consider my usage of my Macs to be "professional."
1. An Apple Developer license is only required for distributing software in App Stores and notarizing.
2. I'm not sure what ads you're talking about in macOS. I've only ever seen them in the completely optional App Store.
3. Installing Homebrew is literally a one liner. I've never used it, but Macports appears to be similarly easy as an alternative.
4. I can't speak to this point, so I'll take your word for it.
5. I only started using macOS since the Apple Silicon era, but as far as I'm concerned the experience just keeps getting better and better. Every release of macOS has added features I enjoy and use constantly. Just the seamless integration between all of the Apple products in my house was worth switching from my previous mix of Windows, Linux, and Android.
This isn't really the case anymore, Linux (specifically SteamOS and its kin) serve the console-like market very well. Arguably better than Windows.
Even for non-Gaming use cases this idea is a bit dated. Printing is by far the best experience on Linux. The "tweaking" that you need to do, that every Windows/MacOS user claims, isn't really a thing these days - sans NVIDIA (I'm not sure what the current status is, but it was bjorked somewhat recently). Sure, if you want to go beyond what Windows/MacOS can offer then tweaking my be required, but the current UIs are extremely comprehensive.
I had a 80yr old lady up and running in one day with PopOS. If that's not lowest common denominator, I don't know what is.
Professional work can be hit and miss. Depends on how draconian your workplace software is.
Thank you so much for telling me that I am actually not managing my non-profit on my Mac. I was convinced I was working every day on a Mac in a business setting, but I guess I was mistaken.
> Anything that it doesn't run due to MacOS I use GeForce Now over ethernet.
Can you elaborate? Thinking of setting up a MacMini for my kids but worried about lack of gaming options for them (I haven't gamed on a Mac in a dozen years and the state of gaming on MacOS was sad back then).
There's a lot of Mac games on Steam, Apple Arcade, and Battle.net these days. Anything that isn't supported there, I generally use Xbox streaming or GeForce Now streaming.
Here's a list of my most played games on my Mac in the last couple of years:
WoW, Hearthstone, Dota 2, League of Legends, Thronefall, Vampire Survivors, Baldur's Gate 3, Cult of the Lamb, Balatro, Death Must Die, Terraria, Dave the Diver, Mechabellum, Space Haven, Hades 2, Peglin, Stellaris, RimWorld, Dead Cells, Total War: Warhammer 2, Valheim, Civilization 6, Slay the Spire, Don't Starve Together, Cities: Skylines, Oxygen Not Included, SUPERHOT.
The point of such an annoying long comment is to demonstrate that there is a very substantial Mac gaming library. The problem is that a new shiny game comes out that doesn't support Mac and you don't want to be the ONE guy in your group who can't play it because you're on Mac. The latest one for me is Deadlock. Not on GeForce Now, not on console, not on Mac... so I needed to get a Windows PC.
But if you're a kid and just looking for a general gaming machine, it plays a ton of cool stuff.
Thanks for all the info! Great to hear about the availability.
I noticed the other comments mentioned GeForceNow over _ethernet_. What connection speeds do you typically need to play these games over GeforceNow or similar.
My first thought was similar, though followed quickly by "...but it's Apple, so what's the catch?" The relevant extra things to know are that the SSD is soldered, there are no slots for extra SSDs, and choosing a sensible (1TB) drive is >4X the price of buying similar storage at retail. Still a no from me, then.
(The only thing I do often that's CPU-limited is compiling, being faster at that saves me maybe a few minutes in a full working day; I don't care. I am frequently limited by RAM and I really hate shuffling things around to make space on drives.)
My Mac mini M1 is still such a great computer and I really don't need to upgrade, but with the spec bump up to 16GB of RAM, $230 trade in value that Apple is telling me they'll give me, and the $499 education pricing (I'm currently doing a Masters degree) it's too tempting to pass up.
My Intel Mac Mini is still my "tv content" machine. Since it has no problem driving my Samsung OLED TV and keeping up with typical video framerates I suspect I will be holding on to it for many more years to come.
Are these games available on OSX? Or are you somehow booting Windows?
(Apologies if this seems like a stupid question. I've not played games for a very long time, mainly because most stuff doesn't seem to be available on Macs).
It's not a dumb question. I actually used to use an iMac 27" with an Nvidia 680 that I would boot into Bootcamp / Windows for my primary gaming computer. I covered it in "built, not bought" stickers at Quakecon one year.
You can't do x86/x64 Windows on M-series Macs without emulation and it is generally a poor experience. There's a few things like Crossover, Parallels, etc that can help you run Windows games.
But I have found that most of the games I care about are either Mac native or on GeForce Now at this point. There's a surprisingly large game catalog on Mac now.
So the short answer is that some of them run on some sort of Windows compatibility layer, some are Mac native, some I stream. But most of my favorites run native on Mac.
To be honest, there are so many games to play these days that I don't mind missing out on a few titles. Valorant is a good example of a game that I can't play on Mac, GFN, or Crossover. But it's OK, I still have CS2.
I'm wondering if it might actually be easier to install Asahi Linux (or some other distro) on Apple Silicon for gaming via Proton, until Game Porting Toolkit is adopted more.
Yes — World of Warcraft, League of Legends and DotA 2 all have native macOS ports. WoW got an Apple Silicon port relatively recently IIRC (last expansion).
In my experience with gaming on Macs, even when there is a native Mac port of a particular game, the experience is inferior to Windows more often than not. Many of them don't do 4K properly, for example (you get everything rendered at half-res in fullscreen). Things like Cmd+Tab don't work reliably, either.
> Mac mini is made with over 50 percent recycled content overall, including 100 percent recycled aluminum in the enclosure, 100 percent recycled gold plating in all Apple-designed printed circuit boards, and 100 percent recycled rare earth elements in all magnets. The electricity used to manufacture Mac mini is sourced from 100 percent renewable electricity. And, to address 100 percent of the electricity customers use to power Mac mini, Apple has invested in clean energy projects around the world. Apple has also prioritized lower-carbon modes of shipping, like ocean freight, to further reduce emissions from transportation. Together, these actions have reduced the carbon footprint of Mac mini by over 80 percent.
I’m inclined to trust Apple with this information but the skeptical side of me is questioning, how can we fact check this data? If it’s true it is very cool.
Third party auditors that come in to verify it. "We" probably can't verify it, but Apple more than likely has these claims audited so they are prepared when they get sued over them.
I don't know why you're getting downvoted. I think it's a fair question.
The fine print says:
> Carbon reductions are calculated against a business-as-usual baseline scenario: No use of clean electricity for manufacturing or product use, beyond what is already available on the latest modeled grid; Apple’s carbon intensity of key materials as of 2015; and Apple’s average mix of transportation modes by product line across three years. Learn more at apple.com/2030.
https://www.apple.com/2030 which mostly seems to focus on the goal of being 100% carbon neutral in energy use.
It sounds like they're generally only looking at carbon emissions from _energy_ use in transportation and manufacturing, and they're probably using some sort of carbon offset to achieve that "net zero". They're probably also not counting carbon emissions from building construction and they're probably not counting carbon emissions from meat served at corporate events, etc.
Update: I found a breakdown for the Mac Mini (linked from the apple.com/2030 page).
What does it mean for the gold to be "recycled"? I get that the aluminum probably came for a pile of cans, but does this mean that the gold definitely came from a pile of electronics? Or could it be that they melted down a few old $20 coins from the US? It's not like a lot of gold ends up in landfills.
"According to the World Gold Council, recycled gold accounted for 28 percent of the total global gold supply of 4,633 metric tons in 2020; 90 percent of that recycled gold comes from discarded jewelry and the rest from a growing mountain of electronic waste such as cellphones and laptops."
But we offer base models with paltry memory and solder our ssds and ram modules (look at the ram/ssd chips next to the m chip itself under the heat spreader, they could make em low profile socketed for sure).
If we truly want to achieve zero emissions globally we need to take seriously all sources of CO2 emissions, the full carbon footprint of companies. Not just energy use.
It's not entirely unreasonable to ask companies to be responsible for carbon capture or in the short term an offset for their employees breathing on the clock, as funny as that sounds.
We need to take all sources of carbon emissions seriously. This shouldn't be downvoted.
>> "ask companies to be responsible for carbon capture or in the short term an offset for their employees breathing on the clock"
Unless you think their employees breathe more when they are on the clock than off it, I'm not sure this makes sense. When they're off the clock, they might be exercising or playing with their kids, so perhaps they actually breathe less when sitting at their desks on the clock.
Yikes, I hope folks don't think I was referring to CO2 caused by human respiration! I was referring to the CO2 emitted for example in growing the employee's food and getting it to him, his shelter (cement production being particularly high in CO2 emissions), transportation, home heating, the CO2 emitted by the people who educated him and provided his medical care.
Like someone else said, spending is a very good proxy for CO2 emissions, and about 68% of all spending is "consumer spending", which basically means keeping people alive, somewhat happy and somewhat productive.
As far as I am aware, there isn't a single competitor from big brand manufacture at $599 price point regardless of size. M4, 16GB RAM, Thunderbolt 4. The SSD is the main failing point but with TB4 you can easily get an external SSD. You can also get 10Gbps for extra $100. With EDU or Staff pricing this thing stars at $499. Which is practically a steal.
I am thinking it may be better for cooperate to buy this and run Windows on VM than buying a PC.
Considering iPad and iPhone has been replacing 99% of my workflow outside of office I am thinking if my next computer could be a mini rather than a Laptop.
I’m always confused as to why people are so paranoid about storage size. I got the base MacBook Air and an external 2TB drive for cheap. Super fast and I never worry about anything - I didn’t even manage to get up to 50% of my 256GB drive.
There is a generation of tech users that downloaded TB of media for local storage. It’s just not something a lot of people do anymore but it created a psychological need, even if it’s not a technical necessity.
Totally fine. I attached a 4TB SSD at 700-800 MiB/s for non-work files, never noticed anything slower unless it’s copying a huge file for 20 seconds instead of 5.
Apart from the huge price jump from M4 to M4 Pro, I really like this product line-up.
Last time I bought a Mac Mini was before the 2018 model got introduced, and I almost took it back in to get it exchanged (I was within 30 days of purchase when the 2018 model dropped), but it's been plugging away doing everything I have asked of it for 6 years, and it's still going strong. All the upgrades since have left me a little cool, but this genuinely looks like a contender for an upgrade. Only thing stopping me from getting the credit card ready is waiting to see what the M4 MacBook Air - which is inevitably going to be announced in the next 72 hours - looks like in comparison.
There will be a regular M4 MacBook Pro I assume, just like how there’s a regular M3 MacBook Pro now. They did this same release order last year (Pros in the fall and Air this spring)
The release cycle is the same every year, aside from occasional refresh omissions and delays. It would be weird if they actually did release the M4 Air tomorrow.
M4 pro comes with Thunderbolt 5, which means one cable to run 2x 2160p120. And in case of macbooks equipped with TB5, one cable to do 2x high res, high refresh displays + power + plenty of bandwdith for data accessories. Omnomnom.
There's currently plenty of 3rd party VESA mounts for Mac Mini, I'm sure they'll have some for this new Mac Mini as well. They slide down into a "clamp" style bracket. They run about $15 on Amazon.
You can get 'sandwich' enclosures that put the mini between the monitor arm and the monitor itself, or off to the side. That's what I do with an M1 Mac Mini sitting next to me. Maybe it's a blessing in disguise since you can get these cheaper than what Apple would sell them for :)
Honestly, I wish they'd go with an external USB-C power brick.
The only reason they might not is that they want to keep everything across the entire line, and the highest end Mac Studio probably needs more power than USB can offer.
I bought a portable AC battery for my Mac mini for $65. Hard agree though that it would be a lot cooler if the conversion was in the cable so I could just do dc in.
The thing is, M chips run so efficiently that even with the power loss from conversion, it’s still pulling less power than a lot of PCs.
But yeah it running on a sleek dc battery would be a lot cooler
The Apple tech specs page says "Maximum continuous power: 155W" and USB-C PD supports up to 240 W these days, so technically it's doable, although it would be a nightmare with people trying to run it off the far more common lower power supplies (which it would have to reject because it doesn't have a battery to handle if someone suddenly plugs in some high-draw device)
It would appear the air intake is on the bottom like the Mac Studio.
As someone who lives in a very dusty 150 year old house, My Mac Studio does not appreciate the air input being directly on the desk. It collects all the dust that lands anywhere near it.
I have a large levoit air filter running 24/7 in my office and still end up with this[1] regularly. I wish I could at least reasonably take the thing apart to clean it out.
In many big cities, getting a lot of dust is nearly out of your control, the only factor you can control is how often you gather it all up. I used to live practically next to a four-lane road when I was younger and even if you kept the windows closed, the dust would still creep in with every coming and going. If you ever opened a window, you'd know you'd need to vacuum soon.
Agree on dust removal. But if you have a constant source of pollutant input such as air pollution, dust or pollen, you want to be running a filter 24/7.
Large buildings don't run their HVACs in burst and then turn them off.
Many people are eager to plug in their air purifiers and get started, but they often miss the fine print about checking inside the unit. Leaving the plastic bag on the filter basically turns the purifier into a fan, without any actual filtering. I saw someone post that they ran theirs like that for months before realizing it—no air getting filtered the whole time! Your dust photo reminded me, so just wanted to mention it in case you hadn’t checked for the bag inside.
However, I don't see how this leads to more dust going into the computer compared to e.g. front-facing ventilation.
The dust landing on the desk next to the computer will slowly drift down onto the surface, passing right in front of any opening and being sucked into the device anyway.
I wish they would add small UPS inside (like super-capacitor or something like that) there to provide way for forced sleep when power is cut off. It's a neat small device which must be accompanied by huge bulky UPS for reliable operation.
If someone didn't know, macOS ignores fsync, so without UPS your data is not safe. Not an issue for laptops, obviously, but issue for battery-less devices.
If you need to guarantee that data is, in fact, written permanently, you use fnctl(F_FULLFSYNC).
FWIW while fsync() on Linux does request that the drive flushes its hardware cache, it's up to the drive whether to actually honor this request, and many don't do so synchronously (but still report success). So unless you control the whole setup end-to-end - hardware and software both - you don't actually have the guarantees.
A missed opportunity though. If it’s USB-C powered, it could be even smaller and Apple could simplify its BOM by including a MacBook Pro charger with it.
Apple really is the kind of cherry picking comparisons. They seem to compare the new Mini with the M1 Mini, the Core i7 Mini, and the M2 Mini, all in different categories, whenever it benefits them.
I think I’d miss my USB-A ports if I switched my Mac Studio for this. Apart from that, it looks pretty good. Not really sure if it’s worth saving a couple of hundred when you spec it up to par with an M4 Max Mac Studio when that comes out though. It’s the same price as the base M2 Max Mac Studio when you upgrade the memory and SoC.
I have lots of USB-A devices, so I get what you're saying. But converters are pretty cheap and seem reliable.
And Apple has a long history of making this change ahead of the rest of the market. It's been years since they've move to all USB-C in their laptops, so IMO, it was only a matter of time.
And yeah - upgrades are awful price wise. From what I can tell, it's basically only worth it to buy base models unless the machine is making you money. Hopefully they upgrade the Mac Studio to M4 down the line.
I actually recently discovered that my USB DAC was skipping a lot because I had it connected to a hub. Threw it directly onto the Mac Studio and now everything's peachy, so there are definitely downsides to trying to get a bunch of USB-A devices attached to one of these.
Sadly it's been common for USB hubs to be dodgy ever since the advent of USB 3. I rarely had trouble out of 1.x and 2.x hubs, but 3.x+ hubs are consistently trouble. The only ones that haven't been problematic are those integrated into Thunderbolt docks, probably because those undergo more stringent certifications.
How much faster is the M4 vs the M2 for Swift development?
I’d probably get 32GB. I started buying 16GB Macs in 2013. The extra RAM will keep any Mac useful for a few extra years. In fact, my 2013 Intel MB Pro would be still be great if I could upgrade the OS
I love its form factor, less so the price difference between the M4 and M4 Pro models ($800 USD, presumably so it doesn’t cannibalize the Studio). It looks small, friendly, and inviting to the user, despite not breaking its industrial aesthetic.
Honestly kind of want one as a desktop, even though my M1 Pro MBP is still insanely powerful for my needs.
I feel the same way. I have a really nice MBP and I cannot justify a dedicated desktop when a single thunderbolt cable to my laptop does the job just fine, but I do love the value and design. Maybe I'll pick one up for the kids.
To me, it really is just an aesthetic thing. What purpose does this actually serve? It's small, but not portable. If it's going under my desk or on a rack never to be moved or looked at, why does it need to be cute?
They may have higher ambitions for this generation! In the presentation (roughly at the 10-minute mark), they show off the standard target demographics and setups for creative work, then complementing that with some more enterprise-flirty stuff about making workers more productive and lowering office energy usage, only to finish off with this:
"And with the industry-leading reliability of macOS, healthcare systems can count on mini when providing critical care."
Given how Apple is pushing the carbon neutral narrative while still not reaching the goal on all its products, I assume just buying the credits would tank their margins enough to push them to actually reduce the footprint first.
This looks to me like one instance where the incentives are decently working, at least to some point.
Maybe. Alternatively it could just be the marketing department milking the narrative over an extended amount of time. Going instantly 100% “carbon neutral” through carbon credits is certainly a worse move in this regard.
> Only after these efforts do we cover residual emissions through high-quality carbon credits that are real, additional, measurable, quantified, and have systems in place to avoid double-counting and ensure permanence.
Better than nothing...
Also interesting:
Maxed out: Mac mini with M4 Pro (64GB memory, 8TB SSD): Product footprint before carbon credits 121 kg CO2e
Min spec: Mac mini with M4 (16GB memory, 256GB SSD): Product footprint before carbon credits 32 kg CO2e
I wouldn't have thought that there is this much of a difference in electronics!
Mac minis are going to be one of the smaller selling product lines, so it's probably easier to offset the carbon emissions with the carbon credits they buy.
Enclosure is: Acasis M.2 NVMe SSD Enclosure 40Gbps
The disk I put in there is a SK Hynix Gold P31 2TB, I am not getting its full speed with this enclosure so you can probably get a slower one and get the same results.
I've been using external drives for years and would love to get rid of them now internal ones of a decent size are available
It's always been a slightly clunky experience - having to eject them before I can undock my laptop, or the way they never go to sleep (some issue with CalDigit TB dock...?)
I used to think of them as a backup, but since moving house a couple of years ago my internet is fast enough to make Backblaze viable
Next time I upgrade I'm just going to have less boxes on the desk, less power-drawing crap plugged in all the time
I hate the price of 8TB storage on these though :(
This seems like such a cool home server... BUT with all the disk encryption stuff, you'd need to be logged in to run things, right? If the power goes out, your server does too?
Does anybody have a guide or tips on how to make one of these better for hosting a website with cloudflare tunnel and being resilient to power outages?
Macs run decently as headless servers except for the limit that you cannot use full disk encryption -- the boot process stops and waits for you to provide the decryption key via local keyboard and there is no way around this. If you are concerned about this then you can look at running an encrypted external disk or a partition of an internal disk as an encrypted volume. You still need to decrypt things before everything starts working again but at least the system can boot for remote access. Yes, yes, this is not a secure as having the system fully encrypted and we can all think of various ways something like this can be compromised. It all depends on the threat model you are looking at.
Yes, Apple Screen Sharing is built in. It's based on VNC. You can definitely run it from another Mac, although it's not a great experience with other VNC clients on Windows or Linux. They appear to have their own encoding via VNC, based on H264 or HEVC, and falling back to the encodings supported by other clients is pretty laggy.
I do this exact thing with my M2 Mac Mini (plex, a few home lab things with no display).
I use Chrome Remote Desktop to get into the box remotely. If the box does end up losing power/restarting, I also make sure to have SSH on so I can ssh into the box and start remote desktop before being logged in (Google provides instructions).
I found this to be the path of least resistance to getting it remotely accessible.
Website says it supports 3 6K displays, here you go:
M4 (Thunderbolt 4):
- Up to three displays: Two displays with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 5K resolution at 60Hz over Thunderbolt or 4K resolution at 60Hz over HDMI
- Up to two displays: One display with up to 5K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
M4 Pro (this one has Thunderbolt 5):
- Up to three displays: Three displays with up to 6K resolution at 60Hz over Thunderbolt or HDMI
- Up to two displays: One display with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
OK, so unnecessary and arbitrary restrictions on screen numbers, once again.
There is really no reason you couldn't drive four (or more) lower resolution (4k) screens, given the array of ports.
In case anyone is wondering, the use-case here is a triple-monitor configuration at a desk with a much larger "TV" positioned, or hung, elsewhere in the room.
USB PHYs aren't the same thing as display controller IP blocks. It's obviously possible to design a chip with more of the former than latter. At the hardware level, nothing is actually an arbitrarily-subdividible budget of display bandwidth.
The SoC needs to have a hardware display controller for each display, and Apple only put 3 of them on their chip.
You can add as many extra displays as you want using DisplayLink which runs as a standard USB device and doesn't use the built-in controllers, but has worse performance, probably good enough for a "TV" though.
Somewhat unrelated but Apple are mainly focusing on Apple Intelligence in these new announcements.
The first version of OS X I used was Mavericks. In hindsight, that was the last great version of OS X for me — the last version where it seems the priorities of the people deciding the direction of development where somewhat aligned with mine.
Many have written about the decline in usability and attention to detail in OS X since then — I guess Apple Intelligence represents this shift in focus perfectly: a black-box interface that may or may not do something along the lines of what you were intending.
> With M4, Mac mini delivers up to 1.8x faster CPU performance and 2.2x faster GPU performance over the M1 model
They're comparing three generations back now?
Oh, I see that they never updated the Mini for M3. So it's only two generations of Mini. Still, I prefer to see one generation comparisons. And it's kind of weird that Apple doesn't keep their smaller product lines more up to date. They certainly have the resources to do so.
My guess is the leap from intel to M1 was significant for an upgrade and M1 vs M2/M3 wasn’t really. I’m personally on an M1 and use it heavily but I don’t think I need the M4 jump still.
Not only that but they're still comparing m3 as being "17x more powerful than Intel" for the air, but neglecting to point out they haven't used Intel for several generations now. Lmaoooo.
Yes it does. We have some in chassis with graphics cards attached and I'm considering whether I can just put these new minis outside the chassis and run extension cords.
Does this still have soldered flash chips for the SSD? This would've looked a lot better without the soldered non-upgradable SSD. It's not great at all.
I've watched that guy's video. The Apple arm64 Macbook Pro doesn't even charge without a functional SSD. I suspect it also doesn't boot off anything else if its main SSD is dead.
Was that an Intel iMac? On the Apple Silicon machines, the internal SSD also contains all the stuff that would be on the firmware flash chip and NVRAM on an Intel machine, so it's required even when you boot from an external drive
I’ve owned Macs for decades and never had an internal drive die, SSD or not. And I’ve worked in the HPC storage industry for years. My expert opinion is that you’re making up reasons why this is bad.
Funny thing that when I look at it, $600 is objectively cheap, not only by Apple standards - I remember 8 or 9 years ago I really-really wanted a Mac Mini, but just couldn't afford the 320 EUR (including like 10 EUR IBMer discount) they asked for the base model back then, new. Inflation happens on strange ways...
As usual, no upgradability. There's evidence that it's possible with SSDs with no loss of performance. Probably the same would apply to memory, maybe with replaceable memory chips and a simple switch. More future landfill material.
I wish Apple devices were more upgradable (and cheaper and more fixable), but I would speculate that Apple devices are the last devices to end up in a landfill (or more aptly, recycled). If you outgrow a device there is a very robust resale market and that machine will happily fill someone else's needs.
Apple devices seem to stay in use for an eternity.
Are we going to hear this for every product release ad nauseam forever? Not sure about you but at least for myself, I always trade-in/recycle my products with Apple which I hope closes the loop as close as possible.
The days where this actually matters is going away. Your opinion is but a tiny minority, for the vast majority it does not matter. $200-800 for a tool that generates an enormous amount of value is incredible, no desire to upgrade it myself. I think about how rarely a PC gaming computer needs to be upgraded these days, by the time it happens its usually a complete overhaul because there is a CPU, Mobo upgrade required.
More pointlessly defending multi trillion dollar companies for their penny pinching. Sure it's useful, but it could be cheaper without so much anti consumer markup. And y'all should be in favour of this yet you've been taught by apple's cult atmosphere to be "happy with what master gives us"
TB5 is 15GB/s. So gen 5 equivalent. I'm not saying there are tb5 enclosures in the wilds, but it's a matter of time. Also if you're bottlenecked by buffered, linear reads and writes so much that there is a difference between 3GB/s and 7GB/s then I envy you. Most of what I choke my desktops and servers with is random IO that wouldn't saturate gen2 :)
Thunderbolt 5 is very high data transport, but the latency of going through the TB port is still higher than going through PCIe. In a single large transfer, I'd expect TB5 to win, in a millions-of-tiny-transfers scenario, I'm not so sure.
Thunderbolt is PCIe though, just over an external interface. That's why eGPUs worked so well. I can't see a situation where the latency of Thunderbolt has a significant impact on disk usage when eGPUs, where latency is so much more noticeable, worked acceptably?
Thunderbolt provides a tunneling mechanism for PCIe, DisplayPort, USB etc. It's also a mesh network where packets are source-static routed from node to node in the network - so the source sets up the route-to-the-destination and the data packet is transmitted from controller-node to controller-node until it gets to the destination, then it's unpacked and presented as data to the system.
You could see some of this on the venerable "trashcan" Mac Pro, where one of the TB controllers wasn't directly connected to the port, but came through another TB node. The latencies on the ports connected to this TB controller were slightly higher due to the extra transit-time.
Latencies over PCIe are measured in tens of nanoseconds (say 70-100) depending on chipset and how much you pay. Latencies over TB can be several hundreds of nanoseconds. TB presents as a PCI interface, but that's an adaptor-type design pattern, it's not fundamentally PCIe underneath.
Bandwidth vs latency is like a pickup vs lambo I guess. And what the tb limits is the bandwidth, if you catch my drift (although lambos are awd and poor at drifting). So the actual performance that matters (the snappiness) is still there.
It won’t ever be able run as fast as a soldered system.
Have you installed a server CPU?
It’s really easy to fuckup and lose a few channels of memory due to the contact being bad. Right now I’ve got a 3647 Xeon phi cpu that’s refusing to train dimm a1 for _reasons_
That’s not an experience Apple wants any user of their products to have.
People running a single desktop machine are way better served by being able to upgrade RAM modules than worrying about single contacts being bad between the RAM stick and motherboard.
Yes? I wasn’t making a claim that it was better to solder everything for everyone. I’m saying the overlap between most Apple users and those people is low.
Give definitive evidence that soldered is faster, in my experience with decent contact this is not true at all. I think the general confusion is around typical ram sticks with a controller onboard and much fewer io broke out compared to the raw ram without controller which is often what you get soldered, with more io broken out.
The bga socket you chose is more for test or industrial hardware, versus desktop cpu sockets which are much slimmer and consumer friendly.
Really disingenuous, imo. It's absolutely possible for Apple to make these chips replaceable, using the heat spreader as the retaining plate.
Sure, they are. Buy a Epyc or Xeon, burns 200+ watts, has 8 to 12 channels, which requires 8 or 12 dimms, which are barely fitting in 19" racks (next gen are moving to 21").
Or you could get a m3 max, run the memory at twice the speed, still have a 512 bit wide memory bus, and have a 10+ hour battery life. Presumably similar with the m4 max, rumors claim later today (Wednesday).
Apple puts the memory, CPU, and GPU all on the same chip. This generates less waste as you only need a single package and socket, and uses less energy during operation.
And many desktops do that today, but like everything it has tradeoffs, such as peak bandwidth and power usage. DDR sockets inherently make this sacrifice, integrated designs will always have wider buses, higher bandwidth, etc. That's also why you don't get sockets for your GPU memory, either. It's a design tradeoff.
-> It won’t ever be able run as fast as a soldered system.
Yeah, just take a look at PCIe 5 and it's 512GB/s of bandwidth.
-> Have you installed a server CPU?
Yeah, and none of the problems you mentioned.
-> That’s not an experience Apple wants any user of their products to have.
Yeah, just look at the older macs with upgradable components and the easyness you had replacing them... So, instead of making it easier, let's just remove it altogether.
PCIe is a serial interface, not parallel like modern DRAM interfaces. They're completely different at a hardware level, the electrical design constraints are completely different, the latency characteristics are completely different. I think you are just throwing words and numbers out and don't really know what they mean at all.
As discussed in the iMac thread yesterday, LPCAMM2 makes it possible. There are LPCAMM2 modules with the same 7500 MT/s spec as the M4s integrated memory, and two of them running in parallel would match the M4 Pro.
Even if Apple wanted to support modular memory, which they obviously don't, the ultra-tiny form factor of the new Mini would probably still rule it out though. Soldering the memory down is still more compact.
No way the new Mini is too small to allow upgradability. You can buy a Windows mini PC that is not only smaller than the new Mini but also allows upgrading both RAM and SSD. And that's without using LPCAMM2 - just normal SO-DIMMs. (Example: https://trigkey.com/products/trigkey-green-g4-16g-500g-n100)
That system you linked to is an extremely poor example. It relies on an external power brick, is incredibly underpowered, only gives one PCIe lane to the M.2 slot limiting it to ~800MB/s according to their specs (meaning it's only PCIe gen3), and has only one SODIMM slot (meaning it's operating with just a 64-bit memory bus, half the bandwidth of mainstream consumer PCs).
It's basically a 12 year old PC shrunk into a tiny box and low power budget.
Sure, it's not on the same performance level but this isn't the only option. There is a wide range of options available in the same form factor. Here's something higher end: https://www.bosgamepc.com/products/bosgame-mini-pc-p3-amd-ry... Probably still uses an external power brick but I imagine that's just to reduce costs.
My point is that this size of device is already available with upgradability so the form factor isn't the issue. Apple is significantly better at engineering products than these random companies and they could surely have made this new Mac Mini upgradeable. I do understand why they wouldn't want to though!
I am not talking about DIMMs. Talking about the chips themselves. I am pretty sure they don't make different APUs for different memory sizes, it's just a fuse or something like that. If CPUs can use sockets, so do memory chips.
The pin density on a bga memory is like, 0.3mm for the type typically used by apple. That’s 200 0.3mm pins that have to line up and work at 4GHz and survive you dropping it 5 feet.
Apple knows how to make money, I can buy a quality 4TB Nvme for 300$( you can definitely go lower if you want to risk it ). The upgrade to 4TB on the M4 Pro Mini is 1200$(it's not supported on the base model) , on top of 1400$ for the actual computer.
It I had to guess, most of Apple's margin is on users riding the pricing ladder up into the stratosphere.
I had an experience a few years ago at an Apple store, where this clerk refuse to sell me the cheapest m1 MacBook Air. There's probably some direction from up top which is trying convince people they need the more expensive Macs.
I doubt the internal SSD is going to die before the power supply gets cooked.
If it does, you can get the SSD chips replaced. That is well proven now. Granted it needs a specialist with rework kit but they are starting to become more common now that it's an issue.
> I don't think these even boot once the SSDs die.
All Macs that I know of let you configure the boot drive. I had an older Mac Mini with a spinning HDD. I added an external SSD, set that up as the boot drive, and never touched the slow drive again. I'd be extremely surprised if you couldn't do the same with this.
Wow, are you still using your original 386DX board, with minor upgrades along the way? /s
I actually think Apple's way of managing upgrades isn't as harsh as many people think.
The first thing to get to sustainability is to use less. If you don't need the hardware to make hardware easily upgradable, you simplify the hardware and use less of it. This is one of the reasons Apple do it.
Secondly, they're using a lot of recycled material in this thing. Their lede line on it is that its carbon neutral. Show me another desktop PC like this that can make that claim.
Thirdly, the "half-life" of a Mac is kind of insane. When I was buying Thinkpads, Dells, and the like, I'd get 2-3 years down the line and I'd "need" to upgrade the whole thing. I've got a 2017 Mac Mini, and an 2015 MBP in regular use. I have a G4 iBook that was in active use by my parents from 2004 until _this Spring_ - they only gave it up because they couldn't upgrade Chrome on it any more, so it's about to become a retro Linux term for me, because the hardware is still sound (albeit too under-powered for anything modern).
And lastly, they take old hardware in and recycle it back into the new stuff in the first step. They give relatively decent trade-in prices, and are one of the few consumer brands doing that.
Given that they're shipping it with 16GB of RAM, which is fine for my needs, I think I'm confident in saying I could buy one, use it for 5-8 years, and then get it recycled when I upgrade at that point, while most PCs with upgradable RAM being sold today are going to landfill within 4 years, perhaps.
I think you’re giving PCs way too little credit compared to Macs. AM4 motherboards from 2017 can have 5800x3d or 5700x3d CPU installed, the former of which is still #2 in the majority of gaming benchmarks beating anything Intel can offer for a fraction of price and power consumed.
A 2004 G4 iBook has been unsupported for any software for decades, let alone Chrome who was first released in 2008. I don't know if they actually ever made an official PPC version of Chrome but I doubt it.
You are a liar but a bad one.
You can like Macs, you can prefer it, you can rationalise the added cost any way you want.
But the fact is, Macs do not have any more useful life than PCs if your price matches them.
You play the role of the typical Apple fanboy who compares a 1.5k MacBook to a random crappy Lenovo that was on sale for less than 500.
The Mac is better/longer lasting. It's almost as if price convey some sort of information on quality...
>Delivers up to 13.3x faster gaming performance in World of Warcraft: The War Within
This is such an Apple stat especially for a game. What does "faster gaming performance" even mean? Every zone and city hub loads 13.3x faster so loading screens are quicker? They don't say anything about FPS and no one would use "faster" as a synonym for higher FPS.
An MMO is really not the best benchmark tbh
Edit: notes has the compared spec "Results are compared to previous-generation 3.2GHz 6-core Intel Core i7-based Mac mini systems with Intel Iris UHD Graphics 630, 64GB of RAM, and 2TB SSD."
So they compared the 2024 M4 to a 2018 8th gen Intel i7 (i7-8700B). Take that as you will
It's funny that they advertise this because a Mac comes with a spreadsheet application that is hard to use and unbelievably slow. If they sent some engineers to work on that program they could get a 10x-100x improvement on the software side instead of grinding it out on the hardware side.
Yeah Numbers is nice for making cool layouts and convenient for its multi-table functionality in one page but it's also very lackluster in so many ways (no pivot for stater).
But the worst thing is by far its abysmal performance. Even simple accounting sheets get absurdly slow in the low hundreds of rows, no matter how powerful your machine is.
So, you might as well use a web app, like Google sheets that has other advantages.
To be honest it feels like they still offer their office suite just to say they have something else than Microsoft offerings, they have stopped caring about any of it a long time ago (with the redesign IMO).
While that sounds pretty funny I know people who actually burn the CPU on Excel so that might be significant. Granted they should not be using Excel for what they do but you know, it's easier than learning something new!
Once people learn Goal Seek and the matrix extensions to Excel’s macro language, it’s game over for corporate standard-issue Dell 11” laptops, that’s for sure.
Up to three displays: Two displays with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 5K resolution at 60Hz over Thunderbolt or 4K resolution at 60Hz over HDMI
Up to two displays: One display with up to 5K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
Are these set in stone? Would it be enough to run say, two external 2560 x 1440 at 144hz?
I can't say it from first hand experience, but usually that are just example stats. The limit usually is pixels/sec for the bandwidth, so any configuration which requires less bandwidth should work too.
Is there a good performance benchmark website/channel for Mac hardware? (Once reviewers get their hands on the hardware)
I'm trying to decide if I should get the Pro or the base model mini. I've been learning Swift and Metal using an old work Macbook and I want to get my own hardware. The only games I play recently at Factorio and Baldur's Gate 3, so I was thinking perhaps I should get the Pro and not bother upgrading my desktop (an i7 6700k from 2015).
> With M4, Mac mini can support up to two 6K displays and up to one 5K display, and with M4 Pro, it can support up to three 6K displays at 60Hz for a total of over 60 million pixels.
Okay so how many displays can the base model Mac mini m4 support? Is it one 5K over HDMI and then 2 6K over 2 separate Thunderbolt 4 connections, for a total of 3 displays?
I've always loved the form factor/pricing of the Mac Mini, but I've never been able to convince myself to buy it. If you're able to afford a Macbook/MBP, is there any reason why someone would purchase the Mini? Seems like the former gets you the same performance with the benefit of portability.
The MBP is over twice as much and you're still going to buy a keyboard, mouse, and screen for your desk. Portability is nice, but for many people I suspect the use case is too narrow -- for most things you need to do when you are not at your desk computer, the smartphone suffices.
Yea that's a fair point. I live in a small apt so I often find myself working anywhere other than my desk, but I imagine there's a lot of folks that prefer the niceties of a dedicated workspace (monitor, keyboard, etc).
Looks like it is not a big price differential to get 4x the cheapest Mac mini vs. a fully loaded 64GB mem version with the Mac mini Pro. That and we would end up getting more GPU cores (4x10 vs. 20).
And if this is cross connected with TB4 networking and using exolab, might be good for a nice local setup.
...to be looking really appealing to pair with one of these new Mac Minis.
$899 MSRP in the US, 5120 x 2880, same dimensions as an Apple Studio Display but a lot cheaper... And B&H just got them in stock.
Just ordered one myself, now I need to pick which variant of the Mac Mini M4 to pair it with. (My goal here is replacing a 27" Intel iMac for map making / CAD / DTP type stuff.)
Ah that's cool, it seems to be better than the one Samsung released.
And the price is not too bad.
Is Asus going to make "cheap" Macs better with their display offering, that would be kind of hilarious.
But yes, if that display is half decent, it makes entry level Mac Mini pretty good, at least you can get a sub 2k setup.
I ordered one, still went for a $2K+ Mac mini, but yeah. My goal is to replace an aged 27" iMac and because of doing DTP type work, I wanted a nice high DPI screen.
Really seriously considered a new iMac, but the mini and a non-Apple-specific monitor wins.
It's too bad target display mode is no longer a thing.
Yes that is exactly the kind of use case the previous iMac shined at.
Which is why I think it's bonkers they want people to put so much money into their standalone display, it's way too steep for the use case and you don't need that much performance making the lack of all-in-one annoying.
Isn't the 2K Mac Mini a bit lackluster though? It's Mac Studio price territory, you get a lot more for 2K. Then again, they need to refresh it, so if you need something now, sure...
You can convert your old iMac into a display with a controller board if you want to keep it. It would cost about 300 but you get a great display, and those old iMacs don't sell for much anyway.
I was looking at one of the cheaper minis, but between educational discount and wanting a bit of RAM (and cores) for some map tile rendering stuff I want to play with over winter, I nudged it up... I'm hoping for a lot-of-years life out of this, so another $1K or so really isn't that bad.
I looked into the display conversion idea, but that just felt kinda meh. I could sell the old computer for a bit less than $200, or pay about $300 to make a so-so display. While the panel would be nice, the cables just hanging out the back, coupled with the rather-meh controller board just felt kinda hacky.
Instead paying $800 for a purpose-built display that should be sleek and nice... I'm okay with that.
I have a Minisforum UM690S, it's about the same size (5x5x2.25 inches) and works well as a small machine. They just announced the EliteMini AI370 today that has the latest AMD laptop chip with 12 cores/24 threads - I assume that would also be a decent linux server (note that ram is soldered on that one).
Just ordered one the UM690S after this comment - I had been looking for another machine like it and saw the sale. Thanks for sharing!
I just deployed another System76 machine that I’m using for the same purpose — using both for redundant containerized web servers I’m self hosting. I got it with 64GB RAM.
Does anyone know how well Asahi Linux supports M4? I only see M1 and M2 listed on the website, but I’m not sure how often it gets updated. I think thunderbolt displays are still a pain point as well?
Putting the audio jack on the front is a strange design choice to me if you plan on hooking it up to wired speakers all the time. Did they run out of space to keep it in the back?
On the front is far more practical for wired headphones. Plus you can always do something with USB-C pretty easily if you want to run it out of the back; if I had to choose a dongle for either fixed speakers or headphones, I’d pick them for the speakers.
It's definitely more practical for headphones. I'm curious as to why Apple decided to make this change now. Previous Mac Minis and the Mac Studios have always had them in the back. It seems uncharacteristic of them to go out of their way for wired headphone users after they removed the port entirely from the iPhone, so I wonder if it was a hardware/industrial design thing where it was simply more convenient to move it to the front for the new form factor.
Edit: also I noticed they moved the power button to the bottom corner of the Mac Mini! (It used to be on the back as well.) This makes me think even more that they didn't want to crowd up the back too much.
Probably for public computers and classroom settings (think libraries, schools, etc) for people to plug their wired headphone in.
Would have been nice to have audio both in the front and rear, with front audio overriding rear audio (like in most desktops), but I guess that would have been too much maximalism for apple
Presumably servicing wired headphones, but agree it's an odd choice to not also include one in the back— particularly when unlike a laptop there's no built in speakers, and Apple has been pushing hard on bluetooth audio since the iPhone 8 in 2017.
Oh that's right yeah— 6/7/8 all had the same form factor as that was the time that the off-year S versions were also dropped, so I always get confused.
And to further muddy the waters, the space in the 7 chassis for the jack was mostly still available, which led to that one madlad bodging in his own headphone jack, for a one of a kind iPhone 7:
The 6 actually had an S release, as well as the X.
But yeah the headphone jack dropping was obviously just to get more people onboard with AirPods that launched at the same time. And you can't say it didn't work! I remember when the first images of people wearing AirPods came out and it was the laughing stock of the internet. People said it looked like you had Q-tips hanging out of your ears, or the tips of an electric toothbrush.
A few years later and they're pulling in tens of billions of dollars per year, just on AirPods sales alone. AirPods could be pulled out into its own business and it would be seen as a wildly successful tech company.
Yeah, good point about the HDMI carrying the audio from the back already. I think I’m out of touch since I use an old receiver that doesn’t have HDMI. It’s funny that this is the one place Apple considered the convenience to its wired headphone users.
The built-in DACs are better than most audiophile DACs; audiophile companies are largely scams and either way don't have the budget to actually do much R&D.
You may still need an amp for electrically incompatible (high impedance) headphones.
The Mac mini 3.5mm jack advertises "advanced support for high-impedance headphones" so you might not even need an amp unless you have some really crazy cans.
My M1 MacBook Pro (the first one in the new chassis) can easily drive my 250 Ohm DT 990 Pro to uncomfortable loudness so it should indeed be fine for 99% of headphones. I've been told it also drives 600 Ohm headphones just fine.
There are a lot of options out there now for USB PC speakers that have the DAC/amp built in, eg these are $50 and natively USB-C: https://www.amazon.ca/dp/B08F57GSJ7
But yeah, anyone "serious" would go discrete for all that stuff regardless. I guess this also lets Apple sidestep a bunch of fuss around non-stereo use-cases, for people who want quadraphonic or 5.1 at their workstation.
I wish there was a way to use my existing M1 iMac as the display in conjunction with the Mac Mini. Ironically, this is preventing me from buying one of these.
Honestly a fantastic update for this machine. So much bang for your buck at the entry level now that they've FINALLY made 16GB of RAM the baseline. And fully specced out with M4 Pro + 64GB of Ram makes this a serious powerhouse in a tiny box. I love it.
I really want to see what they're going to do with the Studio this/next year. M4 Ultra could be insane.
Almost certainly. Asahi linux is getting pretty useable on M1 and M2. They don't support M3 yet, let alone M4, but support will surely come eventually.
I would be shocked if they weren't using a test suite, especially given all the platforms and devices Linux supports. POSIX has a test suite, and there are several Linux test suites [1], [2]. Although, I would think that an architecture port is fairly straightforward. It's reverse-engineering and writing all the device drivers, but devices generally have a well-known interface (and, therefore, presumably tests). The OpenGL drivers are being tested against the official OpenGL test suite.
Of course, there are no guarantees that it runs correctly. Probably doesn't, given that even Apple and Microsoft's software don't run correctly, either. But saying software doesn't run perfectly in all cases is almost tautological.
I wouldn't be so sure - if marcan loses interest (already looks like it), who is going to keep up with supporting the latest Apple chips?
When the M series chips were the hot new thing, there sure was developer interest - but now that a new chip is released every year, it becomes boring drudgery.
Look at support for T2 Macs - it took a decade to get them supported, not because the hardware was so different, but mainly because the hardware was 'boring'.
Fair enough, I suppose I could be overly optimistic. I just figure it's garnered enough interest that even if there's turnover in the team somebody will carry the torch. Especially since Apple seems to be actively tolerant if not even supportive of the project.
The other comments talk about Asahi Linux, which doesn't support the M3 yet. You can also run Linux in a VM on MacOS, and it runs very well.
For some uses you won't get the best performance compared to native Linux. But for a Plex/Kodi server a VM should be great.
(On an x86 Apple laptop I found the power consumption better with a Linux VM on MacOS than with native Linux, so VMs can be quite efficient for some uses. Software builds sometimes run much faster in a Linux VM inside MacOS than natively in MacOS. On the other hand, I found Qemu inside a Linux VM for Android development was extremely slow.)
One of the reasons Asahi doesn't support M3, is that Apple never released a Mac Mini, so they can't do continuous integration. [1] That being said, it seems Apple does re use a lot of the parts on the SoC in each generation, so it's not too different.
- Apple tax on 1GbE to 10 GbE ($100 surcharge lol)
- maxes out at 64G of configurable memory w/ m4 pro
Got to give it to Apple. The traunches between different configuration levels is “small” enough to convince buyers to enter the next level.
It’s like “hey, you are already at $4000 for m4 pro with 64g. Just spend an extra $400-$600 for that bump in storage. It’s no biggy. We losing money at this point”.
I wonder how many consumers fall into this sunk cost fallacy scenario that Apple has designed.
Not to mention, the vast majority of people don’t need 10GbE. Most users use WiFi and never touch Ethernet. Adding the functionality would just drive up the cost unnecessarily.
Agreed, stings that they don't include 2.5G. Even a single spinning disk easily saturates GigE, getting 2.5x that without the price overhead of 10G is a pretty decent value. Even $200 firewall widgets often include 4x2.5G.
Well yes, it's the most reasonably priced upgrade. But when you start upgrading the other options and you come into expensive computer territory it becomes another thing that you need to drop cash on when it feels like it should be default at this price. Many NAS have 2.5G or 5G interface now, so it's not like there is no use case at all...
I would like to see 2.5GbE - that's getting pretty common on AMD/Intel machines now too. But afaik Apple love using Broadcom for 1G NICs, and I'm not sure they have a 2.5GbE chipset?
I have to disagree - the baseline models satisfy the needs of many casual consumers and the price point puts it within their reach. For $600, most of us could give this to our parents and it would be great for 7-10 years.
For those who need more performance, better value is found at higher rungs of the product lineup, and this has been Apple's strategy for decades.
Think you should try some of the machines they ship in that range, for a long time the lowest rung was 8gb memory and 128gb storage, meaning you had a machine that booted with around 30-40 gb free for the end user which would then rapidly get eaten into as swap space when browsing then you'd hit the "Out of memory, close applications" freeze up.
Well the M4 pro crushes the M3 pro. Presumably the M4 max will be a better than the M3 max. No idea if the Ultra will come out, apple hasn't done that since the M2, even if they do it's likely to be 6 months away.
I'm seeing 3200 points for iPhone 16 Pro single core and 3800 for an M4 Mac. That's only about a 18% difference. I don’t think a cabled desktop computer is the right choice for this endeavor anyway.
I've been semi-lightweight gaming on my M1 MacBook Pro with some level of success...
Games that are optimised to run on Apple Silicon natively mostly run great (No Man's Sky easily pushes 120FPS at "almost-maxed-out" settings at native resolution and looks amazing on the built-in display).
Games running in rosetta also work well. The performance hit of Rosetta is only a couple percent.
Less demanding games such as Minecraft, Factorio, various MMOs, etc. will all run very well.
For Windows-only games, Parallels works shockingly well. I can run Skyrim with 3rd party shaders at 60 FPS without any issues in Parallels.
There is no 1st party support for SteamVR. You can supposedly get it working with some older Vive models. I couldn't get my Quest 2 working, even in Parallels. Some games with aggressive (rootkit) Anti-Cheats will probably also not work.
No. Game compatibility is still a big issue for many AAA titles, and the GPU is still not nearly as powerful as a dedicated graphics card from NVIDIA or AMD.
However, this is a fantastic general purpose machine for things like light web browsing, text editing, coding, etc.
Depends on what you play. It can probably play most games already. For high-end graphics, you would need GeForce Now (which works on M series Macs flawlessly btw).
People made rack mount kits for the previous generations. Not sure if the dimensions changed, if they did I'm sure products will be updated and you will be able to 3d print your own or buy a professional version from Amazon shortly
I've been pondering giving up on Macs for a while and this blind and rather stupid deferral to "AI" is going to accelerate the process.
Almost nobody asked for this. I personally would have wanted one program to not stop starting with a cryptic message after upgrading to macOS 15.1 earlier today. But hey, crazies like me who want a decently working software are apparently not welcome in the customer base.
The only reason I am still staying with Apple for my desktop needs is that I paid $8000 for my iMac Pro and that was just some short 5 years ago.
But as time goes by, buying 1-2 specialized text rendering displays and going full Linux looks more and more attractive, especially with Fedora and Manjaro now offering the "immutable" distros i.e. you can frak around with your environment but then revert everything if you don't like it (or the contrary, do a DB commit of sorts i.e. have your changes persist after reboot) -- those features make backing up entire workstations even easier.
Sprinkle an external ZFS server and the ability to just zfs send/recv entire disks with encryption and compression and I think just some 2-3 short years into the future I'll be laughing at Macs.
Apple keeps dropping the ball. iOS 18 lost all my tab groups in Safari as well. And Photos randomly chooses not to show pictures in the big Photos feed; you have to know which day they were saved to be able to see them.
/facepalm
Apple is now in decline, it can't be more obvious by these fairly outrageous bugs + the fact that they are now regular followers like everyone else and are jumping on the "AI" bandwagon.
I've been saying this since the butterfly-keyboard fiasco. At that point it was clear that professional users weren't the focus anymore, and I promptly left the ecosystem. I still have a iPhone (12 mini) that I only keep because it still works, but every time CarPlay tries to murder me in traffic, I get one step closer to just yeeting that phone into the sea.
Apple silicon Macs have been many people's favorite computers they've ever purchased (myself included) and blew out the value proposition for buying practically any other computer, unless you specifically need Windows for some reason.
iPhones have gotten so good they can barely come up with anything new to add year-over-year. Young people in the west are almost exclusively buying iPhones because people like them.
They basically own the tablet space, if you're looking for a tablet there's almost no reason to go with anything other than an iPad. Same for smart watches, they make the best selling watching in the world.
For the amount of devices they move, they're shockingly reliable and have a smoother customer support / coverage system than any other company I've had to deal with. That's why people keep coming back.
It's pretty bizarre to say they're in decline. The only area I can see active decline is how badly they let Spotify eat their lunch with music streaming when they used to basically own digital music distribution.
People buy their hardware for entirely different reasons than what you make up.
They have financial success but that doesn't mean much.
I have had 2 Apple Watch and I mostly find it to be a deeply flawed poorly thought product, especially for the price.
Exactly. The 2016-2018 models were the low point, but Apple has listened and has improved since then.
The 2019 16” was a step in the right direction. More ports, better cooling, better keyboard.
The M1 Pro/Max line up brought back HDMI, MagSafe, SD card slots, and are seriously fast, quiet, & cool. The M2 and M3 releases have been iterative performance improvements and haven’t made any stupid decisions.
Apple’s also invested development effort into useful tooling for developers like their virtualisation framework - this has made Docker on Mac vastly more pleasant for instance.
If you find my statement bizarre then you're either super lucky or haven't paid much attention. Even their Photos app started making mistakes since iOS 18.
Oh I agree, I cringed hard back then as well but at least the Mac software and speed was decent so I figured the beginning of their slow downfall will not affect me for a while, and I was right.
Nowadays though... I am stuck with a former workstation-grade hardware where Neovim needs FIVE SECONDS TO START because macOS is auditing each and every of its syscalls. I switch to my (now allegedly ancient) full-AMD laptop with 5500U CPU and the only thing that needs more than 0.5 secs to start is Firefox. I was not able to find one thing that did not react instantly. I am seriously considering just plugging my 35" gaming display to the Linux laptop and make that my main work setup.
And you are right -- pro users made Apple rich but are now undesired because they apparently demand too much ("Who needs stable software, bleeeerrrrgh! Am I right guys?"). Yeah, screw Apple. I am back on the hamster-wheel employment grind now, sadly, but once I stabilize a bit more I'll just move to 2 HiDPI display and assemble a future-proof Linux workstation to go wit them. Pretty sure that with periodical thermal paste maintenance it can easily last me 10 years and I'd only change it if there's something seriously tempting out there (about which I am very doubtful; the tech giants were only worried about becoming oligopolists and they care not about their users' needs).
Apple had its opportunities. They wasted them. Sure, many people will consider them the top for a while still and will keep buying, but their pricing policy has made it blindingly obvious that less people are buying and they are now doing their damnedest to compensate for this by either including less in the package, making the carton package itself cheaper, or just making all products except the base models outrageously overpriced. That's how they keep the profit margins. The curse of being a public-traded company and all that.
Those policies will work for them. For some time more. I wonder what happens after.
Something's seriously wrong with your system. I just launched nvim on mine and couldn't measurably detect the interval between hitting enter to exec the command and seeing the welcome screen. Even opening and immediately quitting my substantially customized Emacs took 1.7s.
Your setup should not be taking many seconds to do those routine things. Last time I experienced something like that, I had a dying harddrive that was logging a continual series of read timeouts.
I don’t have that one anymore to test it. I’d suggest using Disk Utility to check the entire drive for errors and ruling that out first.
I don’t want to sound like “it works on my machine”, but what you’re describing is so far out of expectations that I’ve gotta wonder if it’s an error condition.
I really don't think it'd be those mitigations. Estimates I've seen floating around estimate it to be a total 15-25% performance hit when heavily loaded.
For additional context, last weekend I went around to every old Intel Mac in a medical office to upgrade their OSes and some of the apps on them. None of them were speed demons, but they were all just proportionally slower than my much newer Mx Macs. Regular "small" apps still loaded quickly and were perfectly usable. This is in a busy office where any slowdowns that kept people from working at full speed would be fixed.
And just because it works for me doesn't automatically mean it's got to work for you, of course. I'm not going to doubt your experience. It's more that what you're describing is so very different than what I'm seeing that it feels like there's got to be something more at play here.
There is a long tail of consequences for short-sighted policies and people monitoring stock prices always seem to be taken by surprise by them when they occur.
You should include additional factors in your analysis.
User goodwill. I'm a programmer and I can easily accept they don't care about us. Fine. But I've been hearing very casual users loudly complaining as well.
At one point sales drop below what executives find acceptable. It has already started for iPhones and has been like that for several years.
Depends on what you are doing? Our daughter has the original Mac Mini M1 with 8GiB, has a habit to keep everything (PowerPoint, Word, DTP, web browsers, etc.) open and the machine is still lightning-fast.
Developer that wants to run IntelliJ, a load of Docker containers, etc.? Sure, that's going to be tight, better get 32 or 64 GiB.
Remember that many Mac users are just folks that do web browsing, Office, and a bunch of other things and 16GiB is going to be enough for a few years.
That is correct[0], as known from the iPad M4 analysis.
I will say SVT-AV1 has had some significant ARM64 performance improvements lately (~300% YoY, with bitrate savings at a given preset[1][2], so call it a 400% increase), so for many use-cases software AV1 encoding (rather than hardware encoding) is likely the preferred modality.
The exceptions, IMO, are concurrant gaming with streaming (niche on MacOS?) and video server transcoding. However, even these exceptions are tenuous: Because Apple Silicon doesn't play x86's logical core / boost clock games, and considering the huge multi-threaded performance of M4, I think streaming with SW encoding of AV1 is quite feasible (for single streams) for both streaming and transcoding. x86 needs a dedicated AV1 encoder more-so due to the single-threaded perf hit from running a multi-threaded background workload. And the bit-rate efficiency will be much better from SW encoding.
That said, latency will suffer and I would still appreciate a HW AV1 encoder.
edit to clarify: I don’t think reducing out carbon emissions is nonsense, that should be our top priority as a society, that’s also (part of) why AI is quite shit honestly, I couldn’t care less if they were just burning money by developing nothing burgers but they’re also burning through all our natural resources at insane rates.
However I do think the term “carbon neutral” is quite nonsensical and just seems like a term to make the consumer feel less guilty about themselves, hell sometimes it’s even used to make the company execs feel better about themselvss. I didn’t forget about that HORRIBLE, TERRIBLE “mother earth” commercial apple ran. DISGUSTING.
Dunno, seems the opposite. Dyson are plastic, don't last long, and generally aren't serviceable.
Apples have been improving repairability, their laptops have many replaceable parts, and generally last longer than PCs.
Compare new price / used for 4 years old price on apple vs any PC. In my experience used apples age well, are reliable, still get years of OS updates, and general age better than PCs.
So Apple just released their new caste-defined product line. Sure, they're technically good. But I don't know how they can claim any vision aligned with any legacy, they're basically a shiny walled-garden Dell.
From what I gather from x.com gamedev corner of the web and elsewhere [1] is that Apple hardware is completely unusable for them still, or even more so after switching to alien ARM arch relative to desktop x86-64 PC.
I wish Apple would invest in gaming, so that we won't have such a capable hardware allocating puny market-share of only 2% according to Steam survey. [2]
I think this just became the go-to recommendation I'll give to anybody wanting an entry-level desktop computer of any kind. In fact I might buy one for my parents right now to replace the old mac mini they have. I really can't think of any reasonable competition for it at that price.