My takeaway is that somebody, somewhere, had to get a 50k mac replaced and apple has one to refurbish. I suppose it's not that surprising given that used luxury cars are a thing, but a used workstation at this price point is rare.
Curious about the low (comparatively to the rest of the specs) amount of SSD storage, given that raw video tends to be very large. I guess they would be using networked storage for everything?
If you're really doing video, I suspect you'd use storage arrays (perhaps arrays of SSD for high-end) instead of trying to cram it into your workstation, not to mention you'd want redundancy (and sharing) for production-type work
A lot of editors will keep all of their footage on some form of networked storage until they need to work on a specific piece of it. Bring it to the machine, edit it, and send it back to the storage array.
I work in the same industry, and will say you don't need to max out every component on a PC for real-time graphics peak performance. Yeah, you might need to max out a few parts, but not every part.
They threw a silly amount of RAM in it, so ended up using one of the AMD server CPUs/mainboards rather than a threadripper, since the threadripper could only leverage 256G of RAM. $1200 a stick... woof. I doubt the workload they put on things used that much memory. Would have been interesting to match a cheaper 24 core CPU ($1300) and 256G ($1000) or less of memory with that same test.
If we're going for comparisons, the $43k+ refurb Mac Pro linked at the top of this page still has 50% more RAM than that PC does.
I think the video's conclusion is spot on -- blindly comparing specs isn't really helpful for judging real world usability of hardware that is this high end. You should get the specific resources matched to your workload. That might be a Mac Pro, it might be a server in a closet with a TB of RAM, or it might be a PC with a couple of Quadros in it. Or maybe it's just a netbook with a Celeron :)
> They threw a silly amount of RAM in it, so ended up using one of the AMD server CPUs/mainboards rather than a threadripper, since the threadripper could only leverage 256G of RAM. $1200 a stick... woof.
Are you arguing that no one needs >256G of memory? Or are you arguing that no one needs ECC memory? Or are you arguing that server chips and motherboards shouldn't exist? Or is your complaint that Apple sells a workstation? Or maybe you think workstations should be illegal?
Should car makers only sell cars that you, personally, would like to buy? Should all manufacturers only sell things that you personally find useful? And if you don't find this computer useful how valuable is it for 100 people to chime in to say that? (IMHO opinion polls are better suited to that level of commentary.)
> I doubt the workload they put on things used that much memory
All anecdata to the contrary some workloads do in fact benefit from obscene amounts of memory. And if you've ever looked into it supporting >1TB of memory requires expensive dense memory modules because there are only so many memory busses on the chip and only so many sockets you can put on a bus.
> Are you arguing that no one needs >256G of memory?
For context, I'm working in cloud infrastructure. Due to customer demand (esp. from customers with large in-memory DBs), our newest set of ultra-large VM flavors goes up to 3 and 6 TiB RAM. I'm not directly involved with compute, but from what I gather, it's an interesting challenge esp. on the operations side. You can migrate a 4 GiB VM in a pinch if the hypervisor is looking bad. Migrating a 6 TiB VM, however, takes a substantial amount of time (22 minutes if you have a 40 Gbps link and can saturate it, which you usually can't).
I'm looking at what a workstation class machine likely should be, I guess. All things being equal, I'd love there to be no (realistic) upper limit on memory. ECC should have been supported by threadripper too, IMHO. I've only got a 128G on my workstation and already considering filling that extra four slots. 64-512G is likely pretty normal memory space for a workstation class machine.
My complaint is the testing methodology. Server vs server is a different problem/requirements. They compared a workstation and then built a workstation with memory requirements that were more common to a server class machine. I suspect a cheaper 'non-server' CPU, with a memory footprint that was more typical for a workstation load, a cheaper option would have held up. One of the likely reasons they gimped the threadripper was to ensure it did not compete with the epic product line.
As for the questions: no, no, no, no, no, no, no, and no. :)
I was curious as to whether this indeed would be possible. But the CPU (W-3275M) itself starts at $6k, the Vega's are $8k each, I can't find any catalogue prices but each slab of ram comes in at around $600 which puts us at $30k. Not unthinkable that the remaining hardware can add another $5k to the bill and you're looking at a $35k machine.
You can probably replace some of these with consumer grade parts, but I'll be impressed if you can create a machine with two titans, 28 cores, and 1.5tB of memory for under $10k.
It seems like they’re maybe Apple-exclusive so it’s possibly they’re drastically marked up. The Titan RTX is $2500 apiece and seems somewhat comparable.
I think your issue may be with the exponential scaling of GPU cost at the highest end. As a consumer, you wish that a card ~2x the Titan in tflops only cost twice as much, but it doesn’t work that way for either platform, and of course isn’t helped by the workstation card market being smaller by unit than the retail/ORM gaming card market.
That said I did not downvote you; it’s an understandable frustration.
They "still have unrevealed Intel Macs" as per the keynote; given that the 2013 iMac is still supported for Big Sur, we almost certainly still have 7 years of support for both architectures in MacOS.
Jobs said the same thing (about new Macs) about PowerPC in 2006. By 2007, Macs were all Intel, and the Intel-only Snow Leopard debuted in 2009. Complete abandonment of Leopard in 2011.
All I think is haven’t we heard all this before, multiple times, is this just the next generation going through what sounds like a good idea, but in practice isn’t!
There's no need to play that game and it's far more rewarding to define and play your own game with a smaller audience. I had cut myself off of social media and blogging but got back into it recently because I felt there would be an audience I could benefit: https://blog.kowsheek.com/should-i-start-a-blog/
That data structures and algorithm book and course is the root of so much distasteful smugness in the tech world, it's really bizarre.
[Edit]: It's almost like it was the first course we found very difficult so we should that imprint of challenge on everyone else.