Hacker News new | past | comments | ask | show | jobs | submit login

They have SeaMicro: http://www.seamicro.com/ And given Nvidia has never tried to do anything on the server, it might be that AMD is already ahead of many others.



Nvidia ships quite a bit of Tesla hardware for GPGPU data center use; Amazon just bought a massive shipment of these racks for use through AWS.[1]

What's notable about Nvidia's Tesla offerings is that they sit as a separate 1-2U rack on top of the compute box. The space and power costs of operating Nvidia GPGPUs in a datacenter are nontrivial.

If AMD ships a solid ARM product with some good on-die GPGPU components, that might compete with Nvidia, but otherwise the two are in different spaces even within the server world.

[1] http://vr-zone.com/articles/amazon-orders-more-than-10-000-n...


Tesla boards haven't shipped in a separate 1U form factor for a few years; they're all passively-cooled PCIe boards inside a x86 server chassis now.


Actually both setups are possible. Sometimes vendors put the Tesla PCIe cards in a separate chassis, and link the chassis to the host via a PCIe cable, eg.:

http://www.dell.com/us/business/p/poweredge-c410x/pd


I was thinking a great deal about SeaMicro the moment I read this announcement.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: