Hacker News new | past | comments | ask | show | jobs | submit login

Nvidia ships quite a bit of Tesla hardware for GPGPU data center use; Amazon just bought a massive shipment of these racks for use through AWS.[1]

What's notable about Nvidia's Tesla offerings is that they sit as a separate 1-2U rack on top of the compute box. The space and power costs of operating Nvidia GPGPUs in a datacenter are nontrivial.

If AMD ships a solid ARM product with some good on-die GPGPU components, that might compete with Nvidia, but otherwise the two are in different spaces even within the server world.

[1] http://vr-zone.com/articles/amazon-orders-more-than-10-000-n...




Tesla boards haven't shipped in a separate 1U form factor for a few years; they're all passively-cooled PCIe boards inside a x86 server chassis now.


Actually both setups are possible. Sometimes vendors put the Tesla PCIe cards in a separate chassis, and link the chassis to the host via a PCIe cable, eg.:

http://www.dell.com/us/business/p/poweredge-c410x/pd




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: