They have SeaMicro: http://www.seamicro.com/
And given Nvidia has never tried to do anything on the server, it might be that AMD is already ahead of many others.
Nvidia ships quite a bit of Tesla hardware for GPGPU data center use; Amazon just bought a massive shipment of these racks for use through AWS.[1]
What's notable about Nvidia's Tesla offerings is that they sit as a separate 1-2U rack on top of the compute box. The space and power costs of operating Nvidia GPGPUs in a datacenter are nontrivial.
If AMD ships a solid ARM product with some good on-die GPGPU components, that might compete with Nvidia, but otherwise the two are in different spaces even within the server world.
Actually both setups are possible. Sometimes vendors put the Tesla PCIe cards in a separate chassis, and link the chassis to the host via a PCIe cable, eg.: