Hacker News new | past | comments | ask | show | jobs | submit login

The cost of electricity for running a single RTX 4090 a year: $800 (calculation below). Cost to the company of the median developer: $200,000.

Assume a 20 cent cost of 1 kWh, then a RTX 4090 with a 450W power draw for a year costs: 0.45 * 24 * 365 * 0.20 = $788.4




Moreover, considering the cost of the card itself at $1,600, the electricity expenses become relatively minor. In fact, for the price of the card, you could operate it non-stop for two years. This highlights how the cost of electricity is a small factor in the overall picture.


No, this highlights that electricity is virtually all of the TCO over the life of the device. It doesn't sound like you've done a lot of TCO work before. In any case we're talking about facilities containing tens of thousands of these devices, so the TCO of one of them is irrelevant.


I don't understand your point. Yes, electricity is virtually all the TCO of the device (other than the upfront cost). Isn't that the case for any consumer electronic part - processor, memory, hdd, etc? That fact doesn't change any of what I'm trying to say.

My point is that this cost (of electricity) is negligible relative to two other costs - cost of time (in other words, cost of human capital), and the upfront cost of the device itself.

This is a response to your comment "Wait, why? Is it more important to organizations to wait a little less time, or to spend less money (on electricity)?"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: