Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is still not viable.

Assume your datacenter runs at 50C (122F) and the temperature outside is -40C (-40F), using the datacenter heat to generate electricity has theoretical maximum efficiency of 27.85%. If your datacenter is at 23C (73F) and room temp is 0C (32F), then the theoretical maximum is 7.77%. Note the word 'theoretical maximum', in reality probably at least 10 times worse than that.



You assume DCs are cooled by the whole volume of air inside when in fact you can put heat pipes on components and get closer to 70-90 degrees heat source. Even years ago when I worked in datacenters the air was circulated from outside to the server rack and then back outside, never crossing from the rack into server room.

Newer generations of CPUs are going to be exchange less frequently but will produce more energy as they offer more dense computing. This makes case for investing more in the server hardware.

Even then understand, that 7% of a huge amount of energy is still huge amount of energy.

Edit: Apparently "Facebook datacenter in Denmark is supplying around 10.000 homes with heating" -- source, another poster.

So... you need to rethink your expertise on defining what is and what is not viable.


You can supply heat no problem. What I was talking about was recovering energy from heat, which is physically limited by Carnot engine. The percentage was Carnot engine efficiency. Note that it is impossible to create Carnot engine in real life, so actual efficiency is much, much, much lower.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: