Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: What is the ideal thermostat setting change when away from home?
2 points by idatum 3 months ago | hide | past | favorite | 7 comments
Using home automation, when no one is home, I set the thermostat heat down and cool up. The goal is to avoid keeping the house warm/cool when no one is present.

Questions:

1. Is there an ideal thermostat setting based on the outside temperature?

2. Ignoring solar radiation warming, is the heat/cool loss in my home linear? Or is the rate based on the difference between inside and outside temperature?

For #2, for example in winter, if it's non-linear then perhaps it's best to find the lowest temperature to minimize the rate of heat loss in the house.




> Or is the rate based on the difference between inside and outside temperature?

Both conductive heat loss and "air leak" heat loss is dependent upon the temperature difference between inside and outside temperature.

For conductive loss, the heat differential determines the amount of heat that "moves" per unit time from the hot side to the cold side.

For "air leak" heat loss, the temperature difference between outside and inside determines the amount of "heat" lost/gained by the leaks per unit time (assuming the leaks are constant).

Are both "linear"? To a first order approximation they are probably "linear enough" to treat them as "linear" given your use case (lower/raise interior temp when no one is home). But also consider that the greater the raise/lower, the longer any recovery will take, so consider than in your calculations as well.


With heat pumps, the energy required to heat or cool the air is nonlinear, with larger differences costing increasingly more. This is based on the temperature difference between the air intake and output within the house. This only matters for heat pumps with a variable-speed compressor, which can reduce the temperature difference, to save energy. These are often advertised with "inverter" in the name or description.

The practical effect of this, is that if you have a heat pump with a variable-speed compressor, heating or cooling your home quickly costs significantly more than doing so slowly, so you are best off leaving the thermostat at a constant temperature, unless there is a significant price change in energy, throughout the day.


Do you mean running at a lower flow temperature when you say "reduce the temperature difference"?


The heat pump runs the compressor motor at a lower RPM, which reduces the power needed for a given amount of heat pumped. Most heat pumps also reduce the blower fan speed proportionally, but the power savings come almost entirely from reducing the compressor speed.


OK, but is that while achieving the same flow teperature or a lower one?


The second law of thermodynamics comes in handy. And Stefan Boltzmann already thought about this, too :)

That given, #2 results in "the lowest temperature to minimize the rate of heat loss is equal the outside temperature. Equilibrium."

It makes sense to turn off the heat when going out of house. In the that time, the house loose energy and it makes sense not to spend more energy to heat & keep a certain temperature. But the house will lose in one way or another anyway, so pulling the temps down makes totally sense.


Heat loss/gain is proportional to internal-external delta T (and fabric U or R values, and ventilation).

In winter, allowing for time to warm up when you next expect to be there, you should consider the lowest temperature that will avoid damage from freezing of pipes and so on. Plus your fridge/freezer and a few other things will not enjoy getting very cold. FWIW I built in 6C as the 'frost protect' temperature for a heating control currently on the UK market.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: