Using home automation, when no one is home, I set the thermostat heat down and cool up. The goal is to avoid keeping the house warm/cool when no one is present.
Questions:
1. Is there an ideal thermostat setting based on the outside temperature?
2. Ignoring solar radiation warming, is the heat/cool loss in my home linear? Or is the rate based on the difference between inside and outside temperature?
For #2, for example in winter, if it's non-linear then perhaps it's best to find the lowest temperature to minimize the rate of heat loss in the house.
Both conductive heat loss and "air leak" heat loss is dependent upon the temperature difference between inside and outside temperature.
For conductive loss, the heat differential determines the amount of heat that "moves" per unit time from the hot side to the cold side.
For "air leak" heat loss, the temperature difference between outside and inside determines the amount of "heat" lost/gained by the leaks per unit time (assuming the leaks are constant).
Are both "linear"? To a first order approximation they are probably "linear enough" to treat them as "linear" given your use case (lower/raise interior temp when no one is home). But also consider that the greater the raise/lower, the longer any recovery will take, so consider than in your calculations as well.