Energy impact with high temp water heater and thermostatic mixing valve?
How much more or less energy will my water heater use if I increase the internal setpoint and install a thermostatic mixing valve?
Okay, let’s say I have a bog-standard electric resistive water heater and it is set to 120F to prevent against any accidental scalding but still at least prevent Legionella from breeding.
I then install a thermostatic mixing valve above the heater and bump the internal setpoint to 160F. The valve is set to mix in enough 55F cold water to produce a 120F flow into the house. This will have several notable benefits, including a notably higher output of usable hot water without having to get a bigger tank and the complete elimination of any Legionella.
Using a calculator from OmniCalculator site, I use 400lbs for roughly 50 gallons of water and I’m heating 55F water to 120F with 100% efficiency. That takes 7.6 kWh. Switching the end result to 160F takes 60% more energy — 12.3 kWh. Oof.
But remember that I’m not using as much hot water anymore! If my internal setpoint is 160F; my cold water is 55F; and my valve setpoint is 120F, then 160p + 55(1-p) = 120 gives me 60% hot water mixed with 40% cold to get my result. Put another way, I am only needing 60% of a gallon of heated water to get my gallon of desired hot water (since the rest of the heat in that gallon is coming from the natural heat in the “cold” water).
I think I can then redo my calculation with 400lbs * 0.6 = 240lbs of water I need to heat and re-running the numbers, I now see only 7.3 kWh. That’s less energy than the baseline!
How realistic is this? Is my logic or math off anywhere? Has anybody done any real world testing on this and can report actual numbers?
(This question was reposted without the link to the calculator, since this forum doesn’t like links)
GBA Detail Library
A collection of one thousand construction details organized by climate and house part