Why thicker on top?
In the process of reading about building codes and retrofits it
seems a foregone conclusion that the roof always gets the thickest
insulation. It never seems to get questioned, simply done, often
as a stackup of multiple different insulation types and techniques
to reach the desired [and sometimes obscenely high] R-value. I
always wondered about just why this is, especially in the context
of a tight house in a heating-dominated climate.
Stopping almost all the air leakage should virtually eliminate
stack effect, allowing warm air to distribute fairly uniformly
and not immediately "head upstairs" to find a way out. Interior
convection doesn't seem to be much of a factor with human-sized
ceiling heights. Thus, in the absence of other factors I'll get
to in a second, it would seem that heat leakage through the walls
and roof would be equal per unit area anywhere around a uniform
envelope, with the same delta from inside to outside. And in the
"cathedralized attic" setup with insulation at the roofline, is
a surface at a 12-in-12 pitch a roof or a wall? It's somewhere
in between, really.
Is it a matter of physical practicality, where typical construction
only allows so much wall thickness but we can dump a lot more fluffy
stuff into an attic space so we might as well go ahead? But even
the folks with double-stud or Larsen truss setups still cling to
something like the 10/20/40/60 model and want more up above.
I'll grant right away that it makes sense for buildings in heavy
cooling climates, to keep them from turning into solar ovens. But
that can depend quite a bit on roofing material used. With only 4"
polyiso on the roof and walls here and *no* additional cavity/rafter
fill, I barely had to cool at all over the whole summer and during
an extended absence with a setpoint of 78 or so, the kwh reading
after I got back told me that the system *never ran the entire time
I was gone*. In this case the high-reflectance metal roofing and
the "underspec" roof insulation perform just fine as a system.
On the other hand, I might have realized an interesting subtlety
about winter: night-sky radiation. The attic isn't directly supplied
from the main duct system and participates only a very modest amount
in the ventilation, so no surprise to me that it runs maybe 6 - 7
degrees colder than downstairs but it doesn't get *inordinately* cold,
which leads me to conclude that the roof and wall insulation are
respectively performing about the same and that's what I'd expect.
The walls get some additional benefit from the drywall on the inside
and whatever old excuse for insulation might have been in there
already, whereas the attic rafter bays are just open so I'm calling
it all roughly R-25 minus windows/doors. On some of the clear, cold
overnights in the low single digits it almost felt like the 1.5 ton
heat pump was running a disproportionate amount of time, not exactly
struggling to keep up but close to it. Besides the potential low-
ambient "polyiso penalty" discussed recently in another thread,
consider the two included pictures. They show the thin strip of
roofing coming down alongside the shed dormer, at 45 degrees, but
with a fairly good view to the night sky past the trees in the yard.
It was about 30F and quite calm the night these were taken. The
little orange square is a patch of black gaff tape like I've stuck on
many other surfaces in an attempt to have uniform infrared emissivity
and get more accurate surface temperature measurements. It's an
unvented assembly of metal panels, slip-sheet and underlayment, upper
deck, 2" + 2" polyiso, and original decking in toward the attic.
The metal roof isn't shiny metal, it's a painted coating that has
more emissivity than that but gives a misleading view because it's
also somewhat reflective in the IR and therefore looks much colder
when it's reflecting some of the night sky to the imager. The tape
normalizes that, and strongly suggests that the roof's effective
"white plate" temperature is considerably colder than the walls
of the house! The temp endpoints in the FLIR shots are held locked
to similar values, and note that everything in view is pretty cold
anyways -- it's not like the house is hemorrhaging a lot of heat
in general, this is just to get some relative measurements. We can
see that the patch of sky beyond the roof and trees is registering
far colder than everything else. [If I point the imager straight up,
it's off the bottom of the scale at "< -40F".] My initial conclusion,
especially after reading about some of the successful NSR cooling
experiments in New Mexico, is that on clear nights the roof itself
must be getting colder than ambient and increasing the effective temp
delta across the upward-facing surfaces of the house.
Okay, so maybe I'm having trouble with, uh, nocturnal emissions.
At a theoretical ballpark around 500 btu/hr over the relevant area of
my roof it doesn't seem like it would make a huge difference, but one
could still argue that a tad more insulation on upper surfaces *would*
make sense here even in winter. That likely goes for any other roofing
materials as well since there's nothing particularly special about the
emissivity of the metal panels. What I should really do is figure out
a way to bury a temp sensor under one of the panels, to determine if
this effect really is chilling down the upper decking to ten or more
degrees below everything else. If all the exterior house surfaces
simply illustrated outgoing heat flux in the infrared and the roof was
losing more than the walls, on a windless night it would look warmer
up there, not colder. I suspect that the vinyl siding experiences
some amount of NSR too as it consistently scans a little less than
local ambient too, and it's not in direct contact with the walls
except where the vertical strapping and the window bucks are.
Well, enough of that nerdfest for now; what other reasons are in the
collective wisdom for having more insulation at the top of a building??
Posted Mon, 02/10/2014 - 12:20
Other Questions in Energy efficiency and durability