Why thicker on top?
In the process of reading about building codes and retrofits it
seems a foregone conclusion that the roof always gets the thickest
insulation. It never seems to get questioned, simply done, often
as a stackup of multiple different insulation types and techniques
to reach the desired [and sometimes obscenely high] R-value. I
always wondered about just why this is, especially in the context
of a tight house in a heating-dominated climate.
Stopping almost all the air leakage should virtually eliminate
stack effect, allowing warm air to distribute fairly uniformly
and not immediately “head upstairs” to find a way out. Interior
convection doesn’t seem to be much of a factor with human-sized
ceiling heights. Thus, in the absence of other factors I’ll get
to in a second, it would seem that heat leakage through the walls
and roof would be equal per unit area anywhere around a uniform
envelope, with the same delta from inside to outside. And in the
“cathedralized attic” setup with insulation at the roofline, is
a surface at a 12-in-12 pitch a roof or a wall? It’s somewhere
in between, really.
Is it a matter of physical practicality, where typical construction
only allows so much wall thickness but we can dump a lot more fluffy
stuff into an attic space so we might as well go ahead? But even
the folks with double-stud or Larsen truss setups still cling to
something like the 10/20/40/60 model and want more up above.
I’ll grant right away that it makes sense for buildings in heavy
cooling climates, to keep them from turning into solar ovens. But
that can depend quite a bit on roofing material used. With only 4″
polyiso on the roof and walls here and *no* additional cavity/rafter
fill, I barely had to cool at all over the whole summer and during
an extended absence with a setpoint of 78 or so, the kwh reading
after I got back told me that the system *never ran the entire time
I was gone*. In this case the high-reflectance metal roofing and
the “underspec” roof insulation perform just fine as a system.
On the other hand, I might have realized an interesting subtlety
about winter: night-sky radiation. The attic isn’t directly supplied
from the main duct system and participates only a very modest amount
in the ventilation, so no surprise to me that it runs maybe 6 – 7
degrees colder than downstairs but it doesn’t get *inordinately* cold,
which leads me to conclude that the roof and wall insulation are
respectively performing about the same and that’s what I’d expect.
The walls get some additional benefit from the drywall on the inside
and whatever old excuse for insulation might have been in there
already, whereas the attic rafter bays are just open so I’m calling
it all roughly R-25 minus windows/doors. On some of the clear, cold
overnights in the low single digits it almost felt like the 1.5 ton
heat pump was running a disproportionate amount of time, not exactly
struggling to keep up but close to it. Besides the potential low-
ambient “polyiso penalty” discussed recently in another thread,
consider the two included pictures. They show the thin strip of
roofing coming down alongside the shed dormer, at 45 degrees, but
with a fairly good view to the night sky past the trees in the yard.
It was about 30F and quite calm the night these were taken. The
little orange square is a patch of black gaff tape like I’ve stuck on
many other surfaces in an attempt to have uniform infrared emissivity
and get more accurate surface temperature measurements. It’s an
unvented assembly of metal panels, slip-sheet and underlayment, upper
deck, 2″ + 2″ polyiso, and original decking in toward the attic.
The metal roof isn’t shiny metal, it’s a painted coating that has
more emissivity than that but gives a misleading view because it’s
also somewhat reflective in the IR and therefore looks much colder
when it’s reflecting some of the night sky to the imager. The tape
normalizes that, and strongly suggests that the roof’s effective
“white plate” temperature is considerably colder than the walls
of the house! The temp endpoints in the FLIR shots are held locked
to similar values, and note that everything in view is pretty cold
anyways — it’s not like the house is hemorrhaging a lot of heat
in general, this is just to get some relative measurements. We can
see that the patch of sky beyond the roof and trees is registering
far colder than everything else. [If I point the imager straight up,
it’s off the bottom of the scale at “< -40F”.] My initial conclusion, especially after reading about some of the successful NSR cooling
experiments in New Mexico, is that on clear nights the roof itself
must be getting colder than ambient and increasing the effective temp
delta across the upward-facing surfaces of the house.
Okay, so maybe I’m having trouble with, uh, nocturnal emissions.
At a theoretical ballpark around 500 btu/hr over the relevant area of
my roof it doesn’t seem like it would make a huge difference, but one
could still argue that a tad more insulation on upper surfaces *would*
make sense here even in winter. That likely goes for any other roofing
materials as well since there’s nothing particularly special about the
emissivity of the metal panels. What I should really do is figure out
a way to bury a temp sensor under one of the panels, to determine if
this effect really is chilling down the upper decking to ten or more
degrees below everything else. If all the exterior house surfaces
simply illustrated outgoing heat flux in the infrared and the roof was
losing more than the walls, on a windless night it would look warmer
up there, not colder. I suspect that the vinyl siding experiences
some amount of NSR too as it consistently scans a little less than
local ambient too, and it’s not in direct contact with the walls
except where the vertical strapping and the window bucks are.
Well, enough of that nerdfest for now; what other reasons are in the
collective wisdom for having more insulation at the top of a building??
GBA Detail Library
A collection of one thousand construction details organized by climate and house part
I think you have accurately named the reasons that codes and traditional building practices call for higher R-values for ceilings than for walls.
By far the most important reason: in a conventional house with a vented unconditioned attic, it's much cheaper to add insulation to the attic floor than it is to walls. Deeper insulation makes sense for attics, because the incremental cost to go from 8 inches to 16 inches is less than it is for walls.
Also, ice dams are a major concern in a cold climate with snow. A roof needs a high insulation value to minimize snow melt and associated damaging ice formation. Walls do not share this concern. An R-20 wall can be fine, but an R-20 roof can be a really bad thing even if the heat radiating from all points in the building envelope are the same.
Night sky cooling stopping assembly;
Same metal roof layed over 1.5" strapping over foil faced foam. Would not much of the night time cooling be stopped right there at the foil?
I am always amazed at how well a heat shield works with just a small gap... like we use around chimneys and woodstoves and you see if you look at your exhaust system as it exits your vehicle's engine to keep the spark plug wires from damage.
It's common for roof surfaces to drop as much as 20F or more below the ambient air temp on clear nights (limited only by the dew point of the air), or rise to 50F or more above the ambient air temp on sunny days. The radiational cooling & solar gains of walls aren't nearly as severe, since the walls are facing a landscape, half or more of which is radiating at or near the ambient air temp rather than the coldness of outer space in the night sky, or the intense radiation of the sun during the day.
On a clear winter night (or even a clear DAY) ahead an point the IR-cam at the sky (away from the sun, if a clear day) then point it at the ground. You could also do this with an IR thermometer. It's pretty amazing how cold the radiant temp of the sky is when the moisture content of the air is low, even during the day. But during the day that high-intensity hot spot brings the AVERAGE radiation temp WAY up.
Trying to do vented strapping would be interesting as there's
a whole 'nother solid plywood deck over the foam. Maybe strapping
that followed the rafters between the foam and the plywood, with
the headloks run all the way through the combination ... we were
actually thinking about something like that for the retrofit but
it sounded like more of a PITA than it was worth especially with
the occasional difficulty finding rafters at all.
Heh, do I just need to put together a giant Reflectix blankie for
the house now??
I was doing some informal "audit" around an acquaintance's house the
other night, and as they live on a hill where you could almost look
down on their roof by walking up the street a ways, I thought that
might be a nice way to view heat leakage at the roof deck -- *if*
there hadn't been snow on it, which there was, so we didn't try.
Sounds like the NSR problem would make that a fairly useless exercise
anyway, unless perhaps one could do it on a night with heavy overcast
but not precipitating. We did see a lot of the classic "sagged in
the bay" insulation issues, though.
These days I'm being amused by the melt patterns on the roofs I
pass, which probably tells you a lot more than any IR scan could.
Big valleys with little ridges 16 on center? Typical attic with
insulation a bit lacking. Little gullies at the rafters between
thicker "pillows"? Conditioned attic with cavity-fill alone.
Fairly uniform surface? Thermal breaks on the rafters! Big bare
patch in the middle? Oh, that must be where the bathroom fan lets
out underneath, or maybe the mold colony is making its own heat...