When you get into the world of building science, it’s inevitable that you’ll hear about the Oak Ridge study proving that fiberglass insulation loses about half its R-value. Maybe, though, you’re a homeowner who’s been told about this problem by an insulation contractor warning you not to put fiberglass insulation in your home. Or perhaps this article you’re reading now is the first time you’ve heard of the issue.
However you came to it, though, it’s important to understand what that Oak Ridge study showed and how much of what I said above is actually true.
First, let’s look at the actual research paper on this study (something that many of the people who cite it haven’t done themselves). Titled Thermal Performance of Fiberglass and Cellulose Attic Insulations, the paper describes the research done by Kenneth E. Wilkes and Phillip W. Childs at Oak Ridge National Laboratory (ORNL) in the early 1990s. (Download their paper along with the technical bulletins from Owens Corning and Johns Manville described below.)
They set up an attic test module that simulated temperature differences across an insulated attic floor. You can read all the details in the paper — the sketch from their paper is shown below — but basically they put a whole roof and attic assembly into big chamber and measured the R-values of three insulation types:
What they found is that the fiberglass batts and loose-fill cellulose performed as expected at the whole range of temperature differences. The loose-fill fiberglass, however, showed a significant reduction in R-value as the attic got colder and the temperature difference got larger.
In fact, the loose-fill fiberglass lost 35% to 50% of its resistance to heat flow at temperature differences of 70°F to 76°F. The loss of R-value started…