Using renewables on-site vs. grid-tie
I have a question that I am hoping may lead to some debate. This is a theoretical question and not tied to a specific project that I am working on.
I was talking with a senior energy expert at the Department of Energy about reaping the benefits of onsite renewables, e.g. solar panels on your roof, wind turbine, solar thermal, etc.
With electricity in particular, if that electricity is feeding into the grid we were discussing the difficulty in accounting for environmental benefits. The cost of transmitting electricity through the grid is very high—about 2 out of 3 watts are lost in the process. In addition, it’s hard to know if the 2 kW that you fed into the grid at 12 p.m. really resulted in the the utility being able to reduce their coal-fired output by 2 kW. In fact, there are a lot of reasons to conclude that nothing of the kind happened. And that’s to say nothing about what happened at 2 p.m. when a thunderstorm rolled in and you needed that coal-fired juice to run your A/C and lights, while you solar panels twiddled their thumbs.
The point of our conversation was that the best use of onsite renewables, from a thermodynamic and environmental point of view, is onsite.
My question is — should we be more real with homeowners and building owners about the hard-to-see benefits of grid-tied renewables? Where does this lead? Toward more off-the-grid homes? Homes with some systems off-the-grid and others on the grid? Homes that use PVs to make ice for off-peak cooling? Does this tell us to move away from PV to solar thermal?
Or, do you disagree with the premise?
Thanks in advance for your thoughts.