In the last year (April 1, 2012 to March 31, 2013), we used 6,046 kWh of electricity and produced 8,576 kWh for a net surplus of 2,530 kWh. That’s awesome right?
But 72% of the electricity we used was supplied by the grid when solar could not supply enough power to cover the need at that moment, like at night or on a cloudy day*.
Or another way to look at it, although we produce more than we use, most of what we produce we don’t really use directly. We only use 28% of what we produce. The rest goes back to the grid to pay back what we used when the sun wasn’t shining and to build up a surplus for a rainy day.
So even with all that sun, we still draw a lot of power from the grid that requires coal and other nasties to be burnt to serve our electricity needs.
It makes sense, most of our heavy use, hot water for showers, cooking, washing dishes, all occurs early in the morning or evening when the sun is not at it’s brightest or best angle. The more we time our usage to occur when the sun is shining, the less we demand of the grid.
There has to be a common industry term for this? Anyone know? Percentage of power supplied by the grid as compared to total usage when solar or other renewable is in the mix?
To me this seems like a much more important number to track if you have solar and are concerned with your direct carbon producing footprint.
* In order to find how much energy we used from the grid I added up all the usage values on an hourly basis that were greater than what was being produced by solar. For example from 5-6am, total demand was 1000Wh. The sun was just coming up and the system was only producing 200Wh. That means the grid was supplied 800Wh during that time. 80% is grid supplied for this hour. Now do that for every hour in a year. Hint, it helps if all your energy values are stored in a database.