Capacity factor: The ratio of the electrical energy produced by a generating unit for the period of time considered to the electrical energy that could have been produced at continuous full power operation during the same period. (EIA)
The US EIA Electric Power Monthly uses the above definition for both fossil and non-fossil generators. However, the definition is more appropriate for intermittent renewable generators (wind and solar) than for other types of generation, since the output of these renewable generators have first priority on the grid. Their full output is used, except in circumstances when that output exceeds the contemporaneous demand on the grid. Therefore, their capacity factors are an accurate measure of what they are capable of generating “for the period of time considered”.
The output of wind and solar generators varies uncontrolled over timeframes of seconds, minutes, hours, days, weeks, month, seasons and years. In the shorter timeframes, output can vary from 100% of rating plate capacity to zero. Over the longer timeframes, wind generator output can vary from approximately 24 – 47% on a monthly basis and from approximately 32 – 35% on an annual basis. Over the longer timeframes, solar output can vary from approximately 12 – 33% on a monthly basis and from approximately 23 -26% on an annual basis. These numbers represent national averages for existing generating facilities.
The non-renewable generators supplying the grid are operated to generate the difference between the contemporaneous grid demand and the output of the intermittent renewable generators. Therefore, their “capacity factors” are not weather limited, as is the case with the intermittent renewable generators, but rather are “utilization factors” controlled by the output of the intermittent renewable generators and the contemporaneous grid demand. Therefore, the “capacity factors” of the non-renewable generators decrease as the quantity of renewable generation supplied to the grid increases, with the exception of the nuclear generators which are typically operated at full capacity because the variable cost of the generation they provide is low.
Nuclear generators are typically capable of operating at rated capacity approximately 95% of the year, natural gas combined-cycle generators approximately 90% of the year and coal generators approximately 85% of the year. The portion of the year when they are unavailable is typically scheduled for the shoulder months of the year, when grid demand is well below peak demand.
The lower “capacity factors” (utilization factors) reported by EIA are directly driven by contemporaneous grid demand and indirectly driven by weather impacts on intermittent renewable generation output.
Ultimately, the Administration goal is to replace dispatchable fossil generation with renewable generation plus storage. Assuming that storage can be recharged at approximately the same rate that it can be discharged, the maximum capacity factor for storage would be approximately 50%, in situations in which storage was discharged and recharged daily. However, in situations in which longer duration storage was charged during periods of high monthly or seasonal renewable availability for use during periods of lower monthly or seasonal renewable generation availability, storage capacity factor would be significantly lower. That has economic consequences, since storage is currently significantly more expensive than renewable generation.
Originally published here.
Thanks for the comment and also for the link. It is very interesting indeed, and very damning.
Always interested in 101 principles and review