----- Original Message ----

From: Jed Rothwell 

> The worst performing U.K. turbine had a load factor of just 7 percent. These 
> figures reflect a poor return on investment."
 
> JR: This makes no sense.... Everyone knows that actual power generated is 
> less than nameplate capacity. 


Yes, of course they do, but the devil is in the details - and one point of the 
article is that the performance seems to have been badly miscalculated by the 
"experts". From the report it appears that this could be an endemic problem 
going all the way back to the planning stages - i.e. that some sites can have 
such a surprisingly low load factor - makes one wonder if they they not bother 
to test them thoroughly in advance.

If they had known the actual load factor (as opposed to the predicted) was 
going to be so low in advance, they probably would not have invested in wind 
energy at all. 

The load factor for nuclear, by comparison, often exceeds 100% since the 
planners tend to be rather more cautious from the start in stating capacity - 
whereas the promoters of wind have apparently erred on the side of optimism. 

The actual numbers do not lie. The problem is in reconciling them with what had 
been predicted in the planning stages - and then in using that knowledge for 
future planning.

If energy from either wind or nuclear cost $4 watt (faceplate) installed, and 
wind delivers only one fourth of that as the load factor, then it is a minimum 
or four times more costly, and it is hard to paint that picture any other 
way... 

... except to say that it is actually worse for the users, in practice, because 
the peak usage for consumers is at mid-day to mid afternoon, and that tends to 
be the time of day when wind is the least reliable.

Jones

Reply via email to