On 4/20/11 8:53 AM, Jim Lux wrote:
On 4/20/11 8:26 AM, Poul-Henning Kamp wrote:
In message<[email protected]>, paul swed writ
es:

Thats what I also thought though did not plan to test it.
As for mains stability. They are indeed stable over time weeks and months
and are corrected.
Are you sure ?

Here in europe that was lost in the "privatization" of the grid: Nobody
was charged with paying for the extra power needed to capture lost
cycles, so now they just try to keep it close to 50.0Hz and don't
care about the integral.

I would be surprised if it were any different in USA.

It's probably somewhat better, because there are long distance transmission lines which rely on careful management of relative phases (and by extension frequencies) to control the power flow on the line. California consumes about 50 GW (peak) (26 GW for Ca Independent System Operator as I type this). The two Pacific Intertie lines (one AC and one HVDC) carry 7GW-ish. That AC line is quite the challenge to stabilize (it's 1000km long, and I've heard that transients take hours to die out). (and, of course, they use GPS heavily to provide an accurate time reference for reporting instantaneous phase and amplitude of the lines)

I seem to recall a site somewhere that gave statistics (in quasi real time) of the system frequency here in Southern California.

I found this in a generator interconnection agreement:

This frequency response control shall, when enabled at the direction of CAISO, continuously monitor the system frequency and automatically reduce the real power output of the Asynchronous Generating Facility with a droop equal to a one-hundred (100) percent decrease in plant output for a five (5) percent rise in frequency (five (5) percent droop) above an intentional dead band of 0.036 Hz

---

Here's a nice training presentation about how they measure and manage system frequency and relative phase (to fractions of a degree)
http://www.phasor-rtdms.com/downloads/guides/CAISO_RTDMS-Training01312006.pdf

There's some pictures of actual frequency disturbances during transients, and I leave it as an exercise for the reader to turn that into an AVAR spec.

That's all about short run (taus of tens/hundreds seconds)...
The system generally regulates frequency over 100k seconds (a day) to keep electric clocks reasonably "on time". I think the standard is "no more than 2 seconds deviation from UTC" which implies, what, something like 5E-4 ADEV for tau of 100,000 secs?

tvb's data at http://leapsecond.com/pages/mains/ seems to show similar statistics. His plot of "phase data" is labeled seconds, and if that's right, then his local power is substantially worse than the "2 second error" metric.

_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to