Prior to change, DTV stations that had a first adjacent lower NTSC had to be within 3 Hz of a specific (5+ MHz) offset from that first adjacent lower NTSC's visual carrier in order to prevent interference to the NTSC's chroma. The only practical way to do that was for both stations to lock to GPS and each maintain, by gentlemen's agreement a 1.5 Hz tolerance. Some of the stations required to do this actually did it. In the Los Angeles area there were several that were stable and within 0.1 Hz. This is sufficient for checking or calibrating a service monitor in the field but not nearly tight enough for what we try to accomplish.

The pilot signal is a 10 dB spike on the lower edge of the "box" as seen on a spectrum analyzer. Measuring this is like measuring any AM carrier. Now that NTSC analog is gone there is no requirement for anything other than the normal 1000 Hz tolerance of the pilot signal. Some stations may chose to lock or reference to a GPS standard but this is not a requirement. As to what signals may be within the data channel, I have no idea what's there other than a lot of boring programming.

Burt, K6OQK


From: Hal Murray <[email protected]>
Subject: [time-nuts] Time/freq from digital TV



My radio has many news stories about the end of analog TV.

What sort of time or frequency can I get from a digital TV signal?

Now that frame buffers are common, does each station use its own master clock?

Burt I. Weiner Associates
Broadcast Technical Services
Glendale, California  U.S.A.
[email protected]
K6OQK

_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to