Re: DRM broadcast disrupted by leap seconds

2003-07-21 Thread Peter Bunclark
On Sat, 19 Jul 2003, Markus Kuhn wrote:

 All modern digital broadcast transmission systems introduce significant
 delays due to compression and coding. It is therefore common practice
 today that the studio clocks run a few seconds (say T = 10 s) early, and
 then the signal is delayed by digital buffers between the studio and the
 various transmitter chains for T minus the respective transmission and
 coding delay. This way, you can achieve that both analog terrestrial and
 digital satellite transmissions have rather synchronous audio and video.
 Otherwise, your neigbor would already cheer in from of his analogue TV
 set, while you still hear on DRM the live report about the football
 player aproaching the goal.
But that's exactly what does happen, analog TV is ahead of digital, often
leading to asynchronous cheering coming from different parts of the house.

 There are a couple of problem though with delayed live:

   - One is with the BBC. They insist for nostalgic reasons to transmit
 the Big Bang sound live, which cannot be run 10 seconds early in
 sync with the studio clock.

   - Another are live telephone conversations with untrained members of the
 radio audience who run a loud receiver next to the phone. The delay
 eliminates the risk of feedback whisle, but it now ads echo and
 human confusion. The former can be tackled with DSP techniques, the
 latter is more tricky.
But then there's often a deliberate delay introduced so the editor can
push the cut-off button on the first f


   - The third problem is that in the present generation of digital
 radio receivers (DAB, DRM, WorldSpace, etc.), the authors of the
 spec neglected to standardize the exact buffer delay in the receiver.
Intestingly, I have noticed Radio 5 live is synchronous or even slightly
ahead of analogue on Digital Terrestial.   I put it down to relatively
instantaneous compression/decompression of audio cf video streams.
(NICAM is near-instantaneous on 15-year old technology)

Pete.


Re: DRM broadcast disrupted by leap seconds

2003-07-19 Thread Ed Davies
Markus Kuhn wrote:
 When the scheduled transmission time
 arrives for a packet, it is handed with high timing accuracy to the
 analog-to-digital converter,

I assume you mean digital-to-analog.

 ...
 [In fact, since short-wave transmitters frequently switch between
 programmes at the full hour, in discussions the hope was expressed that
 in practice nobody will notice.]
 ...

Don't they also often transmit the time signal on the hour?  Ironic, huh?

This also raises the point that because the transmission is delayed a few
seconds for buffering there is presumably a need for the studio to work
in the future by a few seconds if time signals are to be transmitted
correctly.

 Either having a commonly used standard time without leap seconds (TI),
 or having TAI widely supported in clocks and APIs would have solved the
 problem.

Absolutely - and the second suggested solution doesn't need to take 20
years to be implemented.

Ed.


Re: DRM broadcast disrupted by leap seconds

2003-07-19 Thread Markus Kuhn
Ed Davies wrote on 2003-07-19 09:15 UTC:
  When the scheduled transmission time
  arrives for a packet, it is handed with high timing accuracy to the
  analog-to-digital converter,

 I assume you mean digital-to-analog.

Yes, sorry for the typo.

 This also raises the point that because the transmission is delayed a few
 seconds for buffering there is presumably a need for the studio to work
 in the future by a few seconds if time signals are to be transmitted
 correctly.

All modern digital broadcast transmission systems introduce significant
delays due to compression and coding. It is therefore common practice
today that the studio clocks run a few seconds (say T = 10 s) early, and
then the signal is delayed by digital buffers between the studio and the
various transmitter chains for T minus the respective transmission and
coding delay. This way, you can achieve that both analog terrestrial and
digital satellite transmissions have rather synchronous audio and video.
Otherwise, your neigbor would already cheer in from of his analogue TV
set, while you still hear on DRM the live report about the football
player aproaching the goal.

There are a couple of problem though with delayed live:

  - One is with the BBC. They insist for nostalgic reasons to transmit
the Big Bang sound live, which cannot be run 10 seconds early in
sync with the studio clock.

  - Another are live telephone conversations with untrained members of the
radio audience who run a loud receiver next to the phone. The delay
eliminates the risk of feedback whisle, but it now ads echo and
human confusion. The former can be tackled with DSP techniques, the
latter is more tricky.

  - The third problem is that in the present generation of digital
radio receivers (DAB, DRM, WorldSpace, etc.), the authors of the
spec neglected to standardize the exact buffer delay in the receiver.

For the last reason mostly, the time beeps from digital receivers still
have to be used with great caution today (or are even left out by some
stations who prefer to send none rather than wrong ones).

  Either having a commonly used standard time without leap seconds (TI),
  or having TAI widely supported in clocks and APIs would have solved the
  problem.

 Absolutely - and the second suggested solution doesn't need to take 20
 years to be implemented.

The engineer involved in this project to whom I talked was actually very
familiar with my API proposal on

  http://www.cl.cam.ac.uk/~mgk25/time/c/

and agreed that the problem never had come up if that had been widely
supported by Linux, NTP drivers, and GPS receiver manufacturers. But we
are not there yet.

The current discussion on removing leap seconds no doubt also will delay
efforts to make TAI more widely available, because what is the point in
improving the implementations if the spec might change soon
fundamentally.

I don't care much weather we move from UTC to TI, because both
approaches have comparable advantages and drawbacks, which we understand
today probably as good as we ever will. But it would be good to make a
decision rather sooner than later, because the uncertainty that the
discussion generates about how to design new systems developped today
with regard to leap seconds can be far more hassle. It would be
unfortunate if at the end of this discussion we change nothing and all
we have accomplished is to delay setting up mechanisms to deal with leap
seconds properly. I personally feel certainly not motivated to press
ahead with proposals for handling leap seconds better, if there is a
real chance that there might soon be no more of them.

Markus

--
Markus Kuhn, Computer Lab, Univ of Cambridge, GB
http://www.cl.cam.ac.uk/~mgk25/ | __oo_O..O_oo__