Hello tech@.

aucat(1) manual says:

Streams created with the -t option export the server clock using MTC,
allowing non-audio software or hardware to be synchronized to the audio
stream.  The following sample rates (-r) and block sizes (-z) are
recommended for maximum accuracy:

      o   44100Hz, 441 frames
      o   48000Hz, 400 frames
      o   48000Hz, 480 frames
      o   48000Hz, 500 frames

For me, it was unclear why manual suggests different block sizes
for one frequency until I understood MTC resolution being 96, 100
or 120 which is described much earlier in the manual.

This is because I expected a bit self-explanatory text here so I can
think "aha, I saw these numbers earlier" or "ok, let's search why
these are 96, 100 and 120". Something like

     o   44100Hz, 441 frames (MTC resolution is 100)
     o   48000Hz, 400 frames (MTC resolution is 120)
     o   48000Hz, 480 frames (MTC resolution is 100)
     o   48000Hz, 500 frames (MTC resolution is 96)

Moreover, above explanation of MTC and examples are somewhat
duplicate/overlap earlier -z description:

-z nframes
        The audio device block size in frames.  This is the number of
        frames between audio clock ticks, i.e. the clock resolution.  If
        a stream is created with the -t option, and MTC is used for
        synchronization, the clock resolution must be 96, 100 or 120
        ticks per second for maximum accuracy.  For instance, 120 ticks
        per second at 48000Hz corresponds to a 400 frame block size.

Is there any better way to explain this? With current text I forced to
jump up and down to understand what is what.

Alexey

Reply via email to