You didn't say what your data value size was.  From looking at the data,
it appears to be 4 bytes (i.e., 32-bit audio -- maybe only 24 of those
32 bits are actually used).

You didn't say how many data values per sample you were using.  From
looking at the data, it appears to be one (i.e., mono).

The documentation file you mentioned talks about frames, periods, and
buffers.  The manual page explains what a period is but not a buffer or
a frame.  It appears that a frame is the same as what I've been calling
a "sample".

You didn't say what you were using for periods/buffer.  From what you
wrote in an earlier comment, it appears to be 2.

Maybe you can understand why I'm feeling a little frustrated.  It would
be nice to have real answers to my questions instead of having to guess.

The documentation file doesn't explain how the latency is calculated.
>From looking at the example, it appears to be frames/period times
periods/buffer divided by the sample rate.  If I understand correctly,
this is the right formula for the output latency, but it is wrong for
the input latency.  The input latency should be frames/period divided by
the sample rate.

Putting together the numbers I guessed for your session, the calculated
latency would be 256 * 2 / 44100 = 11.6 ms.  But the calculated value is
wrong; the correct value is 256 / 44100 = 5.8 ms.  Raising the
frames/period to 512 will double these values.

Why are input latencies computed differently from output latencies?  To
explain, let's consider a simple example.  Suppose we use 256
samples/period and 8 periods/buffer.  If we try to change the output
right now, the changed value goes at the end of the current buffer and
it doesn't reach the hardware until 256*8 sample times have elapsed.
Hence the output latency is 2048 sample times (and of course, the sample
time is one over the sampling rate).

Now suppose we use the same parameters for input.  If the input changes
right now, the changed value will get stored in the current period, and
we will see it when the current period elapses.  Thus the latency will
be 256 sample times, not 2048.

Part of our problem here is that the settings you specify affect how
jack communicates with the kernel's audio driver, but they don't
directly reflect how the kernel's audio driver communicates with the USB
driver.  The usbmon trace for 3.5 shows that the audio driver was really
using the equivalent of 5.5 frames/period and 8 periods/buffer.  This
means the actual latency (as far as the kernel is concerned) was 5.5 /
44100 = 0.125 ms, as I mentioned earlier.  The usbmon trace for 3.8
shows that the audio driver was really using the equivalent of 11
frames/period and 8 periods/buffer, for a latency of 11 / 44100 = 0.25
ms.  Undoubtedly, either of these is a value you could easily live with.

I don't know how the audio driver converts the settings it gets from
jack to the settings it sends to the USB driver, but that appears to be
where the problem lies.

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1191603

Title:
  raring-updates: regression: 3.8.0-24.35 causes sporadic failure of USB
  audio devices

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/linux/+bug/1191603/+subscriptions

-- 
ubuntu-bugs mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to