Once again I'm working on the MIDI timing (when synced with the pcm
device). I've finally been able to trace through much of the code and I
think I found some problems.
THe attached patch does two things:
1) in snd_seq_timer_interrupt() the resolution is passed in from the
pcm timer. This patch updates the tmr->period to the new resolution.
The old code was overwriting this passed in resolution with tmr->period.
Practically, this made no difference in my code. But logically it seems
correct.
2) in snd_seq_timer_open() we need to get the resolution from the pcm
device, instead of using the default. This was not being done
otherwise, and so we were using the wrong resolution for the timer.
With these changes, PCM sync is working almost perfectly. There is some
drift. With a period size of 32, this drift is quite apparently within
a few minutes. On my computer, period size = 32 is a resolution of
725623 microseconds. I suspect that since this value stays constant, we
are losing a fraction of a microsecond on each interrupt. This adds up
to noticable drift.
With a period size of 128, the drift is not noticable within 4-6
minutes. I suspect it would still be noticable after 10 or so, since we
haven't eliminated the drift, only made it much slower.
Now for my questions:
Each time the interrupt gets called, it tries to update the timer
resolution. This results in a call to snd_pcm_timer_resolution(), which
pulls the value from substream->runtime->timer_resolution.
If we are going to eliminate drift, we need to modify this value. Would
it be appropriate to constantly update the timer resolution with the
dinstance in microseconds between successive interrupts? This could be
gotten from rdtscl().
With a period size of 128, the drift is not noticable within 4-6
minutes. I suspect it would still be noticable after 10 or so, since we
haven't eliminated the drift, only made it much slower.
Now for my questions:
Each time the interrupt gets called, it tries to update the timer
resolution. This results in a call to snd_pcm_timer_resolution(), which
pulls the value from substream->runtime->timer_resolution.
If we are going to eliminate drift, we need to modify this value. Would
it be appropriate to constantly update the timer resolution with the
dinstance in microseconds between successive interrupts? This could be
gotten from rdtscl(). And if that is the correct solution, what do we
do for PPC and other arches?
If that's not the correct solution, what would be? How do we eliminate
this timer drift? I'm willing to do the implementation if someone will
point me in the right direction.
jack.
_______________________________________________
Alsa-devel mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/alsa-devel