I'm having a very strange problem with my OpenGLES/MusicPlayer app.  My
UIPanGestureRecognizer is missing render frames ONLY when I have a
MusicPlayer/MusicSequence allocated.  By missing frames I mean that when
tracking a moving touch with UIPanGestureRecognizer, the expected (and
usual) behavior is that the recognizer will call it's callback with every
screen refresh(as long as the touch has moved), But for some reason when I
introduce a MusicSequence/MusicPlayer into the mix there are intermittent
misses where the callback isn't called and the location of the UITouches
aren't updated for a couple of screen refreshes at a time.  This only
happens with the combination of an openGL context being refreshed with a
CADisplaylink and a MusicPlayer. The way I found the problem is that I
noticed the choppy tracking stops when I stop and start the MusicPlayer, in
fact it happens more often if I DON'T call MusicPlayerPreRoll(). I have
created a small demo on github at
https://github.com/dave234/AudioTouchConflict where the bug can be
reproduced. It's a small demo where you drag a blue square and can
optionally start a MusicPlayer.  If you try the demo, the way to reproduce
the bug is to first drag the blue square around to witness the choppy
tracking, if it's not tracking the kill it and try again (it might take 2-3
tries). Once you see it being choppy press the play/stop button and see how
the choppiness stops.  Weirdest bug ever.

Any help is good,
Dave
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to