On Jul 23, 2012, at 3:10 AM, Jay Reynolds Freeman <[email protected]> wrote:
> > The issue is synching animations -- visual ones -- with sounds. Think of it > as lip-synching. You will recall that what I wanted to do was have a > cat-face appear, open its mouth, say (play sound) "meow", then close its > mouth and go away. For this little effect to look best, the open-mouth > animation should end just as the sound begins, and the close-mouth animation > should start just as the sound ends. I know I can't count on that happening > if I the user has too many processes running and the underlying Unix > scheduler decides to swap at just the wrong time, but let's assume that my > App is getting plenty of cycles. > > There is no problem issuing [meow play] as the next instruction after the end > of the open-mouth animation; the issue is doing the best job of making the > close-mouth animation start just as the sound ends. The problem with a > call-back method, like the NSSound delegate method "didFinishPlaying:" is > that that method evidently works by posting an event of some sort to the main > event loop, or perhaps in a dispatch queue somewhere, and if there are other > events preceding it in either the event loop or the dispatch queue, the > operation of "didFinishPlaying:" may be delayed. (My app is a Lisp REPL, it > occasionally does some big chunks of work; fractional-second delays do > happen, and a few of those in the queue would noticeably mess up the timing > of the animation.) NSSound is not the appropriate API for this. It makes no guarantees about its responsiveness. Look into AVFoundation. --Kyle Sluder _______________________________________________ Cocoa-dev mailing list ([email protected]) Please do not post admin requests or moderator comments to the list. Contact the moderators at cocoa-dev-admins(at)lists.apple.com Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com This email sent to [email protected]
