Hi,

I’m writing an iOS audio player utilizing AVAudioEngine with an 
AVAudioPlayerNode and some effect nodes. 

I want the audio to continue as seamlessly as possible on an audio route 
change, in other words: when playing audio through the external speakers and we 
plug in a headphone, then the audio in the headphones should continue exactly 
where the speakers have left off.

According to the docs, AVAudioEngine is stopped and connections are severed 
when such an audio route change occurs; thus the audio graph connections have 
to be re-established and playback has to be started afresh (buffers has to be 
enqueued again, erc.). When this happens a notification is posted 
(AVAudioEngineConfigurationChangeNotification).

In response to this notification, I wanted to simply re-enqueue the previously 
enqueued audio buffers, possibly skipping a bunch of samples from the start of 
the buffer that was playing at the time of the interruption, so that those 
parts that made it through the speaker won’t be played again in the headphones.

But there’s an issue here: by the time this notification is posted, the 
engine’s internal state seems to be torn down (the nodes are stopped and their 
`lastRenderTime` is nil), so I can’t figure out exactly where the playback was 
interrupted...

Have I missed an API that would let me query the playback time after such an 
interruption? 

What is the recommended approach for handling these route/configuration changes 
seamlessly? 

Calculating playback time from “wall time” (i.e. mach_absolute_time) feels a 
bit icky to me when working with a high-level API like AVAudioEngine...

Best regards,
Tamás Zahola
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to