Tamás,

I gave up AVAudioEngine for the same reasons a while back. I achieve similar 
results by counting on AVAudioSession RouteChange notifications (which I 
believe is what AVAudioEngine uses anyway) and low level AudioUnits to 
synchronize buffers during interruptions.

My understanding is that AVAUdioEngine is a wrapper on top of good-old 
Avaudiosession and audiounits! My main issue with not wanting to go further on 
AVAUdioEngine was its poor performance on real-time tap nodes while classical 
AUnits are controllable to the slightest detail.

If anyone can shed light on these observations it would be great.

Arshia Cont
www.antescofo.com 

Sent from my iPhone

> On 15 Apr 2019, at 00:10, Tamás Zahola <tzah...@gmail.com> wrote:
> 
> Hi,
> 
> I’m writing an iOS audio player utilizing AVAudioEngine with an 
> AVAudioPlayerNode and some effect nodes. 
> 
> I want the audio to continue as seamlessly as possible on an audio route 
> change, in other words: when playing audio through the external speakers and 
> we plug in a headphone, then the audio in the headphones should continue 
> exactly where the speakers have left off.
> 
> According to the docs, AVAudioEngine is stopped and connections are severed 
> when such an audio route change occurs; thus the audio graph connections have 
> to be re-established and playback has to be started afresh (buffers has to be 
> enqueued again, erc.). When this happens a notification is posted 
> (AVAudioEngineConfigurationChangeNotification).
> 
> In response to this notification, I wanted to simply re-enqueue the 
> previously enqueued audio buffers, possibly skipping a bunch of samples from 
> the start of the buffer that was playing at the time of the interruption, so 
> that those parts that made it through the speaker won’t be played again in 
> the headphones.
> 
> But there’s an issue here: by the time this notification is posted, the 
> engine’s internal state seems to be torn down (the nodes are stopped and 
> their `lastRenderTime` is nil), so I can’t figure out exactly where the 
> playback was interrupted...
> 
> Have I missed an API that would let me query the playback time after such an 
> interruption? 
> 
> What is the recommended approach for handling these route/configuration 
> changes seamlessly? 
> 
> Calculating playback time from “wall time” (i.e. mach_absolute_time) feels a 
> bit icky to me when working with a high-level API like AVAudioEngine...
> 
> Best regards,
> Tamás Zahola
> _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list      (Coreaudio-api@lists.apple.com)
> Help/Unsubscribe/Update your Subscription:
> https://lists.apple.com/mailman/options/coreaudio-api/arshiacont%40antescofo.com
> 
> This email sent to arshiac...@antescofo.com
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      (Coreaudio-api@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to arch...@mail-archive.com

Reply via email to