Hi gang,

I'm using the AVAudioEngine player in a project where I need to play a
stereo buffer. I want the option to store my left and right buffers as two
separate buffers of Doubles.

Currently I'm handling playback by using a function from the Accelerate
framework to convert and copy the Doubles to the player's
AVAudioPCMBuffer... which is where I would love some sage advice from you!
Two problems with the approach I'm using:

1) at the moment I have to write data to .floatChannelData by arranging
samples like so: [ L0 L1 L2 L3 ... R0 R1 R2 ... ] It would be more flexible
if I could send the buffers separately. Is using an AVAudioEngine mixer the
*only* way do this?

2) (less important) it would be nice if I could send Doubles without
converting to Float32  but I get the impression that's not supported? I'm
not sure I'm sticking with Doubles anyways so it's not a huge issue (I have
code for both)

The best case scenario would let me keep the buffers up to date, and when
it comes time to play them, just assign two pointers for the buffers. That
would be pretty peachy. Anyone know if it's possible?

Thanks!

Charles

PS: I'm hoping I haven't triple-posted this email. I keep getting messages
from the admin group that it was rejected. If so, apologies for that too!
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to