On Sun, Feb 22, 2015 at 12:44 PM, Ilya Konstantinov <
[email protected]> wrote:
> When calling AudioUnitRender, we're passing a list of AudioBuffers:
>
>
> struct AudioBuffer
>
> {
> UInt32 mNumberChannels; // <-- why this?
> UInt32 mDataByteSize;
> void* mData;
>
> };
>
> Why does it
> have to
> contain
> an
> mNumberChannels
> member?
>
>
>
>
> After all, we've already set up the stream format with an
> AudioStreamBasicDescription
> , including
> mChannelsPerFrame
> :
>
>
Because an AudioBuffer is not related to an AudioStream. AudioBuffers can
be passed into something that will feed them into a stream, and the buffer
format and stream format may be different. Consider the trivial case of
merging two single channel AudioBuffers into a two channel stream.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com
This email sent to [email protected]