but essentially, the ring buffer functions as a method of temporarily
holding the audio samples in memory from the rendering function in the
realtime thread and than quickly reading those samples for some other
purpose elsewhere?  using the timestamp to coordinate what exactly?  and
wouldn't the ring buffer eat its tail eventually if the realtime callback
was dumping sample data into the ring buffer faster than i could write it
to disk?  I'm sorry but i find this just a bit difficult to negotiate in my
head.

On Mon, Jan 11, 2016 at 12:43 PM Paul Davis <[email protected]>
wrote:

> On Mon, Jan 11, 2016 at 12:15 PM, Alexander Bollbach <
> [email protected]> wrote:
>
>> thinking a little more,  given that i'm recording an arbitrarily long
>> amount of audio, preallocating a buffer will not be possible, right?  Is
>> that where a circular buffer comes in?  would I use the much used
>> CARingBuffer?  If so, could I get some intuition about the timestamps.
>> I've followed chapter 8 of Chris Adamson's Learning Core Audio book and I
>> sort of have a sense for using a RingBuffer but i'm not entirely clear on
>> the concept of the timestamps you pass into it.
>>
>
> Yes you need a ringbuffer.
>
> I can't comment further because my audio software is cross-platform so I'm
> not familiar with the infrastructure offered by OS X or iOS.
>
>
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to