So I've actually read that a few days ago.  i guess because the function
was a core audio i didn't make the connection that it was i/o accessing.
but right, i do recall that i/o restriction in the callback.  What do you
think was happening?  it was only writing a a fraction of the total frame
bytes before the callback returned? how do you account for the fact that
the resultant wav file was long and shorter based on how long i ran the
program?

anyway, now that I know this won't work, what is the right approach?  Do I
need to fill up a buffer in RAM and than start to offload those bytes into
the audiofile back on a non-realtime thread?

On Mon, Jan 11, 2016 at 7:36 AM Paul Davis <[email protected]>
wrote:

> On Mon, Jan 11, 2016 at 2:08 AM, Alexander Bollbach <
> [email protected]> wrote:
>
>>
>> I am trying to record small chunks of PCM for this beat making app I'm
>> writing and so I figured it would be easy enough to just do
>> AudioFileWriteBytes in this callback and multiply the number of frame with
>> how many bytes per frame and just write it but the file winds up being a
>> garbled mess.
>>
> You cannot do file i/o from a render callback or anywhere else in the
> realtime audio thread(s).
>
> Please read this, which isn't about audiounits but is about realtime audio
> programming.
>
>
> http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing
>
>
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to