Michael Tyson, author of TheAmazingAudioEngine, recommended this article to
me:
http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing

It doesn't address your question directly, but it's a good baseline for
explaining the types of things that should be avoided.

I'm not an expert, but I would suspect that the AudioUnitGetProperty() is
safe to call, since it is a C function and satisfies the "your audio thread
code should be only C - never Obj-C" dictum. That said, it's possible that
I'm completely wrong, so take that with a grain of salt!

Mark Wise
646.241.7126

On Tue, Jan 27, 2015 at 10:07 AM, Leo Thiessen <[email protected]>
wrote:

> Hi folks,
>
> I’m new to coreaudio programming and audio programming in general.  Having
> read through some online materials, I’m trying to wrap my head around what
> is safe to do on in a realtime audio thread render callback and what’s
> not.  As far as I can determine I should really only be calling functions
> that have guarantees about the way they behave: fast, no memory
> allocations/freeing, consistent cpu demand vs high spikes of demand, etc.
>
> This is maybe really obvious, but help me out: how do I know/find out
> what’s safe to do?  For example, can I call AudioUnitGetProperty() in a
> render thread?   I’m suspecting the answer might be it a “that depends…”
> type; but any pointers to help me locate the answers would be much
> appreciated.  Included here is an example of what I’m doing in a render
> thread I’m working on.  Is this sane/OK to do?
>
> I’m using theamazingaudioengine.com, running a Mac OS X 10.9+ and a iOS
> 7+ target executable.
>
> static OSStatus _renderCallback2(__unsafe_unretained VEAETrack
> *THIS,
>                                  __unsafe_unretained AEAudioController
> *audioController,
>                                  const AudioTimeStamp
>  *time,
>                                  UInt32
> frameCount,
>                                  AudioBufferList
> *audio) {
>
>     // Do the main audio processing
>     THIS->_superclassRenderCallback(THIS, audioController, time,
> frameCount, audio); // the superclass uses an AUAudioFilePlayer to render
> the audio, then applies a gain and pan filter using Apple’s vdsp functions
>
>     // Get our current time data
>     if(noErr==AudioUnitGetProperty(THIS->_au,
>                                    kAudioUnitProperty_CurrentPlayTime,
>                                    kAudioUnitScope_Global,
>                                    0,
>                                    &THIS->_audioTimeStamp,
>                                    &THIS->_audioTimeStampSize)) {
>         UInt32 currLoopCount = floor(THIS->_audioTimeStamp.mSampleTime /
> THIS->_mFramesToPlay);
>         THIS->_currentTime = (THIS->_audioTimeStamp.mSampleTime -
> ((float)currLoopCount * THIS->_mFramesToPlay)) / THIS->_outSampleRate;
>
>         // Check for callbacks to be done
>         if(THIS->_completionBlock) {
>             if(THIS->_isLooping) {
>                 // If we are on a new loop number, trigger completion
> callback
>                 if(currLoopCount > THIS->_numLoopsCompleted) {
>                     THIS->_numLoopsCompleted++;
>
> AEAudioControllerSendAsynchronousMessageToMainThread(audioController,
> _notifyCompletion, &THIS, sizeof(VEAETrack*)); // does not lock/block the
> realtime thread
>                 }
>             } else {
>                 // If we're in the last renderCallback of a non-looping
> channel, trigger the completion callback
>                 UInt32 remainderPlusFramesThisRender =
> ((UInt32)THIS->_audioTimeStamp.mSampleTime % THIS->_mFramesToPlay) +
> frameCount;
>                 if(remainderPlusFramesThisRender >= THIS->_mFramesToPlay) {
>
> AEAudioControllerSendAsynchronousMessageToMainThread(audioController,
> _notifyCompletion, &THIS, sizeof(VEAETrack*));
>                 }
>             }
>         }
>     }
>
>     return noErr;
> }
>
> The part I’m wondering about is my call to AudioUnitGetProperty(…) - how
> would I know or find out if that’s OK to do realtime like above?  How about
> other coreaudio functions such as calling MusicDeviceMIDIEvent() on an
> Apple AUSampler instrument audio unit, from the realtime audio thread?  Or
> must I run my own, separate thread parallel to the realtime audio thread to
> do these types of things?
>
>
>
>
>
>
>  _______________________________________________
> Do not post admin requests to the list. They will be ignored.
> Coreaudio-api mailing list      ([email protected])
> Help/Unsubscribe/Update your Subscription:
>
> https://lists.apple.com/mailman/options/coreaudio-api/markmediadude%40gmail.com
>
> This email sent to [email protected]
>
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to