When offline rendering, using the following as an example

while ( engine.manualRenderingSampleTime < someMasterBuffer.length) {
  do {
          ...
          let status = try self.engine.renderOffline(framesTorRender, to: 
outBuffer)
      }
}


*        Is each loop, rendering to the beginning of the buffer, overwriting 
the previous values or is it appending upon each loop? (the documentation 
doesn't explain its use)

*        Is the outBuffer framelength being changed for me or do I need to do 
that myself?  The frame capacity can't be used as an output reference because 
it will be larger than the frame length

*        Is there a signal or sentinel that indicates when all processing has 
occurred?

If I play audio from a scheduled buffer, I could utilize an 
AVAudioNodeCompletionHandler, however that tells me when play completed not 
when processing completed.
Processing could have gone through delays, reverb or varispeed units, 
obfuscating when the last sample made it to the output node.
If I have a looped audio on one channel, that will always have audio, however I 
then I need to place a sentinel on another channel to know when that is 
completed, regardless of the delays in place.

Thank you,


W.


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to