Shouldn't I be able to do the following to just create a passthru AUv3?
- (AUInternalRenderBlock)internalRenderBlock {
// Capture in locals to avoid ObjC member lookups. If "self" is captured in
render, we're doing it wrong. See sample code.
return ^AUAudioUnitStatus(AudioUnitRenderActionFlags *actionFlags, const
AudioTimeStamp *timestamp, AVAudioFrameCount frameCount, NSInteger
outputBusNumber, AudioBufferList *outputData, const AURenderEvent
*realtimeEventListHead, AURenderPullInputBlock pullInputBlock) {
// Do event handling and signal processing here.
pullInputBlock(actionFlags, timestamp, frameCount,
outputBusNumber, outputData);
return noErr;
};
}
Before 'pullInputBlock' I am able to see the buffers are all 0.0 samples and
then after it has resonable audio samples.
I am able to connect from a host app with no complaints anywhere but still get
silence.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com
This email sent to [email protected]