I'm using an AUGraph with a kAudioUnitSubType_VoiceProcessingIO unit with a
44.1 kHz client + hardware format. I play a 48 kHz MP4 with AVPlayer and
notice that my render callbacks are modified with SRC (940/941
inNumberFrames instead of 1024). That's expected. However, when I query the
audio unit it still reports a client + hardware format with 44.1 kHz. Ok.

If I swap out the IO unit with a kAudioUnitSubType_RemoteIO unit, I receive
the same 941/940 inNumberFrames as before but the unit now reports a
hardware format with 48 kHz (which is what I'd expect from the
VoiceProcessionIO unit).

Is this expected behavior?

I'm looking into this issue because when this happens with my
VoiceProcessingIO unit, my microphone samples get distorted. It's almost as
if each sample only contains 940/941 samples instead of 1024 samples (ie.
audio is 'choppy'). If I swap the unit out with RemoteIO, the mic samples
are not choppy.

I need to be using a client format with 44.1 kHz because I am doing a lot
of video/audio mixing into my AUGraph (with AUConverter nodes converting
source assets to 44.1 kHz) and sending out 44.1 kHz samples over a network
stream.

Any insight is helpful.

Mark
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to