Paul thank you. That makes perfect sense. How do I switch my processing to 
process the entire buffer at once and not just one sample at a time?

Sent from my iPhone.

> On Nov 30, 2015, at 7:43 AM, Paul Davis <[email protected]> wrote:
> 
> AudioUnits do not get to control the buffer size delivered via a render call. 
> The host decides this.
> 
>> On Mon, Nov 30, 2015 at 12:08 AM, Daniel Wilson <[email protected]> 
>> wrote:
>> Does anyone know how to change the frame size when doing the digital signal 
>> processing on an audio unit? Currently my audio unit is set up so that it 
>> receives a single sample, does the signal processing, outputs the sample, 
>> and repeats the process for each sample of the audio signal. I have created 
>> quite a few audio units with this set up but now I want to process multiple 
>> samples at the same time to do the FFT/IFFT, etc. Does anyone know how to do 
>> this? It seems like most people are using audio units for iiOS, but my audio 
>> units are for OS X to be used in programs like Logic Pro. Don’t know if that 
>> makes a difference.
>> 
>> -Daniel
>>  _______________________________________________
>> Do not post admin requests to the list. They will be ignored.
>> Coreaudio-api mailing list      ([email protected])
>> Help/Unsubscribe/Update your Subscription:
>> https://lists.apple.com/mailman/options/coreaudio-api/paul%40linuxaudiosystems.com
>> 
>> This email sent to [email protected]
> 
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to