On Mar 16, 2015, at 9:09 PM, "Patrick J. Collins" 
<[email protected]> wrote:
>> I would most definitely look at the new AVAudioEngine API for doing the
>> kinds of things you are trying to do.
> 
> Yeah that api definitely simplifies everything.. I'm just hesitant to
> use it at this point as it would mean my app requires OS X 10.10...
> 
> I'm hoping someone here can help me solve this audio unit graph problem
> with the actual C apis..

I don't have time to dig up the code at this hour, but I developed something 
that sounds like what you described earlier for a client several years back. I 
think what you want is an Offline Host for an AudioUnit Graph. Such a setup can 
run faster than real time, taking your buffer of samples, processing it through 
the low-pass filter, and then saving to a new file. Rather than save to a new 
file, you could probably just store the data in a new buffer. The key is that 
this kind of code certainly doesn't require 10.10

The only complication is that if you want to send audio to the generic output, 
then you're locked into real time. That makes it a little more restrictive than 
offline rendering, where you can process something like a low pass filter much, 
much faster.

Brian


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to