I posted this earlier but the images caused problems.  

Hi.  After a three year break from CoreAudio, I am back.  I am working on an 
iOS educational app for kids that will allow them to hook the microphone up to 
a number of different filters or processors.  From what I see of AVAudioEngine, 
I am not sure if the different nodes can be connected in parallel (I don't mean 
exactly parallel), and that data seems to need to flow through things serially. 
 Can I do this:

https://www.dropbox.com/s/hanpcy7dtdzbfcn/PastedGraphic-2.png?dl=0

Where the microphone sound data is connected to three or more units?

If not, Can I use the tap on a node to create my own dispatcher of the data, 
calling the different clients with data, as the data arrives?  Am I better off 
just going back to AudioUnits?

My next question is, can I have multiple unrelated paths in one AVAudioEngine, 
or do I need to create multiple AudioEngines:

https://www.dropbox.com/s/v1hzql57omicrnq/PastedGraphic-4.png?dl=0 
<https://www.dropbox.com/s/v1hzql57omicrnq/PastedGraphic-4.png?dl=0>

I'll be building a lot of simple generators and filters, like sine, square, 
sawtooth, lowpass, pitch shift, time shift, etc.  Also a "backwards" unit.  
Except for a few, all others I would like to have work in realtime.  Again, 
shall I stay with the old CoreAudio API or will the new AVAudio framework give 
me the flexibility I need?  I think I can recall most of what I had learned 
before with CoreAudio, and do remember that things weren't easy.  That's why I 
am giving the new framework a chance, if it'll make my life easier.

Thank you in advance for your help.

-mahboud

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to