Hi Mahboud,

thanks for your suggestion. I will give it a look.

Yes, I am writing the app from scratch.

I have already implemented a simple version of it by using the remoteIO audio 
unit and streaming from the mic on the device, directly from the recording 
callback and taking the incoming stream from a circular buffer and processing 
it in the render callback on the receiving device.

Now, I want to implement an auGraph to incorporate the remoteIO unit along with 
a mixer (on the receiving side) in order to process multiple incoming audio 
streams.

I understand all of the concepts I need except for the assignment of streams to 
particular input busses on the audio mixer unit.

I looked at Apple’s source example 
UsinganAUGraphwiththeMulti-ChannelMixerandRemoteIOAudioUnit which loads and 
plays two audio files from disc. Unfortunately, I just don’t see where each of 
these files is being assigned to each of the two input busses on the audio 
mixer unit.

The reason I am wanting to assign specific audio streams to specific inputs is 
to allow selective muting and panning of participants in a conversation as well 
as to allow direct pass-through of one’s own mic signal if desired.

So I really need to know which mixer element is handling which stream at any 
given time.

Anyway, I hope this makes sense and thanks again for the additional source code 
example suggestion.

Cheers!

Cara
---
iOS design and development - LookTel.com
---
View my Online Portfolio at:

http://www.onemodelplace.com/models/Cara-Quinn

Follow me on Twitter!

https://twitter.com/ModelCara

On Feb 1, 2016, at 12:14 AM, Mahboud Zabetian <[email protected]> wrote:

Are you writing it from scratch or taking over an existing app and adding 
multi-channel to it?  Is it using the new AVAudioEngine?

A good starting point might be to look at 
https://github.com/SilentCircle/silent-phone-ios


> On Jan 31, 2016, at 8:46 PM, Cara Quinn <[email protected]> wrote:
> 
> Hello All, firstly thanks for this list!
> 
> My apologies for such a basic mixer question but I have been tasked with 
> writing a VOIP application and have had no previous experience with audio 
> units until just recently.
> 
> I am needing to add multi-channel support to the VOIP app at the moment so am 
> researching the standard multi-channel mixer audio unit.
> 
> I have looked at some sample code (both from Apple and from third parties) to 
> get an idea of what is happening and mostly get what is going on with one 
> exception.
> 
> I cannot for the life of me, figure out where the channel or bus assignments 
> are happening. I.E. if I have several audio streams, and I wish to assign 
> each stream to a specific mixer input I cannot see where this is happening in 
> any of the examples I am looking at.
> 
> Also, I have seen people say online that the busses and mixer channels are 
> two different concepts but again, I cannot see in the code where this is 
> demonstrated. So any help on the correct paradigm I should be thinking of 
> this in terms of, would greatly be appreciated.
> 
> I am sure I must be overlooking something basic here so if anyone can point 
> me in the right direction I would sure be grateful! :)
> 
> Thanks so much and please do have a great day!
> 
> Cheers!
> 
> Cara


 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to