Hello W.,

To answer re “drive audio source(1) using audio source(2)” - have you tried 
using AudioUnitAddRenderNotify()

Recently, as a test, I successfully sub-classed and used AVAudioUnitEffect with 
the following init:

- (instancetype)init {
    AudioComponentDescription component = 
VEComponentDescriptionMake(kAudioUnitManufacturer_Apple,
                                                                     
kAudioUnitType_Effect,
                                                                     
kAudioUnitSubType_PeakLimiter);
    if ( (self=[super initWithAudioComponentDescription:component]) ) {
        _audioUnit = self.audioUnit;
        _didAddRenderNotify = 
VECheckOSStatus(AudioUnitAddRenderNotify(_audioUnit,
                                                                       
VEAVLimiterRenderCallback,
                                                                       
(__bridge void * _Nullable)(self)),
                                              "AudioUnitAddRenderNotify");
    }
    return self;
}


…with a render function like:

OSStatus VEAVLimiterRenderCallback(void *inRefCon,
                          AudioUnitRenderActionFlags *ioActionFlags,
                          const AudioTimeStamp *inTimeStamp,
                          UInt32 inBusNumber,
                          UInt32 inNumberFrames,
                          AudioBufferList *ioData) {
    if ((*ioActionFlags & kAudioUnitRenderAction_PostRender) && (inRefCon)) {
        __unsafe_unretained VEAVLimiter * THIS = (__bridge 
VEAVLimiter*)inRefCon;
        // This is essentially a “tap” do something, e.g. here you could do 
some calculations such as
        // determine peak amplitude, or whatever, then store those values and 
use them to do something like
        // “modulate" the “pan” on the source(1) AVAudioNode 
    }
    return noErr;
}

If I understand you correctly, I think this could work.

Kind regards,
Leo Thiessen
http://visionsencoded.com <http://visionsencoded.com/> 




> On Jun 6, 2017, at 2:00 PM, [email protected] wrote:
> 
> Send Coreaudio-api mailing list submissions to
>       [email protected]
> 
> To subscribe or unsubscribe via the World Wide Web, visit
>       https://lists.apple.com/mailman/listinfo/coreaudio-api
> or, via email, send a message with subject or body 'help' to
>       [email protected]
> 
> You can reach the person managing the list at
>       [email protected]
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Coreaudio-api digest..."
> 
> 
> Today's Topics:
> 
>   1. Seeking advice for modifying audio input source (Waverly Edwards)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Tue, 6 Jun 2017 16:59:05 +0000
> From: Waverly Edwards <[email protected]>
> To: "[email protected]" <[email protected]>
> Subject: Seeking advice for modifying audio input source
> Message-ID:
>       
> <sn1pr0401mb17417e7862d3719f4edf6ad084...@sn1pr0401mb1741.namprd04.prod.outlook.com>
>       
> Content-Type: text/plain; charset="utf-8"
> 
> I've been reworking an existing project that uses core-audio to now use 
> AVAudioEngine and I am need of some assistance.
> I am currently forcing AVAudioEngine to render offline, using AudioUnitRender 
> on the output node  - the setup is non-standard but it works.
> I am using Swift and MacOS, not iOS, it's been a bit of a roller-coaster but 
> I'm making (slow) progress
> 
> I am seeking direction on how you can drive audio source(1), using audio 
> source(2).  My specific need to perform a panning effect back and forth at a 
> specific frequency.
> I believe that I could also perform other audio magic such as fade-in, 
> fade-out using the same basic idea.  It would be so much better if there is a 
> high-level way to manage this however, I will go low-level if needed.
> 
> Here are the ideas that I've investigated or attempted:
> 
> Add an audio unit effect.
> Add a tap and use the output as a source make adjustments to the other source
> Adding a render callback to an existing mixer node
> Add an AVAudioIONode?
> Use an AUAudioUnit to access the buffer list data (latest idea being 
> investigated)
> 
> I looked at AudioUnitV3Example and was completely out of my depth, 
> considering writing an audio effect
> Unfortunately, I can't use a tap to monitor and drive the source because 
> working offline, the tap doesn't get data - I tried.  I would also need to 
> decrease the buffer significantly for granularity sake if it did.
> I've been attempting to access the underlying Audiounit of one of the mixer 
> nodes or see if it is possible to add or access a render callback  - not 
> successful yet.
> It took me a couple of weeks to get a render callback on the input node 
> working so I thought I could do so on a mixer node - that battle is still 
> being fought.
> In researching AVAudioIONode, I haven't determined how do you get access to 
> the input stream
> 
> Here is the latest idea being investigated, AUAudioUnit
> I looked into AUAudioUnit and have been unable to determine how you can 
> create a node that accepts input
> I'm not getting the input but do I need it, since I have access to the 
> output. Is access to the input provider necessary?
> 
> Q1: should the componentType be kAudioUnitType_Mixer to get input and output?
> Q2: can this be non-interleaved on MacOS
> Q3: how do you get the input provider working?  Is this necessary, since I 
> have access to the output provider
> Q4: How do you create a component an insert it in the between nodes - let's 
> say between a mixer and distortion node, not just at the end
> 
> Thank you,
> 
> 
> W.
> 
>        do {
> 
>            let audioComponentDescription = AudioComponentDescription(
>                componentType: kAudioUnitType_Output,                 // Q1: 
> should this be kAudioUnitType_Mixer to get input and output?
>                componentSubType: kAudioUnitSubType_HALOutput,
>                componentManufacturer: kAudioUnitManufacturer_Apple,
>                componentFlags: 0,
>                componentFlagsMask: 0 )
> 
>            if (auAudioUnit == nil) {
> 
>                try auAudioUnit = AUAudioUnit(componentDescription: 
> audioComponentDescription)
> 
>                let upstreamBus = auAudioUnit.inputBusses[0]
> 
>                let audioFormat = AVAudioFormat(
>                    commonFormat: AVAudioCommonFormat. pcmFormatFloat32,
>                    sampleRate: Double(sampleRate),
>                    channels:AVAudioChannelCount(2),
>                    interleaved: false )                                 // 
> Q2: can this be non-interleaved
> 
>               auAudioUnit.isInputEnabled = true
>               auAudioUnit.isOutputEnabled = true
> 
>               auAudioUnit.inputHandler = { (actionFlags, timestamp, 
> frameCount, inputBusNumber) in
>                  print("handling input,calling method to fill audioBufferList 
> ")  // Q3: not working but is this necessary, since I have access to the 
> output
>               }
> 
>             auAudioUnit.outputProvider = { (actionFlags, timestamp, 
> frameCount, inputBusNumber, inputData) -> AUAudioUnitStatus in
>             print("handling output,calling method to fill audioBufferList ")
>             return 0
>             }
> }
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: 
> <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20170606/cb39d234/attachment.html>
> 
> ------------------------------
> 
> Subject: Digest Footer
> 
> _______________________________________________
> Coreaudio-api mailing list
> [email protected]
> https://lists.apple.com/mailman/listinfo/coreaudio-api
> 
> ------------------------------
> 
> End of Coreaudio-api Digest, Vol 14, Issue 27
> *********************************************

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to