Hi W., Thanks for the feedback!
The https://forums.developer.apple.com/thread/72674 <https://forums.developer.apple.com/thread/72674> post doesn’t seem to explain why AVAudioUnitNode is the “correct” one. Subclassing AVAudioUnitNode should work just as well from what I know. Instead of sub-classing anything, I think it’s possible to simply instantiate AVAudioUnitNode with the correct AudioComponentDescription, then use AudioUnitAddRenderNotify() on the contained audioUnit - I’ve not tested this myself though. In my case I was intentionally testing out wrapping the Apple provided peak limiter audio unit, so AVAudioUnitEffect made sense and the docs seemed to indicate that it should be subclassed. Then I decided to try adding a render notify because I wanted to get faster metering than I was getting via a standard AV node “tap” (and the work-arounds to reducing the tap lag time seemed likely to break at some point). If you want, I can send you a zip of my test project that used the subclassing approach & you could play around with it from there - just let me know if you want it. If you end up using some kind of direct connection to the audio units like this, keep in mind: http://atastypixel.com/blog/four-common-mistakes-in-audio-development/ <http://atastypixel.com/blog/four-common-mistakes-in-audio-development/>. The TAAE2 <https://github.com/TheAmazingAudioEngine/TheAmazingAudioEngine2> (by the writer of that blog) also has a neat solution for safely communicating between the realtime-audio-thread and the main -thread. Depending on what you’re doing, it may be overkill though - using the simplest solution vs the sophisticated one is also easier to debug (this is stuff I’m learning... ;) You’re not alone in taking a long time … I suspect a notable portion of what wastes time is quirks like the “passUnretained” and “passRetained” one that you discovered - these quirks often compile & run but don’t do what was intended. I also think there is probably a whole lot less of those “gotchya’s” cropping up when working in Objective C and C instead of Swift at this point… in fact over the last few weeks I've completely divorced UI work and my “audio engine” work, along with doing the audio engine only in Objective C… ymmv, but I see a lot less Xcode crashes, spinning beachballs and even lagginess in my iOS app… after several years of primarily using Swift I’m now doing a project in only Objective C (even the UI) and so far it’s been relative bliss… the benefits of Swift’s elegance and type safety is, in my experience, dwarfed by the benefits of a more speedy & stable Xcode and far less “what on earth?” moments, though you’re still likely to stumble into at least some of those, just less :D Good luck on your work! Regards, - Leo > On Jun 9, 2017, at 2:00 PM, [email protected] wrote: > > Send Coreaudio-api mailing list submissions to > [email protected] > > To subscribe or unsubscribe via the World Wide Web, visit > https://lists.apple.com/mailman/listinfo/coreaudio-api > or, via email, send a message with subject or body 'help' to > [email protected] > > You can reach the person managing the list at > [email protected] > > When replying, please edit your Subject line so it is more specific > than "Re: Contents of Coreaudio-api digest..." > > > Today's Topics: > > 1. RE: Seeking advice for modifying audio input source > (Waverly Edwards) > > > ---------------------------------------------------------------------- > > Message: 1 > Date: Thu, 8 Jun 2017 22:00:36 +0000 > From: Waverly Edwards <[email protected]> > To: "[email protected]" <[email protected]> > Subject: RE: Seeking advice for modifying audio input source > Message-ID: > > <sn1pr0401mb17415b7630f913b28e57cc3584...@sn1pr0401mb1741.namprd04.prod.outlook.com> > > Content-Type: text/plain; charset="utf-8" > >>> > To answer re “drive audio source(1) using audio source(2)” - have you tried > using AudioUnitAddRenderNotify() > > Recently, as a test, I successfully sub-classed and used AVAudioUnitEffect > with the following init: > << > > I had not considered using AudioUnitAddRenderNotify() because I’ve been > unsure of a safe place to use it. > It actually took me weeks to figure out how to use a callback on the input > node but seeing your response has revived my interest. Why did it take > weeks? I don’t know if this is a bug or just different behavior for iOS > versus MacOS but 95% of the time, my code would crash, so after repeatedly > getting BAD_EXEC errors in the debugger, I tracked the issue to a single > line. What I found is that it I needed to pass the retained reference to > self to the callback on MacOS. > > > AudioUnitAddRenderNotify(testNode!, renderCallback, > Unmanaged.passUnretained(self).toOpaque()) # This works on iOS > (passUnretained) > > AudioUnitAddRenderNotify(testNode!, renderCallback, > Unmanaged.passRetained(self).toOpaque()) # This works on MacOS Sierra > (passRetained) > > > During my research weeks ago, I came across this thread > https://forums.developer.apple.com/thread/72674 > This mostly waived me off from subclassing AVAudioUnitEffect but I had been > thinking about it A LOT yesterday – wondering if I should do it anyway or use > AUAudioUnit. > I was concerned that all properties and methods may not be present that I > needed or that when I performed a super init, there would be an unpleasant > surprise by something additional that I didn’t want. > > Again, seeing this response has also renewed my interest in subclassing and I > think that I will subclass AVAudioUnitEffect and if I’m successful, I will > pursue subclassing AUAudioUnit > > BTW, I love your mission and values statement. > > Thank you VERY much, > > W. > > > From: Leo Thiessen [mailto:[email protected]] > Sent: Wednesday, June 07, 2017 3:58 PM > To: [email protected] > Subject: Re: Seeking advice for modifying audio input source > > Hello W., > > To answer re “drive audio source(1) using audio source(2)” - have you tried > using AudioUnitAddRenderNotify() > > Recently, as a test, I successfully sub-classed and used AVAudioUnitEffect > with the following init: > > - (instancetype)init { > AudioComponentDescription component = > VEComponentDescriptionMake(kAudioUnitManufacturer_Apple, > > kAudioUnitType_Effect, > > kAudioUnitSubType_PeakLimiter); > if ( (self=[super initWithAudioComponentDescription:component]) ) { > _audioUnit = self.audioUnit; > _didAddRenderNotify = > VECheckOSStatus(AudioUnitAddRenderNotify(_audioUnit, > > VEAVLimiterRenderCallback, > > (__bridge void * _Nullable)(self)), > "AudioUnitAddRenderNotify"); > } > return self; > } > > > …with a render function like: > > OSStatus VEAVLimiterRenderCallback(void *inRefCon, > AudioUnitRenderActionFlags *ioActionFlags, > const AudioTimeStamp *inTimeStamp, > UInt32 inBusNumber, > UInt32 inNumberFrames, > AudioBufferList *ioData) { > if ((*ioActionFlags & kAudioUnitRenderAction_PostRender) && (inRefCon)) { > __unsafe_unretained VEAVLimiter * THIS = (__bridge > VEAVLimiter*)inRefCon; > // This is essentially a “tap” do something, e.g. here you could do > some calculations such as > // determine peak amplitude, or whatever, then store those values and > use them to do something like > // “modulate" the “pan” on the source(1) AVAudioNode > } > return noErr; > } > > If I understand you correctly, I think this could work. > > Kind regards, > Leo Thiessen > http://visionsencoded.com > > > > -------------- next part -------------- > An HTML attachment was scrubbed... > URL: > <https://lists.apple.com/mailman/private/coreaudio-api/attachments/20170608/dfab84cb/attachment.html> > > ------------------------------ > > Subject: Digest Footer > > _______________________________________________ > Coreaudio-api mailing list > [email protected] > https://lists.apple.com/mailman/listinfo/coreaudio-api > > > ------------------------------ > > End of Coreaudio-api Digest, Vol 14, Issue 30 > *********************************************
_______________________________________________ Do not post admin requests to the list. They will be ignored. Coreaudio-api mailing list ([email protected]) Help/Unsubscribe/Update your Subscription: https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com This email sent to [email protected]
