t;
>> To: CoreAudio API <coreaudio-api@lists.apple.com>
>> <mailto:coreaudio-api@lists.apple.com>
>> Subject: Re: Channel Specific MIDI Reverb with an AUGraph
>> Message-ID: <8c7920c8-37bd-4c0b-b8c5-c3bb681ae...@me.com>
>> <mailto:8c7920c8-37bd-4c0b-b8c5-c3
Right - MIDISynth is designed for basic 16-channel GM use minus reverb and
chorus. Channel-by-channel functionality and FX can be handled with individual
instances of the Sampler AU (at the C-language layer) or the AVAudioUnitSampler
(at the ObjC-language layer) as described here.
-DS
> On
Timothy,
One minute sounds definitely WAY too long.
I cannot imagine the V3 API has a new implementation from scratch, it probably
encapsulates the V2 C API.
What you could do is to write a quick & dirty test app, and build your graph
using AVAudioEngine and AVAudioUnitSampler, and load your
?
Thanks,
Tim
Date: Wed, 21 Feb 2018 19:20:19 +0100
From: Sven Thoennissen<bioch...@me.com>
To: CoreAudio API<coreaudio-api@lists.apple.com>
Subject: Re: Channel Specific MIDI Reverb with an AUGraph
Message-ID:<8c7920c8-37bd-4c0b-b8c5-c3bb681ae...@me.com>
Content-Type: text/pl
Hello,
I can confirm your findings; could not find a way to accomplish reverb with
MIDISynth. In my app I use AVAudioUnitSampler for each MIDI channel. This
solution works fine for me, especially since my app needs to connect 3rd party
AU extension effects. Even if MIDISynth supported reverb