Thanks for the input Paul and extended response Brian.

In my case, I very specifically want a song writing tool - a sequencer to
trigger my MIDI gear to fire ... and I've not been happy with the iPad
offerings as of late. I'd eventually like to be able to control other
synths on say, the iPad ... but that isn't my primary goal.

I was a keyboard musician in a past life and have always wanted to create
my own version of an MPC meets Ableton Live sequencer so looking forward to
getting my head around this!

Thanks,
-Luther


On Thu, Jul 9, 2015 at 12:16 PM, Paul Davis <[email protected]>
wrote:

> On Thu, Jul 9, 2015 at 12:21 PM, Paul Davis <[email protected]>
> wrote:
> > On Thu, Jul 9, 2015 at 12:18 PM, Luther Baker <[email protected]>
> wrote:
> >> I'm pretty naive ... if I plug an MAudio USB keyboard into my iPad and
> want
> >> to record the MIDI notes ... I guess you're suggesting I'd use the
> clock of
> >> the provided A/D interface? Is that accessible via Core Audio or Core
> Midi?
> >
> > You're going to get a callback from the coreaudio internals to process
> > (generate/consume) audio samples. That's your clock.
>
> I should note that your software design will be significantly
> different depending on whether you plan to work with MIDI only (i.e.
> no interactions with an audio interface at all, no synthesis or audio
> processing) or with audio and MIDI. As Brian notes, in the first case
> you can just read and write MIDI event timestamps. In the latter case,
> the audio interface clock is your master reference and most
> interesting things will happen in the callback that is executed on
> your behalf.
>
 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to