Hi Luther,

I've not done MIDI on iOS yet, but Mac OS X provides CoreMIDI. CoreMIDI handles 
time stamping for recording and playback, so you don't have to. There are a few 
different ways to use CoreMIDI, and some leave more of the work up to your app. 
Generally, if you're handling all data scheduling, then you merely need to pick 
a reliable time period and make sure that all of your MIDI events are handed to 
CoreMIDI slightly before their time stamp requires them to fire. This time 
period allows the CoreMIDI API some time to pass the data along to the 
interface, potentially taking advantage of technology such as MTS. Even 
dynamically generated MIDI should work flawlessly without significant latency. 
Recording is similar, in that all the data you receive will have a time stamp 
generated by CoreMIDI. The only exception to the above is that some software 
MIDI endpoints do not have automatic scheduling (but all MIDI hardware does 
have scheduling handled by CoreMIDI and the OS).

A quick check of the internet hints that CoreMIDI was added to iOS 4.2, which 
sounds like a long time ago. None of my clients have needed MIDI support, so I 
don't know any details of whether CoreMIDI differs between OSX and iOS.

As Paul Davis has mentioned, CoreAudio provides clocks that allow you to figure 
out what time stamps to apply to outgoing MIDI, so that CoreMIDI will align the 
MIDI events to audio. Same for incoming MIDI - the time stamps can be compared 
to CoreAudio clocks for reference. This is a deep subject, though, because 
CoreAudio has tons of information about the latency between time stamps in your 
code and the actual audio events on the other side of your audio interfaces. 
You'll need to take these latencies into account if you want perfect alignment 
between audio and MIDI. Thankfully, Apple has done the hard work of making the 
information available to the application, but you need to figure out how to 
obtain all of this information from the various sources (buffer delays, 
interface conversion delays, bus latency, channel latency, etc.)

By the way: Welcome! I have found programming CoreMIDI to be very enjoyable, 
having worked with MIDI since its inception. Good luck with your sequencer.

Brian Willoughby
Sound Consulting


On Jul 9, 2015, at 9:08 AM, Luther Baker <[email protected]> wrote:
> I am interested in writing a MIDI sequencer (iOS and eventually Mac) and was 
> wondering, amongst other things, what the general consensus was regarding 
> timers for recording and playback.
> 
> Is there a good open source implementation someone would suggest I read 
> through?
> 
> I'd be curious what folks suggest for storing the midi information as well as 
> suggested data structures and strategies for playback. Are  most packages 
> just storing time stamp offsets or are they creating a 384PPQN grid ... and 
> if so, any high level thoughts on how they manage that grid quickly?
> 
> Would it read like a game ... with a game loop leveraging CADisplayLink 
> timer? Do I need a high precision timer? etc ... 
> 
> I'm hoping someone wrote a white paper to introduce potential gotchas and 
> hoping someone could lead me in that direction.
> 
> Thanks in advance,
> -Luther
> 

 _______________________________________________
Do not post admin requests to the list. They will be ignored.
Coreaudio-api mailing list      ([email protected])
Help/Unsubscribe/Update your Subscription:
https://lists.apple.com/mailman/options/coreaudio-api/archive%40mail-archive.com

This email sent to [email protected]

Reply via email to