Pick up sfront 0.85 -- 10/13/02 at: http://www.cs.berkeley.edu/~lazzaro/sa/index.html [1] Mac OS X support for real-time MIDI control, using the -cin coremidi control driver. Up to four external MIDI sources are recognized. Virtual sources are ignored; expect virtual source support in a future release. [2] Mac OS X memory locking now works in normal user processes, and is no longer limited to root. ----- All the changes in 0.85 are OS X specific, but thought I'd post this here in case people are curious about OS X porting ... With this release, all of the real-time examples in the sfront distribution run under Mac OS X. Specifically, its now it's possible to use OS X as a Structured Audio softsynth -- I've been running my PowerBook this way with 2ms CoreAudio buffers, with MIDI input from my controller via an Edirol UM-1S USB MIDI interface, and audio output via the headphone jack on the Powerbook, and things work glitch-free. Also, because audio and MIDI are both virtualized under OS X, its possible to run multiple ./sa softsynths in parallel (i.e. from different Terminal windows) and get useable layering ... although in most cases, you'd be better off doing your layering inside a single SA engine. To see the -cin coremidi control driver in action, run the sfront/examples/rtime/linbuzz softsynth, it will find external MIDI sources (up to 4, no virtual source support ...) and use them to drive the SA program in real-time. In the linbuzz example, the pitch wheel (set up to do vibrato) mod wheel (spectral envelope) and channel volume controllers are all active -- you can look at the linbuzz.saol SAOL program to see how they are used. The actual CoreMIDI code is in: sfront/src/lib/csys/coremidi.c The most interesting aspect of this code is that a single AF_UNIX SOCK_DGRAM socketpair pipe (named csysi_readproc_pipepair) is used for communication between an arbitrary number of CoreMIDI readprocs (one for each active source) and the SA sound engine (which runs inside the CoreAudio callback -- the actual main thread sleeps and does nothing). Writing the pipe is blocking (but should rarely block, and never for significant time), but reading the pipe is non-blocking. The semantics of the AF_UNIX SOCK_DGRAM (AF_UNIX is reliable, SOCK_DGRAM guarantees the messages from the CoreMIDI readprocs don't mix) makes it a good choice for doing the multi-source MIDI merge. The actual messages sent in the pipe consists of a preamble to identify the readproc, and the (error-checked for SA semantics) MIDI commands in each MIDIPacket. At this point, the Linux and OS X real-time implementations support all of the same features (audio input, audio output, MIDI In, RTP networking) ... I'm not sure if AudioUnits support makes sense for sfront, I'll probably take a closer look at the issue soon ... ------------------------------------------------------------------------- John Lazzaro -- Research Specialist -- CS Division -- EECS -- UC Berkeley lazzaro [at] cs [dot] berkeley [dot] edu www.cs.berkeley.edu/~lazzaro -------------------------------------------------------------------------
