I'm writing on behalf of a friend who's having trouble dealing with midi latency in a soft synth (possibly yoshimi) ...
Given a jack period of N frames, the midi latency with the original code effectively ranges from N frames to 2 * N frames, which I guess qualifies it as jittery. So far my friend has tried a few things, but there's no workable solution as yet. What seemed most promising was to break the audio generation into smaller blocks, applying pending midi events between blocks. Sadly, that drags the creation and destruction of note objects into the realtime jack process callback path. Latency improves, but the number of notes you can get away with before it all falls in a heap is significantly reduced. Getting the destruction of dead notes out of the realtime path is trivial, not so the creation of new ones. Even with a pool of pre-allocated note objects, it seems the amount of initialization code per note is still a real limiting factor on how busy things can get before it all falls apart. Such is life (for my "friend"). cheers, Cal
_______________________________________________ Linux-audio-dev mailing list [email protected] http://lists.linuxaudio.org/listinfo/linux-audio-dev
