I've been thinking about the recent "programs-as-plugins vs IPC" 
discussion, and especially, how LADSPA fits the big picture. As
I see it, most audio apps work like this:

0...n inputs --> { audio app internals } --> 0...n outputs

Inputs&outputs are links to the outside world (files, soundcards, etc),
while everything else happens inside the app. The current LADSPA 
design suits best for internal processing; simple and easily usable
building blocks used by the host app. 

As for the "other plugin API", I don't see it as a competitor (or
replacement) for LADSPA, but rather a different kind of API. It 
would be used at the "-->" -points in the above figure.
In other words, it would be a interface for simulating IPC 
between audio apps using function callbacks - run programs
as plugins. 

Most of the "advanced" topics we've discussed - multichannel support,
generic audio formats, converters, custom GUIs, etc - fit more naturally 
to this design. 

Another thing I've found interesting, is the close relation between 
soundcard<-->app and app<-->app interfaces. I've already spent lots of
time talking about the ALSA pcm-loopback API, but we should also 
remember the normal pcm-API. For instance, ALSA pcm-API already has
solutions for many of our current design problems: multichannel data,
multiple audio formats, latency, etc ... I think we should take 
advantage of this. After all, ALSA and OSS APIs are familiar to us
all. And, of course, if I had to make a plugin version of my 
standalone xxx softsynth, it's the soundcard i/o routines I'd
have to replace with plugin callbacks.

I'm sorry if this sounds a bit incoherent, but I wanted to write 
this down somewhere... :)

-- 
Kai Vehmanen <[EMAIL PROTECTED]> ---------------- CS, University of Turku .
 . audio software for linux ... http://www.eca.cx                .
 . armchair-tunes mp3/wav/ra .. http://www.wakkanet.fi/sculpcave .

Reply via email to