off the top, if people are serious
about comming up with a good 
architecture, someone should sit
down and codify all the implicit
specifications requirements that
are floating around.  this sort of
thing should be done before the
API to make clear what the goals
are.  discussions of "what if i
want to do _____" will become a
lot more clear.   

  i guess the obvious goal is
standardization so that everyone
can use each others plugins.  my
first question is then, "what is
a plugin ?".  more specifically....

there seems to be different ideas
about the processing granularity 
plugins should support.  on one side 
is a view of plugins as little bits 
of equipment you might have on the 
rack in the studio.  each with a set 
of knobs and audio in and out.  
another conception leans towards 
plugins which are on the scale of 
fundamental dsp operators, things 
like FFT, derivative, convolution,
etc. 

*** it appears to me this distinction
hinges on whether we want plugins to 
generate controll signals which drive
other plugins ****

i think once this granularity question
is determined, a lot of other answers
will settle out for free. 

______________________________________

for large grain: 

  metaphor is plugging together hardware 
  in the studio.

  the questions about FFTs and cyclic graphs
  are probably unimportant since that sort 
  of thing wouldn't ever be done.

  specification of the controll signals
  probably only needs to take place between
  developers instead of the runtime level.  
 
  xml descriptions of controll signals are
  only of interest once you get up inside
  the gui layer.  they are usefull for things
  like pluggable looks a feels etc.  the
  connection between plugins and the gui
  layer probably shouldn't be burdened with
  this.  a particular host might like xml
  descriptions of the plugins it uses.  since
  the values don't really mean anything 
  outside the context of the plugin, there
  is no reason they shouldn't be scaled 0-1.

  one thing that seems to trouble people
  is plugins which generate status signals.
  my take on this is that the gui layer
  would have to manage polling the plugin
  in the same way that somewhere in the
  xevent system, someone goes out and
  polls/services the mouse.  this may not
  mesh nicely with GTK but i think the
  level of abstraction for a plugin should
  be the same as that for a mouse...sitting
  somewhere down close to the hardware in
  the realtime domain.


--------------------------------------------

for fine grain:

  metaphor is some pictures from your
  college dsp textbook

  being able to use the output of one
  plugin as the controll signal for another
  becomes very important.  maybe the 
  results of an FFT or tree-structured 
  filter bank are controll signals which
  affect a synthesis bank ?  things like
  implementing reverb using circular
  connections is necessary.

  xml descriptions of controll signals are 
  not particularily usefull.  would probably
  rather have strongly typed data.

--------------------------------------------


my impression is that the large
granularity case is a lot more
interesting.  lower level dsp 
code is better shared as a function
library than kludged out of plugins.
if you want a modular synth or a
reverberation model, build it and 
feed it's output into a network of
other plugins.  i would want my 
audio workstation to emulate an entire
studio rather than a particular
dsp algorithm.

someone is going to tell me that
they want to implement a system
which supports both granularities.
this strikes me as a mistake. 
yes it's probably possible to make
a totally generic API which will
let you interconnect anything and
everything just like your old
modular synth.  however, what host
is ever going to actually want to
support a true mixture ?  aiming
for something totally generic is
a noble cause but such projects
collapse under their own weight
or never actually get used because
they have absurd learning curves
etc.

sorry this got so long.  i'm new 
to the list so slap me if i'm out 
of line ..... my interests lie 
towards a system with a GUI which 
is actually a bunch of hardware 
knob boxes i can use to control
a realtime, linux-based sound 
engine in a live performance 
situation.



regards,
charless

Reply via email to