Es geschah am Monday 16 March 2009 als Luis Garrido schrieb:
> Hi!
>
> I have been investigating how to convert qgiged into an InstrumentEditor.
>
> The problem I see with the API right now is that IE is very limited in
> its functionality, probably as a consequence of its delicate
> interaction with the sampler process.

Yeah, the interaction in general upon instrument resources, especially but not 
just limited to instrument editors, is a bit delicate. And the 
current "InstrumentEditor" base class is also not a clean design made from 
scratch. It grew with our rising needs for certain live-features with gigedit 
step by step. Our priority first was to get the mandatory features in gigedit 
done ASAP, and a clean design came on second place. But of course we're open 
to clean things up.

> However these limitations get in the way of a comfortable and
> sophisticated user experience.
>
> I'll give an exampler. As things stand now, if you open a gigedit
> plugin from qsampler and, without closing it, load another instrument
> in the sampler strip the editor status becomes inconsistent.

In which way it becomes inconsistent? Internally in LinuxSampler we have a 
class called "InstrumentResourceManager". Sampler channel strips AND 
instrument editor instances act as "InstrumentConsumer"s and request the 
respective instrument(s) from that singleton InstrumenResourceManager class, 
which loads the instruments (and files) on behalf of them, returns them a 
pointer to the loaded instrument. The consumers (channel strips and 
instrument editors) inform the manager when they don't need the respective 
instrument anymore and when nobody needs the instrument anymore, the manager 
will free the instrument from memory (unless this policy is not modified by 
the user, because the user can also request an instrument always to be kept 
in memory). And the manager informs all consumers when a shared instrument 
gets modified or updated (e.g. reloaded from disk).

So when you open an instrument on a channel strip, open it with an instrument 
editor and then load another instrument on that channel strip, the editor 
still uses the instrument previously active on the selected channel strip. 
But that's intentional. The instrument editor is opened upon the selected 
instrument, not upon the selected channel strip.

The only inconsistency that currently exisists in this case is that the 
virtual keyboard of the instrument editor still sends note on/off events to 
that sampler channel strip, even though its not using that instrument 
anymore. You can see that as minor bug, which we haven't fixed yet due to 
other priorities.

> This is because IE class is too encapsulated and there is no
> communication path with the main application. There are 4 main actors
> here:
>
> 1) The sampler.
> 2) The GUI controlling the sampler (qsampler.)
> 3) The editor object, arbitrating the access to the instrument loaded
> in the sampler at the time of its creation.
> 4) The GUI controlling the editor.
>
> 1 & 2 are tightly coupled, same as 3 & 4. The problem lies in the
> interaction between both groups.
>
> One could design some external mechanism to connect 2 and 4 (OSC,
> dbus...) but the identification and registration of newly spawned
> plugins with the sampler GUI would be complicated.

Why would you want to connect the sampler frontend application and the editor 
GUI? What are the events you have in mind which should be sent between the 
frontend applications and the instrument editors?

> There is also the fact that the editor GUI creation, which is a heavy
> task, must be done by the editor object. It would be much more
> convenient if different editors could reuse a GUI by rerouting signals
> between them than have every editor spawn its own GUI.

I think that's a bit out of the scope of the sampler. If you want an audio GUI 
toolkit (like graphical wave editor widgets, loop editor widgets, envelope 
editor widgets, etc.) to be shared among other applications, like e.g. other 
instrument editor applications or sampler frontend applications, you could 
bundle that toolkit as separate C++ library, independent from the sampler. It 
would be more useful for other projects that way as well. And I don't see an 
advantage by incorporating such a toolkit e.g. into liblinuxsampler.

And even if we decided to incorporate such an audio GUI toolkit directly into 
the sampler. What would it be based on? GTK? Qt? Plain X11? Native Windows / 
OSX, ... calls? For many people that's almost some kind of ideology. Some 
people don't touch anything based on Qt, some not anything based on Gtk and 
some don't like anything that got Java under the hood and so on.

At the beginning of this project (around 2002 or 2003) we also had a long 
discussion about probably having our own GUI toolkit that could be reused for 
several sampler format GUIs. But such a GUI toolkit is a huge project on its 
own. Don't underestimate it!

> 1) For applications using liblinuxsampler extend
> LinuxSampler::InstrumentManager with something like
> ConnectInstrumentToEditor(instrument_id_t, *InstrumentEditor). That
> way an application can create and control its own IE and use it to
> access the instrument memory structure. This is, of course,
> disregarding LSCP.

You mean for allowing a frontend application to assign another instrument to 
an already opened instrument editor?

CU
Christian

------------------------------------------------------------------------------
Apps built with the Adobe(R) Flex(R) framework and Flex Builder(TM) are
powering Web 2.0 with engaging, cross-platform capabilities. Quickly and
easily build your RIAs with Flex Builder, the Eclipse(TM)based development
software that enables intelligent coding and step-through debugging.
Download the free 60 day trial. http://p.sf.net/sfu/www-adobe-com
_______________________________________________
Linuxsampler-devel mailing list
Linuxsampler-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/linuxsampler-devel

Reply via email to