On 29 October 2012 00:05, D. Michael McIntyre <
[email protected]> wrote:

> On 10/28/2012 02:10 PM, Chris Lyon wrote:
>
> >     GMIDI Monitor) I see the MIDI info echoing on the MIDI outputs of
> >     Rosegarden on channels 15 or 16 and can see no way of actually
> >     selecting those specified channels. I presume they are a component
> >     of the Instrument definition but can't see where the mapping to MIDI
> >     channel is made.
>
> Mapping to MIDI channels depends on the setting of Channel under
> Instrument Parameters in very recent versions of Rosegarden.  If set to
> "fixed" then the channel maps 1:1 with the instrument number.
> Instrument #16 uses channel 16.  If set to "auto" then you never know
> what channel a given segment on a given track is going to use, as
> channels are allocated and freed on a revolving, as-needed basis.  Less
> recent versions of Rosegarden just had the "fixed" channel arrangement
> with no option to float them.
>
> In the short term, getting to channels 15 and 16 is a matter of using
> instruments #15 and #16 and setting them to use the "fixed" channel.
>
> In the broader picture, this is the second report of something to do
> with MIDI thru routing and channel 16.  This really gets into territory
> where we need a developer who has a complex MIDI setup with exotic
> hardware.  Pretty much all of us have a General MIDI mindset, and I
> myself have never had anything more complex than one controller keyboard
> and one GM-capable sound module.
>
> I just don't have the experience to relate to any of these problems, and
> I don't have the equipment to test any possible solutions.  It's all
> beyond me, and Rosegarden's usability for MIDI "power users" probably
> suffers greatly as a result.
>
> There just isn't anything I can do about that but leave it for someone
> else to figure out, and hope they do.  I don't think anybody else around
> here has much more than I do to draw upon in the way of hands on
> experience.  If anybody is in a better position than me to at least have
> an opinion and some insight into any of this on the
> development/internals side, it's probably Tom Breton.  He's the guy who
> developed the automatic channel floating "logical instruments" idea.
> I'm afraid I'm responsible for laying out a lot of his requirements, and
> while he more than satisfied them, the requirements probably weren't
> quite right, because I have no idea how any of the more exotic MIDI
> stuff works.
>
> I'm more than willing to see all your higher end requirements satisfied,
> I simply have nothing to draw upon trying to make that happen with my
> own two hands.
> --
> D. Michael McIntyre
>
> Thank you for the swift reply.

I have always been suspicious of buttons marked Auto so what you say
doesn't surprise me :)
I have the Rosegarden Companion book and you indicate there that this is an
area of Rosegarden that needs attention. Perhaps I can provide some of
that? I've not contributed to open source projects before and my expertise
is more in the area of python but I do have some faintly esoteric MIDI
bit's and pieces and enough software knowledge to be faintly dangerous!

Perhaps as a user with subtly different requirements I might be able to
make suggestions on how I would like things to work? Certainly the idea of
MIDI thru' is relevant. Most decent MIDI kit allows the option of turning
off the local MIDI support. Effectively the virtual MIDI lead from the
keyboard can be disconnected from the sound modules and the device responds
as two separate devices. Very confusing for the beginner ( Why is my
keyboard not working..?) but very useful in complex situations. If this
conceptual function is supported then the 'natural' response would be for
the input MIDI data to be echoed straight to the output ports. The
confusion here would then be how does a specific output port know that it
is to be addressed? and that information would really need to be extracted
from the instrument or device definitions.

I understand the desire for a complete separation of the various objects
involved but Rosegarden pushes this to the extreme. The Event list and the
MIDI display in the transport both display notes & velocity but do not have
any indication of channel or to which output port they are routed.
Presumably, again, this is because they don't actually have any knowledge
of the final destination, but it does make MIDI fault finding
particularly difficult once one takes into account the possibly differing
requirements when the sequencer is in Play, Record, Stop and other modes.

Certainly trying to establish some form of 'default' MIDI setup is almost
doomed to fail, but in the same way that the different possibilities of
ALSA Jack et al, lead to a great variety of configurations, the MIDI
implementation probably would benefit from a staged approach. The
possibility of transmitting a raft of Sys exclusive data is certainly
something that power user will use but often this tends to end up with MIDI
devices constantly resetting to a default patch ( in my case this just
happens to be a sequence on my Nord which would be most embarrassing in a
live situation). A bare bones implementation that only handles Notes on-off
and similarly simple messages might actually be of great value as it means
that if your mental picture of what is going on and the actuality are at
variance you don't get surprised!

I realise I'm probably adding 10 or so dialogs here with all the associated
development dances and as ever it's easy to design if your not involved in
the code wrangling but having spent a fair bit of time reading around
Rosegarden I get the impression that this area has dissuaded several users
from the software. Now if the application addresses the requirements of the
developers then that is really nothing more than the issues that surround a
lot of open source code. But personally I believe you have a strong enough
brand to coalesce some of these features rather than watching the
development of a set of surrounding tools that end up with several
different ways of solving what is actually the same problem.

I'm working with a flautist and a violinist at the moment, trying to play
one or two seventies and eighties pieces ( Lament by Gryphon, Cutting
Branches for a temporary shelter, Penguin Cafe orchestra and Three friends,
Gentle Giant)  and both of them are hungry for notated score with guitar
accompiment. These are mildly complicated exercises in time signature and
rhythm and the ability to turn around accurately controlled scores is of
great benefit. The biggest problem is that when one of them wants to
quickly demonstrate a small passage on a keyboard, there is a fairly major
dance to ensure what they hear is actually what they want to hear. At
present they view Rosegarden with trepidation altho' they both sense
the available power.

So there you go another lump of opinion, from out of the blue. All I can do
is offer some help.Please don't feel offended by this post, It's not
generated without a little consideration, and I am using the code (Version
12.04 Build key bc105e54d6 which has been compiled on Ubuntu studio, rather
than loading the distribution version).  If there is some documentation I
could do to help this process then please don't hesitate to ask. I have a
little time at the moment so it might actually happen and if you'd like to
test something against my MIDI rig then once again please don't hesitate to
ask.

Chris Lyon



>

Rosegarden-user mailing list
> [email protected] - use the link below to unsubscribe
> https://lists.sourceforge.net/lists/listinfo/rosegarden-user
>
------------------------------------------------------------------------------
The Windows 8 Center - In partnership with Sourceforge
Your idea - your app - 30 days.
Get started!
http://windows8center.sourceforge.net/
what-html-developers-need-to-know-about-coding-windows-8-metro-style-apps/
_______________________________________________
Rosegarden-user mailing list
[email protected] - use the link below to unsubscribe
https://lists.sourceforge.net/lists/listinfo/rosegarden-user

Reply via email to