So it is back on list, good.
Something I can make available for bristol in the short term is a VI like
interface
navigate left/right through the synth parameters with h/l and change their
values
down/up with j/k (with shift key accelerators and control key decelerators).
The nice thing about the interface is that the parameter control can be done
quickly
with one hand whilst the other one plays the piano. A readline interface might
need
both hands on the QWERTY to change a value.
Each change in these keys results in a short line of text output giving
parameter
name/value and the audible signal output from the synth. I have this
operational already
for bristol although I would give it a couple of days to clean up some pretty
gruesome
edges (you need to use two terminal, one for GUI one for engine at the moment
to get
it to work for example) but I can clean them up a lot for sometime next week.
The
code does not need X11 to be active to work but it does still need X11-dev to
compile
and X11 run time libraries to operate. It kind of cheats and presents a totally
dummy
window to the GUI - it thinks it has a window open which I need because of the
signal paths used to set parameters in the GUI into the engine.
Of course, if you don't like VI this is going to be a bit of a pain in the butt
- I love it
which is why I took it as a starting point. Medium term I can have a mapping
file for
the keys you want to use.
I am not sure I could code another audible tone to represent the value of a
parameter,
the issue is that audio comes from the engine emulator but the GUI or CLI
interface
is to brighton. To present an audible signal of the parameter value I would
have to
rework the engine as well. It is possible, but not in the short term.
This can be a prototype for people to test next week, after that we can work on
the
necessary refinements. The code is only for the Prophet-1 emulator at the
moment.
It can have memory save/load options operational. If people really object to VI
I can
work more on the arrow navigation keys but I would have to configure a very raw
interface to get that.
Regards, nick,.
"we have to make sure the old choice [Windows] doesn't disappear”.
Jim Wong, president of IT products, Acer
> Date: Fri, 9 Apr 2010 11:54:34 +0200
> From: [email protected]
> To: [email protected]
> CC: [email protected]
> Subject: Re: [LAD] Interface development for the blind (starting from
> Bristol)
>
> Hello Lorenzo!
> Using custom tones to mimic movement of sliders or marking the access of
> an
> element (button, list, etc.) is not completely off. I mean graphical apps use
> a lot of signalling sounds, so do some text based apps. Yet in the context of
> audio software I think it's not very feasable, as my ears would be atuned to
> the audio I'm working on, and in any case a realtime app should perform some
> change, when I change a slider, push a button.
> I hope that helps
> Kindly yours
> Julien
>
> --------
> Music was my first love and it will be my last (John Miles)
>
> ======== FIND MY WEB-PROJECT AT: ========
> http://ltsb.sourceforge.net
> the Linux TextBased Studio guide
> ======= AND MY PERSONAL PAGES AT: =======
> http://www.juliencoder.de
> _______________________________________________
> Linux-audio-dev mailing list
> [email protected]
> http://lists.linuxaudio.org/listinfo/linux-audio-dev
_________________________________________________________________
Hotmail: Trusted email with Microsoft’s powerful SPAM protection.
https://signup.live.com/signup.aspx?id=60969_______________________________________________
Linux-audio-dev mailing list
[email protected]
http://lists.linuxaudio.org/listinfo/linux-audio-dev