It's actually quite encouraging as it's a method for finding and
invoking functionality within an app that is centred around using the
keyboard.
That's in stark contrast to what's been going on with most operating
systems / devices lately where pointing and clicking or tapping on a
touch screen is the order of the day.
And as it's using existing toolkits like GTK and QT hopefully it wont be
too bad from an accessibility point of view.
I'm not that familiar with OS X, how does speakable items work? The blog
does mention integrating voice control into the HUD.
On 24/01/12 21:24, Dave Hunt wrote:
I'm more curious than worried; just wanted to pass this along. Thanks
for your thoughts on how it may work. One can already search for parts
of the gnome-control-center app from Unity and the Gnome shell; not
sure about other apps. For instance, if you search for "keyboard", in
the Gnome shell or Unity, you'll get the "keyboard" page, from the
control center, as a search result. Pretty cool, actually, if you
don't know what the app is called, or where it is. So, if the HUD is
just taking this kind of search ability further, that's interesting.
Now, if we could make something analogus to the "speakable items" in
OSX, That would be great.
Cheers,
Dave Hunt
I tweet as wx1gdave
Voice chat on sip:[email protected]
.
On 01/24/2012 04:09 PM, Paul Hunt wrote:
Interesting,
Well I don't think it's anything to worry about for now from a blind
user's perspective.
--
Ubuntu-accessibility mailing list
[email protected]
https://lists.ubuntu.com/mailman/listinfo/ubuntu-accessibility