Re: On Braille and Linux

2022-04-04 Thread Samuel Thibault
Frank Carmickle, le lun. 04 avril 2022 10:02:26 -0400, a ecrit:
> Seems like a similar approach to how keystrokes get passed from the keyboard 
> to the screen reader as well as the other applications, should be put in 
> place?

Concerning the protocol by itself (X or Wayland), there can be an
extension that defines events that applications receive, yes.

Then comes the question of letting a screen reader catch some of the
events (screen reader gestures) and not others (left for the window
manager, desktop bar, etc.)

Samuel



Re: On Braille and Linux

2022-04-04 Thread Frank Carmickle


> On Apr 4, 2022, at 9:42 AM, Samuel Thibault  wrote:
> 
> Frank Carmickle, le lun. 04 avril 2022 09:36:53 -0400, a ecrit:
>> 
>>> On Apr 4, 2022, at 9:15 AM, Samuel Thibault  wrote:
>>> 
>>> Frank Carmickle, le lun. 04 avril 2022 08:58:08 -0400, a ecrit:
 Please excuse my ignorance. It seems to me that we don't have a native 
 Linux touch interface that is accessible, or did I totally miss something?
>>> 
>>> Touch screens do work on Linux, they show up as a mouse.
>> 
>> Sorry if I wasn't clear, I was talking about having a set of multitouch 
>> gestures for a window manager that allow for visually impaired folk to 
>> navigate the UI. Especially important would be the swipe to element 
>> navigation method.
>> 
>> Is the window manager the right place for such a driver? 
> 
> No, it'd rather be the screen reader that grabs the touch screen device
> so as to get its events and interpret them.
> 
> That being said, one might want to be able to have gestures both toward
> the screen reader and toward the window manager, in which case it'd have
> to be "something else" that grabs the touch screen device and report the
> gestures both to the screen reader and the window manager. I don't know
> if such infrastructure exists already.

Seems like a similar approach to how keystrokes get passed from the keyboard to 
the screen reader as well as the other applications, should be put in place?

I would imagine that we'd want this gesture handler to be written in a more 
performant language, as the use case is older mobile devices?

If Rich is looking to do some significant contributing, is this a good place to 
start?

--FC



Re: On Braille and Linux

2022-04-04 Thread Samuel Thibault
Frank Carmickle, le lun. 04 avril 2022 09:36:53 -0400, a ecrit:
> 
> > On Apr 4, 2022, at 9:15 AM, Samuel Thibault  wrote:
> > 
> > Frank Carmickle, le lun. 04 avril 2022 08:58:08 -0400, a ecrit:
> >> Please excuse my ignorance. It seems to me that we don't have a native 
> >> Linux touch interface that is accessible, or did I totally miss something?
> > 
> > Touch screens do work on Linux, they show up as a mouse.
> 
> Sorry if I wasn't clear, I was talking about having a set of multitouch 
> gestures for a window manager that allow for visually impaired folk to 
> navigate the UI. Especially important would be the swipe to element 
> navigation method.
> 
> Is the window manager the right place for such a driver? 

No, it'd rather be the screen reader that grabs the touch screen device
so as to get its events and interpret them.

That being said, one might want to be able to have gestures both toward
the screen reader and toward the window manager, in which case it'd have
to be "something else" that grabs the touch screen device and report the
gestures both to the screen reader and the window manager. I don't know
if such infrastructure exists already.

Samuel



Re: On Braille and Linux

2022-04-04 Thread Frank Carmickle


> On Apr 4, 2022, at 9:15 AM, Samuel Thibault  wrote:
> 
> Frank Carmickle, le lun. 04 avril 2022 08:58:08 -0400, a ecrit:
>> Please excuse my ignorance. It seems to me that we don't have a native Linux 
>> touch interface that is accessible, or did I totally miss something?
> 
> Touch screens do work on Linux, they show up as a mouse.

Sorry if I wasn't clear, I was talking about having a set of multitouch 
gestures for a window manager that allow for visually impaired folk to navigate 
the UI. Especially important would be the swipe to element navigation method.

Is the window manager the right place for such a driver? 

--FC



Re: On Braille and Linux

2022-04-04 Thread Samuel Thibault
Frank Carmickle, le lun. 04 avril 2022 08:58:08 -0400, a ecrit:
> Please excuse my ignorance. It seems to me that we don't have a native Linux 
> touch interface that is accessible, or did I totally miss something?

Touch screens do work on Linux, they show up as a mouse.

Samuel



Re: On Braille and Linux

2022-04-04 Thread Frank Carmickle
Please excuse my ignorance. It seems to me that we don't have a native Linux 
touch interface that is accessible, or did I totally miss something?

Having Braille on touch screen input is a nice to have. We need a way to 
navigate a touch based GUI first, unless I've missed something.

--FC

> On Apr 3, 2022, at 5:55 PM, Rich Morin  wrote:
> 
> Thanks to Samuel and Devin for their input.  As Devin said:
> 
>> Smart phones use touchscreen braille input to make typing faster. ...
>> This attempts to bring this to Linux smart phones and such like that.
> 
> This is precisely my goal.  I'd like to provide touchscreen braille input for 
> Linux smart phones.  I'm particularly interested in supporting postmarketOS 
> (pmOS), because it targets older (and thus more economical) phones.  However, 
> I'd hope that my code would work on any Linux (or for that matter, BSD) 
> variant.
> 
> Since I have NO interest in trying to create a screen reader, I need to 
> understand the existing packages and what I'd need to do in order to support 
> them.  Please feel free to correct and/or supplement these notes...
> 
> # BRLTTY
> 
>> BRLTTY is a background process (daemon) which provides access to the 
>> Linux/Unix console (when in text mode) for a blind person using a 
>> refreshable braille display.  It drives the braille display, and provides 
>> complete screen review functionality. Some speech capability has also been 
>> incorporated. -- https://brltty.app/
> 
> 
> This seems pretty promising, but I have no clue how I should send event 
> messages to BRLTTY.  I sent a note to their mailing list, but other advice 
> and suggestions would be very welcome.
> 
> # Orca
> 
> Orca appears to support Braille input and output.  I read that:
> 
> - "The Orca screen reader can display the user interface on a refreshable 
> Braille display."
> - "Orca supports contracted braille via the liblouis project."
> 
> So, it would seem reasonable to take advantage of Orca's Braille input 
> capabilities.  I gather that Orca prefers to use the AT-SPI protocol on 
> D-Bus.  As Samuel pointed out, I could support this via 
> atspi_generate_keyboard_event.
> 
> # Yasr
> 
> I read that "Yasr is a general-purpose console screen reader for GNU/Linux 
> and other Unix-like operating systems."  I suspect that I'd want to talk to 
> it via uinput, but perhaps Samuel can clarify and/or correct this.
> 
> -r
> 



On Braille and Linux

2022-04-03 Thread Rich Morin
Thanks to Samuel and Devin for their input.  As Devin said:

> Smart phones use touchscreen braille input to make typing faster. ...
> This attempts to bring this to Linux smart phones and such like that.

This is precisely my goal.  I'd like to provide touchscreen braille input for 
Linux smart phones.  I'm particularly interested in supporting postmarketOS 
(pmOS), because it targets older (and thus more economical) phones.  However, 
I'd hope that my code would work on any Linux (or for that matter, BSD) variant.

Since I have NO interest in trying to create a screen reader, I need to 
understand the existing packages and what I'd need to do in order to support 
them.  Please feel free to correct and/or supplement these notes...

# BRLTTY

> BRLTTY is a background process (daemon) which provides access to the 
> Linux/Unix console (when in text mode) for a blind person using a refreshable 
> braille display.  It drives the braille display, and provides complete screen 
> review functionality. Some speech capability has also been incorporated. -- 
> https://brltty.app/


This seems pretty promising, but I have no clue how I should send event 
messages to BRLTTY.  I sent a note to their mailing list, but other advice and 
suggestions would be very welcome.

# Orca

Orca appears to support Braille input and output.  I read that:

- "The Orca screen reader can display the user interface on a refreshable 
Braille display."
- "Orca supports contracted braille via the liblouis project."

So, it would seem reasonable to take advantage of Orca's Braille input 
capabilities.  I gather that Orca prefers to use the AT-SPI protocol on D-Bus.  
As Samuel pointed out, I could support this via atspi_generate_keyboard_event.

# Yasr

I read that "Yasr is a general-purpose console screen reader for GNU/Linux and 
other Unix-like operating systems."  I suspect that I'd want to talk to it via 
uinput, but perhaps Samuel can clarify and/or correct this.

-r



Re: On Braille and Linux

2022-04-03 Thread Devin Prater
Oh no, I wasn't meaning to put more work on Joanie. I meant Rich could work
on that, since he'd have to get used to working on other technologies
besides, and he'd have to patch that into Orca anyway so why not get used
to the internals now. Also this could help bring trackpad navigation to
Orca as well.
Devin Prater
r.d.t.pra...@gmail.com




On Sun, Apr 3, 2022 at 12:08 PM Samuel Thibault 
wrote:

> Devin Prater, le dim. 03 avril 2022 12:02:59 -0500, a ecrit:
> > I'd work on getting Orca working on it first before working on a text
> > input method,
>
> Well "Orca" is mostly about Joanie alone, so I'd say avoid putting yet
> more load on her and work on a text input method.
>
> Samuel
>
>


Re: On Braille and Linux

2022-04-03 Thread Samuel Thibault
Devin Prater, le dim. 03 avril 2022 12:02:59 -0500, a ecrit:
> I'd work on getting Orca working on it first before working on a text
> input method,

Well "Orca" is mostly about Joanie alone, so I'd say avoid putting yet
more load on her and work on a text input method.

Samuel



Re: On Braille and Linux

2022-04-03 Thread Devin Prater
Smart phones use touchscreen braille input to make typing faster.
Devin Prater. This attempts to bring this to Linux smart phones and such
like that. I'd work on getting Orca working on it first before working on a
text input method, but that's just me lol.
r.d.t.pra...@gmail.com




On Sun, Apr 3, 2022 at 11:17 AM Samuel Thibault 
wrote:

> Hello,
>
> Rich Morin, le mer. 30 mars 2022 21:29:48 -0700, a ecrit:
> > AFAIK, there isn't any braille screen input support for Linux,
>
> AIUI, you mean using a touch screen or touch pad to type Braille? I'm
> not aware of anything like that.
>
> > One major issue is that I'm not at all clear on how to deal with the
> output.  What back-end programs should I target and what interface(s) do
> they generally want to deal with from input devices?  For example, is there
> an easy way in Linux for a user mode process to emulate a keyboard device?
>
> There are various ways to do that. If your output is really keypresses
> you can emulate a keyboard thanks to Linux' uinput. If your output is
> rather text, you can use atspi's atspi_generate_keyboard_event. You can
> also rather implement an ibus module.
>
> > More generally, I'd like to get some feedback on a11y, system and user
> interfaces, etc.  For example:
> >
> > - What back-end programs should I target?
> > - What kinds of gestures would folks want?
> > - What sorts of interfaces should I present?
> >
> > Advice, caveats, clues, and pointers would all be welcome...
>
> I'd say start with determining what the user needs are precisely, before
> looking at the technical details.
>
> Samuel
>
>


Re: On Braille and Linux

2022-04-03 Thread Samuel Thibault
Hello,

Rich Morin, le mer. 30 mars 2022 21:29:48 -0700, a ecrit:
> AFAIK, there isn't any braille screen input support for Linux,

AIUI, you mean using a touch screen or touch pad to type Braille? I'm
not aware of anything like that.

> One major issue is that I'm not at all clear on how to deal with the output.  
> What back-end programs should I target and what interface(s) do they 
> generally want to deal with from input devices?  For example, is there an 
> easy way in Linux for a user mode process to emulate a keyboard device? 

There are various ways to do that. If your output is really keypresses
you can emulate a keyboard thanks to Linux' uinput. If your output is
rather text, you can use atspi's atspi_generate_keyboard_event. You can
also rather implement an ibus module.

> More generally, I'd like to get some feedback on a11y, system and user 
> interfaces, etc.  For example:
> 
> - What back-end programs should I target?
> - What kinds of gestures would folks want?
> - What sorts of interfaces should I present?
> 
> Advice, caveats, clues, and pointers would all be welcome...

I'd say start with determining what the user needs are precisely, before
looking at the technical details.

Samuel



On Braille and Linux

2022-03-30 Thread Rich Morin
AFAIK, there isn't any braille screen input support for Linux, so I've been 
considering trying to implement some.  To be clear, I'm thinking about a 
program that detects and characterizes touch events and then recognizes 
gestures, and finally reports on these as Unicode sequences.

Approach

My approach is based on the Elixir programming language.  For the curious, I'm 
planning to construct a directed acyclic graph (DAG) of lightweight processes, 
flowing from the touch screen to a back end application which would handle 
braille input and feed it to various Linux CLI tools.  The graph would be 
created and used by means of an Elixir pipeline:

start_reporter()
get_action() |> add_gesture()

start_reporter() starts up a "reporter" process which listens for touch and 
gesture events, boils them down a bit, and reports the resulting Unicode to the 
client application.  add_gesture() is the front end for a set of gesture 
recognition processes.  It starts these up as needed, broadcasts messages about 
actions to them, then mostly gets out of the way.

Issues

One major issue is that I'm not at all clear on how to deal with the output.  
What back-end programs should I target and what interface(s) do they generally 
want to deal with from input devices?  For example, is there an easy way in 
Linux for a user mode process to emulate a keyboard device? 

More generally, I'd like to get some feedback on a11y, system and user 
interfaces, etc.  For example:

- What back-end programs should I target?
- What kinds of gestures would folks want?
- What sorts of interfaces should I present?

Advice, caveats, clues, and pointers would all be welcome...

-r