> On Apr 4, 2022, at 9:42 AM, Samuel Thibault <[email protected]> wrote:
> 
> Frank Carmickle, le lun. 04 avril 2022 09:36:53 -0400, a ecrit:
>> 
>>> On Apr 4, 2022, at 9:15 AM, Samuel Thibault <[email protected]> wrote:
>>> 
>>> Frank Carmickle, le lun. 04 avril 2022 08:58:08 -0400, a ecrit:
>>>> Please excuse my ignorance. It seems to me that we don't have a native 
>>>> Linux touch interface that is accessible, or did I totally miss something?
>>> 
>>> Touch screens do work on Linux, they show up as a mouse.
>> 
>> Sorry if I wasn't clear, I was talking about having a set of multitouch 
>> gestures for a window manager that allow for visually impaired folk to 
>> navigate the UI. Especially important would be the swipe to element 
>> navigation method.
>> 
>> Is the window manager the right place for such a driver? 
> 
> No, it'd rather be the screen reader that grabs the touch screen device
> so as to get its events and interpret them.
> 
> That being said, one might want to be able to have gestures both toward
> the screen reader and toward the window manager, in which case it'd have
> to be "something else" that grabs the touch screen device and report the
> gestures both to the screen reader and the window manager. I don't know
> if such infrastructure exists already.

Seems like a similar approach to how keystrokes get passed from the keyboard to 
the screen reader as well as the other applications, should be put in place?

I would imagine that we'd want this gesture handler to be written in a more 
performant language, as the use case is older mobile devices?

If Rich is looking to do some significant contributing, is this a good place to 
start?

--FC

Reply via email to