On Wednesday, April 3, 2019 4:00 PM, Samuel Thibault <samuel.thiba...@ens-lyon.org> wrote: > Olivier Fourdan, le mer. 03 avril 2019 14:39:29 +0200, a ecrit: > > > > What we currently have in Wayland is support for AccessX, directly > > > implemented in mutter. The rest of accessibility features are currently > > > mostly broken. > > > > Well, instead of "mostly broken", I'd say it's a WIP on the GNOME > > side. To recap the state of the accessibility features in GNOME: > > Some of these cover what I mentioned indeed, but that does not cover the > Orca needs for instance, and is GNOME-only. I'd really not push for a > GNOME-only solution that would only leave disabled people with one > choice of desktop, and not be able to use other people's computer which > may not be running GNOME.
Do you have a list of accessibility features you'd like to have and a description of what they do, how they look like? > > - For at-spi on Wayland, there is no global coordinates and no plan > > to add them. So for being able to generate pointer events without > > global coordinates and keyboard events without XTest for a replacement > > of dogtail which would work with GNOME on Wayland, there is ponytail: > > > > > > https://gitlab.gnome.org/ofourdan/gnome-ponytail-daemon > > I am aware of this, but I thought this was mostly considered as a hack > only for testing and not for production use? > > Apparently RecordWindow and Introspect are mutter/gnome-shell-specific, > are they supposed to be implemented by other compositors as well? > > > > We could think > > > of moving the implementation to a shared library that compositors > > > would use, with the same principle as libinput. Such library could > > > implement all accessibility features that a compositor should implement > > > itself. > > > > Yet, I think this is quite different from what libinput does. > > Sure, the kind of processing is different. I just mean that such a > library could be shared by compositors just like libinput is shared by > compositors. > > > These features are very deeply rooted in the event loop of the > > compositor, in the case of mutter, this is clutter. > > Reason for this is because it need to be able to delay, replace or > > inject both pointer (mousekeys) and keyboard events (slowkeys, > > stickykeys, bouncekeys). > > Sure, that's what I mentioned above. > > > It also interfere with xkb state for modifiers (stickykeys). I reckon > > moving the logic out of the compositor is certainly doable, but would > > not necessarily make the code much simpler for other compositors... > > It seems to me that the current AccessX code is already quite involved > compared to the interface it would require. And there is more that we > would like to add for other accessibility features. > > > > It could also provide interfaces for plugging accessibility > > > features implemented in separate processus (e.g. screen reader). To > > > avoid keyloggers and such, such interface needs to be available only to > > > trusted accessibility processes. > > > > That does not need to be Wayland protocols though, these could be DBUS. > > Sure. A few things to keep in mind: - If your feature needs to be strongly tied to Wayland (e.g. it needs to display a surface), Wayland protocols would probably be better - Some compositors don't have D-Bus interfaces (e.g. KDE, Sway). > > But I guess we need to define what a "trusted accessibility processes" > > is and how the compositor can enforce that. > > Yes. AIUI there were already some discussions about this on > wayland-devel? There were some discussions about security, yes. As of now the consensus mostly is: if the compositor starts the client, then it can provide access to privileged interfaces. > > > About the screen reader shortcuts, one issue we have with the current > > > implementation is that for each keyboard event the toolkit has to wait > > > for the screen reader to tell whether it ate the event or not. That > > > makes all keyboard events get delayed. This should probably be replaced > > > by a registration mechanism: the screen reader tells which shortcuts it > > > is interested in, and only the events for them are captured and provided > > > to the screen reader, asynchronously. > > > Opinions on the whole thing? > > > > If the screenreader issue is just about keyboard shortcuts, then a > > DBUS interface would be doable. > > There are also other use cases mentioned on the wiki page: e.g. virtual > keyboard needing to inject keypresses, screen reader needing key events > notification and mouse notification. > > But really, my main concern is that a mutter/gnome-shell-only solution > is really not a proper solution, we need to define something that other > compositors can easily implement too. Agreed. _______________________________________________ wayland-devel mailing list wayland-devel@lists.freedesktop.org https://lists.freedesktop.org/mailman/listinfo/wayland-devel