On 2022-04-03 at 14:44 -0700, Rich Morin <[email protected]> wrote: > I've been speculating about supporting touch screen input for Braille on > Linux-based cell phones.
I have been thinking about this too. My vision has been that this would be completely independent of any screen readers, but of course screen reader support might make it more capable. > What I have in mind is a user-mode program (written in Elixir) that would > recognize input events and gestures, then send messages to various screen > readers. I don't know, but I assume that recognizing touch gestures has quite tight real-time requirements. I'm not sure if an interpreted and garbage-collected language is the best fit for this kind of task. > Can someone give me some clues about the most reasonable way to send these > messages to BRLTTY? For example, should I use uinput? Any other advice, > comments, or suggestions would be welcome. I think uinput is a good way to go. It may even be possible to make it so that it does not need BRLTTY's braille key translation, but can input straight to the Linux console (if that is where you want to run this). -- Aura _______________________________________________ This message was sent via the BRLTTY mailing list. To post a message, send an e-mail to: [email protected] For general information, go to: http://brltty.app/mailman/listinfo/brltty
