On Thu, Oct 22, 2020 at 01:00:38AM -0400, Nicolas Pitre wrote: > On Wed, 21 Oct 2020, Dave Mielke wrote: > > > [quoted lines by Lars Bjørndal on 2020/10/21 at 21:23 +0200] > > > > >I'm working on a project where I need to be able to quickly distinguish > > >between normal letters and braille patterns in the range 0x2800-0x28FF on > > >the braille display, with BRLTTY in the console. On > > >my system, 0x2801 is displayed as dot 1, like a normal a. It's ok if it's > > >displayed as a question mark or whatever, just for this project. Is > > >there a simple solution? > > > > I'm not sure what you're trying to do. Brltty shows the right symbols for > > the > > braille patterns. Why should it be doing somethign different with them? > > I think he wants to distinguish a braille 'a' from the regular letter > 'a'. Right now they're indistinguishable unless you use DESCCHAR on the > whole screen. > > My initial thought was to remove the definition for those unicode > braille characters from the braille table, but there doesn't seem to be > any such definition. > > So the quick solution would be to execute brltty with the > -X lx:unicode=no argument to disable unicode altogether. But then you > won't be able to use DESCCHAR to identify them either.
Ah thank you so much! I'll try that. _______________________________________________ This message was sent via the BRLTTY mailing list. To post a message, send an e-mail to: [email protected] For general information, go to: http://brltty.app/mailman/listinfo/brltty
