(sorry, out of thread; copying from the marc.info post so
References/In-Reply-To aren't set)

> I am looking to understand / enhance the OpenBSD experience for
> blind users.

While not blind, I occasionally attempt to do some screenless testing
with accessibility-tech on OpenBSD, FreeBSD, and Linux.  I also hang
out in the blinux mailing list for blind Linux users, so am
interested in making the BSDs more accessible.

> Do we have any blind users reading misc that can offer any insight
> into their usecases / pain points / work flows / wants?
> I am sure OpenBSD is lacking on this front, so use cases in *nix
> would also be helpful.

>From some recent experiences:

- using a serial port or SSH has proven the best/most-reliable.  For
  some the machine would be attached to an external serial-driven
  synth or Braille device.  For others, it's a serial program on
  another machine that is accessible, or accessing via SSH from that
  other machine.  However, as powerful as the CLI is, it doesn't grant
  access to GUI tools like a real browser.

- yasr isn't available as a package (it's my go-to console
  screen-reader) but can be installed from source.  It does have a
  sample config file but needs a bunch of work to get set up,
  including getting speech-dispatcher to listen via an inet socket
  rather than a unix socket, then pointing yasr at speech-dispatcher,
  and making sure that it is configured properly. Also,
  speech-dispatcher times out after 5-seconds with no connection, so
  you have to know to start yasr within that window of time.

- attempting to `pip install fenrir-screenreader` fails because it
  uses some linux-specific headers

Getting Orca set up is a bit of a bear.  Doable, but it already
assumes you have access to the system.  But roughly involves
installing Gnome (plus configuring GDM which is mostly following the
docs, but it's certainly not out-of-the-box easy), Orca, eflite,
etc.  While GDM comes up with options to turn on text-to-speech, you
have to know the Alt+Super+S shortcut to enable, and you have to know
how to *use* Orca to navigate it.  All of that   All of that is pretty
difficult to do if you're blind and on your own.

Additionally, latency in Orca is pretty horrible on my test machine
here, even under light usage (in this context, running Gnome and the
Orca settings panel; no extra programs or non-default OBSD services
running).  It's not a powerhouse machine (3GB of RAM, dual-core 2GHZ)
but it's also not unreasonable specs for an older machine.

So in the end, using ssh/serial from a remote machine or using yasr +
speech-dispatcher locally was the most usable solution I've been able
to get working.  It would be nice to get Orca working usably so I
could test with a GUI browser.

As for things that could be improved, a couple ideas:

- adding yasr to the package repos

- perhaps some meta-package or a tutorial on getting
  speech-dispatcher + yasr + flite/festival/espeak/whatever working
  together

- tweak Gnome or whatever launches Orca so that it comes up with a
  tutorial mode and/or settings on first-run.

I'd be glad to test other configurations if needed.

-tkc
(@gumnos)


Reply via email to