Sorry for the double-post, but I forgot to mention one detail:
The old DOS screen readers worked in two ways: (a) by hijacking
interrupt 21h so they'd be the first to know when a program wrote to
the screen, and to grab (and act upon) keystrokes. (b) by directly
accessing the video buffer to enable browsing screen content,
independently of whether or not it had gone through int 21h or had
been mov'd there directly.
I hope I'm making any sense. I'm more of a mathematician than a
programmer, but I'm doing my best to use the terminology correctly,
and I feel this is very old code we're talking here.
Best,
Felix

Am So., 15. März 2020 um 18:35 Uhr schrieb Felix G.
<constantlyvaria...@gmail.com>:
>
> Hello Mateusz,
> there is no such thing as a dumb question when asked in the spirit in
> which you are asking. Let me clarify inline below:
>
> > FreeDOS - and DOS in general - is a text-based system, hence one could
> > technically imagine that a virtualization platform could be able to
> > provide an embedded screen reader that reads whatever is present in the
> > VGA buffer. Whether such a contraption exists I have no clue.
>
> And neither do I, which is why I chose to run Dosbox and redirect its
> serial port output to an emulated speech synth on the host. Were I
> given a way to browse the VGA buffer in some VM, I would be overjoyed.
>
> > Questions: how can a blind user install any operating system at all on a
> > PC? Are there some tricks that allow such feat, or is this a step that
> > always require sighted assistance?
>
> Most operating systems have built-in accessibility features
> accomodating for blind users. For example, during Windows setup one
> could press ctrl+Windows+enter to start Narrator, the native Windows
> screen reader. On MacOS you would bring up VoiceOver with command+f5.
> And on Ubuntu you would press alt+super+s to start Orca. Pretty much
> every operating system that's been around for more than two decades
> has evolved some way to do this. I was actually hoping FreeDOS could
> be counted among that lot.
>
> > You are mentioning serial port and hardware speech synth. I can only
> > suppose that blind users would connect such synth to an RS-232 port and
> > provide appropriate instructions to the program or OS so they output
> > meaningful descriptions over this port. But you say these hardware
> > gimmicks aren't in sales any longer - what are the current ways that
> > blind people use for interacting with computers? Are there some software
> > standards or APIs for screen reader emulation?
>
> There are screen readers for Windows, most kinds of Linux, as well as
> MacOS, and they all use software speech synthesizers which are
> accessed through dedicated APIs. On Windows this API would be called
> SAPI, while on Linux it is the so-called Speech Dispatcher which is
> part of BRLTTY. DOS didn't have memory-resident software speech
> synthesizers, which is why people connected hardware ones to RS-232
> ports just as you assumed, and used special TSR programs to grab text
> as it was displayed, and to browse the VGA buffer. The installation
> itself wasn't accessible, of course, but then again this was the 20th
> century, and now we can do better, or so I hope.
> All the best,
> Felix


_______________________________________________
Freedos-user mailing list
Freedos-user@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/freedos-user

Reply via email to