Hi all. So I’ve been thinking about the accessibility of both the Mac and 
Windows apps. While Apple has clearly laid out the details of how the 
accessibility API works, developers usually don’t know them because either its 
way down in the developer guides or the developers just don’t worry about that 
kind of stuff. This isn’t just complex apps, these are little apps too, the 
apps you’d expect to work flawlessly, like Atlantis, the MUD client for mac. 
I’d love to be able to use it, but nope. Why not, I ask Apple. “Its up to 
developers to make their apps accessible.” Why? Why should it be the 
developer’s fault if an app they make can’t be used by the system screen reader?
I think that the accessibility engineers have been going about this the wrong 
way. First of all, if a developer uses a custom development that doesn’t 
support accessibility, there is no way of fixing that, and we can’t expect 
developers to rebuild apps just for us. Take the Alter-aeon MUD app for 
example. Now, maybe an app is pretty accessible but maybe just needs a little 
more tweaking that the developers just won’t be bothered with? Or maybe an app 
like open-emu, where the preferences dialog is almost accessible but the tabs 
along the top of the window cannot be reached via keyboard. We can’t expect 
developers to get it all right. I think that voiceover should copy what other 
windows screen readers have done in the past and has made countless apps 
accessible. Just get information about what’s on the screen and make that 
available to voiceover as well as the os x API’s.

-- 
You received this message because you are subscribed to the Google Groups 
"MacVisionaries" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
Visit this group at http://groups.google.com/group/macvisionaries.
For more options, visit https://groups.google.com/d/optout.

Reply via email to