Hi, all. Still a bit confused about some things and am hoping to get a
bit of help.

As a brief reminder, I'm both new to Android development and totally
blind. While it would make sense as a new developer to start simple,
if the phone itself isn't accessible then "Hello, world" is fairly
useless. :) So, unfortunately, I've had to dive into the deeper end
and start writing a screen reader. Talkback was unfortunately too
basic, and I think that what I've done so far is already better, but I
may have run up against the limits of the API.

Currently I've written a screen reader that lets me navigate the UI
fairly well. Is there any more accessibility functionality than a
stream of events to the accessibility service? And, if so, how would I
use such a stream to accomplish the following use case?

I enter the email application. The accessibility events my app
receives provide me with a great deal of context, but not enough in
some instances. For instance, when I'm setting up my IMAP account, I
land on a field with the text "143". *I* know that this refers to the
port number, but I don't see a way for my application to make this
discoverable. There needs to be one of two things. Maybe either or
bothh of these exist and I'm just missing them.

1. An AccessibilityEvent needs to dump the entire screen/window
content to my application such that I can snag it and make it
available to the user for review. I thought that was waat events with
fullScreen=true were, but these only seem to contain window titles.

2. There needs to be a mechanism for accessing the widget hierarchy,
complete with contents. I understand this is a security risk, but I'm
not sure how to implement compelling acccessibility otherwise.

Am I just missing something?

Next, while most apps provide fairly useful AccessibilityEvents,
WebKit falls down completely. That is, none of the events I receive
have text, or are from classes other than WebView, WebDialog or
something similar. Is this a known issue? Based on the debug logs of
events I'm getting, I don't see how to make the browseer or any other
WebKit apps accessible.

Finally, I'm running into performance issues. Ideally, keystrokes
should terminate speech such that typing isn't painfully slow. When
typing into text areas, I can easily get 10-15 characters ahead of
TTS. What I need is to install a listener that listens for keyboard
input and flushes the TTS queue on any keypress. This will also allow
for speech interruption, a feature available in most speech-based
access solutions. Is it possible to do this, and if so, can someone
point me to the relevant classes/interfaces? I'd also like to
implement screen reader commands for reading statistics such as
battery level/signal strength/message count, which requires
implementing global hotkeys. As a new developer jumping in over his
head by necessity, it isn't immediately obvious where to look for
this. :)

Thanks a bunch.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google
Groups "Android Developers" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to
[email protected]
For more options, visit this group at
http://groups.google.com/group/android-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to