Thanks Mark. I can easily detect touch events by their coordinates, but not so easily identify what UI elements lie underneath - unless I define and layout each of these UI elements graphically myself in terms of screen coordinates. I think your proposed TalkBackOverlayView would require that too, if I understand you correctly. My main screen already uses different touch screen events for gross positions like center, top middle, left middle, bottom left and bottom right, and speaks the relevant user actions. Here I did without an exploration mode, although that should have been easily possible through handling ACTION_DOWN and ACTION_UP events separately in dispatchTouchEvent(). This very basic level of touch screen use is indeed accessible for blind persons, as are the keyboard shortcuts that I have defined for phones with a physical keyboard.
However, at some point one would want to make good use of standard UI elements (listviews, buttons, radio buttons, checkboxes etc) for which Android determines the concrete layout instance, and render these standard elements accessible rather than having to define a range of dedicated touch views corresponding to each and every activity view that sighted users would use. I have no ambition to reinvent the Android layout framework. The iPhone, through VoiceOver, offers a generic touch screen solution for this as part of the OS. I feel that any accessibility "hook" for standard UI elements that I add at application level to my specific Android app should be only an interim solution and a stop gap until Android catches up and offers a similarly general accessibility solution as part of the OS. Otherwise, blind users will in the future at best have a few accessible point solutions, but no general accessibility with phones that lack a physical keyboard and/or d-pad. Now while trying to bridge the current accessibility limitations for d- pad-less Android phones, apparently even a simple key remapping is not that, erm... simple, with the current set of event Listeners and onWhatevers(). :-( On Dec 17, 11:37 am, Mark Murphy <[email protected]> wrote: > On Fri, Dec 17, 2010 at 4:31 AM, blindfold <[email protected]> wrote: > > This currently requires physical keys, because > > Android still lacks the kind of touch event model of the iPhone where > > (blind) users can explore the screen by touch and hear GUI elements > > spoken without immediately activating whatever elements they are > > touching or sliding over. > > If this is your own app, I would think you could do this on your own. > Have a "blind mode" that puts a TalkBackOverlayView on top of the > regular UI. TalkBackOverlayView would use the same techniques as > GestureOverlayView to detect touch events and selectively pass them > through. In particular, in an "exploration" mode, it would behave as > you describe -- when the screen is touched, find out the widget > underneath it at those pixel coordinates, and announce what it is. > > Offering this for other apps would require firmware mods, of course. > > And, of course, I haven't tried this, nor have I looked at the > GestureOverlayView code to see how tough it would be to implement. > It's just an idea that popped to mind while reading your post. > > -- > Mark Murphy (a Commons > Guy)http://commonsware.com|http://github.com/commonsguyhttp://commonsware.com/blog|http://twitter.com/commonsguy > > _The Busy Coder's Guide to *Advanced* Android Development_ Version 1.9 > Available! -- You received this message because you are subscribed to the Google Groups "Android Developers" group. To post to this group, send email to [email protected] To unsubscribe from this group, send email to [email protected] For more options, visit this group at http://groups.google.com/group/android-developers?hl=en

