James Austin wrote:

Microsoft seem to be saying that their new U.I Automation will allow Screen Readers and the like interact with controls that are not yet available, which is certainly an interesting prospect for those using Windows. But surely as Blind/Visually Impaired users, we should be able to integrate as fully as possible with our sighted colleagues and friends? From what the article said, the U>I>A does not represent the screen as I am told VO does;

You're drawing a false distinction, I think. VoiceOver makes uses of the Apple Accessibility API. Apple's programming libraries basically create two trees of objects like text, images, controls, links, menus, and windows. One tree is a series of visual objects displayed on screen. The other tree is an series of programmatic objects that can be converted into a non-visual representation, navigated, and interacted with by assistive technology and automation software. In general, the accessibility tree mirrors the visual tree. If a button is inserted into the visual tree, a button is also inserted at the equivalent branch in the accessibility tree. This means that when developers use the standards controls Apple's provide, their applications get a lot of accessibility built in without extra effort. If there isn't a standard control that does not look like or behave exactly like what the developer wants, then they'll need to write code to create a custom control. They then will hopefully choose to write some more code to say how their custom control should be represented in the accessibility tree. When new complex applications like Apple's and Microsoft's office suites prove partially inaccessible, it's often because they make use of a lot of custom controls and either developers have simply failed to write such code, or there is a problem with the underlying accessibility frameworks that make representing the controls in the accessibility tree difficult.

The big difference between Apple's Accessibility API and Microsoft's old Accessibility API is twofold:

1. More Mac OS X developers make use of Apple's standard controls than make use of Microsoft's standard controls.

2. Apple's Accessibility API is a much newer API better capable of representing complex controls than Microsoft's old API.

As a result, Windows screen readers have to hook into a lot of other APIs (e.g. commands sent to the graphics display) in order to provide a reasonable level of accessibility.

IAccessible2 and UI Automation are both attempts to provide a new accessibility API for cross-platform use. IAccessible2 attempts to ease the transition by piggybacking new complexity inside the MSAA framework; UI Automation instead tries to create an entirely new framework. While there are doubtless differences in approach between Apple Accessibility API and UI Automation, they do not consist in the former somehow representing the screen in a way the later does not.

Hope that helps.

--
Benjamin Hawkes-Lewis

Reply via email to