Hello Everyone
   We are trying to create an accessibility service which could detect
icons of buttons, lets say, and then report this information to talkback
screen reader. Is it possible to complement speech output of talkback? One
of the way I could think for it is through providing hint text of the
button, for instance. But, it seems we can not modify any
AccessibleNodeInfo property from accessibility service. So are there any
APIs that are provided by talkback directly to which we could send intent
perhaps to speak the phrase in context of that particular button?

   Any help in this regard will be very helpful. Please do let me know if
my question is not clear.

-- 
You received this message because you are subscribed to the Google Groups 
"Android Developers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to android-developers+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/android-developers/CAOYNTAyT5y4EkTM8Ph4A5%2Bj5BkHS8az9rrnVT3g6Fsbx_xeXWw%40mail.gmail.com.

Reply via email to