Link is at the bottom of this message.
What's New in iOS 26 Accessibility for Blind and DeafBlind Users
Fall is almost here in the Northern Hemisphere, which means it’s time for
another major release of iOS for the public. While some of the information
in
this article may be new to readers, this year’s changes started being
discussed in the spring. Prior to the Worldwide Developers Conference
(WWDC), Apple
 
announced
several new features coming to their devices in the fall. Fast forward to
June, and the mainstream also received 
a lot of announcements about what's coming in iOS 26.
Many outlets online will cover iOS 26’s mainstream changes in great detail.
Some of them include a new design throughout the entire operating system
called
"Liquid Glass.” Apple describes Liquid Glass as follows: "… a new dynamic
material that combines the optical properties of glass with a sense of
fluidity.
Liquid Glass refracts content from below it, reflects light from around it,
and has a gorgeous lensing effect along its edges." Because of this dramatic
change, I would strongly recommend all low vision users take a look at
something other than their main device before installing. Other mainstream
changes
include Live Translation & Visual Intelligence through ChatGPT, CarPlay
enhancements such as integration of Live Activities, Live Activities
themselves
opening up to third parties, new calling and texting features, and much
more. For a well-documented list of mainstream features in iOS 26, this 
article from Wired
is a good place to start. iOS 26 is compatible with the iPhone 11 and later.
For a list of all supported devices, consult this 
Article from CNET
We're here to discuss the changes in accessibility for those who are blind
and DeafBlind. As in every other year, I'm unable to comment firsthand on
any
low vision changes. It’s my hope that others will share their experiences
from that perspective to help educate low vision users on the update's
impact.
Space constraints do not permit me to explore such things as Voice Control,
Switch Control, Assistive Access, Assistive Touch, Head Tracking and other
changes. I encourage all users to go to their favorite accessibility
settings and visit the "What's New" section Apple has provided in many
locations.
Sharing Is Caring
One of the main additions to the accessibility suite of tools is the ability
to temporarily share your accessibility settings. This can be done between
2 devices running iOS 26 that are in the same location or signed into the
same iCloud account. Once you are done using the shared settings from the
host
device, it is possible to dismiss them, returning the other device to its
settings. When connected, the Accessibility Shortcut will also include the
option
to drop the sharing session. In either case, the first step is to go to
Settings>Accessibility>Share Accessibility Settings and activating the share
Accessibility
Settings button on this screen. This only needs to be done on the host
device. After activating the continue button, a list of devices near the
user will
be shown. This includes iOS Devices not signed into your iCloud account and
also those not running iOS 26, though iOS 26 is required for this feature to
work. After choosing the target device, you will need to switch to that
device and accept the request. Once the connection has been made, the
establishment
is displayed on both devices. As a braille user, this was a little
frustrating since I could not leave this confirmation without hitting the
touchscreen.
I did find that many of the settings transferred successfully from my host
device including audio changes, VoiceOver's speech settings and hatpic
settings.
What wasn't carried over were all of the braille display commands that I had
customized on my primary device. Also, if VoiceOver is not enabled on the
target device, on the target device, press an hold 3 fingers to accept the
request and VoiceOver will start speaking on the target device. It's a
wonderful
way to share settings with others for whatever reason may be needed, but it
would be even more nice if you could take the settings from the host device
and copy them with the one you are sharing settings with. Even better if you
could specify which settings would be merged, for example, all Voice Control
or VoiceOver settings.
Accessibility Nutrition Labels
This first feature is something users can find on the App Store. Rather than
describing it again, here is a quote from Sarah Herrlinger, Apple's senior
director of Global Accessibility Policy and Initiatives, taken from the
interview 
conducted by members of the AppleVis Editorial Team:
block quote
"We really wanted to create a consistent way for developers to highlight
accessibility features. And we want it to be a way that's super easy for
users
to find and understand. Accessibility nutrition labels are an extension of
kind of the longstanding work that Apple has done to provide developers with
tools, documentation, and training to create great accessible experiences.
We're really excited for these labels to come to the Apple ecosystem, and we
expect it'll bring a whole new level of accessibility awareness, both for
users and developers alike.
block quote end
Accessibility Nutrition labels are something developers can fill out that
lists all of the accessibility features their app is designed to support. If
on iOS 26, users can check out an example of a completed label with the 
Please Don't Rain
application. Accessibility Nutrition Labels are optional for the time being,
with a requirement for developers to provide this information expected to
come in the future.
VoiceOver Changes
VoiceOver has undergone many updates with iOS 26, with a lot of attention
being paid to braille. Whether you use a braille display or Braille Screen
Input,
Apple has made some significant upgrades to both sets of tools.
It Magically Won't Play
The Two-Finger Double-Tap gesture for VoiceOver users has always been known
as the "Magic Tap." While it can be very helpful, Magic Tap can also cause
some annoyances. For example, if you are performing the Two-Finger
Double-Tap to end a phone call, and instead get to listen to the audio last
playing
on your device, that can produce some interesting results. The caller may
not have hung up yet, and may get to listen in as well. If you're DeafBlind
and
not using the Sound Curtain feature, this could lead to everyone thinking
you are throwing a party with your phone while you walk around, without you
even
knowing it. iOS 26 brings a new option, which should eliminate some of these
challenges. If the user goes to
Settings>Accessibility>VoiceOver>Commands>Magic
Tap, there is now an option to exclude the playback of media from this
gesture.
Reset It!
With all of the settings which can be modified from within VoiceOver, it can
be confusing to remember what you have changed and what you haven't. If
something
is going wrong and it's not possible to pinpoint the issue, or if you just
want to start fresh, you now have an option for only resetting VoiceOver
settings
instead of resetting the entire device. Navigate to
Settings>Accessibility>VoiceOver>Reset VoiceOver Settings. activating this
resets VoiceOver to the
factory default settings. When performing this reset, I was happy to find
that my Bluetooth connection to my braille display did not get lost. All
other
settings had been reset as best I could tell.
Is Siri Listening?
It is not always easy to tell when Siri is listening when you don't have
access to the screen in front of you. Apple understands this, and has added
a
feature for VoiceOver users which should help offset this challenge. It can
be found by going to Settings>Accessibility>VoiceOver>Audio>Always Use Siri
Sounds, and turn this on to give it a try.
Cursor Output
One of the challenges for users of screen readers who have been used to
other operating systems is the way in which speech output conveys where the
cursor
is located with VoiceOver. By default, VoiceOver is set to announce content
as the cursor passes it. iOS 26 also gives the user the option to speak the
text to the right of the cursor. To enable this, go to
Settings>VoiceOver>Typing>Cursor Output.
What's New And Old in the Rotor?
When visiting Settings>Accessibility>VoiceOver>Rotor>Rotor Items, the user
is presented with over 50 options that they can add and remove from their
rotor.
Naturally, this can take some time to find the item you'd like, until iOS
26. With this release, there is now a search feature located at the bottom
of
this menu which gives users another way in which they can manage their rotor
settings. Speaking of this search box, many of the search boxes throughout
the operating system have gone from being near the top of the screen to near
the bottom. If a user can't find their search box they used to find with iOS
18 and earlier, checking at the bottom of the screen will likely be the
search box's new location.
But Wait... There's More To Copy
A new rotor function has been added called Copied Speech. This option will
allow you to access more than the past one instance of whatever speech you
have
copied to the clipboard. For those who may be unaware, the Copy Speech to
Clipboard function does just that. Whatever the last thing VoiceOver has
spoken
can be copied to the clipboard. Once your clipboard is erased, or something
replaces that item, before now, it would no longer be available to the user.
Note that when turned on, if there isn't anything to copy, the rotor option
will not be available.
Other Changes
Specific to Maps, when focused on a point of interest, VoiceOver users can
now perform a 3-finger single tap. This will pull up a list of more
information
about the Point of Interest. Pan-Indian voices have been added for Gujarati
and Marathi. Under Settings>Accessibility>VoiceOver>Verbosity>Custom Labels,
there is now an option to modify or delete any custom labels the VoiceOver
user has added. Under Verbosity settings, it is now possible to control how
the position in a list is communicated. Options include Speak, Braille, or
Do Nothing.
Braille
Braille Access
Apple has brought some old features in with some new ones to develop a suite
of services for braille display users called Braille Access. To activate
Braille
Access, press dots 7 and 8 together on a Perkins-Style keyboard. For users
of QWERTY keyboards, pressing VoiceOver Modifier Shift Y will enable Braille
Access. pressing VoiceOver Modifier with y will enable braille keyboard
input, which will be required to interact with some of the features listed
below
such as Braille Notes. The user can then move through the menu with standard
navigation options. To launch any form of context menu where available,
users
can press dot 7. Activating items is done with dot 8. Exiting Braille Access
can be done by pressing dots 7 and 8 together, by pressing space with dots
1-2-5, space with dots 1-2, or VoiceOver Modifier Shift and Y. Note that
space with 1-2 (b) also functions as a back button. Before exploring the
various
features of Braille Access, it may be best to also examine the list of
settings which can be configured to best support each user in the Braille
Access
experience. Check out the options under
settings>Accessibility>VoiceOver>braille>Braille Access. It's possible to
configure which features appear in the
Braille Access menu, to control whether menu items are spoken, whether to
speak list items, whether to show a visual representation of the Braille
Access
content with its print equivalents, whether Braille Access should remember
your last position in Braille Access on re-launch, whether there is typing
speech
feedback spoken by VoiceOver, and individual settings for each feature. Note
that in order to get typing speech feedback, typing feedback for hardware
keyboards needs to be enabled in VoiceOver's Typing settings. To set this,
go to Settings>Accessibility>VoiceOver>typing>Typing feedback>Hardware
Keyboard>Characters.
For example, the braille notes feature settings permit the user to define
how created notes should be sorted when displayed. BRF files currently only
has
the option of whether the user wishes to fit their specific device. The
Calculator allows the user to select either Nemeth or Unified English
Braille Math.
The setting under Live Captions allows the user to control whether the audio
from the iOS device's microphone or audio output. Unless configured in
settings
differently, pressing dots 7 and 8 together will present the user with each
menu option: Launch App, Choose Item, Braille Notes, BRF Files, Calculator,
Live Captions, and a running display of the time in seconds viewable at the
bottom of the menu. Each option will be discussed below.
Launch App
Ever since 
iOS 17,
braille display users have been able to quickly launch apps using dot 8, or
space with dot 8 when in 8 dot mode. Then type the app they are looking for
and launch it. While this ability still exists in iOS 26, accessing it has
changed. To do so, press dots 7 and 8 to launch Braille Access. You can also
press this command to exit Braille Access. The user will notice the first
option "launch app" appears with a blinking cursor at the end. Begin typing
the
name of the app desired, and press enter to pull up a list of matches, or to
automatically launch the app if only 1 match is found. What has improved
with
this option is that it was only possible to launch apps from the home screen
in iOS 17 and 18, but this option can be utilized even from within other
applications,
so there is no longer a need to go back to the home screen before going to
another app. It is also now more difficult to accidentally activate while on
the home screen if you forget to lock your device and bump dot 8.
Item Chooser
iOS 17 brought this ability to launching apps, while 
iOS 18
brought this to the item chooser. This is the second option now found under
Braille Access. It works exactly as it did under iOS 18, the location has
just been changed.
Braille Notes
Though many braille displays on the market have their internal notetaking
features, this is a huge leap forward specifically for users of the 
NLS eReader.
Though this device is distributed to patrons of NLS services freely, it does
not have a notetaking application. Whatever code a person throws at it will
see it sent back to them. iOS is recording dot combinations inside these
files instead of attempting to interpret them. For example, I wrote out a
note
half in mixed contracted and uncontracted braille, and reading it back
yielded the same result. Typing in Spanish braille, which has some different
symbols
also read exactly as I had written it. Translation only happens if the user
wishes to send that composed note to something or somewhere else, such as a
text field outside of Braille Access. Notes are saved on iCloud, though the
syncing between other devices on my iCloud account seemed to take much
longer
than when composing a note within the standard Notes application. One of the
neat things about this set of features is that the user can also utilize
VoiceOver
while, for example, taking notes. This requires that the user can hear, but
it allows a user to do research on their iOS Device while also taking notes.
Many of the standard commands apply when editing a note. If the user wishes
to send their note to, say, a text field on iOS, this is also possible but
will need to be translated. One way to get these notes from your device to a
text field is by copying and pasting them. To do this, while in a braille
note, press space with dots 2-3-5-6 to select all of the text. Now, press
space with dots 1-4 to copy it. Then, press dots 7 and 8 to exit Braille
Access
and return to iOS. Find the text field you'd like to paste the text from the
note in and then press space with dots 1-2-3-6 to paste the text. It will
translate the note based on your iOS Device's braille input settings.
BRF Files
One of the challenges with BRF files is that they are designed specifically
for braille users to be consumed on braille-first devices. When a user gets
a BRF file and would like to read it on a mainstream device such as an
iPhone, it requires workarounds. Either the user must convert the file to a
more
usable format on iOS, or use an app which requires other settings to be
changed. iOS 26 brings the readability and writability of BRF content to
iOS. To
create a BRF file, after launching the feature, press dot 8 or a cursor
routing button on "new...". From here, a new file or folder can be created
by selecting
file name and then pressing dot 8. Then, the user can type their file. When
done, simply press space with dots 1-2 to save the file. It will then be
available
after the "new...." option. Pressing dot 8 on a selected file will open the
file in a read-only mode. To access the context menu for any given file,
press
dot 7 while it is selected. This gives the user options to edit, move,
delete, or rename the file. At the moment, the move option does not work
unless
the user first creates a folder, based on my testing of 2 different devices,
but files can still be moved with the Files app. When the user presses dots
7 and 8 together to launch Braille Access, a folder will be created on that
user's iCloud account called "BRF files.". Any BRF files put in this folder
can then be accessed with the BRF files feature inside of Braille Access.
This includes files you have created in Braille Access, but also those which
you move to the BRF Files iCloud folder. I moved 
On the air: the encyclopedia of old time radio
into the BRF files folder from my Dropbox and was able to load it on my
iPhone in approximately 3 seconds. This is quite impressive, as the book is
3.3
megabytes in size. When I left Braille Access and returned an hour later, it
had retained my position in the content. Space with f for find works well
and is able to search through large amounts of content quite quickly. While
a file is open, dot 7 launches a menu. Options are available to find again
and also to create a bookmark. Note that it is also possible to create
multiple bookmarks in the same file.
Calculator
Braille Access also has a Calculator which can display Nemeth or UEB math
codes. Not only can basic arithmetic be used, but it's also possible to type
a math expression in Nemeth or UEB and see the computed result. For example,
the user can use parentheses, fractions, radicals, exponents, constants like
e and pi, and functions such as sin, log, etc. Like with Braille Notes and
BRF Files, it's possible to copy the result. The visual interface will
render
the math expression you typed in Nemeth or UEB as a visual math formula.
Live Captions
Originally introduced as part of 
iOS 16 in 2022,
Live Captions has continually had one issue for braille users. When
accessing Live Captions, each time new text arrives and the individual is
still reading
previous captioning, the new text would force the braille display to jump
back to the beginning of what they were reading. For example, if the reader
was
accessing the 3rd sentence of what someone had said, and then a 4th sentence
was added, the user would be sent back to the top of the text. They would
then need to pan all the way back to the sentence they were reading.
Meanwhile, if more captions arrive, they would again be sent back to the
beginning
of the text. Apple has taken the time to develop Live Captions specific for
braille users which, I'm happy to report, do not suffer the same challenge.
After launching Live Captions from the Braille Access menu, the user will
encounter captions already in progress or "listening." When captions begin
arriving,
pan forward or backward as normal to read captioning. If more captions have
arrived after what the user is reading, an 8-dot full cell will be located
at each end of the braille display. pressing a Cursor Routing button when
the 8-dot cell indicators are present will move focus to the end of the
captions.
Pressing dot 7 here will also launch a menu giving the user several options.
These include pausing/resuming, the toggle of captioning the microphone or
device's internal audio, and for those on Apple Intelligence-capable
devices, the ability to summarize the captions received. Pressing dot 8 will
give
the user the option to make use of the Live Speech function. I set up a
custom braille display command (discussed further below), to launch
captioning
immediately so that it is available on demand. When done with the
captioning, pressing dots 7 and 8 together return me to where I was in iOS.
As of the
iOS 26.0 release, I was unable to get Live Captions to work with Face Time
video or audio calls.
Accessing Time And Date
The final Braille Access option is to view the time and date. When
navigating to this feature within the Braille Access menu, the user will
find the time
displayed with seconds included. If the user presses enter, they will
encounter the date and then the time.
Item Overview
With a single-line display, there are very few ways to scan, for example, a
full Home Screen of apps to activate the correct one. Even more challenging
is when you have a huge list of links on a web page that you may be familiar
with. Item Overview is a new feature in iOS 26 which displays the first few
cells of each item on the user's screen. To activate Item Overview, press
dots 6-7-8 together. The default is to display the first 3 cells of each
item,
but the user can adjust how many cells should display with each item. Find
this setting by going to Settings>Accessibility>VoiceOver>Braille>Item
Overview
and choosing anywhere between 2 and 7 cells. As an example, I launched Item
Overview on my home screen and can select any of the first 5 apps on my Home
Screen with a 20-cell display. Panning forward will give me the next 5
items. I can activate any of the items listed by using a Cursor Routing
Button.
More Selection Options
There are many ways to select text on iOS. For braille display users, there
is a new method with iOS 26using the Cursor Routing Buttons. To select a
word,
use a Cursor Routing button within that word and press it twice quickly.
Pressing this button 3 times quickly will select the full line. Note that
these
selection options are also available for use inside Braille Access. I've had
to get used to working with this method and found the command space with
dots
1-3-5-6 (z) to undo to be very handy.
Cursor Clarity
One of the challenges some individuals face as braille display users on iOS
is determining where the braille cursor is located. This is because, by
default,
the cursor is represented by dot 7 in 1 cell and dot 8 in the next. A new
option found under Settings>Accessibility>VoiceOver>Braille>Use Underline
Cursor,
can be enabled with will show the cursor as dots 7 and 8 together in the
same cell.
New Options For Assigning Braille Keyboard Commands
More functions within the operating system are getting the ability to be
assigned braille keyboard shortcuts. For example, there are options to
assign
a braille keyboard command to each of the Braille Access features covered
above. They can all be found by navigating to
Settings>VoiceOver>Braille>your
Device>More Info>commands, and then selecting the braille category. Other
additions include activating the Accessibility Shortcut, Reachability, the
option
to take a screenshot as well as to activate Spotlight search. All of these
new options can be found under the Device category.
Tapping Feedback
It's now possible to have the same type of keyboard echo with Braille Screen
Input that is found for both the onscreen and hardware keyboards. It can be
found in the same menu as these options.
Settings>Accessibility>VoiceOver>typing>Typing Feedback. BSI is a new
heading at the bottom of this menu. The
options are to echo characters, words, both characters and words, or to do
nothing.
Commanding Customization!
All of the options available for customization of commands for users of
braille displays are now available to Braille Screen Input users. It is
possible
to customize the Braille commands for Braille Screen Input Command Mode, and
Braille Keyboard Input. To check these new choices out, head to
Settings>Accessibility>VoiceOver>Commands
>Braille Screen & Braille Keyboard Input.
Learning The Dots
After calibrating, it's still possible that the input may be slightly
different from what was set. A new option under
Settings>Accessibility>VoiceOver>Braille>Braille
Screen Input>Learn Dot Positions will allow the dots to slowly drift towards
the user's tap positions over time. When they are comfortable with how the
positioning is set, this setting can be turned off so that the dot positions
will be retained.
BSI Single Handedly
Several new input methods are available for BSI users in iOS 26 which
require the use of only 1 hand. If Use Activation Gestures are set to on,
users can
launch single hand BSI by double-tapping and then holding 3 fingers on the
screen. This will also set the orientation to Portrait Mode, regardless of
whether
the user’s orientation is locked in Landscape or not.
1 hand, 4 different ways to type
One Handed BSI has brought 4 different ways of inputting braille on the
screen. The choices can be found by going to
Settings>Accessibility>VoiceOver>Braille>Braille
Screen Input and choose from Right Hand, Left Hand, Slate and Stylus or
Reversed Slate and Stylus. Choosing Right Hand, input is done by column and
uses
3 fingers such as the index, middle and ring fingers. To make the letter g,
dots 1-2-4-5 for example, the user can press dots 4 and 5 with 2 fingers,
swipe
right with 1 finger, sometimes required to do twice as of now, to get to the
next column, and then tap dots 1 and 2 together. When Left Hand is selected,
the user will then use the left side of the column first. Slate and Stylus
allows the user to utilize their touchscreen as a giant 6 dot cell. Like
when
using a conventional Slate and Stylus, where the user has dots 1-3 on the
right and dots 4-6 on the left side of the cell. After tapping the desired
dots
in the cell, swipe right with 1 finger to move on to the next cell. There is
also the option of Reverse Slate and Stylus style, which puts dots 1 through
3 on the left side of the cells and dots 4 through 6 on the right side.
Hearing
Live Captions
Not to be confused with "Live Captions" under Braille Access, this is the
mode that users have had access to since 2022. It still has the same
challenges
outlined above for braille users, but has gained some new functionality.
Newly supported languages include English (Australia), English (United
Kingdom),
English (India), English (Singapore), German (Germany), Spanish (Spain),
Spanish (Mexico), Spanish (United States), French (Canada), French (France),
Japanese
(Japan), Korean (South Korea), Cantonese (China), Chinese (China), and
Chinese (Hong Kong). iOS 26 also brings the ability to save call captions.
The user
has 24 hours after a call to access the transcription. If the user wishes to
save the transcript beyond that, it is possible to screenshot the text for
future reference.
Name Calling
Don't worry, I'm not here to insult anyone by calling them a name, though
you could set this feature up to recognize one. Name Calling is a new
feature
found under the renamed option of "Sound and Name Recognition." Adding a
name to the list can be done by typing the word, but the user is also given
the
option to record their name and hear an audio representation of what iOS
will listen for. It's also possible to add multiple names or words for
recognition.
Testing this, I found that if someone called my name loudly to get my
attention, it worked every time. When in conversation, it would often not
recognize
that my name was said, but would on occasion. It did sometimes think it
heard my name, for example, with the phrase "it's got" which sounds quite
similar
to "Scott." I found that other words which would sound similar, like "plot,"
"rot," "slot," etc. did not produce false positives.
Feel The Vocals
Music Haptics was introduced in iOS 18 and allows one to experience Apple
Music tracks through Haptic Feedback. For iOS 26, new in Music Haptics is
the
ability to have the haptic pattern only reflect the vocal nuances of the
track. There are also new options to control the haptic intensity. Choose
from
Light, Medium, or Strong. My only disappointment is that no other music
services have taken advantage of this technology.
What was that again?
With MFI-supported hearing aids, Live Listen now has a Rewind feature. I was
not able to test this, since I do not use hearing aids which are MFI
compatible,
so it's my hope that someone else can review this functionality. Also new
with Live Listen is the ability to control Live Listen sessions with nearby
devices.
The user can now utilize other devices, such as an Apple Watch, to serve as
a remote control to start, stop, or rewind Live Listen sessions.
Background Sounds Added
New sounds have been added to the Bac list of backgroun sounds. Users can
now also choose from airplane cabin noise, rain, night ambiance, and
crackling
fire.
Conclusion
iOS 26 offers a mixture of new features and quality-of-life improvements for
blind and DeafBlind users. It is always good to see Apple focusing resources
on braille, as traditionally this has been one of Apple’s weaker areas. For
low vision users, I strongly recommend experiencing the Liquid Glass
interface
on another device prior to upgrading your own. For VoiceOver and braille
users, iOS 26 should be a relatively safe upgrade, with the caveat that
everyone’s
situation is unique and the decision whether to install an update is a
personal one. For a list of known bugs, check out our list of these when it
becomes
available.
https://applevis.com/blog/whats-new-ios-26-accessibility-blind-deafblind-use
rs#new

-- 
The following information is important for all members of the V iPhone list.

If you have any questions or concerns about the running of this list, or if you 
feel that a member's post is inappropriate, please contact the owners or 
moderators directly rather than posting on the list itself.

Your V iPhone list moderator is Mark Taylor.  Mark can be reached at:  
[email protected].  Your list owner is Cara Quinn - you can reach Cara at 
[email protected]

The archives for this list can be searched at:
http://www.mail-archive.com/[email protected]/
--- 
You received this message because you are subscribed to the Google Groups 
"VIPhone" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To view this discussion visit 
https://groups.google.com/d/msgid/viphone/01a801dc24c9%249761fcc0%24c625f640%24%40fastmail.com.

Reply via email to