Blog

Blog-JakeC-Health-ScreenReaders

Screen Readers in Cross-Platform Mobile App Development

Screen readers

Working in the digital health space continues to highlight the importance of integrating screen readers into our apps and development practices. In this blog post I’ll explore how this can be done with the specific case of Xamarin mobile app development in-mind.

Screen readers utilise synthesized speech to communicate otherwise inaccessible content that is presented in an app to the users. Screen reader users are typically people who are blind or visually impaired, requiring an alternative mechanism to use apps. There are many screen readers that exist. Below is a summary of a few key screen readers for iOS and Android:

VoiceOver

iOS uses VoiceOver as its screen reader. VoiceOver follows Apple’s general theme of providing a closed, secure, and tightly integrated ecosystem, encouraging the operating system to be in control of as much as possible. This manifests in some VoiceOver quirks, which were, funnily enough, discovered during a client meeting where VoiceOver suggested interpretations of images, without being explicitly told to do so.

TalkBack

Android uses TalkBack screen reader. TalkBack follows more of an open-source ethos, and is fully visible, inside-out. This has several benefits, as it enables the production of a quality screen-reader, with a somewhat more app-tailored experience, as developers can leverage the source code’s visibility.

Voice Assistant

A screen reader used on Samsung devices prior to TalkBack’s major adoption Android-wide.

Preliminary philosophical notes

At the outset, it should be stated that there is no universally accepted best screen reader (as of date), and this is a field of software development and UX that is constantly undergoing new innovation and development as our collective understanding improves. For different platforms, including the most commonly used platforms, iOS and Android, there are different conventions and stylistic choices that are adopted. However, a general principle that is often discussed in the accessibility world is to prioritise three aspects, which are: name, role, and value (which I will acronymize as NRV).

This provides a good general rule of thumb for what to consider when setting accessibility properties for visual elements, and describes the name of an element, it’s role in the application, and what value it can provide to a user via gestures including tap, double-tap, and swipe gestures.

“Basic” Xamarin.Forms screen reader accessibility

Xamarin.Forms provides a class called AutomationProperties, which has two basic accessibility properties utilised throughout the app:

AutomationProperties.Name

NRV Classification: “Name”

This is a property that can be set on any element to set the text read aloud by screen readers.

AutomationProperties.IsInAccessibleTree

This is a property that can be set on any element to control whether they are detectable by a screen reader or not. In a custom control, it is often helpful to hide as many elements as possible by setting AutomationProperties.IsInAccessibleTree = “false” and only setting AutomationProperties.IsInAccessibleTree = “true” on the key elements (those that should be highlighted by the screen reader).

Accessibility Effects

Aside from the basic Xamarin.Forms functionality, additional accessibility effects were created for the sole purpose of leveraging platform-specific quirks of VoiceOver and TalkBack. The following are static properties that can be implemented and set in the XAML of a view, as well as in the code-behind (the xaml.cs files).

Accessibility.AccessibilityTraits (iOS-Specific)

NRV Classification: “Role” (typically, but can be used for other more nuanced use cases)

iOS currently supports a collection of traits for iOS views. See UIAccessibilityTrait Enum for more details on the specific traits available).

Accessibility.DoubleTapText

NRV Classification: “Value”

The concept of “DoubleTapText” originates from the “double tap” action’s text on Android. This is useful for providing context for what clicking custom controls will achieve in the app. For example, if clicking a button opens a PDF document, the double tap text will sound as “Double tap to open PDF”. Other examples could be “Doublet tap to navigate” on buttons that cause navigation, or “Double tap to select filter” for selecting filters. This concept of double tap text can be applied generally for any actions, providing a mechanism for screen reader users to understand different element functionalities.

Accessibility.AndroidRole (Android-specific)

NRV Classification: “Role”

Allows for setting the role of a control on Android. This is particularly helpful when creating custom controls. On Android, it is generally announced in the following sequence by TalkBack (or other screen reader softwares, e.g., Samsung Voice Assistant):

<AutomationProperties.Name>, <AndroidRole>, <DoubleTapText>

Accessibility Service

The IAccessibilityService is the accessibility service that can be added to Xamarin apps. It contains a variety of helpful methods and properties to enhance upon the existing accessibility features.

Check if a screen reader is active

The property IAccessibilityService.ScreenReaderActive can be used in the common Xamarin.Forms code to determine if a screen reader is active or not.

Announcements

Announcements are useful for events and other miscellaneous information that needs to be conveyed to the user. Some examples of events that are currently used are:

  • “loading”, and “finished loading”
    This was particularly useful for the scenario where a page takes some time to load and would otherwise be confusing to screen reader users (lacking any elements on the page).
  • “refreshing”, and “finished refreshing”
    This was useful when enhancing the screen reader experience for the app.

Managing screen reader focus

From collaboration in the digital health space, one of the key takebacks for quality screen reader experiences is the ability to manage the focus when layout changes, or a modal/alert appears. To this end, a ChangeFocus method was implemented in our IAccessibilityService that accepts a Xamarin.Forms.View and focuses this element with the currently active screen reader.

The development of screen readers continues to evolve as we expand our knowledge and capabilities in the fields of software development and user experience design, with innovation continuing to drive new development in that space and ensure greater levels of accessibility are achieved.