?
Digital Application Development
12 minute read

WWDC 2021: Accessibility In-Depth

Apple puts great effort into making its accessibility tools easy to implement and use. Providing developers with the tools they need to add accessibility to their apps is just one of the ways that they display their commitment to accessibility. Every year Apple builds on the previous year’s functionality. In this article, we cover some of the tools and concepts new to Apple’s accessibility suite.

In This Article

copy link

SwiftUI accessibility

As Apple developers, we all know that the place to find the tools for accessibility implementation is in the right pane. This year, Apple introduced the Accessibility Preview. The Accessibility Preview allows the inspection of accessibility elements in real-time. Now developers will have much better visibility into just how accessible their app really is as they write it. Adding accessibility in the past has been difficult to do for many reasons, the most obvious being the lack of an easy way to check your work. Until now, it was far too easy to overlook key areas of an app. Entire regions were often left unattended, leaving users without a complete experience. Accessibility Preview is a game-changer because it allows us to instantly see how each view in the app is perceived at an accessibility level.

Another new addition is the ability to make custom controls more accessible through the use of the accessibilityRepresentation modifier. This modifier allows the substitution of the accessibility traits of one view for another, meaning that you can add the accessibility traits of one class or element to your custom element. Instead of struggling to define how your custom view will be accessible, you can now borrow from similar accessible views.

// Custom controls
import SwiftUI

struct BudgetSlider: View {
    @Binding var value: Double
    var label: String

    var body: some View {
        VStack(alignment: .leading) {
            ...
        }
        .accessibilityRepresentation { ... }
            Slider(value: $value, in: 0...1) {
                Text(label)
            }
    }
}

Alongside this new functionality is the updated ability to enhance the accessibility of a parent view with accessible child views. Specifically, it allows for the conversion of an accessibilityElement to an accessibilityContainer.  This is beneficial because the element retains its own accessibility properties while merging the accessibilityElement children into the single-parent element. With this, people using the VoiceOver feature will now clearly hear what a section is about without sifting through the individual child elements.

// Navigation
import SwiftUI

struct FriendsView: View {
    var users: [User]

    var body: some View {
        ScrollView(.horizontal, showIndicators:  false) {
            HStack {
                ForEach(users) { user in 
                    FriendCellView(user: user)
                        .accessibilityElement(children: .combine)
                        .onTapGesture { ... }
                }
            }
        }
    }
}

To deliver an optimal experience to VoiceOver users, developers need to spend some time focusing on rotors. Rotors are Apple’s solution to interacting with a screen while not being able to see it. In short, rotors can be thought of like a bookmark that users can navigate through. Simply skimming the sections available on-screen can be accomplished by selecting headings on the rotor and swiping to navigate between section elements. While much of the functionality of this feature is automatic, some cases may need some special attention. Developers now have the ability to customize the rotor experience for their app using the SwiftUI Rotors API.  

// Rotors
import SwiftUI

struct AlertsView: View {
    var alerts: [Alerts]
    @Namespace var namespace

    var body: some View {
        VStack {
            ...
        }
        .accessibilityElement(children: .contain)
        .accessibilityRotor("Warning") {
            ForEach(alerts) { alert in 
                if alert.isWarning {
                    AccessibilityRotorEntry(alert.title, id: alert.id, id: namespace)
                }
            }
        }
    }
}

Lastly, new to SwiftUI is accessibilityFocusState. This gives the developer the ability to set the focus of the app programmatically. Typically, the focus is taken care of for you, but sometimes custom views or actions don’t automatically gain focus. This is where developers can now adjust their accessibility experience and hone it to their user experience, providing smooth transitions between views. 

// Focus
import SwiftUI

struct AlertNotificationsView: View {
    @Binding var notifications: Notification?
    AccessibilityFocusState var isNotificationFocused: Bool

    var body: some View {
        ZStack(alignment: .top) {
            ...
            if let notification = notification {
                NotificationBanner($notification)
                    .accessibilityFocused($isNotificationFocused)
            }
        }
        .onChange(of: notification) {
            if notification?.priority == .high {
                isNotificationFocused = true
            } else {
                postAccessibiltyNotification()
            }
        }
    }
}

copy link

Charts

Visual graphs are so commonly used that we tend not to give them much of a second thought. The main purpose of a graph is to convey a lot of information at a glance; however, it is completely useless to someone with vision difficulties. Apple has been making progress in this area and has recently released some new ways developers can enhance their charts and graphs to be accessible to VoiceOver.  

In typical Apple fashion, they have made it very easy to implement accessibility. Adding accessibility enhancements to visual data charts is as simple as defining the accessibilityContainer, creating an accessibility label for the chart, and defining what elements are contained in the new container. SwiftUI takes care of the rest.  

The first step to adding accessibility to your chart is to create an accessibilityContainer. Doing so improves VoiceOver navigation and defines the parts of your accessible chart. This is done by overriding the accessibilityContainerType and returning .semanticGroup.  

extension ChartView {
    public override var accessibilityContinerType: UIAccessibilityConainerType {
        get {
            return .semanticGroup
        }
        set {}
    }
}

The next step is to override the accessibilityLabel. This should ideally be the title of the chart and is what VoiceOver speaks for its description. 

extension ChartView {
    public override var accessibilityContinerType: UIAccessibilityConainerType { ... }
    public override var aaccessibilityLabel: String? {
        get {
            return self.model.title
        }
        set {}
    }
}

The last step is to override and define the accessibilityElements within your chart. This override takes in parameters that define the remainder of what is needed for accessibility, such as the parent container, a frame for where on the screen the accessibility elements are located, and the data points available for VoiceOver. With these simple changes, you can now use the data points to navigate your app and learn more about individual data points.

extension ChartView {
    public override var accessibilityContinerType: UIAccessibilityConainerType { ... }
    public override var aaccessibilityLabel: String? { ... }

    public override var accessibilityElements: [Any]? {
        get {
            return model.dataPoints.map { point in 
                let axElement = UIAccessibilityEleebt(accessibilityContiner: self)
                axElement.accessibilityValue = "\(point.x) cups, \(point.y) lines of code"
                axElement.accessibilityFrameInContainerSpace = frameRect(for: point)
                return axElement
            }
        }
        set {}
    }
}

copy link

Supporting audio graphs 

Adding a basic level of accessibility to your chart is a great first step, but let's make sure that we provide the best experience possible by adding audio graph support. Audio graph support will add the ability for the seeing impaired to understand and interpret your chart easier by adding an audio component. 

First, extend your chartView with the AXChart protocol. This is made simple because there is only one property to support; accessibilityChartDescriptor. The chart descriptor will provide VoiceOver with everything it needs to enable the audioGraph experience.  

Next, define axes using AXNumericDataAxisDescriptor for numeric data and the AXCategoricalDataAxisDescriptor for categorical data. We use this data object to define the title for the axis, the range of the axis, and valueDescriptionProvider to define how VoiceOver will speak the data for that axis. One axis descriptor is needed for each axis.  

public var accessibilityChartDescriptor: AXChartDescriptor? {
    get {
        let xAxis = AXNumericDataAxisDescriptor( ... )
        let yAxis = AXNumericDataAxisDescriptor(title: model.yAxis.title,
                                                range: model.yAxis.range,
                                                gridlinePositions: [],
                                                valueDescriptionProvider: { value in 
                                                    return "\(vaalue) lines of code"
                                                })
    }
    set {}
}

Then, create your dataSeriesDescriptor. The dataSeriesDescriptor is what defines each series of data for VoiceOver. Charts can have multiple data series contained within; therefore, multiple definitions are allowed. The isContinuous property required by the object initialization denotes how VoiceOver will interpret the data. True is used to define a chart with a continuous string of data, such as a line graph. False is used to say that it has discrete data points such as a bar graph.

public var accessibilityChartDescriptor: AXChartDescriptor? {
    get {
        let xAxis = AXNumericDataAxisDescriptor( ... )
        let yAxis = AXNumericDataAxisDescriptor( ... )
        let series = AXDAtaSeriesDescriptor(name: model.title,
                                            isContinuous: false,
                                            dataPoints: model.data.map { point in
                                                AXDataPoint(x: point.x,
                                                            y: point.y,
                                                            additionalValues: [],
                                                            label: point.name)
                                            })
    }
    set {}
}

Lastly, we need to return an AXChartDescriptor composed of all the previous objects. It is important to consider the summary property. This is what will be spoken by VoiceOver when you want to learn more about the chart. It is intended to be where developers would communicate the most important insights about their chart.

public var accessibilityChartDescriptor: AXChartDescriptor? {
    get {
        let xAxis = AXNumericDataAxisDescriptor( ... )
        let yAxis = AXNumericDataAxisDescriptor( ... )
        let series = AXDAtaSeriesDescriptor( ... )
         return AXChartDescriptor(title: model.title,
                                  xAxis: xAxis,
                                  yAxis: yAxis,
                                  additionalAxes: [],
                                  series: [series])
    }
    set {}
}

copy link

Tailoring the VoiceOver experience

This year, Apple has highlighted new and easier ways to integrate the VoiceOver experience into your SwiftUI views. While SwiftUI takes care of many of the more mundane accessibility integrations, a few things are new and noteworthy. First is the ability to hide certain accessibility elements using accessibilityHidden. To many, this may seem counterproductive, but Apple has laid out some simple, concise guidelines for VoiceOver that really bring this idea together. The key concept is that delivering content in measurable portions is crucial to the VoiceOver experience. Too much data may overwhelm the user, and too little data may leave the user with an incomplete experience. With these things in mind, it is clear to see how these new tools fall into place.  

AccessibilityCustomContent modifiers are not new to Swift but are now available to use in SwiftUI. Developers should be familiar with this modifier to create and highlight additional accessibility content in their app. Alongside this addition comes AccessibilityCustomContentKeys. The purpose of these keys is to be able to specify the label and identifier information globally. This means that developers can now take advantage of being able to alter accessibility content in one place. 

extension AccessibilityCustomContentKey {
    static var age: AccessibilityCustomContentKey {
        AccessibilityCustomContentKey("Age")
    }
}

struct DogCell: View {
    var dog: Dog
    var body: some View {
        VStack {
            ....
        }
        .accessibilityElement(children: .combine)
        .accessibilityCustomContent(.age, dog.age, importance: .high)
    }
}

copy link

Apple Watch

This year, Apple has brought to the accessibility suite a new and innovative way to interact with Apple Watch. It’s called Assistive Touch, and it enables users with motor impairments that involve loss of arm, hand or loss of functionality to have full use of their apple watch without touch. Assistive Touch allows the user to now control the navigation through the watch interface using hand gestures and motions. Using simple hand motions, users can now tap on an element or bring up interface action menus by clenching their fist and moving back and forth between elements using pinch gestures. For those who cannot use hand gestures, an alternative is the motion pointer that uses wrist motion to move a cursor around the screen. Similar to how a spirit level’s bubble works, the cursor focus moves in conjunction with the tilting of the wrist. To make the best use of this new technology, Apple encourages developers to take a few simple steps to implement the best accessibility experience for everyone.  

The first requirement is the understanding of focusable elements in your Apple Watch app’s view. One important aspect of this technology is making sure that the elements that need to be focusable are. In Swift, many standard element types have this property by default. Focusable items include most of the standard controller elements such as buttons, toggles, links, etc. The remaining elements that don’t fit into this category, such as static elements like text and images, can now achieve a focusable state by adding the accessibilityRespondsToUserInteraction modifier. This is a quick and clean way to add focusability to elements that need to be a part of the VoiceOver experience.

// Make static element focusable
var body: some View {
    VStack(spacing: 10) {
        FreeDrinkTitleView()

        FreeDrinkInfoView()

        HStack {
            CancelButton(buttonTapped: $didCancel)
            AcceptButton(buttonTapped: $didAccept)
        }
    }
    .onTapGesture {
        showDetail.toggle()
    }
    .sheet(isPresented: $showDetail, onDismiss: dismiss) {
        DrinkDetailModlView()
    }
}

Since assistive touch uses a cursor to highlight elements on the view, it is important to ensure that the cursor does not interfere or overlap with the underlying layout. By default, the cursor will be contained to the element’s frame and will be drawn along the outside edge. Depending on the element's size, this may be problematic as the cursor can overlap with underlying text and images if care is not taken. By using the contentShape modifier, developers can change the shape and size of the cursor in relation to the underlying element. This allows us to give the user a better experience without much modification. 

// AssistiveTouch Cursor Frame
var body: some View {
    HStack(alignment: .firstTextBasesline) {
        DrinkInfoView(drink: currentDrink)

        Spacer()

        NavigationLink(destination: EditView()) {
            Image(systemName: "ellipsis")
                .symbolVariant(.circle)
        }
        .contentShape(Circle().scale(1.5))
    }
}

Lastly is the action menu, which provides a view-contextual menu that you can use to interact with the device and application. Here we will find default actions such as “press crown” or scrolling. In addition to these, if your application has already defined some custom actions for your view, these will show up at the beginning of the action menu list. By default, the action button created by your custom action would display the first letter of the action's name. Now, Apple has given developers the ability to customize the experience further by introducing an accessibilityActionLabel. You can customize this label to display more useful information to the user in a custom label and image.   

// AssistiveTouch Action Menu

PlantContainerView(plant: plant)
    .padding()
    .accessibilityAction {
        // Edit Action
    } label: {
        Label("Edit", systemImage: "ellipsis.circle")
    }

 

Picture and code sample sources:

Swift concurrency: Update a sample app. Uploaded by Apple, 09 June 2021, https://developer.apple.com/videos/play/wwdc2021/10119/.

Bring accessibility to charts in your app. Uploaded by Apple, 08 June 2021, https://developer.apple.com/videos/play/wwdc2021-10122/.

Tailor the VoiceOver experience in your data-rich apps. Uploaded by Apple, 11 June 2021, https://developer.apple.com/videos/play/wwdc2021/10121/.

Create accessible experiences for watchOS. Uploaded by Apple, 09 June 2021, https://developer.apple.com/videos/play/wwdc2021/10223/.