Janet Ingber

Apple hosted their annual Worldwide Developers conference where they introduced a host of new features both mainstream and access specific. In addition, there were several sessions held at the conference for those seeking to improve accessibility in their products and services. For this article, we will first detail some of the accessibility sessions that were available at the conference (and that are still available to view online) before exploring the latest accessibility features arriving in the upcoming Apple OS updates.

Developer Sessions

During the Developers Conference, Apple offered a number of sessions on accessibility. They also offered labs where a developer could work with one of Apple’s accessibility experts. In this section, we will detail a few of the sessions from the conference with a specific focus on those sessions most useful to people focusing on access for people who are blind or have low vision. A full list of sessions with watch links is available on Apple's website.

Session Descriptions

Perform accessibility audits for your app: The presenter began by telling the group that worldwide, one in seven people have a disability. He talked about the Accessibility Inspector, which ships with Xcode. Xcode is Apple’s proprietary program for developers. He loaded an app and used the Accessibility Inspector to highlight accessibility issues within the app. He explained accessibility audits and gave examples.

Build Accessible apps with SwiftUI and UIKit: The presenter spoke about accessibility and how accessibility improvements in UI frameworks make it easier to include accessibility features. Some of the topics covered in the session included assigning priority to announcements, how to refine VoiceOver’s direct touch experiences, and how to keep accessibility attributes up to date. 

Create Accessible Spatial Experiences: In this session, the two presenters talked about making spacial experiences accessible to everyone. When speaking about visionOS, the one of the presenters said, “At Apple, we recognize that access to technology is a fundamental human right. This platform contains the largest list of accessibility features we’ve ever included in the first generation of a product.”

The presenter spoke about making apps accessible for people who are blind or have low vision. He explained that when designing an app for use with VoiceOver, there are several considerations including VoiceOver support, visual design, and motion. He opened an app and showed how it can be used with VoiceOver. He explained how to add VoiceOver to the accessibility shortcut and spoke about the gestures to be used in the app.

While demonstrating, he came to a part of the app that was not accessible. VoiceOver did not respond. He fixed the problem by using the new accessibility component that is in Reality Kit. This feature lets the developer configure aspects such as labels, traits, and custom actions. In this session, he gave directions on how to fix the problem encountered.

Accessibility

On May 16, Apple released a list of new accessibility features. The feature for individuals who are blind or low vision is called Point and Speak. It is accessed through detection mode in the Magnifier. People who are blind or low vision point the phone at an object and the app will read available text. The example that Apple used was that the app can help visually impaired people “interact with physical objects such as household appliances.”

In order to use this new feature, you will need a phone with LiDAR scanner. The iPhone 12, 13, and 14 Pro and Pro Max have this feature. It is rumored that the new iPhone Pros will also have it. In addition to Point and Speak, Apple has also announced the following access features:

  • Siri will have a more expressive voice and Siri’s speech rate will be adjustable.
  • Apple is making it easier for Mac users with low vision to adjust text size.
  • Assistive Access is a new feature designed to help individuals with cognitive disabilities navigate their iPhone or iPad. Features can be customized to fit the accessibility needs of each person.
  • Live Speech lets someone who is non-verbal type their responses during phone calls and conversations.
  • Personal Voice gives anyone the ability to create a synthesized voice that sounds like them. Although this was developed for anyone who may lose their voice, anybody can use it.

The Bottom Line

It is exciting that Apple is taking accessibility seriously with the launch of Vision OS. Virtual reality is a still widely unexplored field in regards to accessibility, but considering the realistic soundscapes often present in virtual reality apps, there is a great deal of promise for people who are blind or have low vision.

Though the WWDC keynote primarily focused on mainstream updates, it was heartening to see the number and detail present in the access and inclusion sessions available at the conference. If you are a developer of apps for the Apple ecosystem or have connections to such, the sessions are definitely worth a watch. Also remember that if you are interested in the Google side of accessibility, Syed Hassan from the AFB Talent Lab published a blog post detailing updates from this year's Google IO.

Author
Janet Ingber
Article Topic
Conference Coverage