Imagine playing a piano with VoiceOver. You'd have to find the key you want to play and then double tap. It would be a very difficult experience. With the .allowsDirectInteraction accessibility trait, VoiceOver passes through touch gestures.

The garage band app is open. It shows how the top area of the app, with all the regular controls, can be used with VoiceOver as usual. The bottom part of the app, with the piano keys, has the allows direct interaction accessibility trait so it can be used with touch controls directly.

Use carefully! And only when it really makes sense to be able to handle controls directly with touch. Other examples could be a drawing app or some games.

You may also find interesting...

The Accessibility APIs are generic and flexible. They're not just for VoiceOver. If you implement them right, you can do it once and it will very likely work great for VoiceOver, Voice Control, Switch Control, Full Keyboard Access, and more. That's why, to start with, we tend to focus on VoiceOver, the same way you may focus on keyboard navigation for the web. A great VoiceOver experience will get you most of the way to a good experience with the other assistive technologies. We've seen one example with Custom Actions. One implementation works for: VoiceOver: https://x.com/dadederk/status/1550099327053451266 Switch Control: https://x.com/dadederk/status/1551236244088279040 Full Keyboard Access: https://x.com/dadederk/status/1551874732504629249 And Voice Control: https://x.com/dadederk/status/1552253520182640645 Of course that doesn't mean you don't have to test and check how the experience is with the other technologies. But before feeling overwhelmed, or for small teams, making sure your app works for VoiceOver is a great start.

UINotificationFeedbackGenerator has a “success” feedback type. Consider using it when a task was performed successfully together with any other visuals or sound. The use of multiple modes just makes it easier for everyone to understand your app.

Today I want to share something I use a lot. You can convert any article into a “podcast” by enabling Speak Screen in Accessibility Settings, switching to Safari’s Reader Mode and swiping down with two fingers from the top of the screen. I think it is a good example of how if we all knew more about how to use the assistive tech available in iOS, we would find ourselves using more of them, more often, exemplifying quite well that accessibility benefits everyone.

Created in Swift with Ignite.

Supporting Swift for Swifts