Abstracting your interface in a way that can offer multiple input and output mechanisms is key when developing software with an accessibility mindset. Apple has brought this to the next level in visionOS.

Create accessible spatial experiences

Drawing of the new Apple Vision Pro device. There are arrows coming out of it pointing to different interaction mechanisms, including: eye tracking, gestures, VoiceOver (and its direct gesture mode), Dwell Control, Pointer Control (that works with eyes, hand, wrist and index finger), and Switch Control. Drew Haas, engineer on the accessibility team at Apple says in the WWDC session: Allow multiple avenues for physical interaction. Plan and design for your app to support different inputs. This is the best way to ensure you don’t accidentally exclude people.

You may also find interesting...

All the accessibility capabilities you can check for, have counterpart notification names you can observe in case the user changes its preferences while using your app. https://x.com/dadederk/status/1577435144129892352

Configuring the header accessibility trait, when appropriate, is one of my favourite accessibility quick wins. In this example, you need a single swipe down, instead of 12 swipes to the right to get to from Podcasts to Artists, in the app.

An alternative layout for large font sizes can be provided with Auto Layout by having three sets of constraints (common, default constraints, and alternative constraints) and activate/deactivate them depending on the content size category.

Created in Swift with Ignite.

Supporting Swift for Swifts