VoiceOver has a very cool gesture called the Magic Tap (double tap with two fingers). It should execute the most important task for the current state of the app. Examples: start/stop timer, play/pause music, take a photo, compose a tweet...

The twitter app is open. It doesn't matter where the focus is, if I double tap with two fingers, the compose tweet screen will be presented. Presumably Twitter is overriding the function accessibilityPerformMagicTap and executing the code that presents that screen and then returning true.

You just need to override accessibilityPerformMagicTap() to capture that gesture, execute the desired code, and return true if handled successfully.

https://developer.apple.com/documentation/objectivec/nsobject-swift.class/accessibilityperformmagictap()

You may also find interesting...

If your watch app has good VoiceOver support, chances are you'll also have good Assistive Touch support. But an improvement you can make is to implement a quick action (triggered with a double pinch) when there is a main action you can perform.

To capture the gesture, you can override the accessibilityPerformEscape() function. In there you can dismiss your view, and return true if you could successfully handle it. https://developer.apple.com/documentation/objectivec/nsobject-swift.class/accessibilityperformescape()

You can add an observer to listen for changes in the content size category, in case it is more convenient than overriding traitCollectionDidChange(_:).

Created in Swift with Ignite.

Supporting Swift for Swifts