When setting isAccessibilityElement to true, assistive tech like VoiceOver will stop looking for other accessible elements in that view hierarchy. So if we make a view accessible, its subviews, including buttons and labels won't be accessible.

A tweet is composed by several UIKit elements: A UIImage for the profile picture; maybe 3 UILabels for name, username and date; a UIButton for more options; probably a UITextView for the text of the tweet; 4 more UIButtons for comments, retweets, likes and share, etc. Just with this elements (because all of them have the isAccessibilityElement configured to true by default), VoiceOver would focus in 9 or so UI elements. If we configure isAccessibilityElement to true for the container view, VoiceOver will see the whole thing as a single accessibility element, and everything else inside the view, will be ignored.

You may also find interesting...

The equivalent of using a .semanticGroup accessibilityContainerType in UIKit, would be to use the .accessibilityElement(children: ) modifier with the .contain option in SwiftUI. Here's a refresher with some use-cases: https://x.com/dadederk/status/1558790851496742914

Meet the rotor. A menu that you activate (and change options) by rotating two fingers on the screen. It lets you select different navigation modes and customizations. Like navigating through headings or changing VoiceOver’s speaking rate.

If you need to send announcement notifications that can step into each other, they will by default, interrupt ongoing announcements. But you can pass attributed strings as parameters too, letting you specify announcements to be queued.

Created in Swift with Ignite.

Supporting Swift for Swifts