Creating UIAccessibilityElements, combined with a semanticGroup accessibilityContainerType, can also help you make components as complex as charts accessible.
Example from "Bring Accessibility to Charts" WWDC21:
https://developer.apple.com/videos/play/wwdc2021/10122/

You may also find interesting...
If you want to know everything about how to "Tailor the VoiceOver experience in your data-rich apps" with the Accessibility Custom Content API, there is a WWDC21 session. https://developer.apple.com/videos/play/wwdc2021/10121/ When implementing accessibilityCustomContent, for any supplementary information, it returns an array that VoiceOver will announce in that given order. The value of the AXCustomContent first, then the label. Users can configure in VoiceOver's verbosity settings if it should say that there's more content available, or play a sound hinting that there is, or simply do nothing. So it should really be optional content as users might miss it.
The equivalent of using a .semanticGroup accessibilityContainerType in UIKit, would be to use the .accessibilityElement(children: ) modifier with the .contain option in SwiftUI. Here's a refresher with some use-cases: https://x.com/dadederk/status/1558790851496742914

The .accessibilityElement(children: ) modifier with the .ignore argument does a similar thing to set the container view to be an accessibility element in UIKit. It is the default argument, so you can just say .accessibilityElement(). Because of this, you'll need to use other modifiers to make it accessible and manually configure an accessibility label and value, traits... when necessary. https://developer.apple.com/documentation/swiftui/view/accessibilityelement(children:) https://developer.apple.com/documentation/swiftui/accessibilitychildbehavior/ignore