Attributed accessibility labels are a thing! They'll let you specify (for the whole accessibility label or a portion of it) VoiceOver's language, to read punctuation marks, spell it out, correct the pronunciation, or even change the pitch.

@RobRWAPP has a very detailed blog post explaining each one of these attributes: https://mobilea11y.com/blog/attributed-accessibility-labels/

And here's Apple's official documentation for them: https://developer.apple.com/documentation/uikit/speech-attributes-for-attributed-strings

You may also find interesting...

One thing I find very useful when testing (or doing demos!) is to have VoiceOver's caption panel enabled. It shows constantly at the bottom of the screen and you can see exactly what VoiceOver is saying.

Images that convey important information should have the .image accessibility trait and provide an alternative text in the accessibility label. "Image" will be added to VoiceOver's utterance and the user will be able to use Image Explorer. Image Explorer is fairly new, introduced just a couple years ago. But if you were appropriately configuring the image trait, users suddenly got this new functionality for free. Isn't that awesome? With VoiceOver on, open Image Explorer by swiping up in an image and double tapping. It lets users find people (with a basic description and positioning in the photo), objects or text in images, using on-device intelligence. It is very cool!

Check for the traversal order of elements in your app. Sometimes, the default top-left to bottom-right order might not be the most logical one. Sometimes, you may consciously want to tweak the order. Some other times, grouping is the answer.

Created in Swift with Ignite.

Supporting Swift for Swifts