If both screenChanged and layoutChanged notifications signal changes in the UI and allow you to move VoiceOver's focus somewhere else, what's the difference? To the user, screen changed plays a sound indicating them they got moved to a new screen.

You may also find interesting...
The Accessibility APIs are generic and flexible. They're not just for VoiceOver. If you implement them right, you can do it once and it will very likely work great for VoiceOver, Voice Control, Switch Control, Full Keyboard Access, and more. That's why, to start with, we tend to focus on VoiceOver, the same way you may focus on keyboard navigation for the web. A great VoiceOver experience will get you most of the way to a good experience with the other assistive technologies. We've seen one example with Custom Actions. One implementation works for: VoiceOver: https://x.com/dadederk/status/1550099327053451266 Switch Control: https://x.com/dadederk/status/1551236244088279040 Full Keyboard Access: https://x.com/dadederk/status/1551874732504629249 And Voice Control: https://x.com/dadederk/status/1552253520182640645 Of course that doesn't mean you don't have to test and check how the experience is with the other technologies. But before feeling overwhelmed, or for small teams, making sure your app works for VoiceOver is a great start.
Check isReduceTransparencyEnabled to lower transparency. A great example is Spotlight. Not only transparency is removed but it keeps the main color of the background, it feels personalized and contextual but reduces noise and improves contrast.

You should convey important information in multiple modes, not just color. If you are still required to do so, at the very least you should complement that info with other modes, like symbols, if the user requested differentiation without color.