I'm wondering if, on MacOS, there's a way to programmatically make VoiceOver interact with a specific element.
Let me explain a little bit more on what the context is.
So, I have an accessibility request for a developer, but I'm trying to figure out if my request is even feasible, given the APIs in MacOS accessibility.
The main issue I'm facing is that when I'm on a table of tasks and navigating with the arrow keys, VO won't announce if an element is collapsed or expanded, and what the level of this item is in the tree. The only way for VO to announce that is for me to make sure that I'm interacting with the table before I start using the arrow keys.
The idea I have is the following:
- When I press a key to move to a table of tasks, the app could "force" VO to interact with the table
- When I'm editing a cell in this table and I press escape to cancel, the app could "force" VO to interact with the table again, not a specific cell
Is this even possible right now? I guess I'm getting around an issue in how VO works itself, so I want to verify with people who have written apps for MacOS that are accessible.