At Apple’s WWDC event in June, our favourite fruit company announced they would be transitioning Macs to run on their own processors which they dubbed Apple Silicon. This isn’t the first time they’ve done this as they moved from Motorola 68k processors to PowerPC and from PowerPC to Intel. In 2015, Apple’s engineers got fed up with Intel’s Skylake chips, so they decided to take their ball and go home to design their own processors. For VoiceOver users, this is a big deal. It might not be as monumental as hearing the iPhone 3GS run VoiceOver for the first time in 2009, but it does represent quantum leap forward for VoiceOver and the Mac. I had a few thoughts I wanted to share about this.
System On Chip (SOC)This was a popular term thrown around at WWDC. It essentially means Apple can etch the most fundamental and important parts of MacOS onto their chips at the fabrication stage—something they could not do with Intel chips. Tasks like networking, memory management, the file system and security can all be baked into the hardware. This includes VoiceOver. This leap forward is similar to the way Intel started including multimedia support on its chips back in the late 90s and early 2000s, eliminating the need for external sound and graphics cards.
VoiceOver running directly off the chip means instant-on and rock solid reliability. Imagine being able to select from multiple operating systems at startup, log in without hearing clicks or beeps when File Vault is turned on or checking on the progress of a MacOS upgrade with the Alex voice instead of 25 year-old Fred when your Mac restarts to install the latest version of MacOS. These are features that will be rolled out when Apple Silicon arrives.
For the system admins among us, this means less sighted help when administering Macs and more flexibility when it comes to helping your hapless sighted colleague get her new laptop up and running.
Better SpeechIf you’ve ever tried to run one of the Siri voices on MacOS, you may have found the performance a bit sluggish. On Apple Silicon, Macs will be able to take advantage of the cutting edge technology that allows these voices to perform quickly on iPhones and iPads. Furthermore, Apple Silicon means Macs can receive the same machine learning hardware that will allow more expressive and natural-sounding voices for your Mac. Imagine reading through AppleVis and hearing the frustration when someone uses 3 exclamation points to rant about unlabeled buttons, or a nearly identical Morgan Freeman voice reading you a book in the desktop version of Books.
In addition to better speech, an Apple Silicon chip means VoiceOver can perform much quicker, better and more efficiently in complex documents, programs or web pages. Think of how sluggish VO performs in a very large Numbers sheet or complex Keynote presentation versus the same file being open on an iPad Pro. We can hopefully kiss the “busy, busy, busy” message goodbye once Macs are running on Apple Silicon.
VoiceOver and AIThe move away from Intel means Apple Silicon can turbocharge VoiceOver to take advantage of features not available on Intel chips such as AI and machine learning. One potential application is true and accurate image detection for those buttons and controls that VoiceOver can’t detect or operate in poorly coded programs. Other features might include multicore processing, artificial vision which would allow for object detection with your iSight camera, better 3D audio support for audio cues and features not yet dreamt of by the developers of the future—some of whom are reading this post.
It isn’t clear as to where this technology can go for a screen reader, but Apple Silicon chips are going to be much more powerful than Intel chips and will create an entirely new set of features we have yet to tap into.
Faster Upgrade CyclesWith Apple rolling out its own silicon, software and hardware releases are no longer tied to Intel’s development schedule. This means VoiceOver may finally get the love it deserves when software and hardware teams are all working off of the same release schedules. With VoiceOver being a parted of the SOC, this may mean the core functionality of VoiceOver will remain rock solid while add-on features can run off of the SSD until they’re deemed worthy enough to be baked into the next chip.
VoiceOver may seem as if it has been stuck over the past 15 years on Mac, simply because of the limitation of Intel chips. Apple Silicon will now allow for huge leaps forward for the screen reader.
Your ThoughtsSo, what do you think is next for VoiceOver when Apple switches to its own silicon? Will we see artificial intelligence features slowly make their way into our screen readers? What features are you expecting when we have the same chips in our Macs that are in our iPads and iPhones? Will we begin to hear realistic human-sounding voices read us the news? Will Siri and VoiceOver have some kind of AI baby that will be like the movie Her? Please leave your comments below.
Comments
Maybe
Will see. FYI. My favor fruit is not an apple but a banana.
I agree with the previous…
I agree with the previous post.
Let's not get ahead of ourself until we see it in action, and yes, I know we saw it at WWDC, but that wasn't with Voiceover running.
yes I agree.. if the Macs…
yes I agree..
if the Macs can get a little more affordable, I'd happily switch to them if all that the author mentioned comes to life
This is, frankly, mostly bullcrap
First, the first point, about voiceover on the chip, is probably true. Although, it's probably just VoiceOver being more present on the recovery partition. The other points, besides Siri voices and VoiceOver recognition, are crap. VoiceOver's problems are not hardware issues. Is VoiceOver a piece of hardware? Is there a VoiceOver can-processor? No.
Try the Terminal app when you get a chance. It's pretty bad. Try doing ping google.com. VoiceOver interupts itself constantly. VoiceOver doesn't have a speech quue.
This isn't a problem at all on Windows, or even Linux. Try the text editor on free code camp. It works fine with NVDA on Windows. Why is it that the Playgrounds sidebar is only becoming accessible this year? These things aren't related to the chip computer uses. Its related to the people programming the operating system.
Personally, I switched back to Windows. A reliable screen reader, NVDA, with good braille support, no stupid touch bar, no need of interacting with a lot of stuff, no lag, and VS Code works much better than Xcode ever will. I just don't have time to idealize companies any more. App's iPhone is great. The MacOS, isn't. Microsoft's Windows is now very good. Google's Android isn't.
Interesting Stuff
Interesting post. I fully realize that I'm probably in the minority when I say this, but I've been quite impressed with VoiceOver on both devices I own. Sure, it has bugs. But doesn't all software have bugs, even for non-AT users? I started out on OS X Mavericks, and ever since then I've seen--or rather heard--bugs being squashed all the time. Take the default low-battery notification. That was broken with VoiceOver there for a little while, but Apple fixed it. Or at least it has worked on my Mac ever since. As for my iPhone...Well, I'll just repeat what I've said a few times in the past. I didn't think I could even use one of these, due to the flat screen. But I've been happily using it for awhile now, and I'm not looking back. So I'm looking forward to the future of Apple. Even if only some of this stuff comes to fruition--or perhaps none of it at all--I just know it'll be good. Btw, that last part about none of it coming to fruition, was just sarcasm on my part. Just fyi. But I know there are those Apple users who will respectfully disagree with me, and I'm ready and willing to accept that.
Thanks for the Comments
Thanks for your comments. No, we did not see VoiceOver running on Apple silicon at WWDC, but it was referenced in passing during one of the SOC sessions. Yes, we will see.
I see your point.
I see your point regarding everything not being baked into the hardware. The hope for me, at least, is that because VO may be running quite similarly between ios and ARM macs, it may be able to squash bugs like this much quicker.
It's still unfortunate that the terminal is such an unusable mess, and it should have been fixed ages ago. Here's hoping that this transition will allow more people to be working on VO, using apple's more fine-tuned control over the processors.I see
VoiceOver on a chip
I think having voiceOver baked into a hardware chip is a bad idea. Maybe having some speech synthesis on a chip could make tasks like managing the BIOS more tractable for a blind person (if Mac's use a BIOS as Windows PC's do), but there are probably other ways of overcoming that issue if developers put their minds to it.
Ages ago lots of accessibility features were built into hardware. You might remember the hardware boards companies sold to do screen magnification, speech synthesis, etc. The problem with such a paradigm is that hardware isn't easily changed when new features and capabilities come along. We've moved to a paradigm now where accessibility features like screen magnifiers, speech synthesis, etc. can be delivered as software rather than hardware. Not only does this make the access technology a lot less expensive, but, as I pointed out, it is much more amenable to upgrades, modifications, bug fixes, etc.
What has changed over the past few decades is the power of our processors. Ages ago, PC processors couldn't handle the extra work of running the access technologies. so these technologies were baked into hardware with their own processors and associated chips. We've come a long way and I think it would be a big mistake to start baking accessibility into hardware that can't be changed.
Just think of all of the VoiceOver bugs people constaly report and complain about. if VoiceOver were baked into the hardware, one would have to not only wait for the next version of hardware to be released but one would then also have to ditch their working computer and purchase a new one. Now, with accessibility run in software, we simply wait for the next update of the software.
Yes, there are probably some design aspects of a custom PC chip that could make some aspects of the accessibility software work a bit more efficiently but I wouldn't want to have my assistive technology built into the hardware.
Anyway, that is just my opinion based on what I've observed as these technologies have marched along over the decades.
--Pete
VO on Hardware
To Peter's point, the entire VO Package may not run off the chip, only the barebones parts of the screen reader that would make it functional with the full version being a combination of the two. I'm only speculating as Apple may just decide to leave VO alone for the first few iterations.
I love both.
I love windows, and mac. but I do not like the endroi, because the screen reader(talk back) is not good for me. I love the voice-over, and NVDA equally. I do not like jaws though.
They need to fix a lot
I'm hoping that this transition means Apple actually starts giving a damn about Voiceover for Mac OS. Catalina was such a trainwreck for me that I sold my MacBook and am now happily using a Windows machine which actually has working Braille support and where Quicknav doesn't break 15 times a day. I was slowly building an angry, sweary laundry list of bugs and complaints about it before I finally found a buyer who would take the thing off my hands, despite the impending hardware transition.
If benchmarks leaking from the developer transition kits are anything to go by, then this is going to be a big leap in performance. I just hope the software also leaps forward. It needs a lot of work before I'm willing to give it another try. I'd very much like to have access to iMovie and Final Cut again. Perhaps they can fix the accessibility bugs in those too? One can dream, especially when you pay so much money to still be disappointed. Really, I don't mind the hardware. Even the touchbar is fine. Just please please please please, fix all the bugs.
Most of this is plain wrong.
First, you don't need apple silicon to implement most of the features at startup. The reason the Alex voice isn't used at recovery has nothing to do with hardware, its just that the Alex voice is way too big to be included in a recovery partition, or during OS upgrade. since Fred is lower quality and as such smaller, it will be used instead. Don't expect this to change with Apple Silicon. Also, apple could easily make an accessible boot loader, but it wouldn't be speech enabled as it would, again use too much storage.
Better speech.
This is more on apple's poor optimization for text to speech on the Mac in general. Even things like speech recognition are slow on the Mac, and that has nothing to do with intel, either. Windows is much better at it.
VoiceOver and AI.
Most of the AI features described can be implemented without special hardware. I have an older device without the AI engine, and iOS 14 recognizes images and buttons 95% of the time without a problem. This is a new feature we're going to see in September, by the way.
As for performance in general, this can mostly be attributed to software, not the processor.
The users who will see something from this will be those of us who run performance intensive tasks, and those who want to buy into the ecosystem at a lower price.
NDA officially broken, @king of the north, and other muzzings
first off, only public beta tester(s) and developer(s would know about iOS 14 accessibility features. posting that image recognition is a new VoiceOver feature, well, I know, you violated apple's NDA, as that information is not available through Apple official channal(s). I personally think that VoiceOver being a hardware feature for startup and recovery is a great option. to the poster who said it is not is, in my view, putting update(s) above effishency . think of all the blindness notakers still around. they're not by any meens, up-to-date, but they're effishent and stable. who cares that we the blind, get stuck with not updated software, it works and be happy we have specialist devices at all! Apple wanting to put VoiceOver on the hardware, from apple's perspective, saves lots of money, developing software in that VoiceOver is currently. this will also be a boom for Apple as they will make loads of money off of the blind community. I hope they do this for all accessibility features. and another thing, I hope Apple once again makes all Opperating system upgrade(s) paid again! tuns of engeneering work goes into them, they deserve to make money and pass it on to the people within Apple who do the momentis work every year for all of apple's software and hardware.
Sincenerly,
Daniel Angus MacDonald
No violation
Apple has published this information both on their release notes of iOS 14, as well as in the interview with Applevis. Whether you follow any of those is a different thing, but the poster certainly didn't violate anything. Moreover, as someone already said, this does not mean Voiceover is suddenly switching and will fully be on the hardware, that's pretty much impossible and unreasonable.
Nope.
No NDA has been broken, as accessibility features are available right on apple's own preview page and I'm testing them at the moment. Apple has never enforced such things.
As for the rest of your comment, I'm not sure how that relates to anything I said.
Daniel
All the money Apple would make from blind people. LOL Lol Lol Lol Lol. Do you know how many times on a freaking 4 dollar app blind people on Applevis say it should be free? Get out of here with that logic. We first off aren't even 1 percent of the population so again Apple really doesn't have to do a lot for us that is reality. They will never make back money they put in for accessibility. It's why I quite frankly the petition for the Apple Watch that Jonathan wrote in my opinion is silly. In the United States only around 40% of blind adults have a job so again not even 1% of the population and only 40% of us are working I just don't see the economic value there, but lets keep smoking the smoke I guess.
Exciting stuff!
This sounds really exciting, Any ideas as to when we are likely to see things like this happen? Stay safe
has to be sarcasm
Okay, I mean, what? Have you read the "What's new in iOS/macOS" pages? It talks briefly about the image recognition stuff there. It is public knowledge. Lol. Wow. I don't think I've ever been such an Apple fanboy. Also, Mac's won't be popular in the blind community until:
These are big issues, and no amount of beefing up a CPU chip will fix them. Eventually, hardware is meaningless, yes, meaningless, in the face of software issues. In other words, bug fixes and optimization > faster hardware.
I sold my MacBook Air in…
I sold my MacBook Air in early 2018, and used Windows as an exclusive desktop operating system. In April this year, I got another MacBook (this time a Pro) for work because we are now working on implementing VoiceOver support into Firefox. I had loosely followed AppleVis and the happenings around Mojave and Catalina. So I knew there were no big leaps in VoiceOver.
And from what I see and hear from Big Sur, at least on the VoiceOver side, there has not been much of a redesign happening. It still insists on interacting with various things, adding a level of complexity that simply isn't there on iPadOS, or on Windows or Linux in any screen reader. Moreover, even when transporting apps from iOS to MacOS via Catalyst, as seen in the publicly available WWDC session on Catalyst Accessibility from 2020, Apple even insist on adding interaction models for the containers developers are using. They demoed that in a small app, but it is to be expected that Messages, which is now a Catalyst app, will require a lot of fiddling to get to conversations and their contents. And do we want to bet on whether VO+J will still properly jump between the last read message in a conversation and the input field, so one can quickly check if there was something new said?
None of that will change if that same operating system will run on Apple Silicon, I suspect. VoiceOver, to be a more viable option in the future, must be rewritten to get rid of this overly complex model of interacting with everything before you can get to the content. Turn the TrackPad into an actually useful touch target environment that quickly allows access to all elements, and otherwise do proper keyboard access in all kinds of apps like AppKit, Catalyst, or SwiftUI.
The one thing that excites me about Apple Silicon is the better performance that is to be expected. But unless Apple really get down and change the design paradigm on VoiceOver, and execute it well, its complexity and bugs and inconsistencies will remain regardless of the platform it runs on.
I disagree on the interaction model.
As someone who uses the quicknav feature, I personally find I am much faster with VoiceOver than windows or linux screen readers. Its incredibly useful for skipping around wikipedia articles, for example, as well as apps that are heavy on buttons and textfields. I find I can't be as efficient with NVDA, but maybe I just need more experience with it.
MacOS Navigation vs Windows
I had to switch to MacOS after several years on Windows because of all the bugs I was experiencing with JAWS on the various laptops I had. Inevitably, no matter how good the laptop was or how up-to-date JAWS was, something major would always fail. I couldn't check my email without the whole app crashing, and Google Chrome became really sluggish as well. I love MacOS because of all the customization options it offers, and I find the hierarchical navigation system quite efficient. Also Google Docs does seem to work a lot better on Mac than on Windows.
NVDA and Windows
Windows focuses a bit more on keyboard commands, whereas with Apple you have to pretty much navigate or search for something. And yeah, the rotor on Mac makes things a bit easier, but everything else on the web just negates it all. Busy, busy, busy, rich text editors, like on Free Code Camp and Google Docs being really a pain to use, and stuff like that just make it harder to use than Windows, which generally works well with everything.
I'm not so sure.
In my experience, NVDA on windows seems to suffer from serious performance and responsiveness issues. When launching a game on windows, for example, it becomes completely unresponsive, to the point where it seems to have crashed, even tho it's still active in task manager. When using the epic games launcher, a web app, both NVDA and jaws fall on their faces, as I am unable to navigate the app with either. I've only had this problem on windows. Sure, VoiceOver saying "busy" can be annoying, but I'd rather have that than have the computer go completely silent with no indication as to what is going on.
My computer is by no means the fastest thing in the market, but its not that slow either. Linux, which uses orca, doesn't have this problem. So I guess for some of us, it can be very YMMV.
Worth a blog instead
Dude how come u didn’t just write all this up as a actual blog? lol.
Great though!
if it’s that bad
Well sir. Out of respect if its as bad as you say. why are u still using the hardware and software?
Role back?
Why didn’t you just grab your time Machine backup and role back to Mahavie? The very latest update to Mahavi is great too btw. I only updated to Catalina to see how well it would even perform on this Mac. Well. ok. and maybe also perhaps iCloud folder sharing. :) Did they do a update to Mahavi to add support there as well? or is it only on latest OS update? Anybody know?or tried?
Tried the new Swift Playgrounds app?
Swift Playgrounds, Learn real coding the fun way,
https://apps.apple.com/us/app/swift-playgrounds/id1496833156?mt=12
Description
:
Swift Playgrounds is a revolutionary app for Mac and iPad that makes it fun to learn and experiment with code. You solve interactive puzzles in the guided “Learn to Code” lessons to master the basics of coding, or experiment with a wide range of challenges that let you explore many unique coding experiences.
Swift Playgrounds requires no coding knowledge, so it’s perfect for students just starting out, from twelve to one-hundred-and-twelve. The whole time you are learning Swift, a powerful programming language created by Apple and used by professionals to build many of today’s most popular apps. Code you write works seamlessly as you move between Mac and iPad.
Lessons built-in
• Apple-created lessons guide you through the core concepts of programming by using code to solve puzzles
• See your code run in a beautiful, interactive 3D world that you can rotate and pinch to zoom using the trackpad
• Animations introduce each new coding concept at a high level before you dive into the puzzles
• Choose from three animated characters to carry out the steps of your code
• Glossary and built-in help pages give detailed information about available commands and frameworks
Explore and create
• Challenges offer many new opportunities for creativity by playing with game logic, music, and more
• Interactive coding shows the results of your code instantly, either beside the text or acted out in the live view
• Step through your code to highlight each line as it is run
• Use your own photos and images within a program to make it uniquely yours
• Starting points are a head start to create your own playgrounds that display graphics or chat with your Mac
• Create your own playgrounds from scratch to build something totally unique
• Reset any page to start over, or duplicate and rename any playground to try different ideas
Built for Mac
• Code suggestions let you write entire programs in just a few clicks of the mouse
• See help along side code suggestions to learn about the available commands
• Click and drag a brace to wrap a block of code inside a loop or conditional statement
• Drag and drop snippets of commonly-used code directly into your playground
Real Swift code
• Learn the same powerful Swift programming language used by the pros to create thousands of apps
• Take your Swift coding skills to the next level using Xcode to develop an app you publish on the App Store
• Access powerful frameworks such as SpriteKit for 2D games, SwiftUI for app interfaces, and more
• Use Bluetooth APIs to write programs that control robots and other real-world devices
• Concepts and skills you learn directly apply when writing real apps
Share
• Send your creations to friends and family using Messages, Mail, AirDrop, or other Share Sheet extensions
• Start a playground on your Mac, then open on your iPad using iCloud to continue the project
Kind of sounds like writing in Inform code to me if you're writing Z code games.
And I have that on Mac too by the way.
Perhaps because the 16-inch…
Perhaps because the 16-inch MacBook Pro shipped with Catalina, and even if I could get Mojave on it, most of the bugs I'm complaining about were there too.
Apply for Apple
Let me just bluntly say this and it just needs to be said because everybody always has things they want to bitch and complain about over and over and over.
If all of this and that and yada yada yada bauthers you soooooo much why not apply for a job at Apple yourself and possibly be on the dev/accessibility team?
Then you possibly can do things you would like to do and fix what you see need’s fixing.
And frankly I’d love it.
Just saying
Yours Truly
Dr. Stansberry.
Voiceover settings tinkering on Mac
I see none of you ever really just played around with Mac Voiceover settings. You can make so many changes in there for Voiceover Navigation you can make it look like Windows if you like.
Change the grouping from standard to something like 'Ignore groups OR Announce groups
OR Bookend groups
Of course the default is standard.
Just press command 4 when you load voiceover utility and see.
I will honestly say to it is there fault there not as clear or as open about how Voiceover works with things perhaps like the guys on Windows does and hopefually they will be fixing some of that.
And not everybody’s gonna just go look at the Vo Manual and it probably isn’t updated as it should be. But I don’t know.
Duly noted. While I finish…
Duly noted. While I finish my CS degree, please direct yourself to a course on how to not be insufferable.
Criticism while still using products
One can criticize products for their deficiencies while still using them for the... one or two things it does well, Email and a BSD/Unix system, using TDSR of course because VO sucks in the Terminal. Really, though, the only place I like the interaction model for is in the Mail app, where I can interact with the messages table, interact with the message row, navigate to the subject column, and arrow down to only hear the subjects being spoken, not everything else. That makes reading through email a breeze. But everything else I listed, and all my other problems, make it easier to just use Windows and the Gmail web app for email.
Swift Playgrounds
There's two "playground" things. There's Playgrounds in XCode, which I was probably talking about, where the Sidebar hasn't spoken, until Big Sur, and Swift Playgrounds, an app for kids which has you try to map out a world in your head, if you're blind, and have your byte sized character go from one place to another. You have to map the world out, plan your code, write your code and hope it works because Catalyst, or do what I eventually started doing and copying and pasting the code into Emacs or something to edit it better, run the code, watch the bot walk and turn and walk, and try again or move on to the next thing. I could do the first few levels, but then teleporters came around, trying to mirror skip logic, and I was just lost. I felt like I wasn't getting anywhere, so I just went back to Automate the Boring Stuff for learning Python, which has you write real code. Like, code that would do something on its own. If others can learn using Swift Playgrounds, then great. I'm glad you're smarter than me. But I have a hard time mapping real spaces in my head, let alone virtual ones converted to just tiles, with freaking teleportation things around. Oh and the text editor sucks because Catalyst.
Seriously?
So, this looks like one of those Linux bros saying "Well if you don't like it? Well if you think it's like, soooo bad? You know what you do? You fix it! Shut up and stop complaining and fix it!"
Do you know what it takes to get into Apple? It takes knowing someone already there that will vouch for you. One can't get into Apple through the strength of their fanboyishness. You have to have great skill, and you have to have something they want. And then do you know what it takes to fix accessibility issues? It takes knowing how to program, in Swift and C++ and Objective C, probably. It takes knowing the code already there. It takes knowing all of the API's, all of the functions used in Apple's OS code that they prefer and the style they like. It then takes understanding enough to fix these issues. Not everyone can, wants to, or should have to code. Not everyone can, wants to, or has the programming skill, soft skills, or Connection to work at Apple. And most blind people on this forum and off it, should be able to respectfully go to Apple Accessibility, show them what's going on with the Mac, and get results in the form of bugs fixed. Apple is not a government agency, where "Oh, you're an assistive tech instructor thing? Yeah you're hired! Show up on Monday!" Of course, it's a little more complex but you see what I mean. Apple doesn't just hire anyone, or people for very specific, non-programming types for accessibility work. Shoot I wish they did, I really do. I'd love to contribute and make the Mac a great system. But I don't have the skills. Should I then just shut up and sit down because I can't program more than a print, input, and maybe an array if I looked it up, in Python? Should I just be silent since I can't fix the problems? No. Let the Linux community have people like that. We cannot fix Apple's code. We cannot talk directly with the mac accessibility engineers, as far as I know. The only thing we can for sure do is report bugs, and ask for improvement, which I've been doing this beta cycle.
Just saying, of course.
How do you do that? That’s…
How do you do that? That’s really helpful And also add the ‘Failed to enishilise Audio devise message when using Team Talk 5. But then again the GUI sucks on that too. But it would be good if the dev ports over to the mac from the iOS. Surprised he hasn’t done it yet!
Nice!
I couldn’t have said it better myself.
Dude I can map out a GUI right now for Logic Pro that would be beautiful all someone needs to do is just implement it!
I just hope there really listening to our voices more like they say. or Some one says. Voiceover can be so much morre!
Please let’s finally say good buy to
"busy. busy. busy. busy. busy.
Desktop
busy. busy. busy. busy. busy.
Busy. Finder"
Once and for all!
VoiceOver optimization
Welcome to Macintosh. VoiceOver is running.
This is what my system tells me when it boots, and that's because I'm an old-school Leopard user who never grew up hearing OS X or macOS... Well I did hear Mac OS X on SnowLeopard, but that was the last truly major update to VO, accept for Activities and Vocalizer in Lion (VoiceOver 4), and this is the current version for macOS, just with some extra features bolted on as time has gone on... We are still on VoiceOver 4, as indicated by your Preferences folder in your Library folder in your Home directory. If any of the clues are pointing in the right direction (VoiceOver playing the doubletap sound in Catalyst apps), then this means touch support will be coming to macOS, where you can have your flat review which I dislike with a passion, especially when dealing with controls such as ribbons in Office. Try navigating those without using NVDA's object navigator, which, if I may so remind you all, uses the interaction model in and of itself, but not as well done as you can lose focus and have weird dialogs floating around or be kicked to the start button with no explanation), so yeah... NVDA took a page from Apple's book. The reason why there is no interaction in iOS is because there doesn't need to be. If everything on the display can be accessed by direct Manipulation, then you shouldn't have to interact with elements as your finger has a direct relationship to what you're touching. This is the concept that people don't understand with a desktop vs mobile screenreader. A mobile screenreader doesn't need interaction because there's a physical link between you touching your finger on the screen and VoiceOver announcing what you're touching.
If you don't like the interaction model, then change the VoiceOver settings to your liking and experiment. Use the advanced features, such as the web rotor, and its Emacs-like progressive search, is extremely powerful and efficient as I don't have to tab and shift tab between the combo box of filters and the actual list of elements, just the arrow keys are necessary to navigate, while typing the first few letters brings only elements that contain the letters into focus. Typing in "ski" when filtering by form controls on YouTube brings you the buttons which contain ski in the name, and one of them might just be skip ads... Give that a try and see how surfing the web is with VO. Don't use the rotor, in fact, disable QuickNav entirely and use VO cmd h and VO cmd shift h to navigate by heading, and the web rotor for whatever you need or VO f to find within the elements themselves. Also, table navigation with VO is beast, all because of the interaction model. Does VO need work? Most certainly it does! We've been stuck on VoiceOver 4 for the last 9 years, it's about time for a new version number, and if I may forecast, I forecast that iOS VoiceOver may replace VoiceOver on macOS as it runs on Apple Silicon hardware.
Standard VoiceOver Keyboard Set, Modifier Keys Lock
I’ve always wondered, why people never talk about these in great detail, like on a podcast/blog or something.
Do folks feel that QuickNave has a better set of commands built-in that they feel the standard set is ancient or something?
Also, the VoiceOver+Semicolon command, never gets enough praise.
Just some things to think about for a future discussion/thread I suppose.
This might be the start of something good
We've seen MacOS pick up a much-needed refresh. Perhaps VoiceOver will get updated ... eventually ... along with all of the other Mac app updates. Perhaps it's time for a petition asking Apple to please please update VoiceOver so that it operates more smoothly. As the only and only screen reader available to Mac users, there are no options if VOiceOver falls so far behind that it eventually becomes unusable. Sure, VoiceOver is highly customizable, but the iOS variant is much more fluid than it's Mac counterpart.
Absolutely, the control…
Absolutely, the control option lock is dead useful, as it gives me all the joys of QuickNav, without needing to test whether or not it is enabled in the first place... Even when running a very early beta build of 10.6 on PowerPC hardware, VoiceOver wasn't updated to have any of the new features at that time in the beta cycle. The issue here is that with macOS needing to support Intel for quite a few years, what will that do for VoiceOver? Will the current version be what we're left with on Intel? Will Intel machines make the jump if VO5 were released? I'll actually double check what features made it into the beta of SnowLeopard on PPC vs Leopard, as that is an un-explored ave that not very many people have ever gone down before. If history is to repeat itself (as it usually does), then what can we learn from the previous shift from PowerPC to Intel hardware?