In this edition of the AppleVis Extra, Dave Nason and Thomas Domville are joined by Sarah Herrlinger, Director of Global Accessibility Policy and Initiatives at Apple; and Dean Hudson, Accessibility Evangelist at Apple. Topics covered in this podcast include an in-depth look at new accessibility features coming later this year in Apple software for blind and low vision users, as well as a broader look at Apple's approach to making their products accessible to as many people as possible.
Full transcript of podcast
Please note, This transcript was created solely for communication access. It is not a certified legal transcript and is not entirely verbatim.
[music]
Announcer: This is the AppleVis Extra.
Dave Nason: Hello, and welcome to AppleVis Extra. This is episode number 67 coming hot on the heels of episode number 66 which was our round-table about WWDC keynote on Monday.
Today, myself, Dave Nason, and my colleague, Thomas Domville, also known as Anonymouse, are delighted to be joined by two people from Apple live from WWDC conference this week. We have the head of accessibility at Apple, Sarah Herrlinger, and we have one of the accessibility technicians called Dean Hudson.
Thomas, thanks for joining me. We're delighted to be getting the chance to interview these guys today.
Thomas Domville: Definitely! I am so excited to meet with these two. I know that we are going to learn a great deal of things today, and I hope our listeners will, too. It's going to be a lot of fun.
Dave Nason: Yeah, it was a big keynote, wasn't it? We talked about a lot on Monday. People can listen to that podcast for the full details of what we talked about in our immediate kind of aftermath, but there was a lot there.
Thomas Domville: Definitely! A lot to soak up, and even today I'm still soaking things up, and hearing things that we didn't really pick up on on the WWDC day itself.
As we're starting to get our hands on these betas, more and more things are starting to pop up. This makes this even more exciting of a podcast to listen to.
Dave Nason: I think we all know that iOS 12 was a performance update. I think whether you had accessibility needs or not, it wasn't a feature-rich release last year. This year, I think there's a bit more there in the accessibility world as well, and in general when it comes to features. I think we're going to have plenty of questions.
Thomas Domville: Definitely! I think that what I came away with from Monday was a lot of people were saying "Is that all?" "Is that it?" I was like oh, my gosh, are you kidding me? There was a lot there. I think for every 1 thing they mentioned, there were probably 20 things they didn't mention. There is so much under-the-hood. Like you said, this is a vast difference, a stark difference, between last year's iOS 12 to this year iOS 13.
That is the same for those with accessibility. There are quite a few small changes, and new things that we can expect which I'm very excited to talk about.
Dave Nason: Indeed!
Let's welcome our two guests! We have, all the way from California in the middle of WWDC week ā¦ā¦ā¦ā¦ā¦ā¦they've taken the time out to talk to us. We have Sarah Herrlinger, and Dean Hudson.
Do you guys want to tell us a little about yourselves? Dean, do you want to go first, and tell us who you are, and what you do?
Dean Hudson: Yeah, sure. Thanks for having us. This is a real honor.
I am Dean Hudson. I started here at Apple probably 2006 where things were starting to roll. I've been here in the accessibility engineering team when there were three of us. It has now expanded greatly, but it's been a real fun ride all the way through advent of iOS making that accessible up through Apple TV, Watch, and HomePod. It's been a really fun ride.
I now, the last few years, have worked for Sarah, as the Accessibility Evangelist at Apple. Really really fun times.
Sarah Herrlinger: I'm Sarah Herrlinger, and I lead our efforts in the Global Accessibility Policy and Initiatives team. I get to work with Dean which is always a lot of fun.
Our team really focuses on accessibility as a core corporate value for Apple. We look holistically at all the ways that we can infuse accessibility into the Apple ecosystem. Whether that be through products, or services, or stores, or anything that we do, just making sure that every employee at Apple understands what accessibility means to us as a company, and that all of our users know about all of the amazing things that we are working to do, so that they take advantage of those, and get more out of their devices.
Dave Nason: Cool! I guess your job is making sure that accessibility is there on the ground floor of every project. Is that kind of an idea?
Sarah Herrlinger: Yep. Absolutely! Through both Dean and I, we look at all those different areas. We get, early and often, into all of the different projects here to make sure that everybody thinks about accessibility in what they do.
Thomas Domville: What an exciting job to have! That is like a dream! It's amazing to hear you guys have been there for so long especially Dean since 2006 when he was part of a team of three. That just totally blows my mind where we are today in terms of what we have gone so far with both of you. That's--
Dave Nason: When you think--
Thomas Domville: --amazing.
Dave Nason: --to join one year before the iPhone launched. Everything that's happened since.
Dean Hudson: Yeah.
Well, I should say I am a VoiceOver user, totally blind. It was just very fun.
The thing that you have to keep in mind, and it really takes a lot of character, but you have to be patient. People want things to happen tomorrow, and it just doesn't work that way. In the end, we took some time to develop and get things right, and it has paid off.
We kind of lead the industry now in accessibility, and it's because we start at a ground level as Sarah was saying. Before even any lines of code are written, we get in there with the teams, and get people to think about accessibility early.
Dave Nason: That's cool!
That's such an advantage for you in a sense in what you do because you're both an expert in being a blind customer, and you're an expert in Apple, and what's going on on the inside of the company, I guess.
Sarah Herrlinger: That's exactly why I stole him away from the engineering team, and brought him over to become our tech evangelist because he is so good at being able to go to every team in the company, and really express to them the importance of the work that we do, and get them to really think about not just the blind community, but every community that we support.
[laughter]
Thomas Domville: Educate. That's the key word is to educate everyone, and explain how to dive in, and do it the right way.
Dave Nason: I've seen in my own work the difference the passion can make. You know what I mean? It's not just dryly telling them this is the features, but when they can actually see a human being using those features, and the difference. I'd say I think that goes a long way when you're speaking to an executive or a project manager or whatever.
Dean Hudson- Yeah. Yeah. No, there were a few times when I would just bring my device to an engineer on the audio team, for example, and say this is wrong, this doesn't work. Can you guys do something about this? They're like oh, my gosh, you've been using this? We should fix this.
[laughter]
Thomas Domville: That's awesome!
Sarah Herrlinger: A lot of years of great work being done.
Dave Nason: Fantastic!
Of course, we're in the middle of a very busy time of year for you guys. We had the keynote on Monday. I would say one of the highlights of the show was the announcement of Voice Control, and that demo that we saw. Do you want to kind of tell us a little about? We saw highlights. There's probably plenty to talk about around Voice Control.
Sarah Herrlinger: Yeah, we're really excited about Voice Control.
One of the things that has been important to us as an accessibility team is to continually look at new user groups that might not otherwise be able to use our technology. How do we keep pushing forward, and making sure that everyone who wants to use an Apple product has the opportunity to do so, and has the tools available to make that simple and easy and fun?
Voice Control is a feature that was built with individuals with extreme physical-motor limitations in mind. It is individuals who wouldn't be able to use their devices unless they were able to use their voice.
What Voice Control does is give them full access to their devices. It is built into both Mac OS and iOS platforms, so for any iOS device or Mac, being able to really control and use your device with just your voice. That would be all elements of navigation, opening apps, opening menus, moving around on the different devices, as well as things like dictation, text editing, and doing those things in a seamless fashion, so moving from one to the next, saying open Pages, dictating text, then saying open Photos, and doing something in your Photos app. Things like that all sort of moving through seamlessly, and not having to kind of move from one to the other in a more stunted way. We wanted it to be something that was really useful and efficient for those users who rely upon their voice.
Dave Nason: I sort of speculated on Monday that maybe it was built on the same framework--if that's the right terminology--as VoiceOver and Switch Control, so that if you designed for one you design for the other. Is that the case, or is it a whole different...
Sarah Herrlinger: It does take advantage of the accessibility API that's built into our software developer kit.
One of the messages that we really try to express out to developers this week is how important it is to use that accessibility API, and how when you do it, you get so much from it. With all three of those utilizing that, if you are someone who... It's sort of the well, if you're concerned about one group, hey, you're going to get the other ones for free. We definitely want everyone to use this, and to really be good digital citizens when it comes to accessibility because this is the foundation for so much of what we do.
Thomas Domville: That's so intriguing because it makes me think in my mind.
I'm trying to grasp how that framework work, as in so the elements you see on the given page, is that the same as the VoiceOver? Are you able to go into more specifics like go right four?
Sarah Herrlinger: Well, so, to give an example. When you think about how important it is in VoiceOver to label elements on the screen, to label images and buttons and things like that, one of the things that then both Switch Control and Voice Control does is it hooks onto those individual elements as well. For example, with Voice Control, one of the features to it is being able to say Show Numbers. Then any of those elements, anything that would be tappable or clickable--depending on whether it's an iOS or Mac device--becomes something that shows up on the screen, so that you could, for example, in the Photos app, say Show Numbers, and then say tap 14, and it is that specific photo that you're trying to get to which really improves the efficiency for someone using voice. Underneath it's also using that same framework to it.
Dean Hudson: I think even Eric gave an example in the keynote the other day--or not the keynote, sorry, the state of the union--where he tried to click on an element, and it didn't work. He said "see, it didn't work because it doesn't have an accessibility label." It definitely hinges upon the accessibility underneath the API.
Dave Nason: That's cool. I was actually thi--
[crosstalk]
Dave Nason: Sorry!
Sarah Herrlinger: It's all the same foundational API, but we also allow even more specific APIs, so developers who want to create even better experiences for Voice Control and such, or for a Voice Control-only experience, they can do that, too. The API is the common base, but it allows very detailed customization to make really great experiences for each one of these types of assistive technology as well.
Thomas Domville: Wow! That is amazing!
It really does going to kind of bring, in a roundabout way, you're also bringing up VoiceOver accessibility issues up. Like you just mentioned, elements or buttons that are not labeled correctly, and that has definitely to bring up to forefront with the developers if they want to take advantage of this--
Dave Nason: Exactly.
Thomas Domville: --control. What if you had, I know a lot of these elements sometimes just like to have just pictures. How do they know what to say for that particular picture?
Sarah Herrlinger: Well, actually, we have a new feature that we've added in this year that I think will be very helpful in that area. I'm gonna let Dean grab this one 'cause he's--
[laughter]
Dean Hudson: We're trying to wait 'til we get to the features, but you guys have pulled it out of us.
[laughter]
Dean Hudson: One of the features--and I'm going to get loud because talking about features I get excited--that we're introducing for iOS is to auto label buttons. If a developer puts this hamburger menu or just puts a picture on the button, we will, through machine learning-- you probably heard a little bit about that during the keynote--determine what that button might be labeled. Having tested it, it works pretty good. I've used some crazy applications, and it does a pretty good job. That's sort of how if someone does give a picture, we sort of auto label it, and that's how it would get picked up.
Thomas Domville: That's interesting because you have somewhat of a feature like that now when sometimes it will predict what it thinks it is. That works--
Dean Hudson: In text.
Thomas Domville: This is more enhanced.
Dean Hudson: This is more, yeah, in images.
Thomas Domville: Oh, wow!
Sarah Herrlinger: Yeah.
Dave Nason: Now, it's reading text, isn't it, that's visually on a button?
Sarah Herrlinger: Yeah.
I think one of the things that we always try and do is build on from what we've done, but never stop working on any of these features. We have had this available for text. Now, we are using machine learning as it continues to grow and grow, and use that to be able to, as Dean said, try and figure out what that is an image of, and give you that information. It might say button, possibly Home, or whatever it might be, so that as best we can, we are trying to add that additional information for you, so that you have more context. As always, we try and work with every developer, and tell them the more important thing is that you do actually go through and label all of these, and that's why we have tools like the Accessibility Inspector in Xcode with its auditing capabilities that give you information along the way as an app designer to know what you can do to be a better accessibility citizen. on top of that, we know that when people don't, we want to try and make sure that we are improving that situation, and really using tools like machine learning to make that better for the community.
Dave Nason: Cool! I kind of have this idea that maybe even people who don't have any motor issues, but maybe they just have their phone on a charging stand at their desk, and rather than picking the phone up, they just look over at their phone and use voice control. Have you found yourself doing that, Sarah, at all?
Sarah Herrlinger: Just start using voice control as its own even as someone who is not in the community? Yeah, I think voice control has it does have applicability that can go beyond the specific audience for which we kind of looked as at the sweet spot. I think one of the things that we will find as time goes by is the number of people who use this in many other circumstances. I've certainly had members of the media and such thus far say things like "This will be great for me as I'm driving in my car", or all different kinds of possible use cases. We look forward to seeing how people use them. It's been interesting even for us to see how people who are not members of the blind community turn on VoiceOver for things here and there. We know that a lot of these types of assistive technologies can be used for other use cases, but first and foremost, we want to make sure we're making the best tools for the communities that rely on them.
Dave Nason: Yeah. Absolutely! I was just thinking, I suppose, in the context that it can really help to drive use of the accessibility API if a larger group of people in the media are talking about it.
Dean Hudson: Uh-huh.
Sarah Herrlinger: Yeah.
Dave Nason: Dean, you alluded to other accessibility features, or other new features. Is there anything else that you want to--
Dean Hudson: Yeah. I'd first like to say WWDC was remarkable this year. We unleashed a ton of features that we're really excited about.
Another piece of that is being at WWDC, we just saw developers one after another very excited about making their apps accessible. I just wanted to put that out there. We did a couple of events--just sort of mingle events--where we had tables set up, so people could see some accessibility features. It was just packed, and you could just feel the excitement when engineers came up and asked making my app accessible. What do I do? You have these long conversations. Well, you could do this, and try this. It was just very very cool.
I'll start with VoiceOver because that's what I use on the Mac. One of the things that we were very excited to bring to the Mac this year is LibLouis. That gives us more than 80 languages for braille, and that's really really cool. As you guys know, LibLouis's open source, and so it's continually growing, so we're continuing to get more languages.
Also, we've had a lot of requests for VoiceOver and braille to bring sort of a single-word mode. For iOS users, you'll know this. When you use your braille display, you've probably got maybe an 18 or 12-cell braille display, iOS only shows 1 item at a time to sort of help you maximize that little space that you have. Well, on the desktop, we brought that option there as well, so that if you have an 80-cell braille display, you turn on this option, you will see one item at a time. You can go back-and-forth between either of those modes. That was really cool.
We've also improved braille input typing. We know that there are some people in the blindness world that use braille, but they are very very fast typists. We know those folks are out there. We've made it now, so that you can type as fast as you need to to get what you need done, and it'll just work great.
Thomas Domville: Bravo! Bravo!
Dean Hudson: Yeah, yeah, yeah, yeah. We definitely listened to all of you guys, and all of you. We've improved braille support on pages that support ARIA. There were some issues even in Mail that we addressed, so now when you paste text, we don't jump to the top of the email. We keep it right there. There were some issues with Messages. We fixed that, and in FaceTime. Some really really great braille improvements. We think you guys are really going to love it!
Dave Nason: That's cool! I know some braille users are going to be very happy to hear that!
[laughter]
Dean Hudson: Yeah, oh yeah.
Thomas Domville: Definitely!
Dean Hudson: Yeah.
Dave Nason: Has much of that made it to iOS as well, or was that Mac-specific?
[cross-talk]
Dean Hudson: That's all both iOS and Mac.
Dave Nason: Wow, that's cool.
Sarah Herrlinger: Yeah, including LibLouis which is also on Apple TV, too, so all of our braille-supported platforms we've brought those LibLouis tables to expand what you are able to access. Even when you think about things like the fact that on apple TV you can get captions through your braille display, if those captions are provided in one of the LibLouis languages, great way to be able to get that information, too.
Dean Hudson: Some other things that we've done for VoiceOver on the desktop is custom punctuation.
This is very big if you are a coder. When you're reading email, that's fine, you can have different levels of punctuation. When you're reading code, itās very important to customize your punctuation, so that you see the symbols that you need to see. That will be sort of like activities. It will be sort of built-in. You switch between Mail, you switch between Xcode, you'll get your right punctuation level.
Dave Nason: You can kind of say okay, I'm in Xcode, I need to hear the colons and the semicolons--
[cross-talk]
Dean Hudson: Yeah.
Dave Nason: --the brackets.
Dean Hudson: Right. In Mail, I don't necessarily need to hear that.
Sarah Herrlinger: Rather than just having that some, most, all, it gives you a lot more granularity in how you can do that customization. This is another one that is also on iOS.
One of the other cool things about it is that through CloudKit, you can sync those preferences from one platform to the other. That which you set up as a custom punctuation on Mac OS will automatically be available to you on iOS.
Thomas Domville: That's beautiful! Any improvement in Xcode especially code-reading capabilities, that's a big improvement.
Dave Nason: The cloud bit. I have to say that as someone who uses both Mac and iOS, the ability to even with things like keyboard shortcuts, you don't have to set them up again. The same with punctuation, just having it sync is--
Thomas Domville: Mmmhmm, mmmhmm.
Sarah Herrlinger: Absolutely! Thomas, I heard you mention Xcode. Dean, I know you have been really excited about Xcode as a coder yourself.
Dean Hudson: Yeah, Xcode is really huge. It's a big big application.
What we wanted to do is focus on where you spend most of your time, and that's in the editor. We've improved, for example, code completion. As you're typing in the name of a function or method, hit Tab, and it autocompletes. You can now access that.
Another point that we improved on is if you set--I'm going to get this wrong, I want to say landmarks, it's not landmarks--where you need to debug code, you set these markers. Those markers are now accessible.
We've also added some rotors now that will allow you to navigate between methods and between scope. If you've had nested if loops, you can now navigate between those. Makes it really easy for you to jump around in your code. Many many fixes around editing. We think that's going to make that experience a lot better.
Thomas Domville: That's beautiful.
Dave Nason: That's cool. We do get a lot of questions on applevis.com about Xcode. It pops up every now and then, as people looking for help with it.
Dean Hudson: We're continuing to work on that.
One of the projects that you guys have probably heard of is Everybody Can Code. We've done that with Swift with the iPad, but we know there's more there. Eventually, you going to get to some levels that you need to use Xcode. We really want to focus on that to make that a fantastic experience.
Sarah Herrlinger: Yeah.
To stick with VoiceOver, but to jump platforms and go to iOS real quick. Couple of things to bring to your attention, one of which is just that when you go into Settings, you're going to find Accessibility in a different place. That is that it's been upleveled in Settings, so rather than having to drill in from tap on Settings, then go to General, then go to Accessibility, it's now at that top level of Settings just below General in the flow, in the chronology there. That was really important to us because we wanted to make sure that it becomes that much more discoverable for people, and that they use these features more.
One of the other things we've done with it is we've also built accessibility into the sort of setup flow, what we call Buddy, as you get a new device. While for a VoiceOver user, you may already know that doing the triple-tap on the Side Button will turn VoiceOver on, for some of those other accessibility features that people may well have felt oh, I can't get to this until after I get through setup, we wanted to make sure that those were ready right up front, so that if you need to invert colors or increase your font or things like that, you can find those earlier in the process.
Dean Hudson: Another one is customizable gestures for iOS.
Dave Nason: You got my next question.
[unintelligible]
[laughter]
Dean Hudson: Think about things like Control Center, Home Screen, App Chooser--App Switcher, pardon me. You can now assign those to, say, two-finger quadruple tap.
Thomas Domville: Oh, that's nice. That's going to be a game changer!
Dean Hudson: Yeah, yeah, we think so.
Sarah Herrlinger: Yeah, you can even assign Siri shortcuts to VoiceOver commands.
Dave Nason: Oh, fantastic!
[unintelligible]
Thomas Domville: Yeah, that's going to be amazing right there, customizable VoiceOver gestures. I love that!
Dean Hudson: Yeah. In fact, both platforms are now we have full keyboard access. If you have your iPad now, and you have it connected to a Bluetooth, even gestures--say the Rotate gesture or the two-finger double tap and hold--you can now assign those to a keyboard command. You can perform those actions on your keyboard.
[crosstalk]
Thomas Domville: Oh, wow!
Dave Nason: I read a bit about there being new keyboard shortcuts across the platforms. Could you tell us a little bit more about what's been kind of added there?
Sarah Herrlinger: iPad OS has more commands in apps. I think that's connected to the full keyboard access that's now available.
Thomas Domville: Okay. I really love the new Siri voices, by the way!
[unintelligible]
Thomas Domville: That was beautiful. I liked hearing that! I'm guessing we will be able to use that as a VoiceOver voice?
Sarah Herrlinger: Mmmhmm.
Dean Hudson: Yes.
Thomas Domville: Awesome! Have we gained any new voices like Eloquence or anything like that?
Sarah Herrlinger: No Eloquence voices, but that... The new Siri voices are also available on the Watch, so that's another one--
[Unintelligible]
Dave Nason: We heard the U.S. one. Are there international new Siri voices, or at the moment is this U.S.?
Sarah Herrlinger: My understanding at this stage is it is starting with U.S., and I think we'll have to see where they go from there.
Dean Hudson: Yeah.
Thomas Domville: Well, making accessibility down to the root of Settings, that is a big thing, as in terms of that Apple recognized that this should be up front to everyone that's sighted or not, they're going to come across this, and be curious, and jump into that. I'm really excited that you guys finally put that up in front and to the main section with the main components.
Dave Nason: That's been a step-by-step process, hasn't it? I remember when it moved from the bottom of the General up towards the top of General, and now it's into the--
[crosstalk]
[laughter]
Sarah Herrlinger: Our evangelism has worked!
[laughter]
[Unintelligible]
Thomas Domville: Good job, Dean, good job!
[laughter]
Dean Hudson: We haven't talked about some of the low-vision features.
Dave Nason: Yeah, I was going to ask that.
Dean Hudson: On the Mac... Do you want to talk about the Hover?
Sarah Herrlinger: Yeah, on Mac OS, we have a couple of great new features we've added.
The one that I'm most excited about as someone who is a glasses wearer and who does struggle with small text is a feature called Hover Text.
It's a new way to make it easy to view text on your Mac display. What you do is if you hover over any text with your cursor, and press down on the Command key, you get a dedicated window with a large high-resolution text field which gives you whatever is the text that's underneath that cursor. You can blow it up to 128-point, you can choose the font type that works best for you or that you prefer. You can also change the color of both the text and the background, and the cursor that surrounds the text showing up on the screen. Lots of customization available, so that whatever your vision needs are as a low-vision user, you can be able to sort of customize that to work best for you.
One of the other things that I love is... For a long time, we've had a feature, and that is Say Text Under the Pointer. When you turn that on, you not only get this giant text customized in the way you want it to look, but it will also speak that out as it's going over the element as well. Text that would be in a menu or in a dock that might be smaller than what you would want it to be, you now have the opportunity to be able to take any text, and just blow it up on the screen.
Dave Nason: That's actually huge because there's some people who are not quite at full screenreader level. They don't need that, but they need that little bit of help sometimes with a bit of speech, and I think that's huge.
Also, I have an application at work which doesn't support screen readers very well in terms of keyboard commands, so I can use it with the mouse by rubbing the mouse over certain sections, and it'll read what's under there.
Sarah Herrlinger: Yeah.
We have another feature called Zoom Display which is for multi-display users. If you're someone who uses two screens, Zoom Display will let you keep one screen zoomed in close while the other one remains at standard resolution. It could be great for everyday work when you are just on your own working on two monitors in an office, but also one of the other applications for it that we've seen thus far is in terms of doing a presentation. Maybe you want your audience to see the screen in that standard resolution, but you want to blow up something on your own device, so that you can zoom in on areas, and get more information as you are presenting out to the world. A really cool way to think about multi-display users, and how low-vision users might use them differently than someone else.
Also, we added in Color Filters in the same way that we have them on iOS. These are filters that support things like color blindness, and we have filters that are specifically built for different types of color blindness, but also being able to do just a straight colored tint over the screen. We've received feedback from individuals with Irlens Syndrome and other types of vision challenges where just being able to have the screen tinted to a specific color to do any kind of work on the device has been really helpful. We're excited that that has moved over to the Mac, too.
Dean Hudson: I know you guys had a question about--I'm going to get the name wrong--but a feature that allows a developer to develop their iPad app, but then move that over to the desktop.
Dave Nason: Project Catalyst.
Dean Hudson: Catalyst, yes. Thank you!
The question was will accessibility be intact, and happy to say that yes it will.
Thomas Domville: Oh, wow!
Dean Hudson: If the developer does accessibility work on iOS, that will transfer to Mac OS.
Dave Nason: That's going to open a huge opportunity for a whole range of apps.
Thomas Domville: We were both talking about that how when we saw that demonstration where you were able to click that little checkbox for Mac. We were wondering if that part of that system to analyze your code would be able to take that accessibility along with it, or improve on it, and point it out to them in certain areas. We had thought about that.
Dave Nason: Will that then, I suppose, automatically change from the hint text, for example, which might be double tap to select on the iPad app, and that's VO Spacebar to select on the Mac.
Dean Hudson: Yeah, some of those little things we have to work through, but for the most part, they look exactly the same. Some of the sounds we've brought over to the desktop.
Sarah Herrlinger: The nice thing for the developers, they can use that iOS accessibility API, and it just ports over to the Mac. The time and effort and energy that someone puts in on one pays forward over into the other.
Dave Nason: It's a really interesting project.
There was mention onstage Twitter is back suddenly. There was a lot of talk--
[unintelligible]
[laughter]
Thomas Domville: Yeah, definitely.
Can you say, Dean, if the developers have tools of any kind that can analyze their code, and let them know where they lack in the accessibility areas, and where to focus on to make improvements?
Dean Hudson: Yes. The Accessibility Inspector is where to go, and we've made, over the last few years, several improvements to it.
One is really cool is that you can audit an application. The developer can have their application up on their iPad, target that iPad, and change accessibility right there. If they see a button that's not labeled, they can label it there. They can touch on their iPad, and suddenly it has the label. We highly recommend, at the very least, that the developer runs that audit tool, so at least they know the areas they need to go and fix.
Dave Nason: have you ever considered--people will ask this on the site sometimes--have you ever considered requirements along those lines, as opposed to recommendations, or is that something that's possible? I know accessibility is such a broad thing, and every app is different, but we kind of wondered that.
Sarah Herrlinger: Yeah, that is one of the things that we... We look at this issue a lot. It's not something that goes unnoticed, but it is a very complex issue.
I think as we look at how many things fall under the term accessibility, and as well the levels of accessibility of something. Even if you look at just VoiceOver, what is the stamp that says seal of approval? We're constantly trying to look at new ways--including things like doing the machine learning automatic label detection--to try and make it easier, and to build these tools to be more comprehensive, and to be simpler and easier for developers, so that they have fewer reasons to not do it. We want everybody to just do it, and make it so that it's not even necessary to have a listing, but mostly we just want to try and do everything we can to make everything as accessible as possible.
One of the other things to note as well in terms of auditing, we also now have a new accessibility audit tool for web content in Safari. That's another area where we've tried to look beyond apps, and into web content as well.
Thomas Domville: Oh, that's nice. I appreciate you being up front because you're right, Sarah, the complexity.
I can't imagine defining the word what is accessible. For a blind person, that's one thing. For low-vision is one thing, those with dexterity or motor issues is another thing. It's not a clear-cut and dry scope that we could just stamp it, and say you guys got to do this. I can't imagine the complexity to have to be behind something. Obviously, we can't just say this is VoiceOver-accessible because then you're singling out all the others that have other accessibility issues.
Dave Nason: Even accessibility is connected to usability, and I might find an app very intuitive and you may not, or...
Dean Hudson: Yeah. I mean, I have plenty times where someone says "is this accessible", and say it's accessible for me, I can use it. Someone else may go I don't like that--
[unintelligible]
[laughter]
Dean Hudson: It's a really really gray area, but it's something that we're striving to make easier as Sarah said, and I think we're going to get there.
Thomas Domville: That's awesome.
Now, one of the things you guys were talking about in the keynote, and I had wondered, the new gesture to do a three-fingers pinch to copy and three-finger spread to paste. I thought oh, that's so brilliant. I suppose that can be used as a VoiceOver custom gesture?
Dean Hudson: We have accommodated that, yes.
Thomas Domville: Awesome!
Dean Hudson: We have some gestures that you can use to do that, perform those actions.
Sarah Herrlinger: yeah, I think as with everything. Our goal even for things that would be considered general mainstream elements of the OS, we always do try and be thoughtful in how a VoiceOver user could navigate that or use it, and also how someone using Switch Control could or how someone using Voice Control. I mean, we look at all of these different elements, and try and be as thoughtful about each as we can.
Dave Nason: That does bring us back actually to a related question that I guess I forgot to ask earlier was Voice Control and VoiceOver. Can they play together, or are they distinct in terms of features?
Sarah Herrlinger: I would say at this stage, much in the way that VoiceOver was initially built as a feature for the blind community, our goal with Voice Control was to be able to support those with extreme physical-motor limitations. We look at that first and foremost.
If you use headphones with Voice Control and you're a VoiceOver user, you may be able to get functionality out of it. When we do these, we often sort of look at let's build out one thing, make sure we've got it, and then we continue to iterate from there, and do more. In the same way, that initially the way that Zoom and VoiceOver work together that's improved over time, the way other things have happened, I think we want to come out of the gate with something that's really a great feature for the community that needs it most, and then figure out from there how we expand.
Dave Nason: Absolutely! It's got to be one of the most complex features you've built in a long time, I would imagine.
Thomas Domville: No doubt. I'm thinking, too, is that just yesterday somebody revealed how there's a new feature within Accessibility for those on iPad that can use their little mouse. They can actually use that as a cursor pointer.
Sarah Herrlinger: Yeah. We do now have mouse support for iOS. It is a part of Assistive Touch.
Just to give that little bit of background on Assistive Touch. Assistive Touch is another one of our features that we created specifically for individuals with physical-motor limitations which allows them to be able to use the device when they may have very limited dexterity, but some. For example, if you can only use one finger, and one finger alone, to work device then when you start thinking about things like how do you do a four-finger swipe, or a pinch, this is something was built in to support those users. A logical extension of that is someone who may need--they aren't using their finger itself on the screen, or on their devices, even on computers--but they use something like a joystick or an assistive mouse that allows them to be able to use the device, and navigate in an alternative fashion. Adding in mouse support on iOS is really, first and foremost, meant to make sure that another community that might not otherwise be able to use a product has that opportunity to do so. We're getting feedback that other people are appreciating it as well, and that's fantastic. We really initially look at how we make sure that we continue to widen the users who are able to use our products in their own individual unique ways.
Thomas Domville: oh, no doubt. I do have clients that will use that mouse, and they will hover over something, and it will speak back to them what they're hovering it over, so little things like that that I've seen in the desktop realms is... We always try to wish for things on iOS and iPad iOS and things like that, so any new features like that is very welcome to all line of disabilities.
Sarah Herrlinger: Yeah. I think one of the key things with this is what we wanted to do was figure out how to use a pointing device like you would use your finger. Not so much reimagine how an iPad and a mouse would work, but really focus on how you can get that sort of touch functionality, but using a mouse.
Thomas Domville: Now, that we're getting to kind of wrapping things up, I am curious if you guys have any other comments or further features that you would like to reveal to our listeners that may have not been discussed at the keynote, or into the mainstream in terms of iOS, iPad, or the Mac?
Dean Hudson: Well, one we haven't talked about that was not in the keynote, but was in another presentation, is the Apple Card. I know that there's been some concerns about how that would work for people who are blind. I've been using it here, testing it, and it's fantastic.
One of the things that I've experienced with credit cards is you get this bill, paper bill, and I have no idea what that thing says. I can scan it, and even then it doesn't tell me where I'm spending my money. Now, having that all accessible on iOS is amazing. Just thought I'd put that out there.
Sarah Herrlinger: Yeah. I would say just sort of in the bigger picture around things, we didn't even get to all of the things even just for the blind and low-vis communities that we've done over the course of this set of updates.
To add in one more, just a quick one. Zoom went through a pretty big re-design on TV OS to just make it easier for individuals or low-vis to be able to control and navigate their devices. I think we could pull out a few more, but in thinking about time, well...
Part of it, I would say, is just go in, and start exploring because I think really in all the nooks and crannies, you're going to find different settings, different new things that are there that are helpful.
We want people to take advantage of it. We want people to give us feedback. To give the plug for the [email protected] email address, that is our customer-facing email address. We appreciate that we get a lot of great feedback every day from our users on how things are working for them whether it's asking us questions, reporting bugs, whatever it might be. We would love to get your thoughts on the work that we've been doing, and helps us to figure out what we keep doing into the future.
Dave Nason: Great stuff! Well, I think that about does it for us today. Thank you guys again for joining us! We really appreciate it on what is a really busy week, I'm sure. Sarah and Dean, thank you so much, and Thomas, thank you for joining me today!
Sarah Herrlinger: Absolutely! Thank you so much for having us!
Dean Hudson: Yeah, thank you!
Dave Nason: Thomas, interesting conversation.
Thomas Domville: Oh, indeed!
I'm sure I'm like everybody else. I was just ready for the next thing, ready for the next thing, ready for the next thing, but yet I'm so focused on what they had to say.
I really love the time that we had to spend with them in details, but as always it's never enough time. I'm so blessed to have these two people that probably are so busy in their life already! We were just so honored and blessed to have at least a half hour with these folks.
In general our listeners is that for every single thing they discussed here today that is new and revealing to you, there's probably 10 more new things under the hood that we going to see when iOS 13 comes out. This is the exciting part about this year, Dave, is this is not like iOS 12 where we had a few things, and that was it! This sounds like we have a lot of little changes coming our way along with some big changes that were not announced like--
[crosstalk]
Thomas Domville: Oh, yeah.
Dave Nason: I think we hit the highlights, but there's definitely a lot of little hidden gems hopefully. I think--
[crosstalk]
Thomas Domville: I thought the low-vision people got a huge boost in the Mac area--
Dave Nason: I think that was overdue as well. I think Zoom and some of those users probably had felt a little neglected versus VoiceOver users in recent years. It looks like Apple have really put an effort in this year to make sure that they really caught up to where they want to be.
Thomas Domville: It tells you the significant because they made a point of that. Not only just on the Mac, but as Sarah was saying and she made a point of it, they revamped it in iOS which is long overdue, but especially... The first and foremost of everything that's even new if they came out hard and heavy on the braille stuff, that was huge.
Dave Nason: I had Scott Davert speaking in my ear the entire time while they were talking about braille.
[laughter]
Thomas Domville: All the millions of questions I'm sure that a lot of people have, but just knowing the fact that they're focused on braille this year is, forward and foremost, it was way way overdue, and I'm so ecstatic and excited to hear that.
Dave Nason: Crossing all our fingers and all our toes that the performance is there.
Thomas Domville: Yeah. Speaking of crossing fingers with toes, what did you think about the VoiceOver gestures customized?
Dave Nason: That's really cool! Really really cool! Thomas Domville: I mean I'll have to see how deep-
Dave Nason: See which gestures? Yeah.
Thomas Domville: Yeah. How deep can we get with it?
It sounded like the keyboard is going to be where it's going to really take hold. You can re-do some of the gestures with the keyboard. If it's complicated already for us to do a four-finger double tap on something, oh my gosh, we can now make that easier, but if we can intertwine this with a certain thing that I want to use day-in-day-out, that's a game changer.
Dave Nason: Yeah. Absolutely!
Even like iPhone 10 and above that don't have the Home button, and maybe some people struggle with those new swipe, the new Home gesture and the new App Switcher gesture, so maybe they could replace that with a two-finger double tap or whatever it might be that they will find easier to perform.
Thomas Domville: Mmmhmm.
I like the fact that the Voice Control, as a whole, it was meant for a specific people with disability. In a whole, it does cover with the VoiceOver API which means that you're going to knock out two birds with one stone really because--
Dave Nason: That's why I love the media attention Voice Control is hopefully getting because this drives the developer to go I'm going to actually put the effort in to do that, we'll get VoiceOver support for free alongside that.
Thomas Domville: Exactly. Boy, I'm going to have to have a label on that, so that they can say something or whatever now. I think it works hands-in-hands.
On top of that, Dean was really getting excited about that we now finally have more accessible means to code now with Xcode where Xcode was so alien, and a lot of things just didn't work the way we want to. They put an emphasis in coding, and so those I've always dreamed to be a coder can now have that reality come true. Plus, I love how when they analyze the code for iPad to move over to the Mac, that includes the accessibility with it.
Dave Nason: Mmmhmm. Yeah. Yep. I think that was a question a lot of people would have had...
I think they renamed it. That was Project Marzipan last year.
Thomas Domville: Correct.
Dave Nason: It wasn't an overwhelming success, I think. Even Craig said onstage "look, we learned a lot. That was 1.0, and this is 2.0 now." They've given it a new name, and a new lease at life, I think, hopefully.
[laughter]
Thomas Domville: Well, Catalyst in itself is a whole different separate topic because then we can go on forever because there's a lot of things we want to know. How's this going to work? How's it going to feel? How's it going to smell? Everything about it, Marzipan which is now Catalyst, is going to be very interesting. I can't wait to dive in!
I'm excited that they finally put Accessibility under Settings. I heard that rumor before this cast, and I was excited to hear that they put that upfront and foremost with other important buttons under Settings.
Dave Nason: Yeah, and it's not down at the bottom. She said it was right underneath General, so that's--
Thomas Domville: Right underneath General where you find Display and Brightness.
Dave Nason: Mmmhmm. I think it is positive, and it's good to see that.
As you said, we're looking forward to getting stuck into iOS 13. Hopefully, the whole team will be, as usual, beta testing over the summer.
[Unintelligible]
Thomas Domville: Stay tuned.
Dave Nason: Hopefully.
Thomas Domville: We will have more information for you. Whether it's in terms of podcasts, or on the website, come to applevis.com to check out in-between during the summer, and definitely check back in the fall when iOS 13 and everybody else gets dropped along with Catalina, the new Apple TV, the new iPad OS, and check out AppleVis for all the latest and greatest and what we've found, and what you can expect in terms of accessibility and other many things.
Dave Nason: Thomas, I think that about wraps it up. Thank you again for joining me!
Thomas Domville: It was quite an honor. I enjoyed it so much! I hope you did, too, Dave.
It was an amazing experience to talk to those two especially Dean now that we now introduced Dean, I think he's coming in as blind and been working there since 2006 was an awesome awesome awesome input on the show.
I loved this! Thank you!
Dave Nason: Thank you so much!
My name is Dave Nason. This is the AppleVis Extra. Thanks for listening! Bye-bye!
[music]
Announcer: Thank you for listening to this episode of the AppleVis Extra. To learn more about us, visit our website at www.applevis.com. Follow us on Twitter @Applevis. Like us on Facebook.
Comments
apple accessibility
Well this podcast was better regarding asking question about accessibility. If thins workout in September iOS 13 for the blind individuals will be Lucky iOS 13. Looking to see all the nice features that will come out this year for us. Thomas seriesly "Eloquence" I would had ask about more voices that will be similar to siri in quality instead of using vocalize. It may be the year for accessibility with iOS 13. Nice.Long live the apple.
Nice Job
Thanks for this excellent episode, and thank you Sarah and Dean for talking about some of the great things we can look forward to with these major updates. I was talking with a sister on the phone this morning, who is a VoiceOver user and like me has manual-dexterity issues. I briefly mentioned the voice control thing to her, and our conversation reminded me that she really needs to come over to AppleVis. So thanks once again for all the great work you and Apple do. It is very much appreciated.
Great to hear from Apple
Itās always great to hear people from Apple engaging with our community on Applevis, and while Iād love to have more direct engagement from them, I appreciate it when it happens.
Based on this podcast, thereās a lot that Iām looking forward to in iOS 13/iPad OS. Greater customisation of reading punctuation is something Iāve been wanting for a long time, and custom VoiceOver gestures and keyboard shortcuts will be great too. I take it that if you can assign Siri shortcuts to VoiceOver gestures, and VoiceOver gestures to keyboard commands, youāll be able to assign a Siri shortcut to a keyboard command by first assigning it to a VoiceOver gesture, and then assigning that gesture to a keyboard command? If so, Iām going to be using Siri shortcuts a lot more after this release. I havenāt been using it much, because I havenāt found many ways in which it can make tasks quicker or easier, but if I can assign keyboard commands that will all change.
Novelty voices
Hi! someone asked about using the novelty voices on iOS in another post. I would like to say that you have oh ready been able to do that in iOS 12/11 I believe. Just go to the āspeechā settings in Voiceover to download the extra voices. Also would really be awesome if Apple would just stop being so this way or no way with Voiceoverā¦ and Allow scripting for others to do to make Apps more accessible or just enhance Voiceover itself: like NVDA addons. Iām happy about the 80+ Braille tables that will be added however. Great job all the same.
great job on accessibility!
I would like to say great job on accessibility! Apple is really doing it seriously this time, compared to last year. Yes, the part with voice control and VoiceOver playing together will indeed work in due time. that's one feature I'm looking forward to testing once released. and custom gestures are an added feature as well, although I'm fine with using the standard gestures.
Also, having the accessibility option up front is a big plus! previously I had to go deep into general to find it, but now with this change it'll be a lot more easier. Great job apple's accessibility team!
All in all, great podcast!
Itās not what they say, itās what they do that matters
Days after this podcast, Apple released its newest version of logic Pro X. For blind users with time machine, it was a hassle to restore the previous version. For blind users who did not have time machine and who were updating software automatically, the newest version of logic Pro X is a catastrophe. There has been much chatter on the logic Accessibility google groups forum about this catastrophe.
In short, logic Pro Xās newest version is much, much less accessible to the blind. Thus, we see that no matter how pretty the words might sound, what Apple releases can be , And in the case of logic Pro X is, a disaster for blind users.
Nice podcast, butā¦ Dot
Really??? We're going to do this?s
As respectful as I can be I ask you, what's not accessible? I use the new version just fine. I'd be glad to help you work around problems your having. The Inspecter has changed some, that's true, but still you can use it. I'm sure there is a update on the way to fix the problems you're having, as I have the same problems myself. Are you in the VOLogic Whatsapp group? You may want to join if not.
https://chat.whatsapp.com/LMXkPB8ksKVChgXlH6B2fz you
Mostly good
Hello,
Most of what was said in the podcast was on the positive side. However Voiceover on the mac has fallen way behind the windows screen readers in functionallity. Even the next version of Narraoor is going to have scripting capabilities like both JAWS and NVDA. Working on the web is much easier in windows than it is in mac. Safari is a great browser if you can see but if you are using voiceover its not so good. Focus issues make it difficult to use safari and voiceover to do any real work. I should say at least that is the way it is for me.
So I'm told
Hi Dominique. I'm glad you replied, and I thank you for offering to help. Happily for me, my Logic Pro blind teacher, Steve Martin, warned me to avoid the newest version of Logic Pro X, so I restored the previous version using Time Machine before I ever tried using the new version.
Afterward, on the Logic Pro Accessibility group on Google Groups, I saw blind Logic users trying to get back to the previous Logic version because of accessibility problems they were having with the newest version.
I'm afraid that's about the full extent of my knowledge when it comes to accessibility problems. You know more than I do. I sure do wish Apple would warn people, though, but maybe they do. I've turned off automatic updates, and I intend to read release notes before downloading future versions of Logic Pro X. Who knows? Maybe I'll use the newest version after all, Given what you said about the problems only existing in the inspector. Unfortunately, though, I do use the inspector for a variety of tasks, such as changing gain in selected regions, pitch correction, fading in and fading out and cross fades, etc. Are you able to perform these tasks in the inspector with the newest version of Logic Pro X? Also, which DAW are you using? I've just switched from a Focusrite Clarett 4pre to an Apogee Element 88, and I'm having a few issues with the IOS control app for my iMac Pro with the Element 88.
Thank you again for your feedback. I'm glad the problem isn't as bad as I was told.
Contact me offlist...
Hello! Please facetime audio/iMessage me off site and Iād love to chat with you. Any one else as well who would love (some) help on getting started. Email is probably in my profile: but I have no problem putting it here in my comment.
[email protected]
I use a Behringer UMC202 audio interface.