Skip to main content
AppleVis

Search

Main navigation

  • Apps
  • Forum
  • Blog
    • Podcast
    • Guides
    • Hardware & Accessory Reviews
    • Bug Tracker
    • Developer Resources
  • Log in

Breadcrumb

  1. Home
  2. Podcasts

AppleVis Extra 107: Exploring Apple’s Latest Accessibility Innovations with Sarah Herrlinger

By AppleVis, 30 May, 2025

Member of the AppleVis Editorial Team

In this AppleVis Extra episode, David Nason and Thomas Domville (AnonyMouse) interview Sarah Herrlinger, senior director of Global Accessibility Policy and Initiatives at Apple. They explore Apple’s ongoing dedication to accessibility, spotlighting exciting new features designed to better support users with disabilities. The conversation covers several highlights, including Accessibility Nutrition Labels, Braille Access Mode, Magnifier for Mac, and the role of AI in accessibility enhancements.

Key Highlights:

  1. Accessibility Nutrition Label

    • A new initiative that provides standardized accessibility info for apps.
    • Developers will showcase features like VoiceOver and captions.
    • Designed to increase awareness and help users easily find accessibility details.
  2. Braille Access Mode

    • Available on iPhone, iPad, Mac, and Apple Vision Pro.
    • Enables quick note-taking, calculations, and BRF file access with Braille displays.
    • Supports live captioning for DeafBlind users to improve communication.
  3. Magnifier for Mac

    • Turns your iPhone into a magnifier for Mac users.
    • Uses a secondary camera to enlarge physical objects.
    • Includes zoom, color filters, brightness controls, and OCR with text-to-speech via Accessibility Reader.
  4. AI and Accessibility

    • AI remains a vital tool in accessibility advancements.
    • Enhances image recognition and descriptive capabilities.
    • Continues to be integrated to improve experiences for visually impaired users.
  5. User Engagement and Feedback

    • Highlights the value of user feedback in shaping accessibility features.
    • Encourages users to send suggestions to [email protected].
  6. Share Accessibility Settings

    • A new feature lets users temporarily transfer their accessibility settings to another device.
    • Makes it easier for family members to help with troubleshooting and tech support.

Listeners are invited to share their thoughts on these features and suggest any other accessibility needs they’d like Apple to consider.

Transcript

Disclaimer: This transcript was generated by AI Note Taker – VoicePen, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.

Dave: Hello there, and welcome to another episode of the AppleVis Extra. My name is David Mason, and I am delighted to be joined once again by Thomas Domville, also known as AnonyMouse, of course. And this is an exciting episode that we, I want to say, annually, semi-annually do, and that is an interview with Apple's Global Head of Accessibility, Sarah Herrlinger. So, looking forward to this one, Thomas.

Thomas: Right. I mean, you're right. That is a mouthful. What is your, I had to look that up. Director of Global Accessibility Policy and Initiatives. I'm like, wow. I wonder if that actually fits on her business card in one line. There's no way. They only respond so small. But no, you're right. This has been long overdue. It's been, gosh, almost three years since we've talked here. So I can't wait to kind of ask her some of the things that are coming up this year. And plus some other questions we have always been wanting to ask her.

Dave: Yeah, so without further ado, Sarah, you're back. Welcome back to the AppleViz podcast. How are you?

Sarah: Thank you so much. It's such a pleasure to be here. I love getting the chance to hang out with you both. I'm doing well. It's been a very, very busy spring. I feel like every year things tend to speed up, not slow down. So lots to do.

Dave: Well, you get two WWDCs, don't you? Because you get God and WWDCs. We do. It's

Sarah: It's a really fun thing that we get to use Global Accessibility Awareness Day to make our announcements and kind of pre-announce for the community in a way that really nothing else comes out about our stuff until WWDC. So we love being able to have our day to shine. And then we also have a lot to talk about at WWDC this year. So super excited about that, too.

Thomas: Well, first of all, thank you for coming on. My gosh, it's been, you know, I had three years, I was like, oh, my, how time flies when you're having fun. But, you know, this is kind of an interesting trend that you guys are starting to set for the week of the Global Awareness Day, and that is to kind of introduce what's coming up for accessibility for us to enjoy at the year end. So I was like, this is great.

Thomas: So let's go ahead and dive into that because a lot of people are raving about a few of the new things that are coming. And I thought maybe we could just kind of highlight some of the big ones that we think that might be useful for our users to know what is coming. And let's start with the first one you guys announced, and that is the cloud. App Store or the Accessibility Nutritional Label.

Thomas: And I thought, you know, at first I was going to say, how many calories is that and how much saturated fat does that include? And we got a lot of comments saying, well, I didn't know that that was a thing. And so why don't you tell us what exactly is this new label and how is that going to help us?

Sarah: All righty. I got you, too. And one thing I will say up front, that term nutrition label is actually something that we've used in other places. So if you look at privacy in the App Store, it's also and has been for a while referred to as nutrition labels. So that's kind of where that came from. But this is something we're really excited about. We think it's going to have a huge impact. Now, currently, we've noticed that developers today find a variety of ways to share information about the accessibility of their apps with users.

Sarah: But we really wanted to create a consistent way for developers to highlight acce ssibility features. And we want it to be a way that's super easy for users to find and understand. Accessibility nutrition labels are an extension of kind of the longstanding work that Apple has done to provide developers with tools, documentation, and training to create great accessible experiences.

Sarah: We're really excited for these labels to come to the Apple ecosystem, and we expect it'll bring a whole new level of accessibility awareness, both for users and developers alike. So really, this is kind of the codification of being able to talk about different elements of accessibility in an app, to users in a way that's really clear and laid out in grid format, essentially.

Dave: That's great. Did this come about through user feedback? I wonder. I think we even had a conversation on this topic on this very podcast a few years ago, and I'm sure it's popped up in our report card and things as an idea people would like, so...

Sarah: Yeah, well, I think it's one of those things that we've talked about for a while. We've always tried to make sure that, you know, we are giving people the most accessible experience we can. And I think for us, it's really just an extension of how we try and do it within our own apps. And we see the app store as an extension of the larger ecosystem. So we just want to make sure that other developers...

Sarah: hopefully follow in our footsteps of trying to up-level accessibility as something that they not just talk about, but really prioritize in their work.

Thomas: So explain to us is what we will see when something like that is up. Does it just say this has been flagged by the developers that is support voiceover or how detailed does that get to?

Sarah: So the way that it will manifest or show up is a developer, when they're putting their app into the App Store, will now go through a process where it will say, does your app support a number of accessibility elements? And those are voiceover and voice control, which we know are essential tools for many users when they interact with their devices. There are also some vision features such as larger text, sufficient contrast,

Sarah: dark interface, differentiate without color alone, and reduce motion. And then for any app that has media elements in it, calling out whether that media, those videos, include captions or audio descriptions. So they're going to be presented with that. if their app does support these things, and within that there will also be information for them for what does it mean to say supported.

Sarah: So it's not just choose, it's here's what we mean when we say, does it support a dark interface? And then if they still don't understand exactly what that means, we'll also be connecting them to resources so they can understand, here's how you use our APIs and code for this. So we want to hold their hand in the process.

Sarah: But if they say, yes, I do support this feature, then as a user, when you go to that app's page, you are presented with which elements they support. And again, with that, it sort of gives you, you can drill in on that just as to, in a general way, what that means.

Sarah: Like this developer has said that they support a dark interface, which allows somebody to, you know, I'm sorry, I don't have the exact words, but, you know, deliver this in a, where there's a dark background with lighter text, but not lose color. the color of the photos. It won't invert them or things like that. So it sort of will give you a little bit more of that information of what we're telling that supported means.

Dave: Absolutely. And I think it means the user is getting that information, but also it's raising awareness among developers, I guess, because they're being asked upfront these questions. And if their answer is no, now this might give them a push to look into it.

Sarah: Yeah. I mean, our goal is for everyone to support everything. So wherever we can, we want to make sure that we are giving them that info and then, again, helping hold their hand in the process so that they want to learn more. And then as they do, that they will over time be able to Change knows the yeses.

Thomas: Is that something that is optional or is that a requirement for them to use? And will that be based on an honor system or is there some sort of certificate or certification that goes through that?

Sarah: So at the start, since accessibility and nutrition labels are brand new, we want to give developers time to compare and evaluate their apps. So at the start, it will be voluntary to... do it. But if they do not, if they were to not click on any of them, it will say on their page, this developer has not yet provided any information about the accessibility of their app.

Sarah: So in a sense, going through that process of the yes, no is a part of their process regardless. Whether they choose to say, yes, I support dark mode will be voluntary at the start.

Sarah: Over time, they'll be required to share their accessibility support, but we do want to give them ample time before it's required so that they, for those who may have not really thought about this in the past as much as we would have liked, we want to make sure that they understand what it means.

Dave: Will there be any, or if either now or at a later date you envisage any kind of user validation or if a developer says yes to voiceover but we find it's not, will there be a mechanism for us to kind of feed that back in?

Sarah: Well, as with all elements of the App Store right now, if a person finds that there is something that a developer is reporting that is not correct, the first step is to report it to the developer directly and tell them this isn't working.

Sarah: You can also do things, again, as you would do right now, and give them a low star rating or write a rating that says, a review that says, this doesn't do what this developer says it does. But you can also then report to Apple. And if we find that a developer is doing something that they don't

Sarah: say they're doing, but they're not, that is the point where we step in and investigate and make sure that they understand that, you know, or take action. I would phrase it that way.

Thomas: I think this is very promising. I absolutely love this, Sarah. I think this is a great start. And over time, I hope this just makes more awareness because now it's up in front rather than just I've heard of it or seen it. But now it's like when you actually have to put things up onto the App Store, you're going to see this. And so this is great news.

Thomas: Now, on to the next feature, I got to say, this is probably going to be the biggest one that I think a lot of people have been discussing and raving about, and that is the Braille access. Tell us a little bit more about this feature and what they can expect from us.

Sarah: Yeah. So Braille access mode is another thing we are super excited about. So it's designed to be used with a connected Braille display on iPhone, iPad, Mac, and Apple Vision Pro. It's a new way for Braille users with displays to interact with their Apple devices. And with it, when you launch Braille access, you're then presented with the Braille access menu where you can do things like quickly take notes in Braille format.

Sarah: You can use the Nemeth Braille calculator to perform calculations. You can open Braille VRF files directly from Braille access. So hopefully supporting being able to unlock a wide range of books and files that were previously created on Braille note-taking devices or through other means, but unlocks a lot more books and files in BRF.

Sarah: So lots of things you can do there for deafblind users. You can integrate a form of live captions so you can transcribe real conversations directly onto Braille displays, like live conversations under Braille displays. And yeah, so it's just lots of different things that we're now doing within this Braille access mode to support users.

Dave: Sounds great. I'm not a Braille user myself, so I may not fully grasp it myself yet, but do you have to have a Braille display connected or is there some kind of a mode that is just on your Mac, for example? I think it's just there was a few questions on the forum about that.

Sarah: Yeah, no, this is definitely built to require a connected Braille display. So its intention is for someone who is a Braille display user. And our idea here is just to kind of create a solution that's deeply integrated with the Apple ecosystem.

Sarah: So that ability to kind of do note taking and have access to the calculator and all of that that is meant to be done directly on your Braille display, but really giving you the advantage of being able to use any refreshable Braille display and use the one that works best for you.

Thomas: Wow. That is really, that is huge. You know, kudos to you, Sarah, for learning about the Braille users and display. You probably had no idea what the world was like on these things. And a lot of those Braille note takers do those applications, but to make it integrated to the OS level on Apple is just tremendously. And I think it has a lot of potential.

Thomas: So I imagine over time that you guys are probably going to expand on that to introduce some more things within the Braille access area. Have you been thinking about thoughts on the future that to implement a lot of the AI feature within the Braille access, meaning that I don't know if the rewriting tools is part of that tool and other options like that?

Sarah: Yeah, I think, you know, we're going to continue to implement AI anywhere that we think there is opportunity that works well for our users. I think, you know, we've been huge proponents of AI. artificial intelligence and machine learning within our accessibility team for a long time now.

Sarah: And so I think as we see more ways that it makes sense, we're always investigating and trying to figure it out. And yeah, I mean, even with the last element of Brown's, you know, Braille in general, I have to laugh. I've had a Braille display for years because I have always tested, you know, I kind of try and make sure things work. I am not by any means a proficient Braille user.

Sarah: I can proofread my own business cards, I'd put it that way. But it's been important for us as a company to support Braille and have done so really since we first kicked off VoiceOver. Gosh, we're at the 20th anniversary of VoiceOver on the Mac this year. So I don't know if we had it in our very first year there, but certainly Braille has always been

Sarah: That has been very important to us, and we do believe that, particularly for the deafblind community, this access should be something that as much as we can, we build into the products.

Thomas: Look at you. That's awesome. I love hearing that, that you've been dabbing with the braille display so you know what it feels like and how difficult it can be to learn, but that's awesome.

Dave: Yeah, and I think Braille has been a big winner two years in a row, really, now. Was it last year we had the BSI improvements and the Braille command mode as well on the iOS side? So, yeah, it's brilliant.

Thomas: Yeah, I'm actually curious. What about the BSI? I mean, what about the love for the BSI so they're not going to be able to access the Braille notes? Is this just for the display users only?

Sarah: In this case, yes, it is built specifically for external Braille displays. But duly noted, you know, we always take as much feedback as we can from people and try and see if there are ways to keep building out. I think one of our cardinal sort of rules on things is none of our features are ever, you know, quote unquote finished. We like to always iterate and see where there's room to do more.

Sarah: So I love getting that kind of feedback and I'll make sure that the other folks on the team get it as well.

Thomas: Awesome. And let's not forget our low vision folks out there. Now, this is a pretty slick little tool. I would love to hear this from you where we are able to take our iPhone as a magnifier for our Mac. So explain that feature.

Sarah: Yeah. Oh, gosh, this one I'm super excited about as well. So, as you know, we've had magnifier on iOS since I think it was 2016. And it is, you know, kind of a way to do sort of a Jack of all trades of being able to do many things to support the community. But at its heart, it's being able to take things out in the physical world and be able to magnify them up.

Sarah: So, you know, tiny letters on pill bottles and things like that. For Magnifier for the Mac, we really tried to design it in a way that people use the Mac and make it a different, you know, a different app to leverage the power of the Mac.

Sarah: So the idea behind it is that you can take a secondary camera, whether that be your iPhone or use the continuity camera or another type of camera, and mount it to your Mac so that it becomes your camera to see out into the world. And with the solution on the Mac,

Sarah: we are really trying to make the most use of the Mac's Canvas and enable you to be able to organize and store materials. So unlike iOS, which is really sort of more a feature for the Go, this is really a document-based app. So what you do is when you have that connected, you're able to launch Magnifier and then see whatever it is you are pointing the camera at. So imagine if you are a college student in a lecture hall

Sarah: that you don't have to sit in the very front row. You can be wherever you are in the classroom. And then same as with the magnifier on iOS, you can zoom in up to like 10 times or you can zoom in on things. But with it, you're able to change brightness, contrast,

Sarah: color filters, and also even perspective. So you don't have to have your camera exactly focused perfectly on the whiteboard in front of you. It kind of figures out how to get you a clean image of it. And then again, you can change color filters or whatever you might need to be able to see it. And then you're able to capture that image and kind of make a slide of it.

Sarah: So imagine if it's your science class, you can, you know, take a picture of whatever is on the whiteboard and then keep all of those files in a folder. So over time, you have all of the things that you captured from your science class or your English class or, you know, philosophy or whatever it might be.

Sarah: And use those throughout the school year, just as anybody else might have been capturing, you know, drawing whatever it was that was on the screen or in some other way capturing that information.

Dave: I wish you had this 25 years ago, Sarah, when I was there. Because I was that kid in the front row kind of still struggling to see the board. And that kind of thing, it's brilliant. And I know a lot of the stuff.

Sarah: you're not the first person to say that to me. And in fact, I've actually had one or two people who have said, I want to go back to school now.

Dave: And there's people spending a lot of money on specialist equipment to do this now. So again, it's being able to do that with the laptop and the phone that you have. Yeah.

Sarah: I agree. I think it's, you know, it's another example of kind of how we're trying to do something that is really low lift, but with a lot of value to members of the low vision community who, you know, can have these devices and want to be able to use them in a way that works best for them. Hmm.

Thomas: Now for the million dollar questions, there are two things here. this is a possible new revenue for you guys. I mean, you just said that you can mount the device onto your computer. Hey, that's a perfect thing, a little holder or stand that you could sell, right? And so now people are going to be scattering trying to figure out a good stand for that. But the million-dollar question is, you know, a lot of the devices that are similar to this technology that you are introducing here,

Thomas: will allow you to not only to snap a picture of the, say the whiteboard, as you were mentioning, maybe the PowerPoints and stuff, but have it be able to read to you. So AI versus, so OCR the screen. So you can get that jotted down to notes because sometimes a lot of written things are probably going to be easier if it could OCR the screen for you. Is that a possibility or a thought that you guys thought about?

Sarah: Yeah. And I've, I'll hit both of those. There are a lot of continuity camera mounts and iPhone compatible mounts that are out there and stands and things that are out there in the world. So definitely some great ones for people to find and to be able to use. But yes, one of the really cool things about Magnifier is we've also integrated it with a new feature we're adding this year as well called Accessibility Reader.

Sarah: And Accessibility Reader is a system-wide reading mode that's designed to make text easier to read for users with many types of disabilities. So that could be low vision, it could be using Accessibility Reader as a member of the blind community, or also things like dyslexia. So really built to be something that supports a lot of different types of disability. But it gives you new ways to customize text and focus on content that you want to read.

Sarah: When you think about what we've had in the past with Safari Reader, this is kind of a bigger better. It's not just in Safari. It's in any app out there that has text in it. You can launch Accessibility Reader as well as with Magnifier. So if you snap a photo of this, whatever it is in front of you,

Sarah: And as well, another thing, sorry, with magnifiers, you can use it to read, you know, physical pages. So if you are in a bookstore and you just want to read the, you know, a page in a book, or again, as a student, if it's your textbook, you could set up your mount over the page of your book and have it, you know, read off of that. But it is integrated with all of our support for spoken content.

Sarah: So as it's taking that picture, it will then also, you know, you can change the font and the color and the spacing of things on that page as well, but then you can also use spoken content to read it out to you.

Dave: Is it fair to say then that accessibility reader is kind of that speak screen feature now supercharged?

Sarah: Yes, I would. I mean, in many ways, yes. I think it's, it's, What we've done in the past with spoken content, being able to read, you know, obviously in things like the books app or in Safari or, you know, any of those types of things. But this is really trying to make this something that's available everywhere. There is chunks of text.

Thomas: That's awesome. You know, there's so many more features that's going to be coming out, and I think we pretty much dabbed on the highlights, the real big ones that I think that's going to be very impactful to our community. Kind of wrapping this down a little bit, I am curious. I've always wanted to ask this question to you, and that is, How do you guys as a team determine to figure out what new accessibility will be introduced each year?

Thomas: Is that something that a team come together and kind of vote on if we're going to do these and if we can fit this on a roadmap? Or do you get ideas from outside? I'm really curious.

Sarah: I suppose the shortest answer I can give to that is yes. By that I mean, you know, first and foremost, we always... believe in the disability mantra of nothing about us without us. And our goal is that we build with, not for. That starts with employing people with disabilities inside our own teams, and not just within the accessibility team, but within teams across Apple.

Sarah: So we get a lot of feedback from people all over within our own buildings, in the Apple world, who are saying, gosh, I just wish my device did this. I mean, when you think about something like people detection, That came first from one of our engineers on the accessibility team who just wanted to know when a line moved as he was out and about in the world as a member of the blind community.

Sarah: And it, you know, grew to be so much more. So, you know, we get some from our own ranks. we get some from the accessibility at apple.com email address. And I'd be, you know, it'd be wrong of me not to at some point in this conversation, make a plug for that as always. So when people write to us there and say, even it can be that they're saying, gosh, based on my specific disability or disabilities,

Sarah: I can do this and this, but then I hit up against a roadblock on something. And so I just wish my device would do this. And so we think about that and try and take in that feedback. I mean, anything that gets written to us there, gets processed and sent to the team. So we are constantly mining within that for feedback on whether it be on new ideas or bugs or whatever it might be.

Sarah: But we do that. We go to conferences. We talk to, you know, do sessions at conferences and talk to people at those to try and make sure we're understanding what the needs are. And also, we're just looking at whatever is the latest and greatest within Apple in general. So if a team says, hey, we're building a faster processor into this device, or we're improving the camera based on X, Y, and Z, our teams are

Sarah: constantly in conversations with those teams to say, okay, well, what does that mean and how might that benefit and allow us to do something we've wanted to do but we couldn't do until now because of, you know, whatever might be the issue. So, yeah, we're just, you know, constantly looking at what's the next thing that makes sense.

Dave: Speaking of that, of cameras and evolving technology, I suppose you guys were pretty ahead of the game, I think it's fair to say, with things like screen recognition, you know, which is utilizing machine learning, I guess, what we now call AI technology. Now what's become huge for us in our community over the last couple of years is image recognition through LLMs, through large language models.

Dave: And obviously we have, with Apple built in, you have the, you know, if you tap on an image, it'll give you a very basic image description, which again is from machine learning. Now with things like BIMAI, there's been huge development. What do you see as the future in terms of Apple's development of that technology and how it can help the blind and visually impaired community?

Sarah: Yeah, again, we're always looking at what's the right thing at the right time. And when we feel like we have the solutions that we can implement, we do it. So I think AI, it's funny. Someone was telling me recently that this is actually the fifth boom of AI in the world, and the first time it happened was 1950s.

Sarah: It's funny that it's now a big thing in this latest iteration, but we're always trying to look at how we can do more, and I think we'll see. Certainly, one of the things we didn't talk about, but with

Sarah: what we're doing with live recognition on Apple Vision Pro and bringing it there, giving you more information about your surroundings for things like people detection or furniture detection or text detection so you can read your mail with Vision Pro on. So there's certainly ways that we're already looking to implement this more and more. But this element of AI, I would say, is in its infancy. And so lots of room to see where we can go with it.

Dave: I know you won't be able to comment on this, but if you could give us a pair of glasses with the cameras.

Thomas: Put her on the spot. Put her on the spot.

Sarah: I'm sorry. Take my answer to that one. As you know, we always like to keep our surprises.

Thomas: There you go. That's a good answer. That's a good PR. The marketing people also appreciate that answer, right?

Dave: Exactly so.

Thomas: You know, that is a hot topic. I mean, you have got to be hearing left and right. I know you are. And you probably are getting eons or tons of emails about this. And that's Yeah, right now it's a big boom for AI, and we have had a lot of success and a lot of great stories coming out of these AIs being able to have things described more so than just objects.

Thomas: So, you know, this is like an explosion of overwhelming details that we're getting back and not only that, but we're also getting video description and things like that. So we can't wait for Apple to, if they will be thinking of something like that for us down along the line like that as well. So the one thing you mentioned, Sarah, and I really love is that you do take inputs from our blind community.

Thomas: So did you continue to send emails to the accessibility team of features and thoughts that you can include for the next year's iOS that would typically work?

Sarah: Yeah, for really any of our operating systems or devices, that is our, you know, easiest direct way to get feedback to our developers our engineers so whether it be giving that feedback of I wish it did this or here's a roadblock I'm currently hitting or whatever it might be or even just to ask us questions you know a lot of

Sarah: What we get into that account is also I'm brand new to needing an accessibility feature. Can you help me figure out? Can you provide me resources or things like that? And we get tons of feedback that comes into that every day. But there is a team that solely works on supporting that account and responding back to people. So we do get back on everything we get in there.

Dave: That's great, Anna. I wonder, you know, when we look at our own website and the report card that we do at the beginning of the year, that kind of thing, it feels like the operating system that people are maybe struggling with the most still is Mac versus iOS when it comes to voiceover, certainly.

Dave: Is that something that you sense as well, that that's where, not that it's in a bad place, but it's maybe the place where there's the most to do in terms of bringing it up to where, yeah, customers actually want it from a voiceover perspective?

Sarah: Well, I think, you know, we do get feedback and we appreciate the feedback. And I think also with it, there's a lot of different people who configure their devices in different ways. So I think, giving us as much feedback as you can on where people find problems is super important.

Sarah: I know sometimes it feels like you may be writing in and someone says, oh, I need your log files or can you record a video of this or can you give us more information on, you know, which version of the operating system you're using, which model of device it is. Is it an M1 device? Is it an M3 device?

Sarah: All those different things that feel like we're asking a lot, but it really is based on our trying to figure out exactly what is happening for the user so that we can then figure out what's going on and really pinpoint things. So I think We just want to make sure we're getting as much feedback as we can so we can help make the products better.

Thomas: Well, thank you, Sarah, for taking your time out of your day to do our interview with you. This has been wonderful, educational, and insightful. Is there, like, one last thing or one more thing that you would like to add? Is there anything that was not announced at GATT that we possibly could see in the upcoming WWDC when it comes to accessibility?

Sarah: You know, one thing that I don't think we touched on in much depth, but that I think is rather cool is the being able to share accessibility settings. Yeah. The idea behind this is being able to temporarily transfer your settings from your iPhone or iPad to someone else's. And what I really love about this is for years I've,

Sarah: always heard people talk about how they appreciate the fact that their family IT department may be someone who's a member of the blind community or somebody who is even a quadriplegic or whatever it might be, but how that is the person who is the most knowledgeable about tech. And in some cases, if some family member is coming to them and saying,

Sarah: gosh, such and such isn't, I can't get this to work on my device, that you can say, all right, here, give it to me. You accept it the same way you do kind of the airdrop or doing contacts by moving them close to each other. And then you can get on there and go, oh, I figured out what you did. You flipped this toggle or you did whatever and make those changes for them. And then when the session ends, it reverts back to their setup and says, Yours is on yours.

Sarah: But it's just a great option to be able to use someone else's device in the way that you need to do it.

Dave: As tech support for my family, I'm very appreciative of this one.

Sarah: Yeah. So I think that's going to be fun for people as well.

Thomas: Exactly. And I love how that works. So you just put the devices close to each other. They'll detect it so there's no sign in or anything. That's going to make it very easy. I love that. Yeah. Well, awesome. Any other questions, Dave, that you have for Ms. Sarah before we let her go?

Dave: Okay. I just want to say, yeah, huge thanks. You've been very generous with your time. And, yeah, we really always appreciate it when you come and join us on the podcast.

Sarah: Absolutely. Well, thank you very much for the invitation. As I said, it's always a pleasure for me to get to spend time with you guys and chat again.

Podcast File

AppleVisPodcast1668_0.mp3 (34.78 MB)

Tags

Interview
News

Options

  • Log in or register to post comments

Comments

vanity vanity everything is vanity.

By roman on Wednesday, June 4, 2025 - 14:04

I didn’t like the podcast, because everything Sarah, Tom, and Dave discussed was already well known. It felt like Sarah just joined up with the AppleVis
crowd to do them a favor and feed us nonsense. All she did was go on about how great Apple is and what they’re supposedly doing — which, in reality, is
nothing!

While I personally have no trouble using Apple products, I do empathize with Oliver and others who are struggling. I’ll say this: they absolutely should
not — and I mean
not — have done the podcast with her as the mouthpiece. She said absolutely nothing that gave me any sense of hope or excitement.

Oliver

By Holger Fiallo on Wednesday, June 4, 2025 - 14:04

iPad 9 and slim folio keyboard. I use the keyboard for typing. the hot keyboard do not work well. The board have it's own hotkeys. I touch screen and keyboard. If Apple makes the iPad similar to a laptop be perfect and I would be in heaven. Will see but, not holding my breath.

Just Listened to This...

By Ekaj on Wednesday, June 4, 2025 - 20:04

Subject pretty much says it. I really enjoyed this interview with Sarah. One of the features I'm looking forward to is Braille Access Mode. As I write this my eReader is connected to the Mac and I've some time to kill so more practice is in order. I thought I read something about NLS including a note-taking feature in the next update to these eReaders. It seems the person got a bit mixed up there, as it's Apple and not NLS who is implementing these features. In any case, this and the other new stuff will be wonderful.

My thoughts. Don't come for me

By Winter Roses on Thursday, June 5, 2025 - 10:04

I think Apple is about as good as it’s gonna get for now. I’m not a braille user, but I never quite figured out the intricacies of braille screen input. Honestly, for anything braille-related, I’d prefer a physical device. Even if it’s only for typing, doing it on a touchscreen has never worked well for me. I don’t know—no matter how many years I’ve spent using the iPhone, braille screen input is one aspect I’ve never been able to get the hang of. I stick to typing with the onscreen keyboard, and if I’m in a rush, I’ll use dictation for short messages. It’s funny, because dictation used to be amazing. Back in the iOS 7 days, I thought it was working well. I used to get like 90 to 95% accuracy when I dictated messages. But ever since iOS 8, something changed. It’s never improved since then, at least for me. These days, I rarely use it unless it’s absolutely necessary.

When I was younger and had more free time, I used to send emails to Apple’s accessibility team. I don’t live in the US, Canada, or the UK, so I don’t have access to the accessibility phone line, and I’m not sure if the feedback app even allows people like me to request a callback. A lot of the time, you’re still required to be in one of those supported countries to get help. So for me, email was the only way I could reach them. I’d send detailed feedback too—not “this isn’t working.” I’d write out what’s happening, what I expected to happen, and I’d even offer ideas on how to fix it. Obviously I’m not a developer, I don’t know how to code or build software, but I’ve always respected that technology takes time. I never emailed them saying, “GarageBand is broken. Fix it. fix it now.” I made screen recordings showing exactly what was going wrong. And still, the reply I got was usually, “We’ve never heard of that issue before.” Or thanks for sending us this feedback. We're working on it, and, then, crickets! Like when I explained you can’t trim videos properly in the Photos app, or that you can’t stretch a photo across the duration of audio in iMovie.
Now, I’m not trying to bash Apple or say they don’t care about accessibility. Obviously I’m grateful for what we do have. But better is subjective. If there’s a product out there that works better for you, go use it if possible. Laws are only as good as the paper they're printed on. There are many cell phone manufacturers out there, and not many of them are accessible, especially out of the box, even if the law says that they should be. Apple can survive without my money. I wonder if I would be going after them if they never made their products accessible in the first place. You don’t owe loyalty to one brand. Yes, Apple’s one of the best in terms of accessibility, but that doesn’t mean there’s no room for improvement. It’s okay to say, “Hey, I’m thankful for this, but here’s something that could be better.”

I don’t use a Mac, but I hope the accessibility team keeps improving over there for the people who do. I remember seeing someone post that Apple was working on something for the Mac and they were looking for testers—something involving that blind musician, Matthew Whitaker, I think? He said he had some kind of connection with Apple and was asking for volunteers. Does anyone know if anything ever came of that? Or did it fizzle out and disappear?

Anyway, I don’t use braille on my iPhone in any form, so I can’t speak to that. I only use my iPhone, and for the most part, it gets the job done. But back when I used to send feedback, I put effort into those emails—structured them well, gave examples, attached detailed screen recordings. Still, every time I’d speak up, I’d get treated like I was being negative or ungrateful. I’m not saying every feature has to benefit me personally. I get it—some features aren’t for everyone. But when you try to respectfully point something out, and people start dogpiling you, it’s exhausting. You know what else I try to live by? If I see someone say something I disagree with online, and I scroll down and see that ten other people already said exactly what I was gonna say? I don’t say anything. There’s no need to pile on. Especially in the blind community. Nobody needs to hear how wrong they are a hundred times.

To the previous poster

By Jonathan Candler on Thursday, June 5, 2025 - 14:04

100 on this. Fact multiple times I sent feedback to apple regarding an issue I had and multiple at that, even with screen recordings and logs I've always gotten, oh we've never heard of said issues or we're not getting that on our producks. Which tells me that they don't test on various devices and configurations. Again, it's not... Our job to have to send system logs. Apple has great engineers I expect them to do better when it comes to this kind of stuff. It's all useless at this point lol. What's the point if all I get is that unstead of, oh, I don't know, but something like we will see if we can replicate the problems you're experioncing and find a solution to fix. That would be the best approach here instead of giving us the run-around and frankly I'm disappointed and It didn't use to be like this!

Re: Mac Frustrations

By mr grieves on Thursday, June 5, 2025 - 16:04

Firstly, it's weird we can't subscribe to podcast threads in here.

Anyway, the Mac. My opinion of Applevis is that it is a safe place to talk about the reality of using Apple products. If someone is coming on here and just moaning because they hate Apple as a company then that doesn't make sense. But if someone is having genuine issues with an Apple product then I don't understand why this isn't a good place to voice that. It's one of the things I personally love about this site.

I use my Mac all day long for my job. But it is my machine, I spent an eye watering amount on it.

The problem isn't that I can't get most things done. The problem is the amount of time, effort and cognitive load that using the Mac requires which is down to how clumsy and bug ridden VoiceOver is.

The problem is that every major version of the Mac introduces more bugs than it solves. Towards the end of Sonoma, Chrome was actually becoming quite a decent option, but since Sequoia it is often borderline unusable.

A lot of these things depend on what you are using it for. If you are using the Mac for leisure then it might not feel so bad. If you are under pressure to get work finished and the focus is jumping around the place, or you are getting applications not responding or something that is working yesterday is just randomly not working today for no good reason, then this is not acceptable and not something I would have to put up with if I could see the screen.

I don't hate Apple, far from it. And I don't hate everything about the Mac. If it was totally unusable garbage I would just move on and be done with it and the word Mac would never surface on here.

The problem is that stupid little bugs are wasting my time all of the time. I don't expect to be as efficient now I am blind as I was when I could see, I had too much of my life with sight to expect that. But I don't expect to be lied to about what's on screen. I don't expect to have to repeat the same action over and over again trying to find the magic trick I need to perform on this occasion to make it work, knowing that the next time I do the exact same thing it will be different. I shouldn't have to have several browsers lined up so I can flip between them as one decides that today it won't put focus in a combo box, or won't read the text I've typed properly.

If I wasn't using the Mac as my job and wasn't having to deal with these things under time pressures then I would likely be a lot more forgiving.

All these things are personal. If some people love their Macs, that's great. But there are plenty of us trying to be productive and the Mac is seemingly doing everything in its power to prevent this. So it is also reasonable for us to expect more and voice those concerns.

There is probably a wider discussion about what our expectations should be as a blind user. Maybe I get frustrated because I have similar expectation as when I could see. If our success criteria is "I can get something done eventually" then the Mac is fine. If it is be efficient and to feel that the software is working for us not against us, then I feel the Mac lets me down. It is putting me at a disadvantage compared to my sighted colleagues. And to be honest as I lost my sight relatively recently, I am already at a disadvantage because I am nowhere near personally as efficient as I was when I could see anyway. I can accept it when the problem is with me, less so when it's not.

Anyway personally I think it is good that we are all sharing our opinions.

I think you touch on it well…

By Oliver on Thursday, June 5, 2025 - 18:04

I think you touch on it well there, it's extra load.

Sighted users know something can do a task, until we try, we don't know. There are many fringe uses for us beyond emailing, writing some short documents and, well, web browsing. I was going to say sticking within those rather limited boundaries is okay... But it isn't really, mail is difficult to use too, or inconsistant at least.

It's, on top of being slower, on top of having to find other ways of doing relatively simple tasks because certain apps don't work in such a way, or we've not yet learned to use an app which takes longer, or an app breaks, we're also being asked to chase down bugs without any assurance they will be fixed.

they have been better, but, I don't have the time or the energy to work for apple for free fixing their errors. We're not beta testers, we're customers.

Fine, pay us, give us discounts, make it worth while beyond a vague suggestion that it will be looked at, even if we've reported the issue many times before and its not been rectified but, I'm a busy person, I'm not going to work for free for something I've paid for.

saying that, before Denis chirps up, I do report bugs, and I do take part in the betas... But there will be times when all I can do is give a brief overview of the issue, the platform, how to recreate it.... when they, like today, ask for information I've already put in my original report, it erodes confidence.

Fire them all, replace them with AI... Then fire us all and replace us with AI. The world wil be a better place.

I guess that's my thing…

By Jonathan Candler on Thursday, June 5, 2025 - 23:09

I guess that's my thing right because while sighted people are able to blaze through screens, we're activly are, to be lack of a better word, are bogged down because of the way VO behaves on mac. If I wanna get things done, I expect the mac to follow me just as fast as my fingers are able to fly through screens. Not be bogged down but a lot of, application name is not responding. The fact that this is still a thing in 2025 where in places where this happens for no reason at all is no excuse here! Mail is slow, safari is slow everything is slow and if people are still having these issues on a brand new mac, we got a lot of problems thank you very much! Again, some people do not understand so I'll say it very slow for the people in the very, back. I, expect, VO, to, be, just, as, good, just, as, A, sighted, person, would, be, able, to, use, there, mac! I will also, say, this! If, sighted, people, were, having, some, of, the, same, laggy, issues, they'd, be, fixed, right, away! I don't understand why some people are having issues comprehending that!

Jonathan Candler

By Holger Fiallo on Friday, June 6, 2025 - 13:10

It suppose to have the best chips and fast CPU yet with VO is not working well? Just asking since I only use iPhone.

The expectation of expecting screen readers to be as fast as

By Igna Triay on Friday, June 6, 2025 - 14:02

The expectation of expecting screen reader users whether it’s voiceover, jaws, NVDA, etc., to be as fast as sighted people, is, to be blunt, false. it’s not realistic. No matter what screen reader you use, and no matter how fast it is, a sighted person will always be faster. Site works faster than sound. i’m not saying you cannot be fast using a screen reader, you can. But you will never be as fast as a sighted person. As an example, a sided person might take a few seconds to read the screen. The different elements, and click on the button they want. That at the most takes a few seconds. A screen reader user on the other hand... You have to navigate the screen until you find the button and then press it, and given that one has to listen to what the screenreader says... That will take you longer. As an example, for a sighted person this could take maybe at the most? 5 seconds. For the blind person this would take at least 5 seconds. Even if your fast, having to listen to things introduces a slite delay, yes even if you only listen to things partially. I'm not saying a person using a screen reader cant be fast; they can be, but to expect to be as fast as sighted people... This will never be the case; we'll always be a bit slower. By much? No, or, that depends, really on how proficient one is. But as fast as sighted people are... That's a false expectation. Of course, there are ways to make things faster, I.e, maping several actions to one keyboard shortcut as an example, but that's another thing entirely.
Think of it like this, say there's a rase to see who can turn on filevault faster, a sighted person or a blind person using voiceover. The sighted person could do it in... maybe at the most? 5 seconds. The blind person... even if one did it the fastfastest way possible, opening system settings, pressing p to get to privacy and security, then using item chooser to go to filevault... That takes likely more than 5 seconds by which time the sighted person is more than likely already done.
@Oliver... Why should we get payed by apple when they don't pay any one else for beta testing? If things where as you say, paying us or giving discounts for beta testing; they'd have to do it equally for every beta tester, sighted or no.

Re: efficiency

By mr grieves on Friday, June 6, 2025 - 14:30

That's not the issue here at all.

One thing is how efficient I can personally be at interpreting audio or braille or whatever vs being able to see the screen. I'm not trying to blame Apple for the fact that I am slower now than I was then for things within my own control.

What we are not accepting is the additional artificial barriers that we have to negotiate when using the Mac. For example, the other day I was editing something in Text Edit. VoiceOver announced one line. I moved up and it announced the same line. And it appeared that the same line was in there twice. And a line I expected was not there at all. But when I selected the text, lo and behold the text that was there was not what VoiceOver was telling me. Using Apple Books I get this sort of behaviour all the time. Or when focus is shooting off like a Catherine Wheel as you try to get round a web page. And so on and so on.

These things I have no control over. They make me slow for no fault of my own.

I admit I am starting from a place of personal frustration at my own capabilities but I think I have a right to a consistent, speedy and accurate method of controlling my computer regardless of the speed it personally takes me to process the information.

Pagination

  • First page
  • Previous page
  • Page 1
  • Page 2

Submitting Podcasts

Learn more about recording and submitting a podcast to AppleVis.

Site Information

  • About AppleVis
  • About Be My Eyes
    • Download Be My Eyes From the App Store
    • Latest News and Updates
  • Newsletter
  • FAQ
  • Contact

Unless stated otherwise, all content is copyright AppleVis. All rights reserved. © 2025 | Accessibility | Terms | Privacy | A Be My Eyes Company