So, this is admittedly more a philosophy of blindness/conversation starter kind of post than a technical question, so, apologies in advance if it perhaps doesn’t fit the character of the forum.
So, my question: Does anyone else feel more and more like Apple Accessibility is taking a backseat to what sighted people want in favor of sighted people instead of making both our experiences as close to equivalent as possible? I mean that, it seems that, especially with the latest phones/iOS, accessibility is seeming more and more like an add-on whose processes are becoming far less convenient for us to use than they would be for sighted users. I’m thinking now specifically of Face ID and the absence of the Home button and the increased complexity of gestures for us compared to having a fingerprint scanner.
A certain Mr. Jaime Teh of Mozilla/Firefox accessibility fame has a phrase I like: “delightfully accessible” which I take to mean that, if something is “delightfully accessible”, it is not just accessible, but it is both accessible and easy/convenient to use so that people actually like using something instead of doing so begrudgingly. Thoughts?
I don’t find either Face ID or the gestures more difficult than a home button and I think it’s likely you’ll think similarly after a few days to weeks. It is definitely a learning curve and I remember thinking this sucks but it actually doesn’t once you’ve worked out how to make it work for you. Disabling wake on raise helps a lot in feeling more in control of what the phone is doing. I’d try not to get downhearted., Face ID isn’t an inaccessible tech once you’re familiar with how it works.
So I posted something like…
So I posted something like this on a hackernews posting of Apple's announcement yesterday, and go tthis reply.
latexr 1 day ago| | [–]
It’s not just accessibility, everything about their operating systems is crawling with bugs piling up on each other. They have bugs in Passwords, bugs in Shortcuts, bugs in permissions, bugs in Clock… I can no longer even trust that setting an alarm will be done for the correct time.
thfuran 23 hours ago| unvote| | | [–]
It seems like just about the whole industry is massively over-prioritizing feature delivery over bug fixing and UX polish.
So, it looks like there are bugs that effect sighted users as well, and the industry as a whole is paying less attention to them. But when they hit people with disabilities, they hit *harder* than they would for people without disabilities.
The bugs are definitely a problem aren’t they.
Yep I agree the bugs are a major problem. If you can’t see your phone it kind of matters that it behaves predictably because you can’t work out whats gone wrong as easily as someone who can see.
I find that accessibility…
I find that accessibility almost always takes a back seat, especially when companies initially come out so strong. However, the difficulty of using face ID over a fingerprint scanner specifically is relative to the user's own abilities and experience. When I first started using an iphone 10, and first encountered FaceID and the swipe up to unlock, I was pretty baffled, frustrated, annoyed, etc. I literally had to go to Google and figure out how to swipe to unlock to set up that phone. But, I consider myself reasonably techsavvy, good at targeting my questions to search engines, and decent at finding useful resources (like this site here). So again, I can see where you're coming from in a broad perspective, accessibility is always a secondary objective, but when it comes to fingers over FaceID, I would disagree that the experience for people with vision loss is that diminished from those who only use touch ID. I think to encompass all manner of accessibility though, that a true smart device would have the ability to recognize multiple biometrics, rather than being solely reliant on one. The more necessary redundancies, the more accessible. Unfortunately with that luxury, comes more expense, more ports of compromise, and potentially other things I haven't considered.
Just comments on the specific thing the poster mention.
I actually find Face ID much more convenient than the fingerprint scanner and the 6 digit passcode which I know iOS defaults to occasionally or if you can't get face ID to work for you.
I find very little wrong with iOS. Admittedly I'm not a power user, I don't use e-mail or a braille display, but what I use it for works pretty much 100% of the time in the way I'd expect it to.
a11y, UX, and the impact of bugs
@Devin Prater put it very succinctly. Bugs are everywhere, and they hit us a lot harder because of the position we're in. One thing I've learned is that when an interface is bad for us, it's often bad for everyone. When I complain about yet another inaccessible website, it's often followed by a fully-abled person complaining that the interface is a mess for them too. The pessimist in me says they're just practicing some one-upmanship, but they do have a point. Tech is filled with awful interfaces that border on user-hostile. There's a reason that r/assholedesign exists on reddit. Yet another reason why accessibility and inclusive design helps everyone, not just people like us.
Bringing this back to Apple, there are so many bugs in Mac OS that I've lost track. I routinely hear people complaining about bugs in Apple's software. I routinely experience bugs and quirks that make Mac a pain to use. It's made all the more frustrating by their "it just works" message. This is what happens when companies prioritize shipping new features over ensuring stability and good UX.
I think another part of the problem is that we now expect more out of our devices. Other companies have started to get up and walk when it comes to disability inclusion and accessible design. Not quickly, but fast enough to make Apple's actions feel more akin to a crawl than the industry-leading position they used to be in. Games are getting accessibility so more people can play. I regularly play games like Stardew Valley, and I recently picked up The Last of Us which doesn't even need mods to be playable. google finally introduced competent Braille support in the form of Braille through Talkback instead of that heaping pile of bugs they called Brailleback. I myself am a Salesforce admin, thanks to the a11y work they've done to make their CRM available to screen reader users and with the help of an organization that specifically helps get disabled people certified. Blind Institute of Technology, for those interested. Charities exist promoting accessibility and inclusion in tabletop RPGs. More people are suing and pushing back against companies that don't make their services accessible: vision, hearing, motor, cognitive, and the like.
My point is, Apple need to step up their accessibility game, but they're struggling to keep their abled customer base happy as is. We're still second in line for features. It's not fair and it's not okay, but it is what it is. When bugs happen, we're often the hardest hit.
Face ID is definitely a skill issue though. We can absolutely use it blind. It's not as good as Touch ID, but it does work.
Look up Google and android
Look at other operating systems. Android is getting better, chrome OS is getting better. Some smart TVs not have a screen reader.
I wasn't aware sighted people preferred Face ID over the fingerprint reader. It was more like Apple told sighted people they preferred Face ID.
I've said before, I switched to the number code with handwriting because I keep scraping off my fingerprints with hand tools. The fingerprint on a home button is convenient because you can activate, unlock and arrive on the home screen in a single action. Not so with the Face ID or the passcode, but I fear the good old days are gone.
* It's exhausting trying to edit down my overactive sarcasm. I had a lot to say about websites and other topics that got chopped in the reread.
I think I’d have enjoyed the sarcasm.
Not much genuine sarcasm around on the web. The American influence has sharp teeth and one things Americans don’t seem to do well is sarcasm.
Accessibility became a massive afterthought with the introduction of iOS16, in which Voiceover began randomly toggling itself when a device still had loads of space. And when I first reported the bug last September, operator error was blamed by the accessibility rep, as I was the first one to report the bug, and regardless of how that person was dealt with when I went over his head, the bug still exists to this day, several versions later, and even though I'm also a beta tester, I've not been paid any attention when I report that the bug still exists. I'm put in mind of a hypothetical pair of conversations, one between Apple and a blind user, the other between them and a sighted user. Blind user: "Voiceover won't focus properly". Apple: "We'll look into it". Sighted user: "I can't view my photos properly"?. Apple: "We'll have it fixed next week".
Face ID is ridiculous! Yes, it's usable, but for me to use it, I have to disable the attention feature because I don't open my eyes which defeats a lot of the advantages. I don't know why we can't have Touch and Face ID, and simply choose the one we prefer. So much for diversity of user choice, eh? Well, you can still get Touch ID, but only with the SE phones which would be the only ones I'd buy anyway, as they're the cheapest.
I completely agree, the bugs keep piling up. Blind people aren't as important as other groups. We can hope significant time has been put into fixing bugs with iOS 17 and macOS 14, but I highly doubt it.
Oh, you should read some of the things I don't post...
I was going to post something about web designers being required to cram more and more code to download junk from third-party advertisers using slow, cut-rate servers, rather than worry about labeling buttons and such. And how much of those slow webpages is from internet speeds being throttled down to accommodate...?
But the moment has passed, and I really didn't know what I was talking about anyway.
I figure, though, Apple will eventually replace Face ID with something worse, and we will all have a chance to gripe another day!
One thing for certain.
We’ll definitely have plenty of chances to gripe. Just one of the wonderful opportunities being blind presents.
We still had rotary phones when I was young. They were terrible. Now we have screen dialed phones, which can be almost as terrible in certain situations.
* Looks like the post I was responding to was removed.
bugs, bugs, bugs
I agree, IOS, works fairly well.
I hear a lot of complaints about bugs. What bugs are people experiencing? If you all have issues, please go in to a little detail. Constructive feedback is always good’s
Accessibility is definitely not an "afterthought", not at Apple. Look at all the other companies where initial versions of a system don't work well with accessibility features. At Apple, you can be sure that you buy a new product and all new features are completely aaccessible / usable, especially for VoiceOver users. Dynamic Island on iPhone 14 Pro? Very well implemented accessibility. Replacing the home button with new gestures: All people came up with a lot of fear about that, but Apple has implemented accessibility very well into this new concept.
Moreover, almost every year, Apple intorduces new accessibility features, which they would not do if Accessibility is an "afterthought" like you said.
Edit: It's very bad that AppleVis seems flooded by destru cti ve posts which only s a y "Bugs, not accessible, Apple is crap, Won't be happened with holy Steve" etc. A constructive exchange seems no longer possible on this platform which I find very deplorable.
Edit II: Many of the posts - not only on this thread - but all over this site remember me of old people which don't want to learn new concepts. Technology evolves all the time. Otherwise, we would still have dialing wheels, no smartphones, nothing. All what is unknown seems to be immediately inaccessible like the missing Home button etc.
Okay well then why did we have to wait until iOS 3 for VoiceOver
Time. Internet operating system. Apple needed three years to do so! It needed until android one. Six. When does 2000 have narrator. Chrome OS is improving accessibility every day. Even device is like the fire TV and Google TV not have accessibility.
As complexity increases,…
As complexity increases, opportunity for divergence, IE bugs, also grows. Remember the first iPhone was just a phone, an ipod and a movie player. j
The trouble is, you can't sell bug squashing as a feature. Potential buyers will not be impressed by being told that there were bugs in the first place, hence apple etc, still have to push features, adding to the complexity and therefore the cycle continues.
Fingers crossed that the seemingly limited number of additional blind and low vision features announced earlier this week does indeed mean they have spent more resources on addressing the issues laid out by apple vis's superb report card.
Change is inevitable
It seems to me that any time a change is made, when people don't like it, accessibility is the first thing to have the finger pointed at it, when actually it's more likely to be unfamiliarity.
Too many people have the I can't attitude rather than let's give it a go before whining.
However, I believe it's the same on pretty much any platform where changes are made. If it's a change you don't like, it's an accessibility issue, not, necessarily, it could well be a you don't like the changes issue.
I've used android and really didn't like it at all, not because it was inaccessible, I just didn't like it, for me it was too clunky and I just prefered apple's implementation.
What, exactly, is accessibility anyway?
Sure, something may be accessible, but I ask you, is it just as orr nearly just as convenient, efficient and easy-to-use as it would be for sighted people? Should we not just include the most basic of access proper in our definition of accessibility or should we also include within it the kind of usability I just described…?
iPhone 14 Pro And General Persistent Accessibility Bugginess
I, too, have been experiencing this crashing on my newly-purchased iPhone 14 Pro. Seems this has been going on for a while now (as have many other bugs, including VoiceOver voices randomly switching to defaults). Wondering when Apple’s so-called Accessibility Team is ever going to get around to fixing any of this…?
This is what I think
Thing is that most blind people want everything to be served in a silver platter, rather than learning or making effort. So when they find that they can't do something by just tapping twice or three times, they immediately say that the app or system is not accessible.
It is true that now iphones don't have the home button, but activating VoiceOver is just as easy as pressing the side button three times. And yes, perhaps gestures are a bit more complex than before, but I think it's just a mater of learning them. Once you master such gestures you don't regret anymore.
Re: What, exactly, is accessibility anyway?
No. It is not going to be just as easy, in a lot of cases, as for a sighted person, if it is considered accessible. In many cases accessible will mean it is possible for a blind person. This means you are going to have to do more if you are competing with a sighted person for something in plenty of cases. *Insert motivational jargon here.*
On the other hand, it probably should not be like when I first went to a blind school long ago, and a particular blind teacher, there weren't many, constantly told blind students you have to be twice as fast at doing something as a sighted person, work ten times harder than the sighted people, that teacher did neither by the way, and never, never, never make a mistake around the sighted people because they're watching all the time, and what ever they think about you, they think about all blind people.
Who, really, has the energy for all that? I was a low energy youth, so...
I’m not sure thats helpful.
I object to the silver platter argument. Being blind makes lots of things very difficult. To go from a tactile button to a visual lining up with your face mixed with a swiping gesture to points that aren’t defined removes the tactile elements and replaces them with new ways of doing things that might not be intuitive when someone doesn’t understand the world in a visual way. I’d rather just re-assure people that it is possible and in fact becomes just as good as Touch ID instead of shaming people for struggling to adapt to a more visual way of interacting when they don’t have vision. It’s much kinder. Some day any one of us might need a more kind and understanding place to be so I’d much rather be that for others instead of judging and shaming.
Length of Time to Master Gestures
Many keep saying that it only takes time to master the new gestures. I respond that, if I were sighted, would it take me an equal amount of time to master sighted gestures…? If not, then it’s not equal.
How many of you are professional software developers?
I for one am not one of those smart people but have been in the room with some of the most brilliant computer scientists on the planet that have dedicated their professional careers to accessibility. This stuff is not easy. If you are convinced that Apple is intentionally ignoring accessibility for nefarious reasons, well, I guess there is just no convincing you otherwise. Accessibility is very, very hard to get right. Does anybody really believe that Apple wants to have all this bad publicity about VoiceOver bugs? Do you really, really think they are telling their accessibility folks to stop working on projects? Software development, project management, and product releases are tricky and unpredictable. Things do not always go as planned. With so many devices with varying hardware specs and varying operating systems, the ecosystem is bound to have bugs from time to time. Finally, let's not forget that Apple and Google and Amazon and Microsoft and the rest are for profit companies that are trying to make money. As consumers, we have the power to purchase any piece of technology that suits us. Vote with your wallets.
If you are a professional software developer, please enlighten us on just how difficult it is to build accessible software.
Re: Length of Time to Master Gestures
Do the sighted have more than a few gestures? I know they do the pinch thing and a point and drag, and the main gesture is tapping. That's kind of like point and grunt.
The sighted have graphical icons of all sorts for the point and grunt gesture, though. So I suppose they have to learn all those modern hieroglyphics, like a squiggly hamburger that brings up a bunch of other symbols.
Plus, they have to read tiny print at a glance when there isn't an icon. I recall trying to help a sighted person new to IOS download an app from the App Store. She kept tapping the iTunes Store instead of the App Store. Finally, I got frustrated and turned on Vo.
Biggest reason accessibility gets broken.
I suspect the biggest problem with keeping accessibility working is the general rush to production. I'm sure many of the problems we deal with like broken voices, crashes, etc, are caused by changes made in the operating system outside of Voiceover. Between the rush to release new software with new features, and maybe a lack of funding, these things will continue to happen. Feature creep seems to mess up many things these days, just look at the latest versions of other blindness products and how they often start out buggy in an effort to gain some market share over their competitors.
Accessibility, gestures, face IDA few things
I guess I'm one of the few who doesn't have problems with face ID. I'm totally blind and even turned on the require attention feature, just to experiment. It was maybe a split second slower, but still manageable. I generally keep this feature on, though I'm not sure why. I had more problems with touch ID not recognizing my fingerprint, having to set it up again, changing fingers. I liked it when it worked, which was about 50 percent of the time.
Sighted people can have difficulty learning gestures. My mom still can't figure out how to swipe apps closed using the App Switcher.
Having said all this, I do think Apple is focusing less on accessibility. We have long-standing bugs (like the VO focus issue) that haven't been fixed in years. We still can't reliably make a call using the Recent Call log, because VO will likely lose focus at the last possible second. The continuous Voiceover crashes are becoming more and more obnoxious.
Why are cell phones different from cars?
It's not just accessibility. Major issues persist release after release, year after year.
Example: In one out of four phone calls on my iPhone SE 2020, the person on the other end of the call can't hear me. It happens whether the person has an iPhone, and Android, or a land line. It happens whether they have Verizon, ATT, or T-Mobile. Toggling mute on and off resolves the issue, but it's rare that I can execute that before the other person hangs up. A web search provides no answers, but confirms I'm not the only one encountering the problem. I wouldn't be surprised if people in this forum have experienced it, too.
If cars wouldn't start one out of four times, we would have rioting in the streets.
Many posts above seem to imply that the issues we're all seeing are a logical consequence of the fact that smartphones are complicated and feature-rich, and Apple needs to release on tight marketing-driven schedules. So we just need to live with these issues. But if automobile manufacturers can figure out a way to release complex and reliable products on a tight marketing schedule, then cell phone manufacturers, cell service providers, and app developers ought to be able to do the same.
Come to think of it...
I do have the issue of people not being able to hear me. I've blamed my carrier, my location... any number of things. But your comment has made me wonder. Usually when it happens, I hang up, try again, and all is good. I never mentioned it because it's so random that I assumed it was one of the things I mentioned above. I can't even pin-point when it started... I'm thinking iOS 16, but I'm not positive.
Another issue I've had since literally Day 1 of iOS 16, is that my phone occasionally can't connect to my Wifi even though every other device in the house can. It's also done this at family members' homes. It has no connection for maybe a minute, and then everything goes back to normal. I know many people had this bug early in the release, but I haven't heard of it being mentioned lately. I had it on my 12 Pro the day I updated to 16, and it followed me to my 14.
A sighted person has less gestures to master. They can see and tap on a large amount of things in seconds where as we have to scroll through elimmment by elimment.
Both ways of doing things are very diffirent so I don't think it's a fair comparison.
As @mark has said, accessibility is hard and while I might bitch about voiceover from time to time I'm definitley not going to say apple is the worst thing out there, neither is android and that's a great thing for everyone.
Nothing is going to be bug free, that's just not possible yet.
Brad said: "A sighted person ... can see and tap on a large amount of things in seconds where as we have to scroll through element by element."
My response: A cleaner, simpler design would be easier for everyone to navigate and use. Most apps and websites fill the screen with infrequently used options and controls. While this negatively impacts usability for those of us with vision impairments, even my sighted spouse routinely has problems finding controls on her crowded iPhone screen. This poor user interface design is not driven by ergonomics research. It's driven by marketing personnel determined to stick every new feature in our face, whether it's useful or not.
Accessible design is good design.
Getting around Face ID
I bought an iPad Pro a little over a year ago. I'm a keyboard user from way way back, so I paired the iPad with my Logitech K380 Bluetooth keyboard. Being blind, I had no real reason to access the screen, so I bought a cheap case to protect the iPad, thinking I would keep the case closed all the time.
The only problem was that pesky Face ID. I thought it would be pretty ridiculous if, each time I wanted to unlock my iPad, I had to open the case, lift the iPad and point the camera at my face, then close the case again.
My workaround: I never enabled Face ID. I set a PIN instead. Now, my iPad stays protected in its case all the time. To wake it, I simply hit backspace on the keyboard and enter my PIN. Very accessible, in my opinion.
I hope this helps.
I get your point, though. And while we're talking about Face ID, we might also consider the fact that there is no screen less option for any smartphone device, regardless of brand. All I need anymore is a device that pairs with my keyboard and has speakers or pairs to my headset. Unfortunately, each new generation of device comes with a next-gen display, with a corresponding increase in price, and those of us in the blind and vision impaired community must pay for a screen that we really don't need.
Paul, there's no way we're going to get a smartphone with no screen.
Non disabled people will always come first in this market and I'm completely fine with that.
Another thing, no one is making you upgrade to the latest phone, if the features are just visual and you don't feel like you need them; don't upgrade.
something for you to consider
If accessibility were taking such a back seat, Voice-Over would not even exist, nor would there be a special phone line for those who have special needs, nor would there be a team of technicians who deal specifically with accessibility when building operating systems. In short, you are just plain wrong.
Apple adds accessibility due to fact that anyone who does business with the government has to add it. If they did not do so, forget about it.
I know Apple isn't trying to…
I know Apple isn't trying to make bugs and such, but focusing on that Dotpad thing left lots of Braille bugs unfixed. Focusing on these new AI features have also meant less time for fixing bugs.
Haha, helping sighties with their phones.
Someone mentioned off-hand earlier about helping sighted people with their phones. Ohhh, gods, don’t get me started! 🤦🏼♀️ “It’s that little blabbity-bnlah-blah shaped diddle daddle over there…”
Partial myself so thought I’d try Attention Mode. Couldn’t get it to work so turned it off. So, was wondering how the earlier poster was able to get it to work total. Would like more info on this, please as I believe that makes the phone more secure!
Accessibility wouldn’t exist if…etc.
To the poster who stated that all these accessibility features wouldn’t exist if Apple didn’t care, I say, how soon we forget that we had to harp and complain and fight and, I believe, even threaten to sue to get them in the first place. How soon so many of us forget…
So "You're just plain wrong?, was the quote that stood out to me the most. However I actually had an Uber rideshare with a senior apple engineer who just as much said flat out, they did not have any person with a disability of any type working under them. Now, could i question their creditability? Of course I could. But all the "accessibility lines" are just for show. I've done it time and again. If you really push hard on why is the alt text not spoken? Or: How come the radio buttons are not selected when the arrow signifies they have been pressed? Most of Joe Q public Voice Over Genius or whatever their fancy title is, simply means they have a crash course, cover your ass, accessibility training at hand. Now, was I impressed when I bought my iPod years back and this dude knows voice over? Yes. However, Apple's got so many fingers in so many pies, they are barely pushing out voice over except, hey it speaks, QA, send it out! I was particularly disheartened when, after I think series two of Apple watch came out, it took them over a few months to realize, We should probably not ask wheelchair users to stand. Now come on, give me a brake. wouldn't even someone whom is deaf or blind or whatever, could and should've clearly said, Excuse me, this isn't right? No, because apple has little to almost zero disabled persons in about any of their internal developers. i wish them luck but, unless we really go to stores with our issues time and again, day after day, even if it's just one problem, no on ever has had, they will not actually take notice. I'm sure apple has some disabled persons working maybe in sales, but remember when they Apple, sold Braille displays in their stores? I do. so we need to honestly and steadfastly, and politely but with force, ask them and get them to include us. If that means people putting beta software on their primary device? Go do it. all it means is you don't have a phone for a bit. Have a Mac, google home or iPad? You can use that. I'll keep pushing for more accessibility.
I help a friend and her family with issues with their phones. They are sighted but have no idea of iOS. Regarding accessibility, you are aware that apple did not do it and probably does not do it due to not wanting to look bad or losing money with the government. There is a law for accessibility with computers for those who sale them to the government. Unless they took it out. 500 or something. When I was sighted did not know anything about blind unless it was on a movie or S SteveyWonder. so did not know nothing about White cane or Braille or being hones probably would not even care.
Accessibility out of the goodness of their corporate hearts
It would seem that a few folks here don't know that the NFB and other blind organizations back in 2007 or so sued the snot out of Apple for marketing their smart phones so aggressively while doing absolutely nothing to make them accessible to any of us. So, yeah, they're not doing this because they care. I'm happy they became a leader in smart phone accessibility after that but I certainly would appreciate it if even a tenth of the testing and care that goes into updates for sighted users was put into them for us. Especially now when smart phones are even more critical than they were back in the 00s.
preference and other things
The NFB sued Apple, but not Microsoft, whose accessibility was worse. When Windows 10 came out, all screen reader manufacturers let people know not to install it because the built-in web browser was totally inaccessible, yet Microsoft claimed to be the leader in accessibility. Yet the NFB did not sue. Go figure on both counts. Now, to Misty Dawn: All of your complaints about the problems of being blind. Just because we have to use different gestures than sighted people? That's reality. Your attitude hasn't changed since many people muted you on at least 1 social media platform. You do need to get with the reality that we have to do things differently than sighted people. It's just the way things are. Stop complaining and try to enjoy life. I don't mind having to double tap instead of just touching an icon. I have to do it, I am glad that I can do it, so I do it. That's just the way it is. When I received a phone for use on the job, it was an iPhone. No buttons for typing. My first thought was, "Hmm. An iPhone. A touch screen. I figured that other blind people are using it, so I must be able to do so. Let's ask other blind users at my workplace how they do it, and work with it over the week end. I had to deal with the steep learning curve, and I did so, without complaining. Your way of thinking is that we should be able to use it right out of the box just like people who do not use Voice-Over can, and that is not realistic. It is just one of those difficulties we face as blind people. You need a much less "Poor blind me. attitude. Or go out and buy some cheese to go with your whine.
Re: "Plain wrong
You have mentioned a few times on this forum that you've ben told that Apple doesn't have any disabled engineers and that there are no people with disabilities involved in their Accessibility Team. I want to clarify that this information is categorically false, and it's puzzling why people at Apple would convey such information to you.
Apple has featured several engineers with disabilities in their own presentations, whilst some have also taken part in conferences, interviews, and podcasts. I've gathered some links below that provide examples of these individuals, and you can probably find more with just a quick search.
Apple has strict guidelines covering staff members who can participate in such events. They limit it to individuals who are considered "trusted" voices by the company, having undergone internal training to ensure they convey the right message. Therefore, the above examples probably only hint at the number of disabled people working at Apple. However, these examples clearly contradict what you've been told.
I want to stress that the above isn't intended to excuse Apple for the accessibility issues we all encounter. It's simply meant to correct any claims that these problems stem from a lack of disabled engineers or individuals capable of testing accessibility features. In fact, it would be surprising if disabled engineers at Apple didn't strive to ensure the accessibility features they personally use and rely on work as effectively as possible.
Returning to the main topic of this thread, I'm certain that numbers play a significant role when Apple managers allocate limited engineering time. They likely consider factors such as problem severity, the number of affected users, and the volume of problem reports. Unfortunately, the latter two factors often work against us when it comes to accessibility-related issues receiving the priority attention we would hope for.
Personally, I also share the opinion that Apple's release schedule is a major issue. Rather than releasing software when the team deems it stable and ready, it is typically tied to a predetermined calendar date.
Take the upcoming iOS 17 as an example. It will be announced on June 5, with the first beta released to developers on the same day. However, the first public beta might not be available until late June or early July, resulting in approximately four weeks of lost testing and reporting time. The code base for the final release version will be essentially locked to further changes around mid-August. At this point, only fixes for security vulnerabilities and critical bugs are likely to make it into the initial release of iOS 17.0. All other development work will be reserved for subsequent releases, such as 17.0.1 or 17.1.
If this scenario is accurate, there is only a window of around six weeks in which public testers have a realistic opportunity to find and report bugs that can be investigated and resolved in time for the 17.0 release. It's worth noting that iOS reportedly consists of approximately 12 million lines of code, so fixing a bug within that timeframe is going to be challenging. Sadly it's a challenge of Apple's own making.
Crashing issues make me go to android, and I may not be looking
Literally! If it wasn’t for your party party crashes that I get like 400 times a day, and if it wasn’t for the fact that I needed to give this phone to my mum soon, hell yes I would stay with Apple. But my MacBook, 2017, my iPhone, and my son, my other devices Apple are really starting to fail on themselves really bad.
I know android is bad when it comes to accessibility, so some people say, but I have a free, android phones last phone. A galaxy S 21 ultra a Google pixel seven and a galaxy a 04S
Let me clarify some things.
Reason why everyone’s trash, talking android is because they have only played with android for a few minutes jail too. They’ve never actually really taken the time to master the gestures.
I don’t know about you, but macOS accessibility has skyrocketed down to earth ever since Tim Cook came into the job. Same with iOS. Steve Jobs was on roll when it came to accessibility, but as soon as he left until Kwãkù came in charge, that’s when everyone starts having more problems.
Steve Jobs from his bright grave, what the hell is this BS. Why there’s so many Apple users complaining about my operating system Havanese really bad issues that are causing people to switch to Windows an android. This is preposterous!
You know, Apple every chance…
You know, Apple every chance they get, tells us to test test test their beta software. Like, I know they have at least one blind person working for them. Two I think, actually. We heard one on one of the AppleVis podcasts. But my thing is, do they not have people in the company doing quality asurence? Do they need us to beta test their stuff for them? If they *need* this, why aren't we being paid for our valuable time, feedback, and the bugs we have to deal with that slip through, all the time, into the official release?
I have an Android phone. It's a Samsung S20 FE 5G. And honestly, if it weren't for apps like Seeing A I, and games like Mortal Kombat being accessible on the iPhone, and good enough Braille support with the NLS EReader, I'd be all over Android. But no, Google is just now starting to wake up and stretch their accessibility legs a bit with Android accessibility and TalkBack.
Oh hahaha the NLS EReader, when it finally works with TalkBack 14, will only work over USB. Why? Because Android doesn't support HID Braille, and iOS does. Don't believe me? Try a Brailliant. So yeah Google is a mess.
I honestly don't know how this works. I've been told my eyes tend to move a lot, and so I'm assuming they just move in a way they need to at the right time. I just practiced unlocking my phone 3 times with the attention feature still on, and it unlocked within a second or 2. I just point the phone right at eye level, and it works more often than it doesn't. I'm really sorry I can't explain in any more detail.