Hello, friends. I’m a graduate student working on an accessible app for my thesis project. Basically the app is a virtual docent for museums, especially interactive ones. I’d appreciate the chance to learn from you all about best practices in app development for low-vision accessibility, and specifically what you would like to see in an app that could help you explore a museum. If you’d like to learn more about the app, I will describe it below, and below that I’ll tell you what help I need. Hope that gives a roadmap to this long post.
So, the app is currently in a prototype, sighted-only version that my programmers and I will be adapting for you all. It features a sort of web of concepts related to each museum exhibit. These ideas are connected and can be explored by following the trails linking the ideas. Say you are in the museum we’ve deployed it to and are learning about healthy vs. polluted rivers. Once you’ve reviewed an overview of the topic, the app would provide related topics that you can pick from, with an indication of how closely related they are—maybe the animals that make a stream healthy or how heavy metal pollution in streams can be cleaned up.
Trouble is the developers have color-coded buttons on a soundless controller, which isn’t helpful for you all. They also wanted to code related ideas by distance, but I wonder if that would be difficult to find ideas if you can’t see them. These are some of the problems I need to solve. I also will need testers for the app as I work to develop it. If you’re in Montana, I’d be happy to help you tour the the museum where we’re testing the new version. I’ll also be in the Vancouver BC area next year once we’re ready to test the more developed app version.
Thanks for reading. I look forward to hearing from you all.
Comments
I can help
I am a completely blind person. I often visit museums, and I have a blog detailing the experiences. I don't want to advertise my blog here, you can find it in my profile, or you can contact me privately. I have many ideas, but I couldn't find the right solution yet. I'd be happy to share my experiences.
Suggestion
Hi,
One possible solution that I can think of is to indicate where the related exhibits are by using audio beacons. There are new types of technology that will allow users to tell where places or points of interest are with small devices that could be in the same room as the exhibits. All of these could be displayed on a map. The user could then find their way to the exhibits with their smart phones. I hope that the app is successful and I will gladly be one of its beta testers.
Thanks for your input
Thank you to those who have responded so far. Please feel free to contact me at [email protected] if you'd like to talk more. And please keep the responses coming! They are very helpful.
I'll assume you're starting
I'll assume you're starting from scratch in terms of understanding accessibility--thanks for posting here and seeking input.
#1. Your developers and testers should launch Voiceover from Settings>Accessibility>Voiceover and attend to what it says about how gestures are altered while it's running. Running the screen reader will let you confirm that speakable text is associated with screen elements.
#2 Apple has a well-documented AccessibilityUI Toolkit with best practices and how to code them. Mainly, just make sure there is clear text associated with each element. If it's a color-coded button bar, then each button should have a non-visible text label indicating its color (if that's semantically relevant) and function.
#3 please try not to get too jiggy with grapical layouts. Try to stick with Apple standard controls.
#4 I can't think of a way off hand to render screen distances in audible text, except for the fact that blind people can track their finger just like a sighted person, so we may be able to get a sense of distance between icons that way, if the screen layout is simple enough. For a kick, turn on Voiceover and go into Apple Maps: theoretically, a blind person can conduct a virtual Disserto-esque walk through the city by moving their finger, and landmarks/streets are reported. I've never had luck with it, though. It's probably the best model for what you're trying to do, however. I've no idea how it's coded.
Sounds like a very cool curatorial "Web" concept for your app. Very ambitious from an app standpoint. Best of luck!
My Idea
What bothers me is all the do not touch signs and they never ever have something you can read or touch to see what it is you can't touch in the first place. Sometimes you will get lucky and they have audio guided tours but even then someone sighted has to tell you what number to type in. It be nice if the app could tell you what it is that's behind the glass and or what it is that everyone else is looking at. A way this could be done could be if other blind visitors with someone sighted or the museum staff could add descriptions of the exhibits with in the app. So it kept like a database or something. If that isn't possible maybe a way to help you find the printed plaques that describe what the item is then have text recognition to read them. I really think the issue is figuring out what the exhibit or painting is more than finding your way around the museum.