Hello, friends. I’m a graduate student working on an accessible app for my thesis project. Basically the app is a virtual docent for museums, especially interactive ones. I’d appreciate the chance to learn from you all about best practices in app development for low-vision accessibility, and specifically what you would like to see in an app that could help you explore a museum. If you’d like to learn more about the app, I will describe it below, and below that I’ll tell you what help I need. Hope that gives a roadmap to this long post.
So, the app is currently in a prototype, sighted-only version that my programmers and I will be adapting for you all. It features a sort of web of concepts related to each museum exhibit. These ideas are connected and can be explored by following the trails linking the ideas. Say you are in the museum we’ve deployed it to and are learning about healthy vs. polluted rivers. Once you’ve reviewed an overview of the topic, the app would provide related topics that you can pick from, with an indication of how closely related they are—maybe the animals that make a stream healthy or how heavy metal pollution in streams can be cleaned up.
Trouble is the developers have color-coded buttons on a soundless controller, which isn’t helpful for you all. They also wanted to code related ideas by distance, but I wonder if that would be difficult to find ideas if you can’t see them. These are some of the problems I need to solve. I also will need testers for the app as I work to develop it. If you’re in Montana, I’d be happy to help you tour the the museum where we’re testing the new version. I’ll also be in the Vancouver BC area next year once we’re ready to test the more developed app version.
Thanks for reading. I look forward to hearing from you all.