Hello r/Blind community,
I've been working on an accessible online fashion webshop for the visually impaired, using OpenAIs models for intuitive voice command navigation and detailed audio descriptions. It's designed to offer a seamless shopping experience, but it's still a prototype, meaning you can't make purchases yet.
- Navigation: You can submit commands by simply holding the spacebar, saying what you want into the microphone, and letting go of the spacebar. These commands can be anything: Filter for products, go to a certain product page, add a product to cart, ask for an explanation of the website, navigate to the home page, etc. The voice commands don't have to abide by certain standards, just say what you want and the system will figure out what functions to call.
- Searching: Simply describe what you're looking for, like "find blue jeans."
- Go to a specific product page after searching: Ask for product descriptions. Each product will have a number that will be called out after you searched for the type of product you want. Just call out this product number and it will navigate to the proper page, like: "Show me product 8 in more detail".
Just click the link and you're ready to search for some products by holding down the spacebar making a voice command.
Just wanted to share. Let me know if you guys have any comments on it.
Kind regards, Julian