EchoVision by Agiga.ai

By kevinchao89, 4 October, 2024

Forum
Assistive Technology

I first saw EchoVision by Agiga at Vista Center Ignite Accessibility Entrepreneur Pitch Competition last month. It was hard for the AGI to hear me and for me to hear the descriptions and Be My Eyes in the noisy exhibit hall. Based on my interview with Blind Abilities, Huasongcm, CTO at Agiga reached out to Jeff and wanted to chat, where Jeff from Blind Abilities asked if I would like to be on it, where I agreed, but wanted to first examine it better at my roofttop farm that is quieter and allowed me more time with the glasses.

I was even more impressed than I was initially, as the scene descriptions via voice and Be My Eyes volunteer were great, but:

User Experience

  • Agiga added follow-up to EchoVision, so it's possible to get more specific context/details about a scene, which was a feature request I had last month and and it's live in the prototype.
  • Tap with one-finger on right-side towards front of glasses to get a scene description without asking, which is what i wish Meta Ray-Bans would do. There is a good amount of context/details to scene descriptions, which is made for blind people.
  • It's responsive, latency is low, voice is somewhat natural human sounding, and speech rate can be increased. I tested the speech rate at 1.5x.
  • I was able to call Be My Eyes and AIRA, where I got human description of a card no deets and scene.
  • Holding down with 2-fingers on right side will cancel Be My Eyes or hang-up and return to main menu, where I heard I could use Google Lookout by saying: "Hello Agiga, Lookout", which goes into explore mode reading text and identifying objects real-time. It was able to identify different objects, including stairs, plant, read text on credit/debit card, and lottery ticket.
  • I tried to have Lookout identify US currency, but it wanted to read the text on the bill instead. No worries though, a quick tap on the right-side did a quick scene description, where it described I was holding a $20.
  • The form factor are wrap-around sun glasses, which blocks sun for those of us with peripheral eye-sight and light sensitivity, and is more secure as it partially wraps around the back of my head and doesn't slide down my nose like Meta Ray-Bans.
  • I feel it has the features and functionality of Envision AI Glasses, but even better form factor than Meta Ray-Bans. I look forward to pre-ordering on October 11!

Feature Request

  • I was trying to have it describe people, but it said it couldn't describe faces, but when Huasongc tested it at home with his wife, it could. Huasongc thought Be My Ai could, so we tried it and it can, so they'll work on people descriptions.
  • I also tried something no AI could do, which was ask about next bus/train departures via Transit. Agiga is going to see if TransitApp has an API.

Options

Comments

By Ash Rein on Monday, June 9, 2025 - 00:21

I genuinely didn’t like the Meta glasses at all. They just didn’t fit well. And I wasn’t getting any worthwhile description from them. Similarly be my eyes just wasn’t working well. One factor to consider is that we are transferring a lot of video through Bluetooth. And of course it’s gonna be very pixelated. I am hoping that Bluetooth 6.0 fixes that going forward. Until then, I will look forward to the echo vision glasses and there will hopefully be some positive things to report.

By Brian on Monday, June 9, 2025 - 21:07

If anyone is capable of giving a thorough and detailed critical review of a product, I believe it is you. Hope you don't take that as an insult, as it was not meant to be one.

By Ash Rein on Wednesday, June 11, 2025 - 17:32

I appreciate the compliment. I’m genuinely looking forward to this. It’s either gonna be amazing or it’s going to be a disaster. I’m leaning on amazing.