Ray Ban Meta smartglasses are getting Live AI in next update!

By Orlando, 18 December, 2024

Forum
Assistive Technology

Greetings everyone,
Ray Ban Meta Smartglasses are getting live A.I in the next software update.
Here is an article that explains what additions are coming to the Ray Ban Meta Smartglasses.
https://www.meta.com/blog/quest/ray-ban-meta-v11-software-update-live-ai-translation-shazam/

Options

Comments

By Justin Harris on Tuesday, December 24, 2024 - 21:11

These are quickly starting to sound like more than just a cool toy or something I might only use once every few weeks. Interested to see what else gets added in the next few months.

By Levi Gobin on Thursday, March 6, 2025 - 21:11

Member of the AppleVis Editorial Team

There is a link in the blog shared above where you can enroll. All you need is your glasses serial number, which you can find in device settings, general, about.

By PaulMartz on Thursday, March 6, 2025 - 21:11

Is this rolled out and I can't find it? Because the article, which was published in December, mentions Shazam, and Shazam has already rolled out.

By TheBlindGuy07 on Saturday, March 8, 2025 - 21:11

I tried this for 10 seconds on a friend's device and it seems quite useful. Generally speaking, the image (non facial) description is nice just because how the cameras are well placed for that exact job, and the esthetics of the device is quite cool as well.

By Oliver on Saturday, March 8, 2025 - 21:11

It is US only. I've not used mine for months as there is no visual AI at all. They might have some use for those who use Be My Eyes or get help via Whatsapp video chat, but, for the most part, the UK experience isn't that interesting. Can do more with my phone. Slightly more awkward in terms of form factor, but significantly more capable.

By Ash Rein on Saturday, March 8, 2025 - 21:11

I had the glasses for about a week. When I called be my eyes, they said that it was very hard to tell what I was looking at because the video is way too grainy.

By Brian on Saturday, March 8, 2025 - 21:11

Interestingly enough, I get the most use out of my Meta's when I'm in my apartment buildings laundry room. They are great for helping me find dropped socks on the floor, as well as when I drop the occasional quarter. And of course, now that Audible is connected to them, listening to a book while I'm waiting for laundry to finish.
I lead an exciting life. This is true. 🥸

By PaulMartz on Thursday, March 13, 2025 - 21:11

What prompt do you use to get your Meta glasses to help you find things? Or are you actually placing a BME call when you drop a sock?

By Levi Gobin on Tuesday, March 18, 2025 - 21:11

Member of the AppleVis Editorial Team

I have not tested it yet, and I will give you my feedback once I have. If you have enrolled in the early access program, you will be notified once you have access via your email, and the activity tab. You can then go to Device settings>Early Access>Live AI. Once you have access, saying "Hey Meta, Live AI" will start a session. You can find a video showing how it works.

By That Blind Canuck on Tuesday, March 18, 2025 - 21:11

As other individuals have posted, you have to register using Meta's Ray-Band Early Access by simply providing the serial number of your glasses.

Although I didn't get any emails though, a day or two later, I opened the MetaView app and under Device Settings, there was an option for Early Access. In here, you had to enable a toggle for sharing diagnostics, then select the option to join. Then a little time later, the Live AI and Live Translation features appeared in the app, under the Early Access section.

Live Translation seems to work pretty nicely, but currently only supports English French, Spanish and Italian. Here's hoping more languages are added later though.

As for Live AI, It's not quite what I was hoping. Don't get me wrong, it can be useful for some, but I thought, since the glasses were continuously looking through the camera, that you could tell them to let you know when obstacles were in your path while walking and it would do so on its own, kind of like a demo I heard on the Celeste glasses on the Double Tap Canada podcast.

It's true that, once you turn on Live AI, it is continuously looking and you don't have to use the "Hey Meta" wake word, but you still need to interact with it. Plus,when using Live AI, the battery will only last for 30 to 60 minutes. That isn't too bad, considering the size of the battery and what Live AI is doing, but it's still something to keep in mind.

HTH!

By Travis Roth on Tuesday, March 18, 2025 - 21:11

I tried this with Live AI. It kind of works. I said "Hey Meta, start live AI." Then:
Me: "Is there a pocket knife on the floor?"
Meta: "No"
Me: "Is there something on the floor?"
Meta: "There is a red object on the floor but it is not a pocket knife."
Me: "Where is the red object?"
Meta: "It is at the right edge of the room."
Narrator: likely right edge of the image, it was not at the wall.
Narrator: I placed my hand on the floor.
Me: "Where is the red object from my hand?"
Meta: "It is in front of your hand."
Narrator: This was mostly correct, it was also slightly to the right.

So it does seem to have potential. I'll have to try it sometime when I don't know where the object is.

By PaulMartz on Tuesday, March 18, 2025 - 21:11

I can relate to the red object pocket knife confusion. Sometimes, talking to Meta is a little like interrogating a Klingon.

Try telling it your blind, you dropped something, and you need help finding it. That might result in better feedback.

By Ash Rein on Tuesday, March 18, 2025 - 21:11

I get that people are very excited about this. I was also very excited. However, it’s really important to remember that the cameras on these glasses are not great. They are 12 megapixels. The resolution is very grainy. These are not really built for the type of Recognition that blind people are looking for. It’s gonna take at least another year or two before we get any glasses they’re gonna be really worthwhile.

I’m not saying that it’s not worth trying these out. But it is very important to be as objective as possible. This comes off more as a beta than anything else. It’s gonna get a lot of things wrong.

By peter on Tuesday, March 18, 2025 - 21:11

I thought I heard somewhere that the camera was a wide angle lens. If so, that probably isn't the best for getting directions and relative locations and sizes of objects. Might give some distorted picture.

--Pete