Today Be My Eyes launches on Meta smart glasses

By Brian, 13 November, 2024

Forum
Assistive Technology

This morning I received an email from the Be My Eyes team, that Be My Eyes will start rolling out on meta-smart glasses. While perusing the email, I found a link, which is at the bottom of this post, explaining more about the service and how it will integrate with Meta. I, for One, am super stoked about this!
Read more about it from the link below. Enjoy! 😊

https://www.bemyeyes.com/ray-ban-meta?ct=YTo1OntzOjY6InNvdXJjZSI7YToyOntpOjA7czoxNDoiY2FtcGFpZ24uZXZlbnQiO2k6MTtpOjEyO31zOjU6ImVtYWlsIjtpOjI3O3M6NDoic3RhdCI7czoyMjoiNjczNGM5M2M5OTZkODI5MTg0MzI2MyI7czo0OiJsZWFkIjtzOjc6Ijc1MjQ0MDYiO3M6NzoiY2hhbm5lbCI7YToxOntzOjU6ImVtYWlsIjtpOjI3O319

Options

Comments

By Brian on Tuesday, November 26, 2024 - 03:00

The video feature in ChatGPT 4o has not ever come out, or is out, but is a rudimentary version of what that demo suggests.

By Tara on Tuesday, November 26, 2024 - 03:00

Hi Brian,
Advanced voice mode is out, but without the video. Even though they previously said it would have video capabilities. Advanced voice mode is a bit of a disappointment because you can't browse the internet with it. I use the text chat feature more nowadays anyway.

By Gokul on Tuesday, November 26, 2024 - 03:00

Is pointless, except for gimmics and tricks. If it were integrated into a smart speaker or something,... it might've been useful...
But I remember hearing about a limited number of people getting access to the vision capabilities of chat gpt; may be a rudimentary version of what was demonstrated.
Don't know if I've been halusinating...

By Dave Nason on Tuesday, November 26, 2024 - 03:00

Member of the AppleVis Editorial Team

Going back to the question of other Be My Eyes features that could be added.
As well as Be My AI which would be class, I definitely think they could and should add Service Directory calls and calls to your personal Circles.
I bought my Meta Ray-Bans today 🎉
Dave

By PaulMartz on Tuesday, November 26, 2024 - 03:00

I placed an order for mine today. Lenscrafters said it might take up to two weeks. Waiting. Waiting.

By Brian on Tuesday, November 26, 2024 - 03:00

Honestly, I think open AI jumped the gun when they released that video to the world, showing off ChatGPT 4o's video capabilities. I don't even know what advanced voice mode is, but considering the comment above about a smart speaker, makes me think of something like Alexa, but perhaps more robust?
Dave, congrats on grabbing a pair of Meta Ray-Ban's! They are wicked cool. Promise. 😃

By Brian on Tuesday, November 26, 2024 - 03:00

Also congrats on grabbing yourself a pair of Meta Ray-Ban's! I promise, you'll have fun with those.

By Tara on Tuesday, November 26, 2024 - 03:00

Hi Brian,
Advanced voice mode was supposed to be the video feature, plus better voices, voices that could supposedly sing and all sorts. Well, the singing definitely isn't happening here. I've asked it to sing several times, and it said it can't do that. It can immitate different accents though. I asked it to speak English in a Scottish accent, then a French accent, and it did that. But without the video capabilities, and the ability to browse the internet for stuff, it's really disappointing. If I want to talk to it with internet access, I just use one of the old voices. As for getting BeMyAI on the Meta glasses, reading between the lines from Mike Buckley's latest interview, we're more likely to get a better version of Llama for the AI, than BeMyAI being integrated. And I agree with comments above, the service directory should be on the glasses too?!

By Brian on Tuesday, November 26, 2024 - 03:00

I agree 100% with regards to service directory access on the Metas. As for advanced voice mode and open AI, no idea when, or if, that is ever going to come out. 🙁

By Brian on Tuesday, December 3, 2024 - 03:00

I started this thread nearly 2 weeks ago. This evening, I finally was able to add Be My Eyes to my Meta View application. So now, I can finally use Be My Eyes with my Meta Ray-Ban's! 🫣

By PaulMartz on Tuesday, December 3, 2024 - 03:00

Lenscrafters called and let me know my Meta Ray-Bans are ready for pick up. Wouldn't you know it, we've got a blizzard in Colorado today. Fingers crossed, the roads will be clear by this afternoon.

By Dave Nason on Tuesday, December 3, 2024 - 03:00

Member of the AppleVis Editorial Team

@Brian great to hear you got Be My Eyes working. I haven’t had much reason to use it as yet, but I know when I do, it’ll be superior to using the phone. Let us know how you get on when you try it.
@Paul also great to hear. Hope you’re able to get your hands on them soon!

By peter on Tuesday, December 3, 2024 - 03:00

@Paul,

Your message made me smile because people don't realize how different the weather in Colorado can be over such a short distance.

My wife and I live at the base of the mountains in Golden, Colorado and much of the snow that fell on concrete this morning is gone. Last week we got about 2 inches of snow here while our daughter, who lives about 40 minutes higher up the mountain, got about 20 inches of snow!

But, as they say in Colorado, if you don't like the weather, wait an hour. The sun is supposed to be bright and shiney tomorrow and should quickly get rid of some of that snow.

Good luck getting your glasses. I'm sure you'll enjoy them. Let us know how it works out.

--Pete

By PaulMartz on Tuesday, December 3, 2024 - 03:00

Hey Pete. I'm in Erie, not that far from you. Our storm looked pretty ugly for a while, but it wasn't enough to prevent me from picking up my glasses. I've spent the last couple of hours playing around with them. Quite impressive. I did a test call with Be My Eyes, and also tested a Messenger call with my sighted spouse. I haven't begun to explore all the features yet. Pretty stinking cool.

By Brian on Tuesday, December 3, 2024 - 03:00

A little context here. I typically use the Alexa iOS application to play Kindle books, because I actually like the assistive reader that is on the Alexa app and devices, not the assistive reader that is on the Kindle app however.
Now, having settle of that, I finally got Meta AI to successfully tell me what I was listening to with my Meta Smart glasses. This is supposed to work, kind of sorta, like Shazam on iOS. Where you say something like, "hey Meta, what am I listening to? ", And it tells you like the name of the song, etc. well, last night I was actually listening to a Kindle book (see comment above), and just for the fun of it, I asked Meta what I was listening to, and it actually told me that I was listening to something on Alexa for iOS.

Color me surprised! 🤯

By PaulMartz on Tuesday, December 3, 2024 - 03:00

My report, based on using these glasses for less than a day, but several hours of testing.

Set up was fairly easy, though VoiceOver on my iPhone crashed while the glasses updated their firmware. Well, maybe crash isn’t the right word. But VoiceOver stopped talking, and I had to toggle it off and on to restore functionality.

While connecting Meta View with the Be My Eyes app was trivial, for some reason connecting the FB Messenger app took more effort. I would’ve expected it to take me through the connection steps the first time I tried to make a Messenger call. Instead, I had to go into Meta View settings and manually connect the Messenger app. Afterwards, it worked fine, and I had a successful Messenger video call with my spouse in the next room.

The glasses do 90% of what I use SIRI for. Wether. Math. Calendar math (“what’s three weeks from Monday?”), sending dictated text messages, and initiating phone calls. One noteworthy missing feature is family relations. Even though I’ve shared full contacts with Meta View, I can’t say “Hey Meta, call my daughter.” I bet this is because iOS stores those relations in SIRI, which Meta View can’t access.

On topic for this thread, I did a Be My Eyes test call and found it to be incredibly easy. This will be the pants off having to hold my phone with one hand or get out my iPhone stand with the flexible arm. And the volunteer told me something that’s worth repeating. They mentioned that we should feel free to call anytime. They enjoy helping us.

There are some key differences between Be my AI and Meta-described images. Firstly, when it comes to the user interface, there’s no comparison. It’s “Hey Meta, what am I looking at?” versus opening the BME app and taking a photo. Meta gives you a prompt answer, Be My AI has the noticeably longer delay we’re all familiar with. Asking Meta a follow-up is trivial, but Be My AI requires manipulating the dictate interface and pressing send. If Meta thinks it needs a second image, it takes it; this is a manual step in Be My AI.

Secondly, some time back, I bemoaned the overly lengthy descriptions that Be My AI provides. In particular, if I’m looking at a shirt, then extra information about knick knacks on the shelf and sunlight on the tiled floor in the background are unnecessary and time-consuming. Meta’s descriptions are more concise and to the point. If I hold up a t-shirt and ask for a description, Meta tells me it’s a t-shirt, the color, and whatever logo it might have. No extraneous fluff.

Meta glasses and AirPods work together. There are no physical conflicts. Functionality-wise, the AirPods simply take over Bluetooth, so while I’m using both the glasses and AirPods, Apple Music comes through the airPods. I turned on text message announcements in meta View, and it gave me a warning that this might result in duplicate announcements. Sure enough, when a text came in, my glasses announced it, and then my AirPods a couple of seconds later.

I've worn them for several hours. They're not heavy. They’re a bit tight, but not uncomfortably so. I have the mediums, but before I placed my order, I tried the large, and they were too loose. I expect the medium size will loosen a bit over time.

I’m not thrilled with the LED indicator on the glasses case, so performing a manual Bluetooth pairing is a bit of a challenge. But I can ask “Hey Meta, what’s the battery?” to hear the glasses battery charge.

This was a good purchase. I’m having a ton of fun with these. Wow. What would our grandparents have thought if they could see technology like this? Happy US Thanksgiving, all.

By mr grieves on Tuesday, December 3, 2024 - 03:00

You should be able to ask the glasses to call your daughter by name. It certainly works that way for me. (Not that I have tried to call your daughter too many times :)) Although I am using WhatsApp so maybe that helps.

You can also get the battery level of the case in the MetaView app - you have to go to settings and then find your glasses and it gives battery levels for both there. For some reason it doesn't tell you the case battery on the home screen.

Glad you are having a good time with them. I am so impressed by these glasses. Although it sounds like the sneaky backdoor we have been using to get the image descriptions working in the UK is being closed which is really upsetting.

BME integration sounds fantastic, although I've not used it myself yet.

By mr grieves on Tuesday, December 3, 2024 - 03:00

It's quite possible that MetaView can see what you are listening to on the phone, so I'm still a little unconvinced that it is doing a Shazam and identifying the song by hearing the notes.

At any rate, it is a nice feature.

By PaulMartz on Tuesday, December 3, 2024 - 03:00

Yes, please stop calling my daughter. LOL. I can initiate a call using the names of relatives, but SIRI has me so trained to use their relationship that this will be a tough habit to break.

I tried asking Meta what song I was listening to (playing on my non-smart music system, aka a "stereo" as we used to call it). Meta responded with the name of a podcast that I had been listening to an hour earlier and paused before it was finished. Definitely not Shazam.

These glasses are mind-blowing. I turned off my phone's screen curtain and asked Meta to translate to French. Voila! Took me right back to high school French class.

By Brian on Tuesday, December 3, 2024 - 03:00

Regarding the notification LED, you may want to consider putting it on the lowest setting. As I understand it, it flashes in your eyes, and if you're anything like me, it will cause headaches.
Just a heads up.

By PaulMartz on Tuesday, December 3, 2024 - 03:00

I haven't noticed the notification LED. Is that somewhere inside the glasses frames, directed at my eyes? I might be too blind to see it.

Question on playing music. I have a ton of music in my library, but don't have an Apple Music or Amazon Music subscription. I can launch my Music app on my phone and play, and music comes through my Meta glasses. I can pause and resume with tap gestures. Great.

But, if I try telling Meta to play music, it tells me I need an Apple Music subscription.

Is there some way I can configure Meta View to play music through the Music app without requiring an Apple Music subscription? Or am I just stuck launching the Music app manually?

By PaulMartz on Tuesday, December 3, 2024 - 03:00

Maybe I was phrasing my request wrong, but it works now. If I ask for a specific song or album, the Ray-Bans play it, without telling me I need an Apple Music subscription. Shrug.

By peter on Tuesday, December 3, 2024 - 03:00

If you check out the Settings / Shortcuts dialog with Be My Eyes, some of the shortcusts (at least on the beta) are:

Describe quickly with Be My Eyes
Describe fully with Be My Eyes

I haven't tried these, but I would guess that one gives a longer description than the other. so this might be a way of getting a more brief description.

Not sure if these shortcuts are yet available in the public release, but if not, they should be soon.

Hope that helps.

--Pete

By Brian on Tuesday, December 3, 2024 - 03:00

As I understand it, the notification LED is on the inside of the frames, and I believe it is designed for your peripheral vision to catch it. Not sure exactly where on the frame that is located, but if you are light sensitive, it, will, give you a headache.
Just a heads up.

By Gokul on Tuesday, December 3, 2024 - 03:00

Meta ai isn't available here where I come from either, and I, also have been using the vpn method to access it. I too have found that it ain't working any longer for the last couple of days. I was thinking it was me so did everything including totally deleting the meta view app and reinstalling. Nothing has worked so far. I don't know how meta figures my location out when I'm using a vpn, but anyways this is a little frustrating. Please do update if anyone's got a work around.

By Gokul on Tuesday, December 3, 2024 - 03:00

@Paul a little clarification: in my experience so far, I don't think Meta ai has the feature where it can compare multiple images, unlike in BME. If meta thinks it needs to take a new image, it does, but then the description, or the answers to your questions are all with respect to the new image, not based on a comparison or analysis of all the images in your current chat. And even if you give pointed prompts asking it to give you info on something in the new image wrt a previous one, it tells you it doesn't know about that previous image.

By Tara on Tuesday, December 3, 2024 - 03:00

At Pete, maybe these shortcuts will only work on the iPhone and not the glasses? They are Siri shortcuts after all, unless I'm missing something. I can only see one shortcut under the Be My Eyes setting under 'suggested shortcuts' call a volunteer', and that's all.
At Gokul, Meta seem to be clamping down on using VPNs to access stuff. According to 5.8 of these terms, you can't disguise your location to access certain functionality. That's probably why it's not working for you anymore.
https://www.meta.com/gb/legal/supplemental-terms-of-service-updated/
This is for the UK, but I imagine these will apply elsewhere too. Meta can't figure out your location if you're using a VPN, but they can figure out whether you're using a VPN or not. Check out this page about VPN detection.
https://fingerprint.com/blog/vpn-detection-how-it-works/#:~:text=VPNs%20can%20be%20detected%20through,present%20in%20the%20network%20traffic.

By peter on Tuesday, December 3, 2024 - 03:00

@Tara
I guess it is quite possible that Be My eyes Shortcuts won't work with the Meta glasses. I didn't think of that.

For example, I don't think that the Meta Glasses lets the user use the AI description feature of Be My Eyes. That is too bad since, as people have noted, the responses provided by the Be My Eyes AI are tuned to be suited for blind individuals which the Meta responses are not.

Maybe some day these systems will work together more seamlessly and/or be tuned for use by the blind.

--Pete

By Dave Nason on Tuesday, December 3, 2024 - 03:00

Member of the AppleVis Editorial Team

The only aspect of Be My Eyes that currently works on the glasses is the call a volunteer feature.
That’s why so many of us are disappointed not to have full access to Meta AI.

By PaulMartz on Friday, December 13, 2024 - 03:00

If I'm wearing my Meta glasses, shouldn't I also be able to use the Be My Eyes app on my phone? It seems like I have to remove the glasses in order to use Be My Eyes with the Be My AI feature.

Similarly, if I have the Be My Eyes app open, not even as the active app, but just loaded and available through the App Switcher, then attempts to use Be My Eyes through the glasses will fail.

I had a call go through, and the volunteer on the other end told me they received an error message telling them I had to do something with my app or my camera. I don't have the actual text of course because it was displayed to the volunteer, not to me. That, in itself, seems rather odd. If I needed to do something on my end, I should've received the error, not the volunteer.

The summary: If I want to use BME through my glasses, I better kill it in the App Switcher first. And if I want to use it from my phone, I better remove and stow my glasses first.

By Brian on Friday, December 13, 2024 - 03:00

That is all kinds of crazy. I am glad somebody figure this out however. 🤯

By mr grieves on Friday, December 13, 2024 - 03:00

I hadn’t tried before but I was wearing my glasses and opened the Be My Eyes app and I got an error to do with the camera, and it won’t let me take a photo for Be My AI. I opened Speakerboo which does a similar thing and it works fine.

The link in the original article suggests that the two apps should work well together - you can be on a call with the glasses then flip to the camera on your phone for example. I’m guessing this is just teething troubles. Have you reported it with BME?