I'm not good at writing reviews, so I'm not going to even try. I just got these delivered an hour ago and am about to dive in. I wanted to start a new thread where I could comment and hopefully others wil to. Questions are welcome.
They cost £329 in the UK. They arrived two days early.
I'm living alone right now, so i will be attempting the unboxing and setup with just my iPhone and Be My Eyes/AI.
Comments
It's not great at reading…
It's not great at reading text. It summerises. I think speed is rated more important than detail. If you're in the meta app and shake your phone you can send in a report to ask for a blind friendly mode for such things. Most people are using this to caption scenes so, unless we ask, it won't produce verbose description.
it'll get better too. I think short answers are good enough for when we're out and about, we want information quickly, not a long description of a bus thundering toward us.
Ollie
I should have both. Suppose you have a ltter you want to check and you want to know ASAP.
It will read all content if asked.
When I want all the text on a page read, I say "Look and read all the content on this page." or something like that, and it usually does. I really appreciate the ability to also summarize.
Getting it to read the instructions on a food package was a little tricky. I first had the package sideways, and it wouldn't read. When I got the text upright, and asked it how to cook this product, it told me to follow the directions. I had to be really specific and ask it to read me the directions on how to cook this product. When I did that, it worked fine.
It's going to be a matter of it learning what I like and me learning how to ask what I want to know in a way that works for the AI.
But the speed. The speed is amazing.
Nope, no full reading here…
Nope, no full reading here. The visual description is usually not 100 % accurate either. It'll get better, it's just not the tool we need, yet.
There is no prompt, and the fact I have to do some linguistic gymnastics to pin down even a partial read, is a problem. Seeing AI gets it far quicker and more accurately but, to be fair, that is using simple OCR, which is all on device.
I'm just saying this, to temper people's expectations. They are a V2 product after all. The speakers get drowned out in noisy environments, meta AI, as it stands, is far behind chat GPT, the app has bugs, the main one being the ability to change the speed of the voice... But it's not bad. I can take videos when out and about and get good sound. I can whatsapp family and friends, as long as I have a good enough signal, and video call them, which is epic. It does a lot of things okay, the whatsapp video calling is the win for me though.
Big picture!
From my extensive (30 minute) testing this morning, I have consluded that I am getting what I call the big picture. I held up a mug:
It said it was a mug with black liquid. Then it said the mug was glass, with some artwork. then it described the artwork. When I say 'then' I mean after another question.
I looked at my medicine container and it kept telling me it was a list of drugs, doses and times for me, but I wasn't able to get it to read the details. It did tell me I have a fire in this room and when asked, it told me it was on, wich it is!
It seems very fast and acurate at giving me the big picture, what I am looking at, but not with any level of detail. As someone pointed out, it is probably tuned for captioning Grams, which is what these glasses are designed for afterall.
These aren't 'eyes' yet, but given they still connect to iPhone - I think they will do 80% of the looking I need to do and I will take my phone out for the rest. Maybe the Envision glasses do closer to 100%, but in the UK, the Envision glasses are nine times the cost!
Intrigued by these glasses
I'll be buying these soon. I noticed last night that even with using BeMyAI, I scanned a sheet of paper with text on it and was given a summary. I had to ask it to read the whole page and then got everything. So this seems common.
I'm also going to try them…
I'm also going to try them.
I'm going to compare them side by side with the seleste glasses.
AI
Maybe they will let third party apps so people can add seeing AI to get documents or other things read. How does it does with money and coins?
AI
I initially thought the AI was just about asking questions from the camera, but it does seem fairly general purpose, and up to date.
I asked it what the scores were in the English Premier League. It took me Arsenal's score and said they had won with a last minute own goal. I then asked what the other scores were and it reeled them all off. I asked what games in the Premier League tonight and it trotted off a load of teams I'd never heard of - presumably some other country's Premier League. So I asked for the English ones and it said there weren't any, which is probably correct.
I thijnk it doesn't always know what information is the most relevant but I suspect you can probably get there with a bit of proving in the right way. And, as has been said, it is pretty fast.
You can get a lot from follow-up questions
I just used it ton a bottle of a wash in my bathroom. It told me what it was and what it was for, then I asked it for more and then more again. By the end I knew things about this product I never knw and I've been using it for years.
I am super impressed. did I mention that the speed is incredible?
Just ordered mine!!!
OMG, it says they won't be here until May 10th! It's going to be a long few weeks!!
Ray-Ban Meta
How much are they?
Cost
$329. I got mine on Amazon so I could then get the rewards points.
Brooke
Thanks but sadly I could not get it if I could afford it. My left ear is not well develop so would not be able to hold the left side of glasses. Would need those that are call wraparound glasses.
Okay, I'm glad to know, well…
Okay, I'm glad to know, well, sad to know, that it's not just me getting the summery. I think there is a need to get familiar with how it works. It's almost like a decision tree, you get the brief summery of the image, you then have to drill down into what it's said. I'll keep playing as this might be what is needed to get full read, but by then, though meta ai is fast at responding, it will have taken a long time to get there.
What I'd really like is to be able to hold up my phone and flick through photographs like a normy and be told what is on the screen.
I think it's a really cool product and am excited to hear how people are hacking it for our specific use case.
And, Holger Fiallo, there is always a way. I'm wondering if a kind of bungy chord around the back of your head, and maybe over the top, very much like they have on the apple vision pro, might not provide enough stability to use them. Designing solutions is one of my hobbies, 3d printing etc. I'd not want you to pass on something because it's not quite right for you. We've all had enough of that over the years.
One problem for me, meta
The glasses sound pritty awesome and I could imagine getting them, the only problem I am having is that Meta is involved. And I trust that company as far as I can throw my riding lesson horse which is a 1900 LBS Clydesdale Mix.
Still curious though, are there demos on the AIcapabilitys in action?
I know what you mean...
It made me feel dirty at first. They do all there usual tricks, the app demands location services be on all the time, not just when you are using it. iOS tells me it checks about twice a day, which isn't to bad, but still to often.
ATM it is a price worth paying. Until something better comes along. Because these aren't accessibility features, there are probs loads of demos on Youtube - I know I've seen a few.
everything tracks you these…
everything tracks you these days. No avoiding it if you want to own smart devices.
Meta
That's what kept me from ordering these literally the first day I read about them. Anything connected to Meta makes me uncomfortable. But my curiosity won out in the end.
At CrazyEyez
Sorry but that doesn't fly with me.
There is a difference between you getting tracked by cookies, Trackers, iFrames and so on or directly purchasing a product from a company which is known for selling your data off to data brokers and creating add profiles on you. One thing is you encounter and have to battle against to get ridd of that, the other one is a contious decision. I rather wait till apple comes around with their iteration of a smart glass setup.
I don't know about the Celeste Glasses, I have seen to many startups with great ideas fail over the last cupple of years, what tells us that the company behind Celest won't just collapse in the next cupple of weeks and these glasses turn into electronic paperwaights.
Datawolf
Apple tends to release tech late than all but when they do so, they are much better. They do take their time to get it right most of the time. Same here. I am sure that their glasses will be much better, probably the info will be process within the phone regarding AI since they got several companies that process info within the phone and avoiding sending it out to the cloud to get data review. Picture it as Sofia tended to said, You get a nice picture of someone and your phone will give a good description and also use and external AI for others things.
Apple variation...
Would also probably be way over-priced. Like the Envision glasses. Cool concept, but they will never be in my price range.
Coming sooon!!
So excited to finally try these out! I get them tomorrow!
Demo of AI features
The Double Tap podcast yesterday had a really nice demo of the AI features plus the video calling. It's in the second half of the show if you are interested. https://doubletaponair.com/apple-id-issues-ipads-for-blind-people-a-meta-ray-ban-review/
@Stephen
Enjoy them!! Mine will be here sometime between May 6th and 8th. It was originally 10th through 12th, and I'm hoping it moves up even more. I don't wait well for tech, Lol!
AI is now in the UK!
Apparently, the AI feature is now rolling out to the UK. It might be a slow roll-out so not sure if it is available to anyone. I cheated with a VPN so can't check myself but this is such great news.
That will be a relief!
I keep getting weird U.S. adds in podcasts!
AI
I had a bit more of a play with this and firstly, I agree with everyone being so excited about it.
It was able to describe my wife's clothes, including the colours and pattern of her dress. It was able to read the ingredients of a recipe and then also the directions to follow. It helped me identify a bottle and what was in it. It told me the best before date on the milk carton.
I think it doesn't like to say too much. So I asked it about the shelves of condiments which has herbs and spices and it only wanted to tell me about some of them.
I asked it to tell me where the Thai green curry powder was (my wife ensured its label was visible so a bit of a cheat). It told me top shelf, second on the left. But it was actually the shelf down and in the middle. I asked it for the ingredients and it told me the kind of thing that tends to go in Thai green curry powder, not what was in this one.
It managed to find my wife's secret stash of cookies that I had no idea existed.
I think for text if you have something fairly simple it should be fine. Maybe the magazine had too many distracting visual elements on it. I'll have another try at sign posts when I get a chance.
But the fact I have this sort of thing sat on my face giving me all this information is incredible. If you had told me I would have this a couple of years ago I would have thought you had got carried away reading science fiction.
But as always, just be really careful with the information it gives you.
mr grieves
Cookies? Now that you know do not eat it. If you do and tell her the glasses told you, she might hide them.
Poor Mrs-G
AI giving away her secrets!
Mind blown.
OK, so as some of you folks may know, I have a couple pairs of smart glasses. I had the Celeste, I also now have the Ray-Ban metas. The metas are everything i’ve ever been looking for in a pair of smart glasses. Response time is superfast, it does give you short descriptions, but if you want more detail, just ask it. I prefer short descriptions because sometimes I don’t need every single detail. Text, Reeds great, I even tried holding my finger over a button on a microwave and asked it to tell me what button my finger was on. Worked phenomenal. End up buying a second pair as a spare. Also works fine when it comes to reading my thermostat from across the room. I also love that you can do multiple things with it, not just descriptions. Making and receiving phone calls, listening to music, taking pictures and live streaming all of which are phenomenal. Not gonna lie though, it makes me worried for envision and seleste. Meta and Rayban are behemoths in their perspective fields.
Re: Cookies
Cookies? What cookies? I don't even know how those crumbs got there.
Hey Stephen
I know what you mean, the sheer economies of scale involved is an unclimbable mountain. Everything about the Meta glasses is amazing - even down to the range of styles, colours and sizes. Meta is all-in on Llama to, with Llama 4 already in training.
I wonder if Meta could be persuaded to add an additional Accessibility feature to the glasses? But there is almost nothing they don't already do.
I am getting my second pair next week!
Second pair?
Why are you getting another pair already. I’m over the moon excited about where these are going but 1 pair will do for me. Are you going to try the Skyla frames? I’d be really interested to know what they’re like.
Second pair
The reason why I’m getting a second Paris so that I can swap them out when the glasses battery gets low
Hey Stephen. Would you mind…
Hey Stephen.
Would you mind explaining how you got yours to read text?
Mine will not go past the summary.
The product recognition isn't the most accurate either.
I had some deodorant, and it got the brand wrong twice. Both times it gave 2 different responses.
The colour recognition seems okay.
The speed is great.
I'm still excited to try the Seleste glasses to compare the two devices.
Good times are ahead.
Rayban Meta pro tip.
I have been using these glasses for about 5 weeks now. All I can say is, awesome! No. They're not perfect. But which one is? It's been said before, and I would agree. This is the worse they will be.
I took a trip to Eastern Europe a couple weeks ago. I was hoping for easy sign translation, etc. Nope! And the Look and tell features were not available. Well actually, two days before returning to the States, they appeared to get an update that enabled Look and tell along with other AI functions. NICE!
But ok. in case you may not know, there are a couple very useful apps for the phone that really take AI interaction to the next level. And in tandem with the Meta glasses, they shine!
I primarily use PI: Personal Assistant which is a completely free and very capable AI platform that allows for two way communication. And in general, it is excellent. The other is the well known ChatGPT app. It too has a very useful free mode that also provides two way voice conversation.
So the pro tip is, in the Shortcuts app, create shortcuts to either or both apps conversation mode. Then, simply invoke the respective AI chatbot with a hey Siri command.
You may need to unlock your device or use Face ID. But from there on you can have a full conversation and get the information you may be looking for. And all this from the convenience of your glasses. How cool is that?
PI works well at finding real-time location information.
Let me know what you think? I'm thinking of doing a YouTube tutorial. Give it a try and report back!
Do those apps use the…
Do those apps use the cameras in the glasses, or are these apps just something to talk to and ask questions?
Second pair, Skyla
I ordered the Wayfarers, because I'd at least heard of them - I got them in 'denim' but they are still a bit to bold for me. I think cat-eye frames is more me and they have all the cliches covered, they are available in cinnamon and pink!
Reading text
I think this needs clearing up as there has been some slight deviation from the truth here.
Meta Glasses won't read out documents or labels verbatim, it will always summerise no matter how you phrase it. What it will do is allow you to interrogate the image, eg, what is this brand, what are the instructions etc.
I say this as I don't want others, like me, going a bit nuts trying to work out the magic words, a spell if you will, to make them read a page of text. They won't. They're not designed to do so.
As others have pointed out, these are not designed for blind users. If you are in a wood and ask what meta sees, it won't tell you you're in a wood as the designers are assuming that we already know where we are as we can see it.
Meta AI through Ray-Ban is good on specifics, small nuggets of information, but, as yet, do not read out documents in the way we need.
I also encountered another issue, trying to read my iphone screen it said there were notifications, but would not read them out due to privacy concerns, which makes sense on a wider scale, but makes them pretty frustrating for our usage.
I can only hope that meta, in time, allow for a 'blind mode' which allows us to have a slightly different system prompt to those wearing them doing extreme sports or who simply want to caption photographs or identify a plant.
HTH
other languages?
Hello,
Does Meta AI also speaks other languages? If you take a picture of a text in another language, does it reads it correct?
Regards
@Ollie
Well, yes I sort of agree. But what is the use case for a sighted person to ask the glasses to describe what's in front of them? I get this feature being useful for "what kind of plant is this?" or other very specific questions, but a general "where the hell am I?" is probably not a question a sighted person is even going to ask.
I have noticed that if I say "look and tell me what you see in a lot of detail" that you get a bit more information. Maybe we still disagree on the definition of "a lot of" but when I tried that it did describe the trees and give a bit of a feeling of the ambience.
You might be right that getting it to read an entire document isn't going to happen right now, and we aren't going to be able to sit with a book and have it read it all to us. But for bits and pieces I can see them being really useful.
With any AI gadget there are always big caveats - there is quite a large BS factor to consider for starters. I think right now both developers and us are just testing the waters to see what can be achieved. I hope Meta picks up on how much we all love these glasses and what potential they have in our world.
Getting mine on Tuesday.
Hey all,
I did end up getting the shipping email from Amazon US, saying that my Meta Wayfairer smart glasses will arrive Tuesday.
I'm excited to put them to the test around my room, and maybe around my house as well.
I figured it would still be worth getting them?
A few thoughts
Hey everyone. You probably don’t know me, but I am a totally blind journalist and accessibility advocate. I am using these glasses since January, I had access to the preview program for AI and have been testing features within early access and now with the public Version.
I have to say that I am impressed with what updates I got during this months. The fact that I can activate AI functionality in Norway using DNS when setting up the glasses, the fact that I can stream calls directly through WhatsApp messenger and Facebook messenger complete with video And the fact that they included Apple Music functionality, not only Spotify, makes me really happy for now.
Of course, I’m hoping that the rumored glasses from Apple are in development and we will get something from them soon, but until then, these should do the trick for me when it comes to Identifying objects, describing areas and reading text.
Always here if you have any kind of questions.
Chatgpt on meta glasses via whatsapp
I found this on the web. No idea if it works:
https://jovanovski.medium.com/part-2-getting-chatgpt-working-on-meta-smart-glasses-82e74c9a6e1e
Re ChatGPT
Oh wow, I had no idea this sort of thing was even possible. I'm guessing this could open up the glasses to do all sorts of things if you had the time and inclination. I've not gone into detail but I presume the web hook needs to be over the internet not wifi.
I suspect I am too lazy to go all the way through with this but I am very tempted to have a go. Probably with a Raspberry Pi on the internet you cold unlock your whole smart home if you so desired. (Not looked into the security of this yet though). I do have a Pi but have never been bothered to try to get it available on the internet.
Might have a go this weekend…
Might have a go this weekend. It's something I speculated about but didn't have the direct knowhow, so thanks for the link.
The biggest downside to this, I think, will be speed. it's awkward.
re: Might have a go this weekend…
Hello,
Let me know if it works
Location Services and other observations
I had a good chance to put the AI through its paces today.
At one point I was sitting outside a cafe. I asked meta to tell me the menu of the cafe I was at. It told me that I could enable location services in the meta view app settings to get information like this. So I asked where I was and it said that according to location services on the Meta View app, I was at Towcester. Now I wasn't really in the town, but it was the nearest one. I tried this again later and again it just told me the nearest town.
I then asked what direction I was facing and it said North. So I kept my body still and moved my head to the left and asked again and it said West. So it looks like the glasses have head tracking in them which I wasn't aware of before. The second attempt took two goes before it worked - in the middle it said I had to enable location services in the app even though it understood my question exactly. I asked the same thing again and it told me.
I can't find any specific location options in the app but obviously it knows where I am. A bit later my wife disappeared into an antiques shop and left me outside with the dog. I asked what shop I was outside and it said Tesco Express. As far as I am aware there wasn't one of those for miles.
The where am I/what direction questions seem new. I tried them last weekend and they didn't work, although I was on VPN and in the middle of a forest. But it said I couldn't do that on these glasses. If they can tweak it to give a bit more accuracy that could be useful.
When my wife brought over the printed menu I asked the glasses to read it. It said that it was a menu and had breakfast, lunch and dinner options. I asked it to tell me what was on the lunch menu. It gave me some funny answer about"the best lunch options in Towcester include the Ship Inn etc etc" "
A few other things I tried - there was a bench and it told me it was ornate and had some text on it. I asked it to read the text. It said it was Latin then told me it roughly translated to something or other. I had heard it could translate some things, but was surprised it managed Latin.
I also tried asking about a few buildings I was looking at. It told me things like "this is a gothic temple from the 18th century" and I could then find out what year it was built and by whom. (Assuming it was telling the truth.) Or it said "this is stowe house" and could then go on and describe how many stories it was, the Corinthian columns outside and so on.
So I maybe take back my assertion that a sighted person would be asking what they are looking at, because I guess this sort of makes sense.
I think the detail it gives in general is a bit substandard compared to Be My AI, or even sighted person. But it is quite smart when it gives me some extra information that you wouldn't know just from sight. And as always with AI, follow-up questions are amazing.
Another thing I tried was asking what the score was in the Arsenal v Bournmouth match. It told me 1-0 to Arsenal and who had scored from the penalty spot. I asked how long had the game been going on and it told me "it hasn't started - it kicks off at 3pm". About an hour later I asked for the score and it told me it was 0-0.
The other thing I hadn't realised was that it keeps track of the conversations in the AI tab, so I can refer back with VoiceOver. And in here are the images I was asking about too. I presume these are stored in the cloud and not on the glasses as they are not in gallery.
Anyway, AI definitely enhanced the day for me even if it is still a bit hit and miss at times.
The other thing that was quite good was that I could take the occasional photo to keep the wife happy. But a few times it told me that my hand was over the camera. I'm pretty sure this wasn't the case. One time I asked the M-guy to take the photo and he told me that but my arms were by my side. Unless it was some shadow from the hat I was wearing that confused it.
Re: Location Services and other observations
Wow!
I cannot wait to get mine now!
OMG, this is going to be so exciting to test around my house, etc.
According to Amazon US, mine left Georgia on Friday, so we will see when they get here, Amazon says Tuesday.