You know, I think one thing influencers in this space are failing to mention is that this tech is significantly less expensive than something like the OrCam or Envision glasses. Additionally, they have nearly the same functionality but with the added benefit of having a massive tech company behind them and the coveted open source model for development by third parties... something that none of the other "accessible" $3000 assistive glasses have going for them. I have these and have asked all kinds of obscure things of them, for school, for work whatever. They do well. The real big upgrade will be when the camera is upgraded to 4k or higher.
Totally this! I worked in mobile tech for a decade or so, and have kept up to date with assisting technology since my vision has deteriorated. The price being asked by orcam and envision is breathtaking in my opinion. These, on balance, aren't even the most expensive glasses on the Ray ban site! If that means I have to learn the specific language for issuing commands to my glasses, seems like a no brained to me. Additionally we have to bear in mind thar the software is constantly being improved. My two cents, for what they're worth is that these are the most exciting stepping stobe on the way to fully ai enabled AR glasses.
Look I’m sorry, but for what it’s worth the orcam is quite literally the worst product ever made. It’s a tiny chip mounted on your glasses, this at the fraction of the cost has a much better AI and much more features on the side not just for someone with visual impairment
They sure do more than seeing AI. You can’t get seeing AI to just read out the the amount due and date on a bill for instance. You only reviewed the stuff the you can do with camera. There is so much more you can do with just the AI and there’s no typing and the answer comes out as speech. Be my Eyes is the first app to collaborate with the Meta re the glasses. This is very new and came out after your video so no disrespect to you also you no longer have to hey meta look just hey meta then for instance what’s in front of me tand it knows it has to use the camera. This changed about 10 days ago.
@@brendonselley441 As vision tech specialist for over 20 years this is the first wearable product that I think is a real game changer and it just keeps getting better. I listened to an interview with someone from Be My Eyes who worked on the collaboration between them an Meta. Meta never knew how useful this product would be for blind people but now they do he said meta software engineers are clamouring to work on making it better for blind people.
Excellent detailed video. It is so nice to have someone like you to review all of these products for the VIP community. Thank you Sam for everything you do for us.
Thank you for this. I just learned that Be My Eyes has been added to these glasses, which I think it s a huge benefit. I've been looking for AI assistive glasses that I can afford. I'm certainly going to look closely at these glasses.
I had the button problem with IOS 17.7 and got 18.0 and the button problem is gone. In the Blind Ham Radio community these glasses are the hottest thing going. I sent my first pair back because I ordered the sunglasses lens and I wanted the transition lens. Now that Be My Eyes have announced a deal with Meta to work on the glasses. They are going to get better, Also with Llama 3.2 the AI part of the glasses I can see more improvements coming.
I'm surprised you didn't mention the open-ear audio, which to me is one of the main selling points of the glasses. Also, the possibility of taking pictures and videos with the confidence that the camera is more likely to capture what I'm facing, with my neck to give it more stability. Due to the positioning of the microphones, it makes very decent voice recordings when paired with apps such as Just Press Record, or the Voice Memos app in iOS 18 as well as during calls. Currently, the camera is not available to 3rd-party apps, so it only works with Meta apps such as WhatsApp or Messenger. By the way, it is possible to make video calls using WhatsApp or Messenger and give the other person access to the video feed coming from the glasses. Definitely not a life-changing device, but for the price point, I think they can bring great value to someone who is blind or visually impaired, even with the current limitations of Meta AI.
I really like the way it summarizes and then lets you ask about the detail you actually want. VERY nice. I also like that it can answer the phone. I find it impossibly to see my phone outside and answering a call is problematic because my phone is not t consistent how the display works to answer a call. When I need something like this the most is when I am walking my dog on her leash and I am using my cane. Hands free would be very nice for that. What I really need most, is a small remote with actual buttons I can use for basic phone operation when I am outside and can't see the screen at all.
It is very interesting to have you point out how this would work for the people who are blind. I feel right now the Ray Band glasses are more suited for the people who can see and that do use the features in another way of assistance. That is cool that the glasses do work a little bit to help the blind to find/see what is around them, but it does make mistakes or misinterpretations of surrounding objects. Thanks for this video, take care.
As an avid cruiser this is perfect I’ve been wanting to dive into them and see what they are all about but was honestly not into filtering through a ton of videos so I’m way excited to watch this thx
Thanks, Sam! These are a step in the right direction. I’m anxious to see these improve over time with greater accuracy, adaptability, and integration of other apps. I just wish I could look as cool as you do wearing them.
Thanks for all the great videos! Do you have a video showing all the devices that you really like? Or can you say what they are here? Trying to help someone who has fairly advanced wet and dry AMD. Thanks again!
Not sure when this was recorded, but I got my Ray-Ban Meta glasses on Thursday. I didn't run into unlabeled buttons where you did. Matter of fact, the only unlabeled button I found was in the place where you can delete media from the glasses. The button that brings up those options wasn't laeled, but that's the only one I've found.
Thank you for this review. I have stargatdts and have been looking into wearable tech for assistance. I was curious about the Meta sunglasses. This was helpful.
Very detailed information. The only thing I could really see on my part is, working on the farm, and being dusty, I would be wanting to take them off regularly, which would cause me either to misplace them, or damage them. This is where utilizing the phone is so much More handy. Simply pull it out of my pocket, and do what I need to do. As you said, it has its benefits. But, I feel like the Be My Eyes app, as much more effective for my use! Thanks for the sneak peek! 😉
Great Review Sam, as usual. 😊 I was thinking about the issue you were talking about with the transition lenses. And thought about if you had just regular lenses and got those flip down sunglasses to attach to the frame (that were dark enough of course) that you could still use them inside but when you go out flip down the sunglass part so you don’t have to worry about how bright it is. I have light sensitivity too with my Aniridia. Have a wonderful day.?😊
What an informative video, my friend! Having heard a lot of the buzz about these glasses, I knew I would have to test them before making the investment. You have done that for me, which is great because I’m not sure where I could actually test them in my area. Many thanksfor all the info on the Meta Ray-Ban glasses.
I would also note that they had a new update the other day, and it makes the AI better with more capabilities. If you were convinced by what you saw here, I think it's a great purchase, because it will only get better with time. And since your phone/their AI is what drives it, you could probably continue to use these glasses indefinitely and only have to upgrade when you decide you want a better camera/audio, rather than having to upgrade to get better AI.
I agree with most of your observations, that is to say, most of the pluses, most of the minuses. I would simply add that they have the potential to make you look and feel a little less vulnerable. Your mobile phone can often remain in your backpack when out and about. The Meta Ray Bans have a great deal of potential, Be My Eyes and Aira access immediately spring to mind, as does the suggestion that we’re going to have more advanced AI in the not too distant future. Thanks once again, Jim
Hey Sam, great review as always. I do see these as having a future but like any first release product can never be the early adopter best to wait for version two or even version three they’ll come soon now I’m not sure if you got the email but I did receive an email a day or two ago from Be My Eyes and they were announcing that they now have a partnership that they formed with the band glasses so I think that will make a huge difference, especially Once they get all of that fully integrated keep up the great work Sam and really enjoy listening to your reviews
I can’t wait for me to add the Be My Eyes feature where we can call for a volunteer to look through the camera of the glasses👌💯 and I can’t wait for the constant flow feature where you can keep talking without asking me every time you want to ask something. (similar to the new ChatGPT voice mode.)
I completely agree! From interviews with Zuckerberg, it seems like they plan a future where there are glasses like these, more expensive AR headsets like Orion, and bulkier headsets like Quest, all powered by AI to complement or even replace smartphones. I think this is great, because that means the glasses you buy today won't be completely obsolete until many years in the future, and the AI will get better and better over time, making them even better and more useful. One thing I saw in their presentation the other day that I loved was the context based AI capabilities. The woman looked through dresses in her closet, and it picked one that matched her theme/color. She walked away and asked it for a hat, and it pointed one out that was in her closet, so it means it remembers what it sees and can answer questions about things it's not looking at. These features are real and will be added soon, and that's just the beginning! Very exciting future.
Hope there’s a function like warning people/car/bike/… are approaching from the side !!!! And map gps apps that tells you like how much more meters I should walk straight/turn left here/ blah blah hope one day it will come
re: app access to the camera is proprietery (at this time). The current Aira access is effectively shunting the user over to a WhatsApp video call. An admitted work-around to the limitation of this proprietary access. BeMyEyes is apparently being allowed this proprietary access. Hopefully, Meta will take this as a first step to provide more generalized / open access to the camera on the glasses.
Enjoy the information on the Meta Smart glasses looked at them couldn’t get any that fit my face right I am on the fence but now I know for sure I will not be purchasing them. Your information is always helpful.
I just got the glasses today. I held up a box of tablets, PS I knew what they were. But wanted to see if the glasses would identify them. It read the name a bit strange, but then our room was quite dark, so I think it did a pretty good job. Much quicker than getting the phone out opening an app holding the phone above the box and trying to read the box with the phone. And the phone is still an option if we can’t get the information, but these are great to use hands-free. It’s a bit sad that the UK cannot use the look tell feature. I hope that feature remains available in our countries. I am from Australia. I hope the UK gets it as well as I don’t quite understand why they can’t access it.
I’ve just bought some in the UK and am waiting for them to arrive. I’m hoping they will make it easier to make videos when I’m sewing but overall as I use a walking stick in one hand and a long cane in the other I’m hoping I won’t have to juggle my phone and cane every time I need to read street names or find the reception desk at the hospital.
Sam, I love your channel, but this was an unduly critical approach to these fairly inexpensive and very nice looking sunglasses. I think you failed to capture the potential of these smart glasses… They will soon be working with Be My Eyes and a future firmware upgrade, according to Meta itself, will include instant language translation in real time and dramatically improved AI… It's a very good start and the future is almost limitless, yet your cynicism pretty much ignores that bigger picture. 20: 20:33 33
Something to keep in mind is that product features are constantly changing, and also, this video may have been made a while ago but for various reasons that only Sam knows, it wasn't posted until today. It's not uncommon for content creaters to pre-film and schedule videos to be posted. It can take a lot of work to edit and put a video together once it's been filmed, so we're not always going to get a review of the MOST RECENT up to date features of the product. This video may have been recorded prior to the announcement of Be My Eyes making it public that they're working together. I don't know Sam and how his filming process works--if Sam wants that known, he will share with us. But I do know that creating these videos is a process and a lot of work to do them well, work that requires time. I wouldn't have known this several years ago when I then wasn't friends with a couple of well-known content creaters and was also involved in creating a mini series of videos myself.
At this point I do agree with a lot of his points though. I was debating between the Meta Ray-Ban Smart glasses, and envision glasses. I ended up getting a coupon deal for envision. I’m extremely happy with them. Because I have memory problems and how picky the Meta commands are would make me extremely frustrated. Instead I just have to tap on my envision classes to take the picture and then it gives me a very very detailed description. I then just hold down a little button and I can ask any question. Such as what facial expression is he making? Or can you describe the model of the car. Or read the appetizers. I can also program them to recognize my mom, Dadd, nieces and nephews and best friends. I’ve even done it for my waitress at the restaurant so I can tell when she’s approaching.
@@MonikaNelisDupont That's fantastic that you have the Envision glasses. I've played around with them and they are super neat; I'm glad you were able to get a coupon and get them. I definitely think they've got a lot of features that the Meta glasses don't currently have. I'm glad to see that there are options and that we are getting access to information we didn't have before from these technologies. The Meta glasses have tons of potential and like every product, it will be interesting to watch their growth and development. I don't have it in my budget to get anything right now. But maybe in a few more generations...
It would be great to see a item by item comparison of Meta AI versus envision with our AI. Because they seem to be very similar but each have their own strength and weaknesses. I also recently heard that MAI has teamed with BM eyes so that if you use the eyes app, you can use itwith the AI and that camera.
Sam, is it possible to pre-prompt to get a more detailed description, rather than continuing to have to ask follow-up questions to get the info you want? For example “Look at what’s in front of me and give me an extended description”?
Another great video. Thank You so much. That was a good heads up about the transition lenses needing direct U V to change. I am wondering about how much cellular data these glasses use? And, when You used these glasses if data interruption was ever a problem for you? This could be the problem that most affects how they would work for me. Thanks again TT
Thank you so mutch for your review. I have these glasses 1 month or so. And I would like to menshen, that they are not durrable. I had calition with a pillar. And I broken it. It was not so strong calition, I went quite slow, but red chat, and my white came was not used 😀
I found the transitions to be way more useful because I have some vision indoors. A better way to tint over them is order clip on flip downs from NOIR.
FYI Meta does now have clear glasses options. Ive seen a few creators with the clear versions. Personally i want the glasses that Meta showed at Meta Connect, The Orion but only 1000 dev units were made.
Does anyone know if there are plans for these to be able to shake hands with the Seeing AI app on a smart phone? If so wondering if that would expand usage
I would say the glasses like most devices can be used in various ways. Some useful and some needing some work as you were saying. For example, I've used it just to look around my environment. Walking along the street and asking the glasses to tell me what's in shop windows. I can casually "window shop" and see prices of items in the window. Also, I can ask the glasses to read bus stop noticeboard to tell me which bus is arriving next. Being able to use a weight work whilst using my cane when walking freely and then asking for information in the environment can be useful on a casual basis. For more information about the environment, I just ask the glasses to describe in detail and then I use the word "continue" to make the glasses step onto the next bit of information. The annoying thing as you say, is that it obviously has the information and all it would say is for Meta to have an accessibility setting to continue reading through a description rather than assume the user Has vision. They could even just check that VoiceOver is running or present on the phone to automatically switch to a blind persons mode. Now "Be My Eyes" are on the glasses, the Lama AI Engine should become more sensitive to blind person's needs. I know at the moment, the service is only linking volunteers through the camera but the company have been clear that this is only the first step and so hopefully, as the llama language model improves, so will our ability to interrogate it.
Wanted to watch this video because I wanted your take on it. But I’m sure you know by now, at the Meta event this week they also announced to partnership with Be My Eyes. So, essentially you can now use this with Aira and Be My Eyes. Although as you said I was still in beta testing. They seem to be moving quickly to add people Though. As a person who has had a refer many years, and used all the iterations to the glasses, I am looking forward to at some point getting these glasses and hopefully going hands-free once again with Aira. And of course, Be My Eyes as well.
seems like these work best for the low vision community rather than the blind community. If they can help me read a menu or navigate an airport that would be incredibly helpful.
Great & informative review. Thanks for the observation about the efficacy of the transition lenses. I hadn't heard that at yet. One major showstopper for me is that they can only accommodate a limited range of prescription lenses. I am aphakic (no lens), so my prescription is rather strong and not able to be accommodated by the Meta Ray-Ban glasses. If I were able to get them I'd wonder about the fit of the frames. I'd be wearing them all the time so fit would be important. The frames are molded. They customary way of adjusting the fit of molded frames is to heat them up and bend them. That is not possible on these glasses due to the electronics in the frames and arms. While the camera is impressive for the form factor bear in mind that the camera quality and features are much less than that found in most mid to high range cell phones. The camera is fixed focus and aperature/exposure.
I have the meta sunglasses, and I agree, they are nice but not a replacement for our phone apps. I hope they update the program on them, I'm a bata tester for Ray-Ban.
Is there a solution to have an AI connected all the time to have a live update of the soroundings without neccesary asking questions all the time? I'm thinking for an usecase where the person is walking and the glasses might scan the area and prevent dangers or obstacles in the way by default. Thank you
I would like it to be able to let me know how close things like a Door is to me and not get it wrong but all Assistive Technology is not right all the time I found out. It would be better if it Describes things more for places you go but sometimes you need just one thing Described and it works for that. How is it for Describing Faces and Facial Expressions?
Hi, this is first time I’ve seen your video! This is very useful. Do you know if the glasses can can read language other than English? Could you share an application that reads texts? I am visually impaired person from Bulgaria and I speak English, German and Bulgarian.
I'we been waiting for this kind of review for some time now. Can you provide the link to the discussion/comments about the feature, that other apps can use the smart glasses camera insted of the phone camera? I want to dig a little deeper and try it out
Unfortunately I don't have that. Saw it in a webinar once. However, since I recorded this video, several companies have come out saying that their apps will be supported on the glasses. Be My Eyes is one of them.
Hi Sam great review. Can you tell me if it will work with the OKO app? It would be great if so. My assumption is that the app will default to the iPhone Camera. Thanks
I am wondering if the Meta glasses are seen by iOS as a bluetooth earbud, like my airpods. Van I use Notes to record memos? Can I use Siri? Can I listen to Audible or SoundCloud? Or are the glasses tightly linked to the Meta app, so I can only do whit the Meta app allows? Thanks I get the feeling I should wait for Apple Rayban glasses...
On your last video, I was trying to give you a heads up at the Meta connect event 2024. They mentioned that they’re going to be bringing “be my eyes“ to the meta-Ray-Ban sunglasses. I didn’t use those words because I forgot the name of the app, but I wanted to let you know before you release this video and I guess you missed the comment
I would like your opinion about purchasing these glasses. My sister is slowly losing her vision, and only has vision in one eye. Would these be helpful for her when she is trying to read her mail, etc? I know they would help her with what is around her. Thank you for your response.
I also wonder how good the camera is on the glasses. For me working on my farm, taking pictures of my livestock at a distance, my phone camera, even between my iPhone X, versus my iPhone 15, I can tell a big difference in the clarity meaning, the information is read back in much more detail with Be My Eyes.
For some reason my rayban does not even have this feature. Is there any specific command? I need to ask it to do this? I got mine from Amazon., but it can’t announce what I’m looking at. It keeps saying I am just a voice and just an assistant and can’t help with this request., is there something I am doing wrong?
Hi Sam, hope you and your family had a wonderful Christmas and hope you have a happy new year and all the very best in 2025. I just had a question for you about the Meadow Wayfair smart glasses. I live in Canada and my better to purchase them in Canada, my aunt and uncle live out in the states and thought it might be cheaper for them to purchase Them for me out there they thought they would save some money. They live out in the San Francisco area and I am in Ontario Canada and I’m talking about the Meta Wayfair smart glasses. I wasn’t sure if I was better just to purchase them in Canada.
Really helpful video from what I can see they do near enough exactly the same as being my eyes app which I have started to use because of a strict diet. I have to keep two being my eyes allows you to take a picture of the item. It will tell you what it is just like the glasses do but you then have a question button, which you can use to gain more information such as if you need to know what the fat content is in your meal ask the question it will tell you I do plan to use this to read a menu and if it can read the menu like Ray Bank glassesthen as pointed out by Sam, the only benefit would be it’s wearable phone
It would be nice to see a demonstration of how/if they can help one navigate. For example, can it read street signs and tell you when it is safe to cross the street or help you avoid objects like a pole or trash can?
I have these glasses, and they are pretty awesome… But I don't think they would solve the problems you're opposing… But Meta indicates they will have a new approach to AI called "always on AI" which will be effectively like having a human assistant to tell you when the light changes or obstacles are in front of you and that sort of thing. The AI is really not there yet, but it is still quite amazing, and the future is limitless since it is learning and improving at a geometric rate of progression. 0:02
My son was so much impressed that he decided to buy the glasses for both of his parents, that is his mum and myself, his father because we are both visually impaired. And I could hardly convince to only buy the glasses for his mother to start and see first how it works. We received the glasses today, and guess what - the AI features are not available in Europe due to legal restrictions related to privacy considérations! Now, we have to return them to the sender. So, beware! All this is nice. But nit for people who live in Europe.
I do wear Sunglasses inside so that part I like about them and for me the lenses can be Darker if that is what they are like and they just need to work and stay connected to use them and fit my Head because some Sunglasses are too tight.
Hi Sam, thanks for this detailed review of the Meta glasses. This is kind of what bothers me - having to go through Meta’s proprietary app and not being able to use the glasses’ camera everywhere without requiring dedicated API integration. By the way, can the surround microphones be used with any recording app, like Voice Memos, or Ferrite? Or are they only compatible with videos through MetaView? Anyway, I’m excited to follow the evolution of such smart glasses, which I find very promising. The OCR also seemed really accurate! Thanks for your content, I’ll continue following you with pleasure!
I believe the microphones will work with any app. since recording this video, several apps have announced that they will be working on the smart glasses. Apps like be my eyes. So we won't be stuck with just meta's proprietary app.
I am totally blind too, and I heard about the smart glasses from my blind friends, they said they can read things to you so I want to know. Can it read labels to me on a package? For example, if I have a can of food and I say, Hey Meta, what does this label say, then it’ll read the label to me. Can the smart glasses read any label on any object? That’s the main reason I want them.
Fantastic, excellent review! Earlier this week it was announced that bBe My Eyesmy as partnered with Meta in Ray-Ban to use these glasses and that the AI has been significantly enhanced. Have you had a chance to check out these new features yet? If so, what did you think of them? As always thank you for your help in supporting the blind community.
The killer app for me is being able to buy a book from a bookstore (remember those?), saunter into a seaside cafe and spend the afternoon reading a printed book. Perhaps the software can detect page turns and continue reading as you flip pages. Same would go for instruction manuals that come with appliances. The software could link the printed book and the appliance and guide you on how to use it. Someone will hack these and turn it into something amazing for our community that will be a game changer.
Hey there :D I wonder if anyone knows,does those glasses have a night vision integrated? If not are there any smart glasses on market that can improve night vision? :D
Can I get them in India. If so, please let me know the Marketing Agency. If it is available here, what about device centres available in India. Will it work without Internet/ Wifi connection..
You know, I think one thing influencers in this space are failing to mention is that this tech is significantly less expensive than something like the OrCam or Envision glasses. Additionally, they have nearly the same functionality but with the added benefit of having a massive tech company behind them and the coveted open source model for development by third parties... something that none of the other "accessible" $3000 assistive glasses have going for them. I have these and have asked all kinds of obscure things of them, for school, for work whatever. They do well. The real big upgrade will be when the camera is upgraded to 4k or higher.
Totally this! I worked in mobile tech for a decade or so, and have kept up to date with assisting technology since my vision has deteriorated.
The price being asked by orcam and envision is breathtaking in my opinion.
These, on balance, aren't even the most expensive glasses on the Ray ban site!
If that means I have to learn the specific language for issuing commands to my glasses, seems like a no brained to me.
Additionally we have to bear in mind thar the software is constantly being improved. My two cents, for what they're worth is that these are the most exciting stepping stobe on the way to fully ai enabled AR glasses.
This is exactly what I have been saying.
Look I’m sorry, but for what it’s worth the orcam is quite literally the worst product ever made. It’s a tiny chip mounted on your glasses, this at the fraction of the cost has a much better AI and much more features on the side not just for someone with visual impairment
They sure do more than seeing AI. You can’t get seeing AI to just read out the the amount due and date on a bill for instance. You only reviewed the stuff the you can do with camera. There is so much more you can do with just the AI and there’s no typing and the answer comes out as speech. Be my Eyes is the first app to collaborate with the Meta re the glasses. This is very new and came out after your video so no disrespect to you also you no longer have to hey meta look just hey meta then for instance what’s in front of me tand it knows it has to use the camera. This changed about 10 days ago.
@@brendonselley441 As vision tech specialist for over 20 years this is the first wearable product that I think is a real game changer and it just keeps getting better. I listened to an interview with someone from Be My Eyes who worked on the collaboration between them an Meta. Meta never knew how useful this product would be for blind people but now they do he said meta software engineers are clamouring to work on making it better for blind people.
Excellent detailed video. It is so nice to have someone like you to review all of these products for the VIP community. Thank you Sam for everything you do for us.
Well done Sam nice vidio.,
I also think that the garbage truck wanted to be inthe vidio too lol.
Thank you for this. I just learned that Be My Eyes has been added to these glasses, which I think it s a huge benefit. I've been looking for AI assistive glasses that I can afford. I'm certainly going to look closely at these glasses.
I had the button problem with IOS 17.7 and got 18.0 and the button problem is gone. In the Blind Ham Radio community these glasses are the hottest thing going. I sent my first pair back because I ordered the sunglasses lens and I wanted the transition lens. Now that Be My Eyes have announced a deal with Meta to work on the glasses. They are going to get better, Also with Llama 3.2 the AI part of the glasses I can see more improvements coming.
I'm surprised you didn't mention the open-ear audio, which to me is one of the main selling points of the glasses. Also, the possibility of taking pictures and videos with the confidence that the camera is more likely to capture what I'm facing, with my neck to give it more stability. Due to the positioning of the microphones, it makes very decent voice recordings when paired with apps such as Just Press Record, or the Voice Memos app in iOS 18 as well as during calls. Currently, the camera is not available to 3rd-party apps, so it only works with Meta apps such as WhatsApp or Messenger. By the way, it is possible to make video calls using WhatsApp or Messenger and give the other person access to the video feed coming from the glasses. Definitely not a life-changing device, but for the price point, I think they can bring great value to someone who is blind or visually impaired, even with the current limitations of Meta AI.
So happy to see more brands are coming out with smart glasses that can see for us!
Sam, did you test the glasses to see how well they read print books? Anyone else out there try this as well?
Thanks!
Thank you!
I really like the way it summarizes and then lets you ask about the detail you actually want. VERY nice. I also like that it can answer the phone. I find it impossibly to see my phone outside and answering a call is problematic because my phone is not t consistent how the display works to answer a call. When I need something like this the most is when I am walking my dog on her leash and I am using my cane. Hands free would be very nice for that.
What I really need most, is a small remote with actual buttons I can use for basic phone operation when I am outside and can't see the screen at all.
It is very interesting to have you point out how this would work for the people who are blind.
I feel right now the Ray Band glasses are more suited for the people who can see and that do use the features in another way of assistance.
That is cool that the glasses do work a little bit to help the blind to find/see what is around them, but it does make mistakes or misinterpretations of surrounding objects.
Thanks for this video, take care.
Thanks for the detailed review!
As an avid cruiser this is perfect I’ve been wanting to dive into them and see what they are all about but was honestly not into filtering through a ton of videos so I’m way excited to watch this thx
That was quite a detail review. Subscribed on my first visit and view.
Thanks, your videos are helpful to me.
Thanks, Sam! These are a step in the right direction. I’m anxious to see these improve over time with greater accuracy, adaptability, and integration of other apps. I just wish I could look as cool as you do wearing them.
Thanks for all the great videos! Do you have a video showing all the devices that you really like? Or can you say what they are here? Trying to help someone who has fairly advanced wet and dry AMD. Thanks again!
Not sure when this was recorded, but I got my Ray-Ban Meta glasses on Thursday. I didn't run into unlabeled buttons where you did. Matter of fact, the only unlabeled button I found was in the place where you can delete media from the glasses. The button that brings up those options wasn't laeled, but that's the only one I've found.
I can’t say I ran into any unlabelled buttons either.
This was a great Video and you do a great job Describing it.
Thank you for this review. I have stargatdts and have been looking into wearable tech for assistance. I was curious about the Meta sunglasses. This was helpful.
Very detailed information. The only thing I could really see on my part is, working on the farm, and being dusty, I would be wanting to take them off regularly, which would cause me either to misplace them, or damage them. This is where utilizing the phone is so much More handy. Simply pull it out of my pocket, and do what I need to do. As you said, it has its benefits. But, I feel like the Be My Eyes app, as much more effective for my use! Thanks for the sneak peek! 😉
Great Review Sam, as usual. 😊
I was thinking about the issue you were talking about with the transition lenses. And thought about if you had just regular lenses and got those flip down sunglasses to attach to the frame (that were dark enough of course) that you could still use them inside but when you go out flip down the sunglass part so you don’t have to worry about how bright it is.
I have light sensitivity too with my Aniridia.
Have a wonderful day.?😊
What an informative video, my friend! Having heard a lot of the buzz about these glasses, I knew I would have to test them before making the investment. You have done that for me, which is great because I’m not sure where I could actually test them in my area.
Many thanksfor all the info on the Meta Ray-Ban glasses.
I would also note that they had a new update the other day, and it makes the AI better with more capabilities. If you were convinced by what you saw here, I think it's a great purchase, because it will only get better with time. And since your phone/their AI is what drives it, you could probably continue to use these glasses indefinitely and only have to upgrade when you decide you want a better camera/audio, rather than having to upgrade to get better AI.
I agree with most of your observations, that is to say, most of the pluses, most of the minuses. I would simply add that they have the potential to make you look and feel a little less vulnerable. Your mobile phone can often remain in your backpack when out and about.
The Meta Ray Bans have a great deal of potential, Be My Eyes and Aira access immediately spring to mind, as does the suggestion that we’re going to have more advanced AI in the not too distant future.
Thanks once again, Jim
Very informative. Thank you Sam!😎
Awesome video and the technology
Great video! Thank you!
Hey Sam, great review as always. I do see these as having a future but like any first release product can never be the early adopter best to wait for version two or even version three they’ll come soon now I’m not sure if you got the email but I did receive an email a day or two ago from Be My Eyes and they were announcing that they now have a partnership that they formed with the band glasses so I think that will make a huge difference, especially Once they get all of that fully integrated keep up the great work Sam and really enjoy listening to your reviews
I can’t wait for me to add the Be My Eyes feature where we can call for a volunteer to look through the camera of the glasses👌💯 and I can’t wait for the constant flow feature where you can keep talking without asking me every time you want to ask something. (similar to the new ChatGPT voice mode.)
I completely agree! From interviews with Zuckerberg, it seems like they plan a future where there are glasses like these, more expensive AR headsets like Orion, and bulkier headsets like Quest, all powered by AI to complement or even replace smartphones.
I think this is great, because that means the glasses you buy today won't be completely obsolete until many years in the future, and the AI will get better and better over time, making them even better and more useful.
One thing I saw in their presentation the other day that I loved was the context based AI capabilities. The woman looked through dresses in her closet, and it picked one that matched her theme/color. She walked away and asked it for a hat, and it pointed one out that was in her closet, so it means it remembers what it sees and can answer questions about things it's not looking at. These features are real and will be added soon, and that's just the beginning! Very exciting future.
What would be a great feature. Especially for emergencies or just a buddy on the phone helping you with something
This integration was completed. Tested with Be My Eyes volunteer and it worked perfectly 👍
Hope there’s a function like warning people/car/bike/… are approaching from the side !!!! And map gps apps that tells you like how much more meters I should walk straight/turn left here/ blah blah hope one day it will come
just signed up for the app see for me because of these videos a dope app to be helpful
Thank you for your help. Can you take photos with the glasses and have the photos instantly appear on your iphone screen to view them?
Honest and helpful as a review. Thanks Sam. Like, share and toss this guy a subscription.
Sam, I’ve been waiting for this review to drop. I have these
My mom has macular. Would this help read expiration dates and the buttons/read stove microwave screens?
Great review Sam.
For safety reasons, I can’t use my phone while walking around in some big cities. This could be a great option!
re: app access to the camera is proprietery (at this time). The current Aira access is effectively shunting the user over to a WhatsApp video call. An admitted work-around to the limitation of this proprietary access. BeMyEyes is apparently being allowed this proprietary access. Hopefully, Meta will take this as a first step to provide more generalized / open access to the camera on the glasses.
So can I access Aira through WhatsApp?
Enjoy the information on the Meta Smart glasses looked at them couldn’t get any that fit my face right I am on the fence but now I know for sure I will not be purchasing them. Your information is always helpful.
Since this is coming from meta, how long will it be before we get sponsored replies to our queries?
Love my smart glasses
I’d be interested to use this for the London Underground system. Do you think this would be able to help with navigations by reading signs etc?
I just got the glasses today. I held up a box of tablets, PS I knew what they were. But wanted to see if the glasses would identify them. It read the name a bit strange, but then our room was quite dark, so I think it did a pretty good job. Much quicker than getting the phone out opening an app holding the phone above the box and trying to read the box with the phone. And the phone is still an option if we can’t get the information, but these are great to use hands-free. It’s a bit sad that the UK cannot use the look tell feature. I hope that feature remains available in our countries. I am from Australia. I hope the UK gets it as well as I don’t quite understand why they can’t access it.
I’ve just bought some in the UK and am waiting for them to arrive. I’m hoping they will make it easier to make videos when I’m sewing but overall as I use a walking stick in one hand and a long cane in the other I’m hoping I won’t have to juggle my phone and cane every time I need to read street names or find the reception desk at the hospital.
Sam, I love your channel, but this was an unduly critical approach to these fairly inexpensive and very nice looking sunglasses. I think you failed to capture the potential of these smart glasses… They will soon be working with Be My Eyes and a future firmware upgrade, according to Meta itself, will include instant language translation in real time and dramatically improved AI… It's a very good start and the future is almost limitless, yet your cynicism pretty much ignores that bigger picture. 20: 20:33 33
Something to keep in mind is that product features are constantly changing, and also, this video may have been made a while ago but for various reasons that only Sam knows, it wasn't posted until today. It's not uncommon for content creaters to pre-film and schedule videos to be posted. It can take a lot of work to edit and put a video together once it's been filmed, so we're not always going to get a review of the MOST RECENT up to date features of the product.
This video may have been recorded prior to the announcement of Be My Eyes making it public that they're working together. I don't know Sam and how his filming process works--if Sam wants that known, he will share with us. But I do know that creating these videos is a process and a lot of work to do them well, work that requires time. I wouldn't have known this several years ago when I then wasn't friends with a couple of well-known content creaters and was also involved in creating a mini series of videos myself.
At this point I do agree with a lot of his points though. I was debating between the Meta Ray-Ban Smart glasses, and envision glasses. I ended up getting a coupon deal for envision. I’m extremely happy with them. Because I have memory problems and how picky the Meta commands are would make me extremely frustrated. Instead I just have to tap on my envision classes to take the picture and then it gives me a very very detailed description. I then just hold down a little button and I can ask any question. Such as what facial expression is he making? Or can you describe the model of the car. Or read the appetizers. I can also program them to recognize my mom, Dadd, nieces and nephews and best friends. I’ve even done it for my waitress at the restaurant so I can tell when she’s approaching.
@@MonikaNelisDupont That's fantastic that you have the Envision glasses. I've played around with them and they are super neat; I'm glad you were able to get a coupon and get them. I definitely think they've got a lot of features that the Meta glasses don't currently have. I'm glad to see that there are options and that we are getting access to information we didn't have before from these technologies. The Meta glasses have tons of potential and like every product, it will be interesting to watch their growth and development. I don't have it in my budget to get anything right now. But maybe in a few more generations...
That's really cool! Do you receive an audio notification when a person is approaching?
Great video, Sam, by any chance do we have a demonstration of these glasses outdoors as If on the go trying to see what bus is it things of that sort
Thanks for the info
It would be great to see a item by item comparison of Meta AI versus envision with our AI. Because they seem to be very similar but each have their own strength and weaknesses. I also recently heard that MAI has teamed with BM eyes so that if you use the eyes app, you can use itwith the AI and that camera.
Sam, is it possible to pre-prompt to get a more detailed description, rather than continuing to have to ask follow-up questions to get the info you want? For example “Look at what’s in front of me and give me an extended description”?
Very Good Video
Another great video. Thank You so much. That was a good heads up about the transition lenses needing direct U V to change. I am wondering about how much cellular data these glasses use? And, when You used these glasses if data interruption was ever a problem for you? This could be the problem that most affects how they would work for me. Thanks again TT
They work for me!
Great review. Wondering if you’ll review and compare these with the seleste glasses out of Canada.
What do you think about the escoop glasses fof central vision loss ???
Thank you so mutch for your review. I have these glasses 1 month or so. And I would like to menshen, that they are not durrable. I had calition with a pillar. And I broken it. It was not so strong calition, I went quite slow, but red chat, and my white came was not used 😀
I also hear Be my eyes and Envision is also partnering with them as well. 21:05
Excellent 😊
I found the transitions to be way more useful because I have some vision indoors. A better way to tint over them is order clip on flip downs from NOIR.
FYI Meta does now have clear glasses options. Ive seen a few creators with the clear versions.
Personally i want the glasses that Meta showed at Meta Connect, The Orion but only 1000 dev units were made.
Does anyone know if there are plans for these to be able to shake hands with the Seeing AI app on a smart phone? If so wondering if that would expand usage
I would say the glasses like most devices can be used in various ways. Some useful and some needing some work as you were saying.
For example, I've used it just to look around my environment. Walking along the street and asking the glasses to tell me what's in shop windows. I can casually "window shop" and see prices of items in the window.
Also, I can ask the glasses to read bus stop noticeboard to tell me which bus is arriving next.
Being able to use a weight work whilst using my cane when walking freely and then asking for information in the environment can be useful on a casual basis.
For more information about the environment, I just ask the glasses to describe in detail and then I use the word "continue" to make the glasses step onto the next bit of information.
The annoying thing as you say, is that it obviously has the information and all it would say is for Meta to have an accessibility setting to continue reading through a description rather than assume the user Has vision. They could even just check that VoiceOver is running or present on the phone to automatically switch to a blind persons mode.
Now "Be My Eyes" are on the glasses, the Lama AI Engine should become more sensitive to blind person's needs. I know at the moment, the service is only linking volunteers through the camera but the company have been clear that this is only the first step and so hopefully, as the llama language model improves, so will our ability to interrogate it.
Wanted to watch this video because I wanted your take on it. But I’m sure you know by now, at the Meta event this week they also announced to partnership with Be My Eyes. So, essentially you can now use this with Aira and Be My Eyes. Although as you said I was still in beta testing. They seem to be moving quickly to add people Though.
As a person who has had a refer many years, and used all the iterations to the glasses, I am looking forward to at some point getting these glasses and hopefully going hands-free once again with Aira. And of course, Be My Eyes as well.
seems like these work best for the low vision community rather than the blind community. If they can help me read a menu or navigate an airport that would be incredibly helpful.
Great & informative review. Thanks for the observation about the efficacy of the transition lenses. I hadn't heard that at yet. One major showstopper for me is that they can only accommodate a limited range of prescription lenses. I am aphakic (no lens), so my prescription is rather strong and not able to be accommodated by the Meta Ray-Ban glasses. If I were able to get them I'd wonder about the fit of the frames. I'd be wearing them all the time so fit would be important. The frames are molded. They customary way of adjusting the fit of molded frames is to heat them up and bend them. That is not possible on these glasses due to the electronics in the frames and arms. While the camera is impressive for the form factor bear in mind that the camera quality and features are much less than that found in most mid to high range cell phones. The camera is fixed focus and aperature/exposure.
I have the meta sunglasses, and I agree, they are nice but not a replacement for our phone apps. I hope they update the program on them, I'm a bata tester for Ray-Ban.
Is there a solution to have an AI connected all the time to have a live update of the soroundings without neccesary asking questions all the time? I'm thinking for an usecase where the person is walking and the glasses might scan the area and prevent dangers or obstacles in the way by default. Thank you
I would like it to be able to let me know how close things like a Door is to me and not get it wrong but all Assistive Technology is not right all the time I found out. It would be better if it Describes things more for places you go but sometimes you need just one thing Described and it works for that. How is it for Describing Faces and Facial Expressions?
Hi, this is first time I’ve seen your video! This is very useful. Do you know if the glasses can can read language other than English? Could you share an application that reads texts? I am visually impaired person from Bulgaria and I speak English, German and Bulgarian.
I'we been waiting for this kind of review for some time now. Can you provide the link to the discussion/comments about the feature, that other apps can use the smart glasses camera insted of the phone camera? I want to dig a little deeper and try it out
Unfortunately I don't have that. Saw it in a webinar once. However, since I recorded this video, several companies have come out saying that their apps will be supported on the glasses. Be My Eyes is one of them.
@@theblindlife thanks anyway
Hi Sam great review. Can you tell me if it will work with the OKO app? It would be great if so. My assumption is that the app will default to the iPhone Camera. Thanks
Transition lenses would help with indoor use. However, i wonder how people who use prescription glasses would use them.
I am wondering if the Meta glasses are seen by iOS as a bluetooth earbud, like my airpods. Van I use Notes to record memos? Can I use Siri? Can I listen to Audible or SoundCloud? Or are the glasses tightly linked to the Meta app, so I can only do whit the Meta app allows? Thanks
I get the feeling I should wait for Apple Rayban glasses...
Great video! Would these work well for reading a book aloud, or would the text be too small?
What if you say “describe the scene while naming the colors of the objects”
Hey Sam, just to let you know, (I’m not sure if you saw this), Be My Eyes is also partnering with Meta.
I did, great news!
On your last video, I was trying to give you a heads up at the Meta connect event 2024. They mentioned that they’re going to be bringing “be my eyes“ to the meta-Ray-Ban sunglasses. I didn’t use those words because I forgot the name of the app, but I wanted to let you know before you release this video and I guess you missed the comment
I bought mine today… and the update now has 3 minutes, but the default is 60 sec
I would like your opinion about purchasing these glasses. My sister is slowly losing her vision, and only has vision in one eye. Would these be helpful for her when she is trying to read her mail, etc? I know they would help her with what is around her. Thank you for your response.
I also wonder how good the camera is on the glasses. For me working on my farm, taking pictures of my livestock at a distance, my phone camera, even between my iPhone X, versus my iPhone 15, I can tell a big difference in the clarity meaning, the information is read back in much more detail with Be My Eyes.
For some reason my rayban does not even have this feature. Is there any specific command? I need to ask it to do this? I got mine from Amazon., but it can’t announce what I’m looking at. It keeps saying I am just a voice and just an assistant and can’t help with this request., is there something I am doing wrong?
Hi Sam, hope you and your family had a wonderful Christmas and hope you have a happy new year and all the very best in 2025. I just had a question for you about the Meadow Wayfair smart glasses. I live in Canada and my better to purchase them in Canada, my aunt and uncle live out in the states and thought it might be cheaper for them to purchase Them for me out there they thought they would save some money. They live out in the San Francisco area and I am in Ontario Canada and I’m talking about the Meta Wayfair smart glasses. I wasn’t sure if I was better just to purchase them in Canada.
Really helpful video from what I can see they do near enough exactly the same as being my eyes app which I have started to use because of a strict diet. I have to keep two being my eyes allows you to take a picture of the item. It will tell you what it is just like the glasses do but you then have a question button, which you can use to gain more information such as if you need to know what the fat content is in your meal ask the question it will tell you I do plan to use this to read a menu and if it can read the menu like Ray Bank glassesthen as pointed out by Sam, the only benefit would be it’s wearable phone
It would be nice to see a demonstration of how/if they can help one navigate. For example, can it read street signs and tell you when it is safe to cross the street or help you avoid objects like a pole or trash can?
I have these glasses, and they are pretty awesome… But I don't think they would solve the problems you're opposing… But Meta indicates they will have a new approach to AI called "always on AI" which will be effectively like having a human assistant to tell you when the light changes or obstacles are in front of you and that sort of thing. The AI is really not there yet, but it is still quite amazing, and the future is limitless since it is learning and improving at a geometric rate of progression. 0:02
It would be cool if they could recognize people you know by name. Like if they are across the street
Thanks
My son was so much impressed that he decided to buy the glasses for both of his parents, that is his mum and myself, his father because we are both visually impaired. And I could hardly convince to only buy the glasses for his mother to start and see first how it works. We received the glasses today, and guess what - the AI features are not available in Europe due to legal restrictions related to privacy considérations! Now, we have to return them to the sender.
So, beware! All this is nice. But nit for people who live in Europe.
I do wear Sunglasses inside so that part I like about them and for me the lenses can be Darker if that is what they are like and they just need to work and stay connected to use them and fit my Head because some Sunglasses are too tight.
Hi Sam, thanks for this detailed review of the Meta glasses.
This is kind of what bothers me - having to go through Meta’s proprietary app and not being able to use the glasses’ camera everywhere without requiring dedicated API integration.
By the way, can the surround microphones be used with any recording app, like Voice Memos, or Ferrite? Or are they only compatible with videos through MetaView?
Anyway, I’m excited to follow the evolution of such smart glasses, which I find very promising. The OCR also seemed really accurate! Thanks for your content, I’ll continue following you with pleasure!
I believe the microphones will work with any app. since recording this video, several apps have announced that they will be working on the smart glasses. Apps like be my eyes. So we won't be stuck with just meta's proprietary app.
The lens come in different colors, and transparent, and clear.
Also I heard BE MY EYE is going to added.
great!
Do these glasses work with Bluetooth hearing aids?
I am totally blind too, and I heard about the smart glasses from my blind friends, they said they can read things to you so I want to know. Can it read labels to me on a package? For example, if I have a can of food and I say, Hey Meta, what does this label say, then it’ll read the label to me. Can the smart glasses read any label on any object? That’s the main reason I want them.
Fantastic, excellent review! Earlier this week it was announced that bBe My Eyesmy as partnered with Meta in Ray-Ban to use these glasses and that the AI has been significantly enhanced. Have you had a chance to check out these new features yet? If so, what did you think of them? As always thank you for your help in supporting the blind community.
The killer app for me is being able to buy a book from a bookstore (remember those?), saunter into a seaside cafe and spend the afternoon reading a printed book. Perhaps the software can detect page turns and continue reading as you flip pages. Same would go for instruction manuals that come with appliances. The software could link the printed book and the appliance and guide you on how to use it. Someone will hack these and turn it into something amazing for our community that will be a game changer.
What is a mobile screen reader?
Hey there :D I wonder if anyone knows,does those glasses have a night vision integrated? If not are there any smart glasses on market that can improve night vision? :D
The glasses don't have any digital video component at all, they are just simple sunglasses that talk to you.
Hey Sam. Would you mind sharing your thoughts on Apple Intelligence and how it impacts the blind community?
Can you use ear buds so people around you aren't hearing everything?
Be my Eyes should partner with a glasses company and bring be my AI to something like this, dedicated to assistive tech. Would be cool
You've got your wish, Be My Eyes just announced they will be compatible with the meta ray ban smart glasses as well
Can I get them in India. If so, please let me know the Marketing Agency. If it is available here, what about device centres available in India. Will it work without Internet/ Wifi connection..
It's audio glasses? We can't read but listen by audio that written in book ? Which type of glasses it is?😊
What’s the best app on an iPhone that a person with RP and that’s visually impaired could use to help them with their daily lives?