The Ray-Ban Conspiracy
Sep. 2nd, 2025 03:07 pmLJI Week 8: Infrastructure
A few weeks ago, my wife, Lizbeth, got very excited about a device she saw a coworker using.
"They're Ray-Ban smart glasses from Meta," she gushed. "You can take pictures with them, listen to music, and ask the AI questions."
Admittedly, I tend to have an automatic negative response whenever Meta/Facebook is mentioned, and so my response was lukewarm at best.
"Uh, okay. I suppose you want a pair of these now?"
"I was actually thinking about them for you," the much maligned Lizbeth patiently responded. "You can use them to describe your surroundings, read print on all sorts of objects, and get help through a live person using the Be My Eyes app."
"But," the blind man objected, "I can already do all of that on my phone."
"But," Lizbeth insisted, "wouldn't it be cool to have a pair of glasses do all of that for you instead of juggling a phone?"
I made a noncommittal noise.
It wasn't just the Meta-disgust, although that was part of it. In general, assistive technology tends to lag behind everyday tech, but there are a few niche cases where it leaps ahead, and smart glasses are one of those. For several years, I've been using an app on my phone called Envision AI, and it pretty much does all the things Lizbeth told me about the Ray-Ban glasses. They also sell their own companion glasses matched to the app, with a staggering cost of thousands of dollars. So, I was familiar with the concept and appalled by the price! In comparison, the Ray-Bans were only a few hundred bucks, but I had already trained my mind to say "NO!" whenever someone mentioned smart glasses.
Just before we departed on vacation last week, Lizbeth decided that she did, in fact, want a pair of Ray-Bans for herself. (So perhaps the above malignment was justified after all?)
"How do they sound?" I asked after she had unpacked and set them up.
"Really good," she answered. "Here, try them out."
The speakers on the Ray-Bans are in the arms of the glasses that hook over your ears, and given their small size, my expectations were pretty low.
"Wow!" I exclaimed after a few seconds of listening, "they do sound good."
Isn't it funny sometimes how your life experiences tend to cluster around one topic? I am subscribed by e-mail to an online magazine called AccessWorld, and simultaneous with the arrival of Lizbeth's Ray-Bans, they published the article, A Review of the Ray-Ban Meta AI Glasses for People With Low Vision. Even though the author concluded the article by saying, " The Ray-Ban Meta glasses are an accessibility tool only by coincidence," he also mentioned that he sometimes felt as though the results he got, especially in the case of reading menus, were magical.
Lizbeth was super-eager for me to try this out while we were on vacation, and finally convinced me to take her glasses for a spin while we were sitting in one restaurant. I accepted the Ray-Bans, slipped the glasses on over my ears, and received the acknowledgement chirp which I assume translated to something like, "Head detected!" I said, "Hey Meta, read this," while holding up the menu, and after a brief pause it did. It was a pretty dark environment, and Lizbeth later told me that our table companions were using their cell phone flash lights to read their own menus, but the OCR, optical character recognition, was excellent.
I'm sure you've already figured out where this is going, and are wondering just how many more paragraphs you'll have to read before I'll finally admit to purchasing Ray-Bans of my very own?
Honestly, even after the above benefits, I probably wouldn't have bought a pair, except for one thing I haven't mentioned yet. Initially, I thought the glasses only provided sound feedback when the user requested to play music content. Instead, when they're paired with your phone, the Ray-Bans play all the sounds your phone generates, just like Bluetooth headphones. My Eureka moment was, "Hey, that includes screen reader audio!" So, if I'm out and about and want to quickly check the time, play a game, or refer back to an e-mail or text message, I don't have to hold the phone up to my ear or scrabble in my pocket for my earbuds, assuming I remembered to take them with me. In fact, since the glasses can automatically read and allow you to respond to text messages by voice, half the time I don't even have to take the phone out of my pocket.
Once I had confirmed this functionality by using Lizbeth's glasses as guinea pigs, I didn't even wait until we got back from vacation to order a pair of Ray-Bans. Nope, those puppies were waiting for me when I arrived home!
So far, I'm very pleased with my purchase. Sure, I've discovered a few limitations:
Still, that's not bad for a product with coincidental accessibility. And, remember the outrageously priced Envision AI glasses I mentioned above? One of the reasons assistive technology tends to have high prices is the limited market, and hence limited sales, it serves. But, when you're Meta (shudder), the market is much larger, and reduces the cost significantly. Plus, they have the infrastructure to make everything work, and that's not a trivial concern.
It's just ... a blind guy wearing Ray-Bans? The stereotype's killing me!
Dan
A few weeks ago, my wife, Lizbeth, got very excited about a device she saw a coworker using.
"They're Ray-Ban smart glasses from Meta," she gushed. "You can take pictures with them, listen to music, and ask the AI questions."
Admittedly, I tend to have an automatic negative response whenever Meta/Facebook is mentioned, and so my response was lukewarm at best.
"Uh, okay. I suppose you want a pair of these now?"
"I was actually thinking about them for you," the much maligned Lizbeth patiently responded. "You can use them to describe your surroundings, read print on all sorts of objects, and get help through a live person using the Be My Eyes app."
"But," the blind man objected, "I can already do all of that on my phone."
"But," Lizbeth insisted, "wouldn't it be cool to have a pair of glasses do all of that for you instead of juggling a phone?"
I made a noncommittal noise.
It wasn't just the Meta-disgust, although that was part of it. In general, assistive technology tends to lag behind everyday tech, but there are a few niche cases where it leaps ahead, and smart glasses are one of those. For several years, I've been using an app on my phone called Envision AI, and it pretty much does all the things Lizbeth told me about the Ray-Ban glasses. They also sell their own companion glasses matched to the app, with a staggering cost of thousands of dollars. So, I was familiar with the concept and appalled by the price! In comparison, the Ray-Bans were only a few hundred bucks, but I had already trained my mind to say "NO!" whenever someone mentioned smart glasses.
Just before we departed on vacation last week, Lizbeth decided that she did, in fact, want a pair of Ray-Bans for herself. (So perhaps the above malignment was justified after all?)
"How do they sound?" I asked after she had unpacked and set them up.
"Really good," she answered. "Here, try them out."
The speakers on the Ray-Bans are in the arms of the glasses that hook over your ears, and given their small size, my expectations were pretty low.
"Wow!" I exclaimed after a few seconds of listening, "they do sound good."
Isn't it funny sometimes how your life experiences tend to cluster around one topic? I am subscribed by e-mail to an online magazine called AccessWorld, and simultaneous with the arrival of Lizbeth's Ray-Bans, they published the article, A Review of the Ray-Ban Meta AI Glasses for People With Low Vision. Even though the author concluded the article by saying, " The Ray-Ban Meta glasses are an accessibility tool only by coincidence," he also mentioned that he sometimes felt as though the results he got, especially in the case of reading menus, were magical.
Lizbeth was super-eager for me to try this out while we were on vacation, and finally convinced me to take her glasses for a spin while we were sitting in one restaurant. I accepted the Ray-Bans, slipped the glasses on over my ears, and received the acknowledgement chirp which I assume translated to something like, "Head detected!" I said, "Hey Meta, read this," while holding up the menu, and after a brief pause it did. It was a pretty dark environment, and Lizbeth later told me that our table companions were using their cell phone flash lights to read their own menus, but the OCR, optical character recognition, was excellent.
I'm sure you've already figured out where this is going, and are wondering just how many more paragraphs you'll have to read before I'll finally admit to purchasing Ray-Bans of my very own?
Honestly, even after the above benefits, I probably wouldn't have bought a pair, except for one thing I haven't mentioned yet. Initially, I thought the glasses only provided sound feedback when the user requested to play music content. Instead, when they're paired with your phone, the Ray-Bans play all the sounds your phone generates, just like Bluetooth headphones. My Eureka moment was, "Hey, that includes screen reader audio!" So, if I'm out and about and want to quickly check the time, play a game, or refer back to an e-mail or text message, I don't have to hold the phone up to my ear or scrabble in my pocket for my earbuds, assuming I remembered to take them with me. In fact, since the glasses can automatically read and allow you to respond to text messages by voice, half the time I don't even have to take the phone out of my pocket.
Once I had confirmed this functionality by using Lizbeth's glasses as guinea pigs, I didn't even wait until we got back from vacation to order a pair of Ray-Bans. Nope, those puppies were waiting for me when I arrived home!
So far, I'm very pleased with my purchase. Sure, I've discovered a few limitations:
- You can't currently set alarms by voice, only timers.
- YouTube Music isn't a recognized Meta app to play music from, so you have to initiate that from your phone.
- The speech-to-text for sending text messages isn't as advanced as Google's implementation, so if I say "That's great!" and narrate the punctuation, the recipient sees, "That's great exclamation mark"
- And, even though its AI claims the capability of sending group texts, when I narrate multiple recipient names, it generally picks just one.
Still, that's not bad for a product with coincidental accessibility. And, remember the outrageously priced Envision AI glasses I mentioned above? One of the reasons assistive technology tends to have high prices is the limited market, and hence limited sales, it serves. But, when you're Meta (shudder), the market is much larger, and reduces the cost significantly. Plus, they have the infrastructure to make everything work, and that's not a trivial concern.
It's just ... a blind guy wearing Ray-Bans? The stereotype's killing me!
Dan
no subject
Date: 2025-09-03 10:05 am (UTC)no subject
Date: 2025-09-03 06:46 pm (UTC)Dan
no subject
Date: 2025-09-04 03:22 pm (UTC)no subject
Date: 2025-09-04 03:31 pm (UTC)Thanks for reading/commenting!
Dan
no subject
Date: 2025-09-04 10:22 pm (UTC)no subject
Date: 2025-09-04 10:49 pm (UTC)Dan
no subject
Date: 2025-09-05 11:03 am (UTC)no subject
Date: 2025-09-05 03:58 pm (UTC)Thanks for reading and commenting.
Dan
no subject
Date: 2025-09-05 09:17 pm (UTC)Glad you got on well with the Ray Bans and that they're working for you despite the several places it lacks.
Great piece.
no subject
Date: 2025-09-05 10:04 pm (UTC)Thanks for reading plus the kind words.
Dan
no subject
Date: 2025-09-05 11:39 pm (UTC)no subject
Date: 2025-09-05 10:25 pm (UTC)no subject
Date: 2025-09-05 10:27 pm (UTC)Dan
no subject
Date: 2025-09-05 11:32 pm (UTC)