Steve Kelley

In the small group of professional certified vision rehabilitation therapists (CVRT), to which I belong, it is not uncommon for us to use the term “magic glasses.” These are the glasses some of our clients hope to find that will improve or restore their vision. It makes sense—in that for many of them, a new prescription once brought things back into focus. Vision loss from macular degeneration, glaucoma, diabetes, and other conditions is usually something that can be managed with proper care, but vision is not typically restored to what it once was. Still, the myth or hope for the magic glasses persists.

Take a pair of glasses, like the Ray-Ban Meta Glasses, add artificial intelligence (AI) to them, and it’s easy to see how these can only add to the hope that they might finally be the magic glasses those of us with acquired low vision have been waiting for. In my day job, I’ve received a lot of calls about the Meta glasses—where to buy them, and what they will do.

Let’s begin with a reality check. The Ray-Ban Meta glasses were designed for social media users (recall that Meta is the new name for the company that owns Facebook), to add pictures and videos to their social media accounts using a stylish pair of glasses equipped with a camera and speakers. Much of this is accomplished with a digital assistant, like Siri or Alexa, summoned with the built-in microphone (“Hey Meta!”). So, it’s important to note that the Meta glasses were not designed for low vision or blind users. The fact that some of the features happen to be handy for people with vision loss is purely by chance—a fortunate bit of synchronicity.

This also explains why, compared to other smart glasses specifically designed for the visually impaired, they are reasonably priced—starting at less than $300, and often available locally at places like Target or LensCrafters. Another bonus of not being designed strictly as assistive technology is that they are stylish and customizable. The frames are well-built, look good, and can accept prescription lenses with a wide variety of glare filters. In addition to the tech features, these glasses can have very functional lenses built right in.

How Do the Meta Glasses Work?

One feature that attracts a lot of interest is the ability to talk to the glasses and get information with voice prompts, the way you might use a digital assistant like Alexa or Siri. While this is true, it’s important to note that the glasses must be connected to a smartphone or tablet. The Meta glasses themselves are simply a hands-free place to put a camera, speakers, and microphone—all the AI processing happens on the smart device in your pocket, backpack, or purse.

As a result, using the Meta glasses begins with downloading the free Meta AI app from the Apple App Store or Google Play Store, installing it on your phone or tablet, and then connecting the glasses to your device via Bluetooth.

The Meta glasses have a built-in rechargeable battery and a clever charging method. The case for the glasses doubles as a charging station. The case itself charges with a USB-C cable. Whenever the glasses are placed in the case, they recharge automatically. Once the case is fully charged, it can recharge the glasses multiple times before needing to be plugged into a wall outlet. When charged, the Meta glasses will run for about four hours of moderate use, and about five hours of continuous audio playback such as streaming music or podcasts.

The AI Magic

Many of us are already familiar with AI, or artificial intelligence. We use it whenever we ask Siri, Alexa, or Google Assistant a question. Apps like Seeing AI, Google Lookout, or Be My Eyes AI all use artificial intelligence to identify objects or colors, recognize text, and describe the environment. The Meta glasses use Meta AI to process voice prompts or images from the camera, located on the top left side of the frames.

Once paired and connected by Bluetooth, users can issue voice prompts. For example, “Hey Meta, what time is it?” or “Hey Meta, what’s the weather today?” More importantly for those with vision loss, the glasses can be prompted to “look and describe” or “look and read.” For example, “Hey Meta, look and describe what’s in front of me.” “Hey Meta, look and read what’s in my hand.” In each case, the glasses respond with information. Follow-up questions can also be asked. For example, if Meta says there’s a car in front of you, you might then ask, “What color is the car?” or, “What model car is it?”

Meta does have a strong impulse to summarize text it reads. While this can be handy when sorting mail, if the goal is to read an entire document, the prompt “Hey Meta, look and read every single word” usually does the trick.

Summarization is especially helpful for reading restaurant menus. Rather than listening to an entire menu from start to finish, Meta can quickly scan for certain items or summarize sections such as sandwiches, desserts, or price ranges—depending on your follow-up questions. For this writer, the ability to quickly search a menu truly conjured up the notion of “magic glasses.”

Hallucinations or Misbehavior

Experts on AI will tell you that sometimes it produces what are called “hallucinations.” This is often said with a chuckle, like the misbehavior of an impulsive child. The Meta glasses are not immune.

Users can often point to examples where the AI insisted on summarizing instead of reading fully, made something up, or gave less-than-accurate information. For instance, when asked if there was a lamp in the room, Meta replied, “Yes, there’s a lamp on the table next to the bed.” It later said the lamp was several feet away. When asked again, it responded, “The lamp is slightly to the left.” In fact, the lamp was on the right of center.

All things considered, the fact that the objects were described at all—with approximate distance—was still useful. But the key point is that Meta AI is not yet able to provide reliable navigation directions or replace tools like a white cane for orientation and mobility. Processing speed is fairly quick, but not instantaneous or accurate enough for something like street crossings.

A Brief Tour

As mentioned earlier, the Meta glasses are equipped with features you’d expect for creating or consuming media content. A camera sits above the left lens. Inside the left arm, next to the hinge, is the on/off switch—push forward for on, back for off. Inside both arms, near the ear, are speakers with surprisingly good sound quality. On the outside of the right arm is a touchpad that performs several functions: swipe forward or back to adjust volume, tap to answer phone calls, and more. On the top front of the right arm is a small button that takes a picture when pressed. Long press to record a video, and press again to stop. Lastly, above the right lens is a small light that signals when a picture or video is being captured—helpful for those with vision to know when they’re on camera.

Integrated Apps

Because Meta AI is a product of Meta, it primarily interfaces with apps in the Meta ecosystem such as WhatsApp, Messenger, and Spotify. Notable exceptions, and perhaps concessions to users with vision loss, are Be My Eyes and AIRA. Both can be connected through the Meta AI app and used hands-free with the glasses. The one limitation is that Be My Eyes AI does not function with the glasses; only calls to volunteers are supported.

Final Thoughts

Ray-Ban is best known as a maker of sunglasses. I found that the Ray-Ban Meta glasses used for this review had an excellent polarized gray gradient tint, which made them terrific sunglasses even without the smart features. Most of the time, though, I wanted to use the AI functions indoors for reading. For many, a pair with clear lenses or a light tint for indoor glare would be ideal, perhaps with clip-on glare filters for outdoor use—flip them up when taking a picture.

The camera on the left side took some getting used to. For photos, I often ended up missing part of the frame to the right. I also forgot to take off my hat on several occasions, leaving the brim in the shot.

I confess, there were times—usually when using the glasses for reading—that I too thought of them as magical, with the exception of the on/off switch, which always seemed difficult to find or use with a fingernail. But for anyone who assumes the AI can handle navigation or provide instant descriptions for tasks like street crossings, the limitations will be disappointing.

In addition, some people I spoke with assumed that because the glasses respond to voice prompts, they eliminated the need to learn how to use a smartphone. That’s not the case. Once the Meta AI app is downloaded and the glasses are connected to Bluetooth, it is possible to interact with them conversationally, but a basic level of comfort with a smartphone or tablet is still required.

The Ray-Ban Meta glasses are an accessibility tool only by coincidence. That makes them more affordable and, in many ways, unaware of the needs of users with reduced vision. Still, for the price and convenience, they offer a great deal in object recognition, environmental description, and reading with optical character recognition (OCR).

Ray-Ban Meta AI glasses start at $299. The Meta AI app is a free download from the App Store or Google Play Store and runs on iOS 14.2 and above or Android 10 and above. Meta does offer accessibility support by phone at 855-592-2237.

Author
Steven Kelley
Article Topic
Product Reviews