At the India AI Impact Summit, Facebook-parent company Meta showcased and demoed its latest smart glasses, including the Meta Ray-Ban Display. Now before we tell how they work and what we think about them, lets recap what they are and how they are different from the ones you get in India. Meta introducing the its Meta Ray-Ban Display at Connect conference for $799 in September. They come with a built-in screen and gesture controls. These are not launched in India yet but The Times of India got a demo at Meta’s booth and here’s what we think about them.
Meta Ray-Ban Display design
At first glance, they look like ordinary Ray-Ban Wayfarers — the kind you’d see on anyone walking down the street. You see two cameras (12MP ultra wide 3x Zoom) on either end pieces, and a strap that will read your hand movement to control what you see on a tiny heads-up display. It also gets speakers, microphones and Meta’s AI assistant working to make sense of everything I’m looking at. It’s augmented reality, but subtle. Almost invisible if you are walking down a street in broad daylight.
Let us start with the display itself. Tucked into the lens is a micro-projector that beams text, icons and other information directly into your field of vision. In this way, you don’t have to make an effort to take out your phone and see important notifications.
It’s not the immersive, full-colour overlay that you may imagine from sci-fi movies, but a small readout in the corner of your vision which does not obstruct your vision.
Functionality includes navigation, like turn-by-turn directions that you see in Google Maps on your phone – all the data appearing right before the eyes. Its futuristic and took use sometime to get used to the gestures and make differentiation between the display and real-world.
Gesture controls: A mixed bag
Meta has built in gesture controls, which sound great in theory but were slightly odd when used in real-life scenario. You can swipe along the temple of the glasses to scroll through messages or notifications, tap to select, and even pinch the air to zoom in on certain content. A representative at the booth said the gesture recognition improves with use.
Voice commands worked fine with “Hey Meta” as the wake word. You can ask the AI to make a WhatsApp call to anyone you like, and hear the voice clearly through the built-in speakers on the temple.
The speakers use directional sound technology to channels audio straight into the ears and microphones do the job of capturing voice even in the noisy environment at the booth. It’s not quite as immersive as earbuds, but for calls and navigation prompts, it’s more than adequate.
We asked a few questions like, “What am I seeing,” and it responded with a reply that the setup looks like an exhibition. You can be seeing anything, like a piece of paper and ask Meta AI to explain the meaning of what I’m reading.
It took some time but returned with a correct answer.
Battery life
Meta claims the glasses last about 6 hours with single charge, and a fully charged case provides up to 24 hours of additional charge.
Verdict
The Meta Ray-Ban Display glasses are impressive but that comes with privacy concerns. They offer a useful glimpse into what hands-free computing could look like. They’re not perfect as gesture controls need refinement and practice, and the technology still feels like it’s in the “early adopter” phase. Can they replace phones: No; Are they a good tech device to have? Yes.



