For years, tech companies have been chasing an elusive dream: glasses with built-in displays that help us communicate, understand the world, and break free from our reliance on phone screens.
This week, Meta unveiled its latest attempt at its Connect developer conference: Meta Ray-Ban Display, a pair of smart glasses with an integrated screen that can show directions and texts, query Meta’s AI assistant, stream music, take photos, and even provide live captions to make conversations easier to follow.
The device is notable for two reasons: Unlike earlier prototypes, Meta’s glasses look like actual glasses. And while competitors have built intriguing developer kits without mass-production plans, Meta is actually ready to ship. Consumers will be able to buy Ray-Ban Display glasses for $800 later this month.
For an industry that has been working on a future that always seemed just out of reach, that’s a big deal—so much so that Meta CEO Mark Zuckerberg seemed compelled to assure the audience of his Connect keynote speech that this was actually happening.
“This isn’t a prototype,” Zuckerberg said. “This is here, it’s ready to go.”
The first display-equipped glasses that look like glasses
Ray-Ban Display features a monocular display, capable of projecting information in only one lens. That rules out AR games or 3D graphics but dramatically cuts power consumption.
That efficiency enabled Meta to use a smaller battery, keeping the design close to standard eyewear. The frame retains Ray-Ban’s iconic look. The temples are only slightly thicker than the company’s camera-equipped glasses, and the whole device weighs just 69 grams.
Upon close inspection, you can see that the right lens features a waveguide—diagonal lines that refract the light of a tiny projector integrated into the right temple to show information when the display is on. To outside observers, though, it’s not obvious whether the display is on or off—a big difference from earlier AR devices that leaked light. “It’s really optimized for private viewing,” says Meta AR devices VP Ming Hua.
All of this means that there is a lot less stigma attached to the usage of the device, which can be a huge factor when it comes to the success or failure of such a device. Case in point: Google Glass, a pioneering smart glasses product first introduced over a decade ago, looked a lot more like something from a science fiction movie than a regular pair of glasses.
“People are making Google Glass comparisons to the Meta Ray-Ban Display glasses,” says Meta CTO Andrew Bosworth. “It has a display, it’s monocular. But they’re so different because [ours] look good. [Google Glass] didn’t look good, and that matters a ton.”
“They nailed the form factor,” agrees Moor Insights & Strategy analyst Anshel Sag. “It’s really a well integrated product.”
What it is like to wear Meta’s new Display glasses
At Connect, I used the glasses to get directions, exchange WhatsApp messages, and query Meta’s AI assistant. Indoors or under California sun, the display was legible. Text fades after a few seconds and can be recalled when needed.
Meta positioned the display slightly below the typical line of sight, which makes it especially useful for certain applications. The glasses can transcribe speech in real time, helping with conversations in noisy environments or assisting people who are hard of hearing. They can also translate speech instantly, and Meta is developing a teleprompter app aimed at supporting public speakers.
advertisement
The glasses also support video calls, a digital viewfinder for photos, and other apps. But perhaps the most striking feature isn’t on the lenses at all: Meta’s Ray-Ban Display glasses come with a wristband that resembles a screenless fitness tracker. Inside are sensors that detect neural signals from tiny hand movements, allowing the glasses to be controlled with subtle gestures. Opening an app or summoning Meta’s AI assistant can be as simple as tapping your thumb and index finger together, while dismissing notifications takes just a quick swipe of the thumb.
The Neural Band, as it’s officially called, can also handle more complex gestures. To raise the volume of a Spotify stream, for instance, you rotate your thumb and index finger clockwise, like twisting the knob on an old radio. The device can even be trained to recognize handwriting, letting you reply to text messages by “scribbling” letters with your fingers. None of this requires line of sight—you can keep your hand behind your back or tucked in a pocket, and it still works. The effect feels remarkably close to magic.
Part of a multi-pronged approach
Meta isn’t the only company betting on glasses as the next major computing platform. Google has shown off prototypes for its own display glasses, and Amazon is reportedly working on a device like this as well. However, Meta has a key advantage over its would-be competitors: A yearslong partnership with glasses maker EssilorLuxottica.
The two companies introduced their first pair of camera-equipped smart glasses in 2021. Even without a display, the glasses became an unexpected hit: Consumers have purchased more than two million Ray-Ban–branded Meta glasses to date, according to EssilorLuxottica. Meta has continued refining the line and unveiled several new models at Connect this week.
In 2024, Meta previewed a bulkier but far more capable prototype for AR glasses, code-named Orion. Those glasses featured displays in both lenses, allowing 3D objects to be overlaid on the real world. It may take years to shrink Orion’s technology into a consumer-friendly form factor, but Meta has already folded some of that research into its current products. The neural wristband shipping with the Ray-Ban Display glasses, for instance, is essentially the same one developed for the Orion prototype.
The new Ray-Ban Display glasses also feature a custom-designed battery. Meta advertises up to six hours of “mixed use” battery life, which includes activities like listening to music when the display is off. Used continuously for internet access, however, the battery would likely last only about an hour, according to company representatives. The same battery technology is also being integrated into lower-cost Ray-Ban smart glasses that don’t include displays.
Catching up with Meta won’t be easy
One potential achilles heel for Meta could be its relationship with third-party developers. So far, Meta has only partnered with a few high-profile companies like Spotify to integrate their services into its wearables. At Connect, it announced plans for a developer program, but it may tightly control the apps it is willing to publish onto these devices.
There are good reasons for caution. Last year, two Harvard students used Meta’s camera glasses to power a facial recognition engine capable of identifying strangers on the street. While the glasses themselves don’t support that feature, Meta may choose to block such applications—though others likely won’t. The students have since dropped out and launched their own smart glasses startup.
A more permissive approach towards third-party apps could be one way for others to compete with Meta and its Ray-Ban partnership. “Catching up is not going to be easy,” says Sag, the analyst with Moor Insights & Strategy. “The biggest way someone could compete with them is by third-party developer access and making those classes more useful, even if they’re not more fashionable.”