The Rise of AI Glasses and the Meta/Ray-Ban Partnership

User avatar placeholder
Written by Joseph Nordqvist

Published: 21:10, September 15, 2025

AI glasses – also known as smart glasses – are making the leap from tech demo to real-world tool. These wearable devices integrate cameras, microphones, speakers, and sometimes displays into eyewear to provide you with hands-free information and assistance. In the past, products like Google Glass (launched in 2013) showed the potential of augmented reality (AR) headsets, but they struggled with bulkiness, limited use cases, and privacy concerns. Snapchat’s Spectacles (first released in 2016) found some success as camera glasses for social media, but they didn’t have true displays or widespread adoption.

Now, a new wave of AI-powered smart glasses is emerging, led by a partnership between Meta (Facebook’s parent company) and Ray-Ban. This partnership marries cutting-edge tech with fashion-friendly design, and it’s quickly bringing smart glasses into the mainstream.

Meta and Ray-Ban: A Tech-Fashion Partnership

Meta (formerly Facebook) and EssilorLuxottica (Ray-Ban’s parent company) joined forces in 2019 to create stylish smart eyewear. Their goal was to make high-tech glasses that look and feel like normal Ray-Bans, so you’d actually want to wear them in public.

The first result of this collaboration was Ray-Ban Stories, launched in September 2021. These first-generation glasses looked like classic Ray-Bans but packed in dual 5-megapixel cameras, open-ear speakers, and a microphone. They let you take photos and 30-second videos with a tap or voice command, listen to music, and take phone calls – all without pulling out your phone. Ray-Ban Stories didn’t have any kind of visual display or AR overlay; they were about capturing moments and listening to audio while keeping your hands free. Users could say “Hey Facebook” (now “Hey Meta”) to issue voice commands via the built-in Facebook Assistant.

Privacy by design was a concern from the start. To address this, Ray-Ban Stories included a hard-wired LED that lights up when the camera is in use, so people around you know when you’re recording. Even so, some critics felt the recording indicator was too small and expressed mistrust given Facebook’s privacy track record. Despite those concerns, the partnership proved that there’s a demand for smart glasses that look cool.

Inline_Ray-bans-Intl-Expansion-Update

Meta and Ray-Ban offered the frames in iconic Ray-Ban styles (Wayfarer, Round, and Meteor) with various colors and lenses, including prescription options. At a starting price of $299, they positioned the glasses as a fashionable tech accessory rather than a geeky gadget.

Building on early success, Meta and Ray-Ban released a second generation of smart glasses in fall 2023. Simply called the Ray-Ban Meta smart glasses, this new lineup improved on the original in many ways.

They upgraded the camera to a 12 MP wide-angle lens for sharper photos and 1080p videos, boosted the audio with better speakers and a 5-microphone array for clearer calls and sound recording, and made the glasses lighter and more comfortable for all-day wear. You can even live-stream directly from the glasses to Facebook or Instagram and hear or even see comments in real time (with the help of the phone app).

Crucially, this generation introduced Meta AI into the glasses – your built-in smart assistant. By saying “Hey Meta,” you can ask questions, get translations, control music, or have messages read out to you using Meta’s advanced AI assistant (powered by the same Llama 2 large language model behind Meta’s chatbots).

Initially, Meta AI on the glasses launched in beta for US and Canada, helping you get information or inspiration hands-free. This means you can ask your glasses things like “Hey Meta, how’s the weather?” or “Translate this sign for me,” and get an answer through the built-in speakers. Essentially, the second-gen Ray-Ban Meta glasses gave you an AI voice assistant on your face, without any visible screen.

Meta reamined committed to the vision and in September 2024 the two companies extended their partnership into the 2030s, signing a deal to keep collaborating on “multi-generational smart eyewear products” for years to come. There are already two generations of Ray-Ban smart glasses (the 2021 and 2023 models), and more are on the roadmap. Behind the scenes, Meta even invested about €3 billion for a stake in EssilorLuxottica (roughly 3% ownership, with plans to possibly increase to 5%).

For Meta, this investment secures its partnership and gives it more influence over design decisions – for example, convincing Ray-Ban to embrace higher-tech features even if it means slightly thicker frames. For EssilorLuxottica, it means having a tech giant’s resources to push eyewear into a new era. Both companies see smart glasses as a long-term play that could eventually replace some functions of your phone.

New Features on the Horizon: HUD and More AI

The Meta/Ray-Ban partnership’s next big leap is adding a heads-up display (HUD) to the glasses. For the first time, you won’t just hear or capture information – you’ll see data in your field of view. On September 15, 2025, just days before Meta’s annual Connect conference, a video leaked (you can see it in the X post embedded below) showing a new version of its Ray-Ban branded AI glasses.

These glasses appear to have a built-in display over the right lens, creating a monocular HUD for the wearer. This is not full augmented reality (it doesn’t project 3D objects into the world around you), but it can overlay simple graphics and text like notifications and directions in your line of sight.

What can this HUD do? The leaked video shows a user getting turn-by-turn walking directions right on the glasses, so you can navigate a city street without looking at your phone. It also shows incoming messages popping up, real-time translations of text in the world, and other context-aware alerts displayed through the lens. And of course, Meta’s voice-controlled AI assistant is front and center – you can speak to the glasses to ask for info, and see the AI’s responses or search results in your view. Essentially, these glasses aim to be your smartphone, camera, and AI assistant all-in-one, available at a glance.

One novel piece of tech in this upcoming model is a neural wristband that pairs with the glasses. The video shows a user wearing a sleek black wristband while using the HUD glasses. This is Meta’s sEMG wristband, a wearable that reads electrical signals from your arm muscles to detect finger movements. In the demo, the user scribbles letters on a notepad with their finger, and the wristband translates those motions into text input on the glasses. In other words, you can reply to a message or perform gestures without touching the glasses or pulling out your phone, just by subtly moving your fingers.

vThis is an evolution of tech Meta has been developing for years (they acquired a startup CTRL-Labs in 2019 to get this wristband technology). For you, it means a new way to interact with glasses hands-free – almost like telepathic input, since tiny neural signals in your arm can trigger actions on the device.

Why It Matters and What’s Next

You might be wondering, why all the buzz about AI glasses now? In short, the technology has matured and the use cases are clearer. By combining cameras, voice interface, and AI, smart glasses can do things your phone can’t do as conveniently. Imagine you’re a tourist walking in a foreign city – with AI glasses, you can get walking directions in your view, have street signs translated instantly, and capture photos of your journey without ever stopping to pull out a device. If you’re cooking in your kitchen, you could follow a recipe shown step-by-step in a corner of your vision. If you’re on a bike, you could see your speed and incoming text alerts without taking your eyes off the road. All of this keeps you “heads-up” and present in the real world, rather than staring down at a screen.

From a business perspective, major tech companies see smart glasses as the next big computing platform – what comes after the smartphone. Meta’s CEO explicitly said that glasses could become “the next major technology platform”, reshaping how we connect and get information. That’s why Meta, Google, Apple, Amazon, and others are investing heavily here. (For instance, Apple’s first step in AR is the Vision Pro headset, and Google has been quietly exploring its own AI glasses project.)

By focusing on everyday eyewear with Ray-Ban, Meta is trying to be the first to crack the mass-market formula. The Ray-Ban partnership in particular is a strategic advantage. It brings style credibility, access to thousands of retail outlets, and experience in eyewear manufacturing at scale. People are far more likely to wear a device that looks “normal” – and Ray-Ban knows how to make glasses people like.

For business decision-makers, the rise of AI glasses opens up new opportunities. As these devices become mainstream, companies can leverage them for hands-free productivity and training. Imagine your field technicians getting overlay instructions while fixing equipment, or surgeons viewing patient vitals on their eyewear during an operation. Even in retail or hospitality, smart glasses could assist workers by providing customer data or real-time translations. We’re not fully there yet, but the trajectory is clear: the line between the digital and physical world is blurring. Tech embedded in glasses means instant, context-aware data wherever you look.

At the same time, challenges remain. Privacy and security will continue to be hot topics – both for users (are the glasses securely handling your data?) and bystanders (are they being recorded?). Companies adopting AI glasses will need clear policies to address these concerns. Battery life is another practical limit; today’s Ray-Ban glasses last a few hours of active use, and adding a display will consume more power. And of course, there’s the question of social acceptance: will people feel comfortable interacting with someone wearing AI-enabled glasses? These are issues that the industry will navigate as the technology improves.

One thing is certain: AI glasses are no longer sci-fi. They are here today, on the faces of real customers, and evolving fast. Meta and Ray-Ban’s partnership has been at the forefront of this rise, proving that blending tech with fashion is the key to adoption. By continuously upping the capabilities – from capturing memories, to voice AI assistance, and now to visual HUD overlays – they are turning glasses into personal tech companions. If you value staying connected without being glued to a smartphone screen, keep an eye on this space (or rather, keep an eye through it). The humble eyeglasses are transforming into a powerful computing device that you might one day consider as essential as your phone.

Joseph Nordqvist Avatar