Will Google finally crack the market with its new AI-powered smart glasses?
Will Google finally see success with two new smart products being released in 2026
It has been over a decade since Google Glass smart glasses were announced in 2013, followed by their swift withdrawal – in part because of low adoption. Their subsequent (and lesser known) second iteration was released in 2017 and aimed at the workplace. They were withdrawn in 2023.
In December 2025, Google made a new promise for smart glasses – with two new products to be released in 2026. But why have Google smart glasses struggled where others are succeeding? And will Google see success the third time around?
What is clear from developments in wearable tech over the last decade, is that successful products are being built into things that people already like to wear: watches, rings, bracelets and glasses.
These are the types of accessories that have emerged over centuries and currently adopted as normal in society.
Some of the most recent academic research is taking this approach, building sensors into jewellery that people would actually want to wear. Research has developed a scale to measure the social acceptability of wearable technology (the WEAR scale, or Wearable Acceptability Range), which includes questions like: “I think my peers would find this device acceptable to wear.”
Noreen Kelly, from Iowa State University, and colleagues showed that at its core, this scale measured two things: that the device helped people reach a goal (that made it worth wearing), and that it did not create social anxiety about privacy and being seen as rude.

This latter issue was highlighted most prominently by the term that emerged for Google Glass users: Glassholes. Although many studies have considered the potential benefits of smart glasses, from mental health to use in surgery, privacy concerns and other issues are ongoing for newer smart glasses.
All that said, “look-and-feel” keeps coming up the most common concern for potential buyers. The most successful products have been designed to be desirable as accessories first, and with smart technologies second. Typically, in fact, by designer brands.
A fine spectacle
After Google Glass, Snapchat released smart glasses called “spectacles”, which had cameras built in, focused on fashion and were more easily accepted into society. The now most prominent smart glasses were released by Meta (Facebook’s parent company), in collaboration with designer brands like Ray-Ban and Oakley. Most of these products include front facing cameras and conversational voice agent support from Meta AI.
So what do we expect to see from Google Smart Glasses in 2026? Google has promised two products: one that is audio only, and one that has “screens” shown on the lenses (like Google Glass).
The biggest assumption (based on the promo videos) is that these will see a significant change in form factor, from the futuristic if not scary and unfamiliar design of Google Glass, to something that is more normally seen as glasses.
Google’s announcement also focused on the addition of AI (in fact, they announced them as “AI Glasses” rather than smart glasses). The two types of product (audio only AI Glasses, and AI Glasses with projections in the field of view), however, are not especially novel, even when combined with AI.

Meta’s Ray-Ban products are available in both modes, and include voice interaction with their own AI. These have been more successful than the recent Humane AI Pin, for example, which included front-facing cameras, other sensors, and voice support from an AI agent. This was the closest thing we’ve had so far to the Star Trek lapel communicators.
Direction of travel
Chances are, the main directions of innovation in this are, first, reducing the chonkyness of smart glasses, which have necessarily been bulky to include electronics and still look like that are normally proportioned.
“Building glasses you’ll want to wear” is how Google phrases it, and so we may see innovation from the company that just improves the aesthetic of smart glasses. They are also working with popular brand partners. Google also advertised the release of wired XR (Mixed Reality) glasses, which are significantly reduced in form factor compared to Virtual Reality headsets on the market.
Second, we could expect more integration with other Google products and services, where Google has many more commonly used products than Meta including Google Search, Google Maps, and Gmail. Their promotional material shows examples of seeing Google Maps information in view in the AI Glasses, while walking through the streets.
Finally, and perhaps the biggest area of opportunity, is to innovate on the inclusion of additional sensors, perhaps integrating with other Google wearable health products, where we are seeing many of their current ventures, including introducing their own smart rings.
Much research has focused on things that can be sensed from common touchpoints on the head, which has included heart rate, body temperature and galvanic skin response (skin moistness, which changes with, for example, stress), and even brain activation through EEG for example. With the current advances in consumer neurotechnology, we could easily see Smart Glasses that use EEG to track brain data in the next few years.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments
Bookmark popover
Removed from bookmarks