Google’s Android XR platform, designed for the next generation of smart glasses and mixed-reality headsets, has been in development for some time.
While we’ve seen conceptual videos showing the potential of the technology, it wasn’t until TED2025 that we got a real glimpse of how it functions in a live demo.
During the conference, Google’s Shahram Izadi, with assistance from Nishtha Bhatia, demonstrated Android XR running on a pair of prototype smart glasses.
The live presentation showed off practical features that moved beyond the idealized use cases we saw in earlier videos. One of the most impressive aspects of the demo was the glasses’ real-world capabilities, including seamless integration with a phone and support for prescription lenses.
The demonstration began with a simple task: Gemini, Google’s AI platform, was used to generate a haiku on demand.
But things quickly escalated. Nishtha asked the glasses-integrated Gemini to identify the title of a book on a shelf behind her, and within seconds, Gemini responded accurately. This demonstrated the system’s real-time awareness of its environment.
The demo also highlighted several key features:
- Visual Understanding: The glasses explained the contents of a diagram.
- Real-Time Translation: A sign was translated into English and then into Farsi (Persian) instantly.
- Seamless Multilingual Interaction: Nishtha spoke to Gemini in Hindi, and the AI responded accurately without any manual changes to the settings.
- Contextual Actions: When looking at a music album, Gemini identified it and offered to play a song.
- Navigation: A heads-up display overlaid navigation directions with a 3D map, showing how the glasses could assist with real-time navigation.
These examples demonstrated the potential for a genuinely helpful, AI-driven assistant integrated directly into the user’s field of view. Though still in prototype form, the live demo provided the clearest picture yet of Google’s vision for ambient computing through eyewear.
The features, especially real-time awareness and translation, showcase how far the technology has come and offer an exciting glimpse into the future of smart glasses.