Google’s vision for AI-powered virtual assistants is taking shape with Project Astra, a multimodal assistant aiming to seamlessly integrate into daily life.
While Astra remains in the prototype phase, recent developments suggest Google may pair it with smart glasses, moving closer to a hands-free, always-on AI experience.
During a press briefing ahead of the Gemini 2.0 launch, Bibo Xu from Google DeepMind revealed that a small group of testers would trial Astra using prototype glasses.
These testers are part of Google’s Trusted Tester program, known for getting early access to experimental tech. While Xu refrained from confirming a public release, he hinted that more news on the glasses would be coming soon.
Why Smart Glasses?
Smart glasses are a natural fit for Astra’s capabilities. In a demo video, Astra was shown helping users with tasks like recalling security codes, checking the weather, and determining bus routes.
While these actions are possible on phones, integrating audio, video, and display into wearable glasses offers a more intuitive, always-on interaction.
Google’s History with Smart Glasses
Google has a long history with wearable tech, from Google Glass to its Project Iris translator glasses. Now, with Astra, the company appears to be doubling down on smart glasses as the ideal form factor for its AI ambitions.
The Bigger Picture
Though still in early stages, Astra on smart glasses signals Google’s serious intent to redefine how users interact with AI.
While competitors like Meta are also exploring similar wearables, Google’s approach could bring a more practical and immersive AI experience.
Whether these smart glasses will move beyond prototypes remains to be seen, but one thing is clear: Google is betting big on wearables as the future of AI.