Google’s Android XR is expanding its input methods by introducing a “Gesture navigation” system that incorporates the classic 3-button layout familiar to Android users.
When users raise their hand, a floating diamond container with a circular interface appears. A pinch gesture (thumb and forefinger) reveals a pill-shaped interface featuring the “Back” triangle, “Launcher” circle, and “Recents” square. Users can slide to the desired button and release to execute the action.
While this system mimics traditional 3-button navigation, it offers a more efficient alternative to physical button presses on headsets.
Expanded Input and App Experiences
Android XR supports Hand Tracking, Eye Tracking, keyboard, mouse, and controller inputs, and voice commands with Gemini. An Auto Detect feature enables seamless switching between input methods.
Apps in Android XR can operate in two modes:
- Home Space: Allows multitasking with a flat 2D UI where apps coexist.
- Full Space: Focuses on a single app or feature, such as 3D model viewing or immersive Google TV experiences.
Developers can create custom environments like a virtual home theatre and integrate spatial elevations for components, offering enhanced interaction. Navigation and action bars can also be separated into floating “orbiters” for more space-efficient designs.
Optimized User Interface
To prevent user fatigue, Android XR centers primary content within a 41-degree horizontal field of view and automatically adjusts UI size based on proximity. This ensures elements remain easy to interact with, regardless of their position.
Android XR’s combination of familiar gestures and innovative features aims to redefine navigation and multitasking in extended reality.