Apple has unveiled new accessibility features for its Vision Pro headset, which could turn the device into a visual proxy for users. These updates, expected to launch in visionOS later this year, aim to help users better navigate the real world and digital environments using machine learning and the device’s main camera.
One of the key features is magnification, which allows users to zoom in on real-world objects and virtual items alike. A demonstration from Apple showed how a Vision Pro wearer could read a recipe book up close and then transition seamlessly to a Reminders app, which also appears magnified for easier reading.
This feature mimics a common smartphone function but provides a hands-free solution, particularly helpful for those who need both hands free for tasks.
Alongside magnification, Apple’s VoiceOver accessibility feature will be enhanced in visionOS. This feature will enable the headset to describe surroundings, find objects, read documents, and provide other visual assistance in real-time, offering hands-free navigation for users.
Apple will also release an API for approved developers, allowing them to create accessibility apps for the Vision Pro that use its camera. A potential use case for this API could be live, person-to-person assistance through apps like Be My Eyes, which helps users interpret their surroundings visually.
This feature opens up possibilities for real-time, hands-free interaction with the environment, an especially valuable tool for users with visual impairments.
Looking ahead, these features could extend to other Apple devices, such as camera-equipped AirPods or smart glasses, building a more connected and accessible ecosystem for users.
In addition to these updates, Apple is also introducing a brain-computer interface (BCI) protocol that supports Switch Control on visionOS, iOS, and iPadOS. This protocol will enable alternative input methods, such as controlling devices through head movements tracked by an iPhone camera.
Apple worked with Synchron, a brain implant company, to develop this new BCI standard. Although the current tech doesn’t allow for mouse-like movements, it opens doors for future advancements in brain interface controls.
These updates highlight Apple’s commitment to enhancing accessibility and creating more inclusive experiences across its products.
Leave a comment