Electronics & Gadgets

Apple Will Let the Vision Pro ‘See’ for You With New Accessibility Features

134
Apple Vision Pro Headset

Apple has unveiled new accessibility features for its Vision Pro headset, which could turn the device into a visual proxy for users. These updates, expected to launch in visionOS later this year, aim to help users better navigate the real world and digital environments using machine learning and the device’s main camera.

One of the key features is magnification, which allows users to zoom in on real-world objects and virtual items alike. A demonstration from Apple showed how a Vision Pro wearer could read a recipe book up close and then transition seamlessly to a Reminders app, which also appears magnified for easier reading.

This feature mimics a common smartphone function but provides a hands-free solution, particularly helpful for those who need both hands free for tasks.

Alongside magnification, Apple’s VoiceOver accessibility feature will be enhanced in visionOS. This feature will enable the headset to describe surroundings, find objects, read documents, and provide other visual assistance in real-time, offering hands-free navigation for users.

Apple will also release an API for approved developers, allowing them to create accessibility apps for the Vision Pro that use its camera. A potential use case for this API could be live, person-to-person assistance through apps like Be My Eyes, which helps users interpret their surroundings visually.

This feature opens up possibilities for real-time, hands-free interaction with the environment, an especially valuable tool for users with visual impairments.

Looking ahead, these features could extend to other Apple devices, such as camera-equipped AirPods or smart glasses, building a more connected and accessible ecosystem for users.

In addition to these updates, Apple is also introducing a brain-computer interface (BCI) protocol that supports Switch Control on visionOS, iOS, and iPadOS. This protocol will enable alternative input methods, such as controlling devices through head movements tracked by an iPhone camera.

Apple worked with Synchron, a brain implant company, to develop this new BCI standard. Although the current tech doesn’t allow for mouse-like movements, it opens doors for future advancements in brain interface controls.

These updates highlight Apple’s commitment to enhancing accessibility and creating more inclusive experiences across its products.

Written by
Sazid Kabir

I've loved music and writing all my life. That's why I started this blog. In my spare time, I make music and run this blog for fellow music fans.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay updated with nomusica.com. Add us to your preferred sources to see our latest updates first.

Related Articles

AirPods 2 Max
Electronics & Gadgets

Apple Quietly Launches AirPods Max 2 With Major New Features

Apple has officially revealed AirPods Max 2. The new over-ear headphones come...

PS5 Hyperpop
Electronics & Gadgets

Sony Quietly Adds New Hardware to the PlayStation Lineup

Sony has released new hardware for the PlayStation 5 in 2026. The...

iPhone Fold Render
Electronics & Gadgets

Apple Plans Three ‘Ultra’ Devices in 2026 – Foldable iPhone Incoming

Apple is planning to release at least three new “Ultra” devices this...

New Apple Products
Electronics & Gadgets

Apple Releases 7 New Products, Including a MacBook and iPhone

Apple has officially released seven new products worldwide, with the devices now...