Today, Apple revealed forthcoming accessibility enhancements set to debut later this year with the upcoming software updates, including iOS 18, iPadOS 18, and visionOS 2. Once more, the Cupertino tech giant showcases its commitment to providing impressive accessibility features. This latest announcement unveils a range of useful tools aimed at assisting users with disabilities.
Apple offered a glimpse of numerous beneficial features for the iPhone and iPad, but the standout among them is Eye Tracking. Indeed, one of the most remarkable capabilities of Apple Vision Pro is now extending its reach to iPhone and iPad.
Harnessing the capabilities of Artificial Intelligence, Eye Tracking will enable iPhone and iPad users to navigate their devices solely with their gaze. This functionality mirrors the capabilities observed in Vision Pro. Tailored for individuals with physical disabilities, Eye Tracking on iPhones and iPads empowers users to control their devices effortlessly through eye movements.
This innovative feature utilizes the front-facing camera to identify the element the user is focusing on. Users can simply gaze at a button to highlight it, then maintain their gaze for a brief duration to select it. Leveraging on-device machine learning, all data is securely stored on the device and remains private, with no sharing, not even with Apple.
It’s noteworthy that Eye Tracking seamlessly operates across iOS and iPadOS apps, requiring no additional hardware or accessories. With the Dwell Control feature, users can activate any element and access additional functions such as swipes, taps, physical buttons, and other gestures—all effortlessly controlled by the user’s gaze.
We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.
Tim Cook, Apple CEO
In addition to Eye Tracking, Apple introduces Music Haptics, enabling users who are deaf or hard of hearing to experience music on their iPhone alongside everyone else. With Music Haptics enabled, the Taptic Engine within the iPhone generates taps, textures, and nuanced vibrations synchronized with the audio of the music, enhancing the overall listening experience.
Additionally, Apple introduces Vocal Shortcuts, empowering iPhone and iPad users to define custom phrases that Siri recognizes to trigger shortcuts and perform intricate tasks. Furthermore, Apple unveils the Vehicle Motion Cues feature, designed to mitigate motion sickness for passengers in transit by providing intuitive cues during vehicle movement.
In the upcoming CarPlay update later this year, users will have access to a host of new features. Firstly, Voice Control enables hands-free navigation of CarPlay and app control through voice commands. For passengers or drivers who are deaf or hard of hearing, the Sound Recognition feature can be activated to receive alerts for car horns. Additionally, individuals who are colorblind can utilize Color Filters, including options for Bold and Large Text, to enhance the visual accessibility of the CarPlay interface.
In addition to iOS and iPadOS, Apple offered a glimpse of upcoming accessibility features for visionOS. These include system-wide Live Captions, Reduce Transparency, Dim Flashing Lights, Smart Invert, and various other enhancements.
The initial preview of iOS 18, iPadOS 18, and visionOS 2 will be unveiled at Apple’s WWDC 2024 keynote event, commencing on June 10th.
0 Comments