Apple’s new Music Haptics feature will make it possible to ‘feel’ the music on your iPhone

 Apple Music Haptics.
Apple Music Haptics.

Apple has just announced a slew of new accessibility features for several of its products, and one of them could change the way that deaf and hard of hearing people experience music on the iPhone.

Known as Music Haptics, this uses the iPhone’s Taptic Engine in the iPhone to play “taps, textures, and refined vibrations to the audio of the music,” giving people an opportunity to enjoy their favourite songs in a different way.

Naturally, this will work in Apple Music, but Apple is also releasing an API for developers so that can be implemented in their own apps. Presumably, this means that those who create music-making software to add it to their products, too.

Elsewhere, perhaps the most eye-catching - quite literally - of the new accessibility features is Eye Tracking for iPad and iPhone, which will enable those with physical disabilities to operate their device using their eyes. Powered by AI and on-device machine learning, this uses the camera to track eye movements across the screen and Dwell Control to activate each element. Eye control of physical buttons, swipes and other gestures is also in the offing.

Other accessibility additions include Vocal Shortcuts, Vehicle Motion cues to help reduce motion sickness and voice control for CarPlay. There are also specific features for Apple Vision Pro.

“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO. “That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”

Find out more about the new accessibility features in the Apple Newsroom. They’ll be rolling out later this year.