ARTICLE AD BOX
Apple has announced a major upgrade to its voice-cloning feature, coming in iOS 19, that will let users create a digital version of their own voice in under a minute using just 10 spoken phrases.
First introduced in iOS 17 as an accessibility tool for people at risk of losing their ability to speak, the feature makes up a part of a broader set of updates revealed ahead of Global Accessibility Awareness Day, which takes place on 15 May.
The tech giant says voices will sound even more natural and will take significantly less time to set up, thanks to improvements in on-device machine learning. Previously, users had to record around 15 minutes of speech to create a personalised voice on their iPhone.
Apple will also be introducing something it calls Accessibility Nutrition Labels. These will be displayed alongside apps on the App Store, giving users a clearer picture of which accessibility features an app supports, including Apple’s VoiceOver screen reader, voice control and captions.
Real-time live captions will also be coming to the Apple Watch for the first time. When paired with Live Listen, a feature that turns your iPhone into a remote mic, the captions will show up right on the Watch screen, letting you read what’s being said around you without having to pull out your phone.
Another neat feature being updated in iOS 19 is the sound recognition feature in CarPlay. As well as alerting you to things like sirens and horns, it’ll now ping you if it hears a baby crying in the back seat. CarPlay is also getting a large text mode to make things easier to read on screen.
There are lots of updates coming to iOS 19 for users with low vision or reading difficulties. Apple will be rolling out a new feature called Accessibility Reader — a simplified reading mode that works across iPhones, iPads, Macs and the Apple Vision Pro. It will let users tweak fonts, colours and spacing, with spoken content support built in.
Additionally, the Magnifier app will be coming to macOS for the first time. Users will be able to connect it to a webcam or iPhone camera, so users can zoom in on physical objects, documents or whiteboards.
Braille Access is another new feature coming to iOS 19 for blind and visually impaired users, essentially turning Apple devices into fully fledged braille note takers. It can also transcribe real-time conversations into braille using a connected display.
Other accessibility updates coming to iOS 19 include a new feature that will notify users when someone nearby says their name – a small but potentially helpful tool for people who are deaf or hard of hearing – as well as new ways to personalise background sounds like ocean waves or rain, which can be used to help people focus, or drown out distractions. Users will also be able to fine-tune music played through haptics, with options to feel just the vocals or the entire track. There’s also a simplified version of the Apple TV app rolling out, designed to be easier to navigate.
All of these features are due to roll out later this year with iOS 19, iPadOS 19, macOS 15, watchOS 11 and visionOS 2.