What New Features Are Coming To iOS 17?

iOS 17 has finally been released to download and, if you own an iPhone, there’s a lot to dig into. “NameDrop” lets you exchange contact info just by bumping devices. “Check In” makes it easy to get live updates on whether or not your friends made it home safely. FaceTime now lets callers leave a video voicemail message if their call goes unanswered (that feature doesn’t have a snazzy Apple marketing name). There’s also “Personal Voice” — a new iOS 17 feature that uses AI to keep a copy of your voice on your phone.

iOS 17 AI voice clone on iPhone

Personal Voice: The iOS 17 Feature That Clones Your Voice With AI

How is this possible? Apple’s solution asks users to read 150 phrases. With the help of AI, this sample set is large enough that your phone can use your voice to say all sorts of things. Personal Voice is then able to speak for you in calls or in assistive communication apps.

The process is reminiscent of recording the voice for a virtual assistant. Voice actress Susan Bennett recorded a range of words and phrases for four months to create the voice of Siri. The range was wide enough that Apple could then make her say phrases she hadn’t even recorded. Fortunately for iPhone owners, Personal Voice requires less time to set up.

How Do I Enable Personal Voice in iOS 17?

Once you’ve updated your iPhone to iOS 17, you can access the Personal Voice option in the Settings app under “Accessibility.”

After you tap on “Accessibility,” scroll down to the “SPEECH” section and then select “Personal Voice.”

iOS 17's Personal Voice feature settings

Is Your Voice Data Safe On iOS 17? What Could Go Wrong?

According to Apple, the Personal Voice feature in iOS 17 uses on-device processing to create a copy of your voice. Which makes it safe, right? We asked our resident privacy expert Misha Rykov of *Privacy Not Included fame. “If the voice sits on-device only, I feel neutral about it,” says Misha. “This is far less creepy than lots of other technologies that collect voice snippets and process them on the cloud and maybe sell them.”

The clone is created on-device, but Apple does offer the option to upload voice data to a user’s iCloud account. This means a copy of your voice could touch the cloud in some way. The “Share Across Devices” via iCloud option is set to “On” by default (shown in the above photo), so you may want to turn it off before creating your voice twin if you’re worried about that info being stored in the cloud.

Apple is known for being the big tech company focused on user privacy, but that doesn’t make their record spotless. Misha points to times that hackers tricked Apple into handing over user data or compromised users’ iPhones. Apple aside, voice-based scams are on the rise. The Federal Trade Commission in the U.S. warns folks to watch out for voice clone spam calls, where spammers imitate the voice of your loved ones, for example. If you get a call from a panicking family member asking for a sudden wire transfer, gift card or cryptocurrency, there’s a chance they may not be your relative at all. (The FTC recommends hanging up and calling the relative yourself, just to be sure.)

So, if you do create a voice clone, is it safe living on your phone? Keeping your voice data off of Apple’s cloud and protecting your phone with a passcode are two solutions but that doesn’t make your vocal clone impervious to hacks. Misha, for one, remains wary of the feature for most users. “All that said, I would never use this feature unless I have a medical condition that will make my voice too weak to be used.”

AI Voice Clones in iOS 17 for iPhone? Here's What You Need To Know

Written By: Xavier Harding

Edited By: Audrey Hingle, Innocent Nwani, Kevin Zawacki, Xavier Harding

Art By: Shannon Zepeda


Sur le même sujet