Apple's new version of the system will "AI yourself"

AI 0 294

On Tuesday, local time, Apple showed off a set of accessibility features that could be introduced in a major system update later this year.


iOS 17 could embed a voice feature called "Personal Voice" that would allow users to synthesize their own voices on iPhones and iPads for face-to-face conversations, phone calls, FaceTime video calls and audio calls.



Apple says this feature is aimed at users who are losing their language capabilities. Users can also save common phrases to quickly express opinions when communicating with others.


Users can reportedly create their own personal voice by recording 15 minutes of audio on the device. Apple said the feature will use native machine learning technology to maximize privacy.


In addition to Personal Voice, Apple's iOS devices will also introduce additional assistive features.


For example, Assistive Access, which helps users with cognitive disabilities and their caregivers use iOS devices more easily.


Apple also announced an amplifier called "Point and Speak" that helps blind or low-vision users identify entities with text on them. on the object to which the user is pointing.


Apple typically releases Beta versions of its software at the Worldwide Developers Conference (WWDC), which means that these features are first available to developers and the public who want to join. In addition, Apple typically keeps the Beta software with these features embedded throughout the summer and then officially rolls out these new features to the public in the fall when the new iPhone is available.


Apple's WWDC 2023 conference will open on June 5, and analysts expect the company to unveil its first VR headset as well as other hardware and software.


May be interested in

Comments0

Comment Now

Feel free to express your views.