JAKARTA Apple announced that it will launch two accessibility features to iOS. Although the version of its operating system is not explained, it is likely that these two features will be released to iOS 18 by the end of the year. The two accessibility features that Apple will launch to the latest iOS version are Eye Tracking and Music Haptics. Especially for Eye Tracking, this feature will also be released to iPadOS. Just like the iOS version, it is not known what version of the iPadOS will get this feature. Eye Tracking or Eye Tracking is an Artificial Intelligence (AI)-based feature to navigate the device through eye movements. This feature that relies on iPhone and iPad front cameras is designed for people with physical disabilities. "Eye Tracking uses front cameras to prepare and calibrate in seconds," Apple said. "With machine learning on devices, all the data used to prepare and control this feature is kept securely." Apple added that this feature does not require hardware or built-in accessories. Only by updating the operating system (OS), iPhone and iPad users can access buttons, shift screens, and do other things using the eye.
SEE ALSO:
Just like Eye Tracking, the Music Haptics feature is designed to help people with disabilities such as deafness and other hearing impairments. They can listen to songs through a sense of touch when activating this feature on the iPhone. Music Hactics will show fine beatings, vibrations, and textures on music audio so those who can't hear can also enjoy music. This feature will be added to the millions of songs on Apple Music. Apart from adding to Apple Music, the company also provides API versions for developers. The hope is that other applications on iOS can adopt this feature and make music more pronounced for deaf people.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)