JAKARTA Apple launched a new safety for children on the iMessage app. With this new feature, underage users can make a report if they receive nudity photos or videos.
According to The Guardian's report, this new feature is part of Apple's Communication Security. The company uses a scanning system on devices to detect pornographic photos or videos received via Messages or AirDrop.
Automatically, photos and videos in the context of nudity will be distributed. In addition, Apple will also display pop-ups with several options such as sending messages to adults, asking for help, or blocking contacts.
Currently, the child safety feature of predators is being tested in Australia via iOS 18.2 in beta. If the user makes a report on the pornographic photos or videos they receive, Apple will get contact information from the report maker and the person reported.
SEE ALSO:
Once the report is received by Apple, the company will analyze its report and take action. The company could limit access to sending pornographic messages on iMessage or report sending messages to law enforcement.
Reportedly, this feature will be expanded globally, but it is not known with certainty when the feature will expand. It is possible that this feature will be released to all countries next year after iOS 18.2 launches in a stable version.
This is not the first feature Apple has released for child safety. In 2021, Apple had launched a feature that could scan the iCloud Photo library to search for child sexual abuse material. If there is a dangerous photo, parents will receive a report from Apple.
The English, Chinese, Japanese, Arabic, and French versions are automatically generated by the AI. So there may still be inaccuracies in translating, please always see Indonesian as our main language. (system supported by DigitalSiber.id)