https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-5653101863891512
Home Apple Apple explains child pornography detection: it only checks photos

Apple explains child pornography detection: it only checks photos

by Joseph Richard
0 comment

Apple revealed that to turn off the detection mechanisms, it will be enough to turn off the synchronization with iCloud photos. Only child pornography in the form of photographs, not videos, will be detected.

Control of sexually explicit media in iMessage will be available exclusively to parents of underage children in one family group. In this case, it will be an optional function.

Apple has published more details on the announced content control in the fight against child pornography. It has emphasized that photos will be checked only when uploading to iCloud, and media sent through iMessage will pass control only for underage users and exclusively when activated for them by a parent. 9to5mac website devoted to this topic.

First of all, we learned that the detection of child pornography would work exclusively with photos. Only still images, not videos, will be checked. In the future, however, Apple does not exclude the expansion of its mechanisms.

As we mentioned, the company also emphasized that photos uploaded to iCloud will be checked, not all media stored locally on the device. It claims that if the user turns off synchronization with iCloud photos, no software detecting a child’s photo will run on the device.

We learned other interesting details from a new document published by the company, which drew the site 9to5Mac. In this case, the functioning of a separate detection of sexually explicit content in the messaging application (iMessage) is approached.

Although Apple introduced the novelty to the public simultaneously as detecting child pornography in iCloud, these are two different technologies that work differently. While when uploading to iCloud, devices look for child pornography according to a central database, in iMessage, they alert for any sexually explicit content.

But the most important thing is that checking and smearing sexual content in iMessage will only work for underage users. The feature will be an optional part of the family sharing package. At the same time, an adult will need to first identify the child’s Apple ID account as their minor child and then themselves as their parent.

To all this, Apple has also set an age limit. Parents will receive notifications about sexually explicit images in their child’s iMessage only up to the age of 12. Before sending a message, the child is explicitly warned that his parent will immediately learn about the disclosure of the picture.

You may also like

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js?client=ca-pub-5653101863891512
%d bloggers like this: