Apple Announces It Will Scan Users iPhone and iCloud Photos for Signs of Child Abuse

Share:  

Apple made a recent announcement that got more than a little attention in the photography press but is nonetheless a long time coming when you survey the general tech landscape on this issue. 

Apple Center
Photo by zhang kaiyv

Apple has announced a slew of new child protections for its device in line with the general movement in tech towards limiting child abuse and their exposure to adult content and contact. None of that is really controversial, yet the method is what is making headlines. 

In Apple’s own words:

“Another important concern is the spread of Child Sexual Abuse Material (CSAM) online. CSAM refers to content that depicts sexually explicit activities involving a child.

To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images stored in iCloud Photos. This will enable Apple to report these instances to the National Center for Missing and Exploited Children (NCMEC). NCMEC acts as a comprehensive reporting center for CSAM and works in collaboration with law enforcement agencies across the United States.”

And even though Apple goes on to explain that user privacy will be kept in mind during this process, as well as outlining specifically how that right is protected, some people just aren’t down with the idea of Apple reviewing their photos even if just on the algorithmic level. 

The company also says that the instance of failure or incorrect matching is quite small.

“Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account,” Apple writes. 

Other safety measures include a warning when sending or receiving sexually explicit content via text message with the option for parents to be alerted of such activity taking place as well as protections with Siri and search that warns users that the queried topic is “harmful and problematic” then providing the user with “resources from partners to get help with this issue.”

You can read Apple’s post in full about the issue right here.

Like we said, this is a general trend across the tech industry and we have covered Facebook’s efforts as far as Messenger and Instagram are concerned. Then again, we have also covered the company’s seemingly tone-deaf desire to launch a Messenger for Kids app, so there isn’t a uniformity of approach out there but a need nonetheless for something to be done. 

What do you think of Apple’s new “Expanded Protections for Children”? Let us know your thoughts in the comments below.

Don’t forget to check out our other photography news at this link right here.

[Apple]

About Author

Kehl is our staff photography news writer since 2017 and has over a decade of experience in online media and publishing and you can get to know him better here and follow him on Insta.

Leave a Reply

Your email address will not be published. Required fields are marked *