Apple designs tools to flag child sexual abuse

Image: Apple.

Image: Apple.

Apple has revealed a series of new tools for iPhone, designed to catch cases of child sexual abuse.

The tools will be rolled out later this year, and are designed to identify images of child sexual abuse that are uploaded to Apple’s iCloud storage system.

Apple’s tools rely on image hashes, with software boiling photographs down to a set of numbers. iPhone’s operating system will then store a database of of hashes of known child sexual abuse material, checking the hashes of images received against this database to see whether there is a match.

Apple said that once there are a set number of matches, photos would be shown to an Apple employee to ensure they are images of child sex abuse. If there is a match, they would be forwarded to the National Center for Missing & Exploited Children, and the user’s iCloud account will be locked.

“If you’re storing a collection of C.S.A.M. [child sex abuse material] material, yes, this is bad for you,” said Erik Neuenschwander, Apple’s privacy chief. “But for the rest of you, this is no different.”

Parents will also have have access to a feature that flags when their children send or receive nude photos in messages.

The company says that the tools would scan the child’s device, analysing every photo received or sent in a text message to check if it includes nudity. Nude photos sent to a child would be blurred, with the child having to choose whether to view it. If children under 13 send or view a nude image, their parents would be notified.

Apple stressed that it would never see or find out about nude images exchanged in a child’s messages – the notifications would only be sent to parents.

Privacy campaigners have raised concerns over the implications of the new features. In particular, worry has focused on Apple’s ability to flag certain content while still maintaining encryption.

Apple has previously refused to meet certain data requests from law enforcement and government by arguing that encryption prevents it from retrieving data.


Story source: The New York Times

 
Previous
Previous

Goldfinger social enterprise to launch furniture collection

Next
Next

A Dirty Secret