Apple criticised for system that detects child abuse

  • Published
Apple logoImage source, Reuters
Image caption,

Apple announced the new technology on Thursday

Apple is facing criticism over a new system that finds child sexual abuse material (CSAM) on US users' devices.

The technology will search for matches of known CSAM before the image is stored onto iCloud Photos.

But there are concerns that the technology could be expanded and used by authoritarian governments to spy on its own citizens.

WhatsApp head Will Cathcart called Apple's move "very concerning".

Apple said that new versions of iOS and iPadOS - due to be released later this year - will have "new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy".

The system will report a match which is then manually reviewed by a human. It can then take steps to disable a user's account and report to law enforcement.

The company says that the new technology offers "significant" privacy benefits over existing techniques - as Apple only learns about users' photos if they have a collection of known child sex abuse material in their iCloud account.

But WhatsApp's Mr Cathcart says the system "could very easily be used to scan private content for anything they or a government decides it wants to control. Countries where iPhones are sold will have different definitions on what is acceptable".

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Will Cathcart

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Will Cathcart

He argues that WhatsApp's system to tackle child sexual abuse material has reported more than 400,000 cases to the US National Center for Missing and Exploited Children without breaking encryption.

The Electronic Frontier Foundation, a digital rights group, has also criticised the move, labelling it "a fully-built system just waiting for external pressure to make the slightest change", external.

But some politicians have welcomed Apple's development.

Sajid Javid, UK Health Secretary, said it was time for others, especially Facebook, to follow suit.

US Senator Richard Blumenthal also praised Apple's move, calling it a "welcome, innovative and bold step".

"This shows that we can protect children and our fundamental privacy rights, external," he added.

Facebook and Apple don't like each other.

That dislike has come to a head in recent months over privacy

Apple's Tim Cook has consistently beaten the drum of "privacy first".

He has not so subtly criticised Facebook's business model - that it essentially sells peoples' data to advertisers.

A recent feature of Apple's new iOS update asked users whether they wanted to be tracked around the internet when they downloaded a new app.

Facebook hated the move, and warned shareholders it could hurt their profits.

So it's not entirely surprising that Facebook owned WhatsApp has come out so emphatically against Apple's new move.

Looking at it cynically, Apple's announcement is a chance for Facebook to tell the world that Apple isn't as keen on privacy as it likes to say.

But the WhatsApp chief isn't alone in his criticism. There are some very real concerns that this technology - in the wrong hands - could be used by governments to spy on its citizens.

Facebook has said in no uncertain terms that it thinks this vision of online safety is dangerous and should be canned.

Not for the first time the two companies have illustrated a totally different philosophical position on of the issues of our age -privacy.

You may be interested in watching:

Media caption,

Panorama: How safe is TikTok for young users?