Apple has spent years making privacy a unique selling point with privacy-first features and slick marketing campaigns. But that carefully curated, privacy-conscious image is at risk of unravelling following Apple’s controversial plans to scan customers’ iPhones and iPads for illegal child sexual abuse materials, or CSAM.

The technology, called NeuralHash, uses machine learning to scan images on Apple phones and tablets before they are uploaded to iCloud. It then compares cryptographic identifiers, known as hashes, with a database of child sexual abuse imagery provided by the National Center for Missing and Exploited Children.

If there’s a match that crosses the CSAM scanning tool’s threshold, Apple will block the cloud upload, shut down the account and alert law enforcement after conducting a human review.

While the likes of Facebook, Google and Microsoft use similar image-scanning technology in the cloud, it is the scanning of images on the device that sets Apple’s technology apart – and has drawn criticism from privacy advocates.

Chief among the concerns is the risk of mission creep by governments to scan iPhones for other types of content. Privacy advocates fear authoritarian states could force Apple to use the technology to detect political imagery or identify oppressed minorities.

“Such technology could be abused if placed in government hands, leading to [it being used] to detect images containing other types of content, such as photos taken at demonstrations and other types of gatherings,” said Chris Hauk, consumer privacy champion at Pixel Privacy. “This could lead to the government clamping down on users’ freedom of expression and used to suppress ‘unapproved’ opinions and activism.”

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

But it is the scanning on the device – or client-side – that has caused such a stir among privacy campaigners.

“Scanning for known indecent images on cloud-based accounts has been a staple tool used by law enforcement for years to locate offenders,” said Jake Moore, former head of digital forensics at Dorset Police and now cybersecurity specialist at internet security firm ESET. “However, Apple has taken this one step further by finding a way to scan devices before such images hit the cloud.”

An open letter signed by more than 7,000 privacy experts, cryptographers and researchers warned:

“Apple’s current path threatens to undermine decades of work by technologists, academics and policy advocates towards strong privacy-preserving measures being the norm across a majority of consumer electronic devices and use cases.

“We ask that Apple reconsider its technology rollout, lest it undo that important work.”

Apple has also announced a separate feature that parents can enable on their child’s iPhone to warn when an incoming message contains sexually explicit imagery. It will tell children under 13 that opening the message will send an alert to their parents.

Critics warn that it provides a tool for parents to carry out surveillance on children, while supporters say it can help shield children from graphic content.

Apple’s response

In the wake of the criticism, Apple defended its new system and said it will “refuse” any demands from governments to expand the hash list to detect non-CSAM images.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands,” Apple said.

Another concern is that malicious actors could game the system by attaching a CSAM hash to a non-CSAM image.

The Cupertino-headquartered tech titan said its process is designed to prevent that from happening. It added that its safeguards mean there is a “less than one in one trillion chance per year of flagging a given account”, although it is unclear if this figure can be verified.

Privacy protector no more?

Despite Apple standing firm, there is a sense that cracks have appeared in its privacy-first veneer.

Under CEO Tim Cook’s leadership, Cupertino has sought to position itself as the most privacy-focused Big Tech company. It has so far proven a successful move, with Apple being the most valuable company in the world.

As a GlobalData thematic report on data privacy notes: “Apple generally sell premium-priced hardware directly to customers, so it does not need to collect much data on them. It does not rely on the same level of data harvesting as Google and Facebook, and can therefore offer more secure services requiring stricter permissions to third parties handling customers’ personal data.”

Apple has also largely avoided the privacy scandals that have blighted the likes of Facebook and Google.

In Apple’s most recent product launch, nearly every feature was centred around privacy, from the ability to make temporary anonymous email addresses in iCloud to disabling tracking pixels in its Mail app.

In 2015 Apple added an ad-blocker to its Safari browser. More recently it gave its users the ability to turn off ad tracking on other services, much to Facebook’s and other adtech firms’ displeasure.

And in 2016 the iMac maker famously refused requests from the FBI to unlock a terrorist’s iPhone – such was its belief in privacy for its customers. The company published an open letter explaining this decision, in which it said: “we believe the contents of your iPhone are none of our business”.

The company repeated this message in 2019 in the form of a billboard outside the Consumer Electronics Show in Las Vegas that read: “What happens on your iPhone, stays on your iPhone”.

It warned in 2016 that removing encryption features for law enforcement would set a “dangerous precedent” – precisely the concern shared by those opposed to the CSAM detection tool.

Developments such as these have helped rank Apple at number two for data privacy on GlobalData’s thematic scorecard.

Apple CSAM scanning “balances individual privacy and child safety”

However, the Apple CSAM scanning, first revealed by cryptographer Matthew Green on 6 August, has been well received by governments and child welfare campaigners.

“The idea that child safety is the trojan horse for privacy erosion is a trope that privacy advocates expound,” said Rachel O’Connell, the founder of TrustElevate who set up the first UK internet safety centre in 2000. “This creates a false dichotomy and shifts the focus away from the children and young people at the front line of dealing with adults with a sexual interest in children, which often engage in grooming children and soliciting them to produce child sexual abuse material.”

O’Connell said that while government misuse for increased surveillance “is a real concern”, “conflating that issue as a justification for not keeping children safe online is unfounded and does not stand up to any form of logical scrutiny.”

Paul Bischoff, privacy advocate at Comparitech, said Apple had taken an approach that “balances individual privacy and child safety”.

However, he said it was important that the scanning tech is “strictly limited in scope to protecting children and not used to scan users’ phones for other photos”.

ESET’s Moore said while there was a concern that Apple’s tech would “drive CSAM further underground”, it is “likely to catch those at the early stages of delving into indecent material and hopefully bring those to justice before their problem gets out of control”.

He added: “It sends a clear message to those thinking of storing such material on their device.”

Rolling out Apple’s CSAM solution

Apple plans to roll out the technology in the US later this year before launching it in other countries, pending regulatory approval.

If Apple decides to push on with the launch of CSAM scanning in its current form, it will need to ensure it sticks to the promises it has made. Any deviation will undermine the business advantage the iPhone maker has created with its privacy orientated features.

“Apple’s announcement is a significant one for the safety tech sector and the ongoing challenge of protecting children from harmful online content,” said Ian Stevenson, CEO of Cyan and chair of the Online Safety Technology Industry Association.

“While it remains to be seen whether Apple’s approach strikes the right balance between users’ rights and online safety, it is encouraging that such a major global organisation recognises how its products are being used to cause harm, and that they are exploring innovative and constructive ways to deal with the issue.”