Following Apple’s announcement of new child safety features in the upcoming iOS and iPadOS release, more than 8,000 people and organizations have signed an open letter against the technology. Critics fear that Apple’s strategy will weaken users’ control over their data or even provide a backdoor to further surveillance. Albeit well-intentioned, Apple’s unilateral initiative ducks a serious debate on ensuring that children are safe online without infringing users’ data privacy and freedoms.

Apple’s new feature might pose data privacy risks

Apple’s new machine learning-based tool, neuralMatch, will search for matches of known child sexual abuse material on every US user’s iPhone and iPad that backs up to iCloud. This new feature doesn’t just scan images stored on the cloud but also scans users’ devices without their consent. Another feature, which parents might be able to deactivate, monitors a child’s messages for images containing nudity. It notifies the parents when a child receives such images.

The changes appear in stark contrast to what Apple has previously advocated on data privacy and security. In the past, Apple has rejected US government requests to help break into the iPhones of suspected terrorists and denounced the idea of building backdoors into its gadgets that would make them inherently less secure. Like other tech companies, it has also been a strong advocate of end-to-end encryption — a communication system where only the communicating users can read messages, with no third party able to decipher the information communicated or stored.

In response to its critics, Apple published a document explaining that the new feature will not scan people’s private libraries on their devices and does not work for users with iCloud Photos disabled.

Authoritarian governments could exploit this feature

By trying to fight child sexual abuse, Apple is supporting a noble cause. However, cybersecurity experts and data privacy campaigners warn that this content-flagging system sets a precedent that authoritarian states like China could replicate. Non-democratic countries could pressure Apple to apply the technology to other images deemed illegal, for example, unfavorable political images.

Should a private company set the rules on the use of AI and online content?

Apple plans to scan users’ encrypted messages sent and received using iMessage also pose challenges. An AI-based tool will attempt to automatically identify sexually explicit images, enabling parents to turn on automatic filters for their children’s inboxes. This system will not result in sexually explicit images being sent to Apple or reported to the authorities, but parents will be notified if their child decides to send or receive sexually explicit photos.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

Child sexual abuse is a terrible crime, but Apple’s move sets an important precedent by allowing law enforcement agencies to access devices in certain circumstances. Recent revelations that the Pegasus spyware targeted thousands of people showed that many governments, especially non-democratic ones, are willing to use backdoor mechanisms to spy on their citizens.

The backlash against Apple’s move also raises the question of whether a private company should set the rules on online content and data privacy that even regulators across the world struggle to tackle. In the absence of consolidated rules and a set of principles on how to use these technologies, Apple’s initiative risks overshadowing the actual impact that the new changes will bring about.