Categories: Privacy

Apple Is Scanning Every Single iPhone for Images of Child Sexual Abuse

Apple Is Scanning Every Single iPhone for Images of Child Sexual Abuse

The world is changing, fast.

Apple has revealed plans to scan all iPhones in the United States for images of child sexual abuse, at once drawing praise from child protection groups, while also raising serious concerns about entrusting private information to systems that are not exactly subject to public consent (since smartphones are essential in modern society), potentially leading to a new sphere of legitimized surveillance on ordinary citizens, according to a blog post on Apple’s official website.

Apple’s new tool will scan images and encrypted messages for signs of child sexual abuse

Called “neuralMatch,” the new tool will scan every image before its uploaded to iCloud, and, if it finds a suggestive match, a real-life human will review it. If they decide it qualifies as child pornography, then the user’s account will be automatically disabled, the company will disable the user’s account, and a notification will be sent to the National Center for Missing and Exploited Children, according to an initial report from NPR. Crucially, Apple will also scan encrypted messages (presumably stored on one’s phone or sent) for signs of sexually explicit content, as a preventative measure against child abuse, and, understandably, this alarmed staunch privacy advocates.

Notably, the new detection tool will only flag images that the company’s database already has stored away as “known” child pornography. Parents who take pictures of their children in, say, a bathtub, are probably not in any danger. But researchers warn that the image-matching tool — which does not literally “see” images, and only makes approximations of what is or isn’t illegal based on mathematical “fingerprints” — might open the door for insidious motives, whether they come from Apple, the government, or any associated party. For example, a top cryptography researcher from Johns Hopkins University named Matthew Green has said that Apple’s new neuralMatch system isn’t fool-proof. In fact, the system, despite its noble intentions, might easily be used to frame innocent iPhone users.

This could work by sending a seemingly innocent image created to activate the tool’s matching function, and flag a harmless user as a sexual abuser of children. “Researchers have been able to do this pretty easily,” said Green of how simple it is to trick systems like Apple’s neuralMatch, in the NPR report. In case it isn’t obvious, “child sexual abuser” and “sexual predator” are extremely stigmatizing accusations that even the most well-behaved citizens could spend a lifetime trying to shake off. Not only from their publicly shared records and social media, but from the court of public opinion. In other words: It should go without saying that catching predators is important.

But at what cost?

Also Read: A Review of PDPC Undertakings July 2021 Cases

Apple is under increased pressure to enable mass surveillance

Apple assuming the right to monitor all photos on an iPhone, not just the ones that actually are the “fingerprints” of a sexual predator, puts a lot of control in the company’s hands, which seems to contradict the firm’s own assertions about how it should interact with law enforcement. Notably, additional abuses of power could involve Apple enabling the government surveillance of dissidents, or protestors, regardless of political persuasion. “What happens when the Chinese Government says, ‘Here is a list of files that we want you to scan for,'” asked Green in the report, rhetorically. “Does Apple say no? I hope they say no, but their technology won’t say no.”

And he has a point. For years, Apple has experienced increased governmental pressure to enable higher levels of surveillance on encrypted data. This has placed the company in tenuous position, balancing a legal imperative to crack down on the abuse and exploitation of children, while also maintaining its image of being resolutely committed to protecting user privacy. But an online civil liberties organization called the Electronic Frontier Foundation sees Apple’s latest move as “a shocking about-face for users who have relied on the company’s leadership in privacy and security,” in the NPR report. While we can’t say we’re living in a cyberpunk dystopia, it seems today that big tech is beginning to exhibit some of the basic markers surrounding invasive surveillance of ordinary citizens.

Also Read: Protecting Data Online in the New Normal

This was a breaking story and was regularly updated as new information became available.

Privacy Ninja

Recent Posts

Role of Enhanced Access Controls in Safeguarding Personal Data in Telecommunications

Role of Enhanced Access Controls in Safeguarding Personal Data in Telecommunications that every Organisation in…

1 week ago

Role of Effective Incident Response Procedures in Strengthening Data Security

Effective Incident Response Procedures in Strengthening Data Security that every Organisation in Singapore should know…

1 week ago

Strengthening Your Cyber Defenses: The Crucial Role of Regular Vulnerability Scanning

Crucial Role of Regular Vulnerability Scanning that every Organisation in Singapore should know. Strengthening Your…

1 week ago

Enhancing Data Security with Multi-Factor Authentication

Enhancing Data Security with Multi-Factor Authentication that every Organisation in Singapore should know. Enhancing Data…

2 weeks ago

A Strong Password Policy: Your Organization’s First Line of Defense Against Data Breaches

Strong Password Policy as a first line of defense against data breaches for Organisations in…

2 weeks ago

Enhancing Website Security: The Importance of Efficient Access Controls

Importance of Efficient Access Controls that every Organisation in Singapore should take note of. Enhancing…

3 weeks ago