Site icon Nairametrics

Apple to roll out child abuse photo checking system on country-by-country basis

Fraudsters attempted to launch 428,000 apps on App Store in 2022—Apple

Apple Inc. will roll out a system for checking photos for child abuse imagery on a country-by-country basis, depending on local laws, the company said on Friday.

The company said it would implement a system that screens photos for such images before they get uploaded to the iCloud storage.

Child safety groups have praised Apple for this move as the company joins Facebook, Microsoft, Alphabet and Google in taking safety measures to protect children.

Other technology companies check photos after they are uploaded to the server but Apple’s case is the reverse.

Apple’s photo check on the iPhone has raised concerns that the company is probing into its users’ devices in a way that could be exploited by the governments.

News continues after this ad

The company said nuances in its system, such as “safety vouchers” passed from the iPhone to Apple’s servers that do not contain useful data, will protect Apple from government pressure to identify material other than child abuse images.

According to Reuters, it has a human review process that acts as a backstop against government abuse. The company will not pass reports from its photo checking system to law enforcement if the review finds no child abuse imagery.

Apple said it would make plans to expand the service based on the laws of each country where it operates. Regulators are increasingly demanding that tech companies do more to take down illegal content.

A few resulting laws, including in Britain, could be used to force tech companies to act against their users in secret. Some security experts are saying that Apple is making a big mistake by showing its willingness to reach into customer phones.

Facebook’s WhatsApp, the world’s largest fully encrypted messaging service, is also under pressure from governments that want to see what people are saying, and it fears that will now increase.

WhatsApp chief, Will Cathcart criticized Apple’s move in a Tweet. He wrote, “We’ve had personal computers for decades, and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.”

Apple’s experts argued that they were not really going into people’s phones because data sent on its devices must clear multiple hurdles. For example, banned material is flagged by watchdog groups, and the identifiers are bundled into Apple’s operating systems worldwide, making them harder to manipulate.

Exit mobile version