December 22, 2024
Apple “does not report” child sexual abuse, watchdogs say

Apple “does not report” child sexual abuse, watchdogs say

Apple

After years of controversy over plans to scan iCloud for other child sexual abuse (CSAM) materials, Apple abandoned those plans last year. Now, child safety experts have accused the tech giant of not only failing to mark CSAM exchanges and backups on its services—including iCloud, iMessage, and FaceTime—but also allegedly failing to provide with a report of all CSAM marked.

The United Kingdom’s National Organization for the Prevention of Child Abuse (NSPCC) shared UK police data with The Guardian showing that Apple “significantly underestimates” how often CSAM is found around the world in its services.

According to the NSPCC, police investigated more CSAM cases in the UK alone in 2023 than Apple reported worldwide for the entire year. Between April 2022 and March 2023 in England and Wales, the NSPCC found, “Apple was involved in 337 reported offenses of child abuse images.” But in 2023, Apple reported only 267 cases of CSAM to the National Center for Missing and Exploited Children (NCMEC), which is said to represent all CSAM in its cases around the world, The Guardian reported.

Major US tech companies are required to report CSAM to the NCMEC when it is discovered, but while Apple reports several hundred cases of CSAM each year, its big tech peers like Meta and Google report millions , the NCMEC report showed. Experts told The Guardian that there is ongoing concern that Apple is “clearly” limiting CSAM to its platform.

Richard Collard, the NSPCC’s head of online child safety policy, told The Guardian that he believes Apple’s child safety efforts need significant improvement.

“There is a disparity between the number of child abuse crimes in the UK that happen on Apple services and the relatively small number of global reports of abuse that they make to the authorities,” Collard said. The Guardian. “Apple is clearly supporting many of its peers in tackling child sexual abuse while all tech firms need to invest in security and prepare for the introduction of the UK Cybersecurity Act .”

Outside the UK, other child safety experts shared Collard’s concerns. Sarah Gardner, CEO of the Los Angeles-based child protection organization Heat Initiative, told The Guardian that she considers Apple’s platform a “black hole” hiding CSAM. And he expects that Apple’s efforts to bring AI to its platform will exacerbate the problem, making it easier to deploy AI-powered CSAM in a place where sex offenders can expect less enforcement.

“Apple doesn’t see CSAM in many of its sites at all,” Gardner told The Guardian.

Gardner agreed with Collard that Apple “isn’t reporting clearly” and “hasn’t invested in the trust and security teams to be able to handle this” as it rushes to introduce advanced AI features. on its platform. Last month, Apple integrated ChatGPT into Siri, iOS and Mac OS, perhaps setting expectations for the ever-increasing AI features that will be introduced in future Apple devices.

“The company is moving into an area that we know can be very dangerous and harmful to children without a track record of being able to deal with it,” Gardner told The Guardian.

So far, Apple has not commented on the NSPCC report. Last September, Apple responded to the Heat Initiative’s demands for more CSAM, saying that instead of focusing on monitoring illegal content, its goal is to connect vulnerable or abused users. directly with local resources and law enforcement that can help them in their communities. .

#Apple #report #child #sexual #abuse #watchdogs

Leave a Reply

Your email address will not be published. Required fields are marked *