Groups call on Apple to drop plans to scan children’s messages
A coalition of 90 groups around the world are calling on Apple to drop its plans to scan its products to detect images of child sexual abuse stored in iCloud.
In a letter published on the Center for Democracy and Technology website, the groups said the feature, known as a CSAM hash, may jeopardize the privacy and security of Apple users worldwide.
“Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” the letter said.
Apple’s plan, announced earlier this month, will feature an update that scans Apple products to detect child sexual abuse material. Such material found would then be reported to the National Center for Missing and Exploited Children (NCMEC).
The advocacy groups say this could create new problems down the road.
“The scan and alert feature in Messages could result in alerts that threaten the safety and wellbeing of some young people, and LGBTQ+ youths with unsympathetic parents are particularly at risk,” they wrote in the letter.
“Once the CSAM hash scanning for photos is built into Apple products, the company will face enormous pressure, and possibly legal requirements, from governments around the world to scan for all sorts of images that the governments find objectionable,” they added.
A separate letter published earlier this month also warned of serious consequences from Apple’s plan. It was signed by almost three dozen organizations and over 6,600 individuals including cryptographers and researchers, as well as privacy, security and legal experts.
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.