The chorus of tech voices expressing serious concerns over Apple’s measures to scan U.S. iPhones For Images of Child Sexual Abuse continues to grow louder and louder by the day. Over the weekend alone, an open letter received more than 4,000 signatures online. The open letter titled ”The Apple Privacy Letter” asked the tech company to “reconsider its technology rollout,” lest it undo “decades of work by technologists, academics and policy advocates” on privacy-preserving measures.
Apple’s plan, which was announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found to be sharing child abuse imagery. Another prong of Apple’s Child Safety strategy involves optionally warning parents if their child under 13 years old sends or views photos containing sexually explicit content. An internal memo at Apple acknowledged that people would be “worried about the implications” of the systems. Apple on Thursday explains in a news release about the technical process of it will work. It will not only stick to your phone’s gallery but also work during chats.
WhatsApp’s Head, Will Cathcart said in a Twitter thread that his company wouldn’t be adopting or part of Apple’s Child Safety measures, referring to Apple’s approach as “very concerning.” Cathcart said WhatsApp’s system to fight child exploitation, which partly utilizes user reports, preserves encryption like Apple’s. This practice has led to the company reporting over 400,000 cases to the National Centre for Missing and Exploited Children in 2020. Apple is also working with the Centre for its CSAM detection efforts.
In his Twitter thread, Cathcart explains his belief that Apple “has might have built software that can scan all the private photos on your phone”. He added that Apple has taken the wrong path in trying to improve its response to child sexual abuse material, or CSAM. This move by Apple on privacy concerns has started some kind of fight between the two companies, with WhatsApp and Facebook openly criticizing Apple’s privacy changes as harmful to small businesses through a newspaper ads. Apple has also fired back, saying that the change “simply requires” that users be given a choice on whether to be tracked.
The list of people and organizations raising concerns about Apple’s policy is endless. Few on the list includes Edward Snowden, the Electronic Frontier Foundation, professors, and more. You can find some of those reactions collated here. This will give you a wider overview of some of the criticisms levied against tech giant – Apple’s new policy.
Matthew Green, an associate professor at Johns Hopkins University has also taken to twitter and tweeted from a completely different stance. In his tweets, he discusses Apple’s plans and about how the it could be abused by governments and malicious individuals. The EFF has also blasted Apple’s intended privacy measures through a press release, more or less calling it a “thoroughly documented, carefully thought-out, and narrowly-scoped backdoor” attempt. The EFF’s press release goes into detail on how it believes Apple’s Child Safety measures could certainly be abused by governments and how they could destroy user privacy trust.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.