Tech giant, Apple has unveiled plans to scan U.S. iPhones for images of child sexual abuse, in the process earning accolades from several child protection groups while some security researchers have raised their concern that the system could be misused by entities like governments for illegal surveillance.
Apple plans to use the tool called “neural Match” to scan images of child sexual abuse before they are uploaded to iCloud. If child pornography is suspected, then reviewed by a human and is confirmed, the user’s account will be disabled and the National Center for Missing and Exploited Children notified. They also plan to scan users’ encrypted messages for sexually explicit content as a child safety measure.
Apple has assured that it can only detect images that are already in the centre’s database of child pornography.
Matthew Green, a leading cryptography researcher at Johns Hopkins University, warned that the system could be used for nefarious activities. It could be used to implicate someone by sending them seemingly harmless pictures that would trigger child pornography matches and trick Apple’s algorithm to alert law enforcement.
“What happens when the Chinese government says, ‘Here is a list of files that we want you to scan for,’” Green asked when listing other implications of the implementing the plan. He expressed concern that the leading tech organization would be unable to say no.
The concerns of privacy advocates are not so farfetched as Apple, and other companies who have embraced end-to-end encryption, have been putting off government pressure to allow for increased surveillance of encrypted data for years. These new security measures would necessitate Apple to create a delicate equilibrium between protecting children’s rights and also keeping the integrity of its privacy policy. The company reported that it would be releasing these latest changes as part of updates for iPhones, Macs and Apple Watches.
Apple said the app will use on-device machine learning to identify and haze out sexually explicit images on children’s phones and also alert the parents of younger children who have enrolled their children’s phones. It would also “intervene” in users’ search for topics related to child sexual abuse.
Hany Farid, a researcher at the University of California, Berkeley argues that a lot of other programs designed for protection also have these concerns. For example, he says, “WhatsApp provides users with end-to-end encryption to protect their privacy, but also employs a system for detecting malware and warning users not to click on harmful links.”
Julia Cordua, the CEO of Thorn, a non-profit that uses technology to help protect children from sexual abuse said that Apple’s technology balances “the need for privacy with digital safety for children.”
However, the Centre for Democracy and Technology in Washington have called on Apple to abandon the changes as it would eventually terminate Apple’s assurance of “end-to-end encryption.”
Apple responded that the new features would not jeopardize the security of private communications.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.