Apple has put on hold its Child Sexual Abuse Material (CSAM) detection technology it announced last month, premising its decision on not so gratifying feedback from policy groups and customers. The feedback, which had been largely negative, had gotten about 25, 000 consumer signatures, according to the Electronic Frontier Foundation.
The American Civil Liberties Union and about 100 policy and rights groups had called on the tech company to immediately halt its plans to roll out the CSAM technology.
Apple, taking cognizance of all this had on Friday morning in a statement it made available to TechCrunch announced the suspension of the CSAM roll-out plan. It stated:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
The NeuralHash technology employed by Apple was mooted to identify child sexual abuse material; on the device of user without having possession of the image or knowing the contents of the image thereof. In lieu of the fact that the photos of the user stored in iCloud are end-to-end encrypted to disallow Apple from accessing the data, NeuralHash would then scan for known Child Sexual Abuse Material (CSAM) on the device of the user. According to Apple, this method is more privacy-friendly than the blanket scanning that is currently in use by cloud providers.
But this is not enough to convince security experts and privacy advocates as they have expressed serious concerns over the tendency of the system being abused by highly resourced actors, (for example, government and high-powered organizations) to implicate and falsely accuse innocent people. The security experts and privacy advocates also believed that governments can manipulate the system to detect other materials that authoritarian nation states find objectionable.
Researchers had within the first weeks of the announcement of the CSAM technology intimated the public that they were able to successfully create “hash collisions” with the NeuralHash method, a move that would effectively trick the system into assuming two entirely different images were actually the same.
It is expected that the iOS 15 will roll out in the next few weeks.
Discover more from TechBooky
Subscribe to get the latest posts sent to your email.