“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Soon after the Center for Democracy and Technology (CDT) urged Apple Inc. to reverse its decision on scanning children’s messages for nudity and the phones of adults for images of child sex abuse, the smartphone maker has decided to withdraw its abuse prevention policy, citing that it would take more time to collect feedback, Reuters reported on Friday.
CDT is a US-based non-profit.
“Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company said in a statement issued to Reuters.
Matthew Green who serves as a cybersecurity expert at Johns Hopkins University said that Apple’s decision was “promising”, even though he had refuted the smartphone maker’s child safety policy.
Voicing their concern, as many as 90 policy and rights groups worldwide had published a joint letter criticizing Apple’s move. This may sound bizarre that why these critics ran a campaign preventing Apple to implement child safety features, but there’s another side to it: privacy concern. Even though the feature promises to prevent child sex abuse, it could breach privacy for scores of people across the world.
Flavio Wagner, president of the independent Brazil chapter of the Internet Society, said last week, “our main concern is the consequence of this mechanism, how this could be extended to other situations and other companies.”
“This represents a serious weakening of encryption,” he added. He was quoted as saying by Reuters.