Apple Inc said on Friday it would take more time to collect inputs and make improvements before releasing child safety features, a month after it said it would implement a system to check iPhones for images of child sexual abuse.
More than 90 policy and rights groups around the world told Apple last month it should abandon plans for scanning children’s messages for nudity and the phones of adults for images of child sex abuse.
Critics of the plan have said that the feature could be exploited by repressive governments looking to find other material for censorship or arrests.
"Based on feedback from customers, advocacy groups,researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the company said in a statement on Friday.
Comments
Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.
We have migrated to a new commenting platform. If you are already a registered user of TheHindu Businessline and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.