Apple will soon bring a tool to scan photos in iPhone to prevent child abuse

This tools will be out to all the parts of the world starting from the US

Apple will soon bring a tool to scan photos in iPhone to prevent child abuse

Apple is most known for its privacy features. Now Apple is planning to develop a new tool that would scan photos in iCloud and iPhone. This scanning of photos is also an added security feature. Apple will be scanning those photos for child sexual abuse material (CSAM). This scanning also includes media content related to child pornography.

Apple will be using neutral matches to identify any illegal image. If it identifies any such images, it will notify the team of human reviewers, who will contact law enforcement to verify the material. This screening process will be done just before users upload pictures from their iPhones to iCloud. Apple already has a separate page for Child Safety where it shows the steps from Apple to prevent Child Sexual Abuse Material.

Apple, in cooperation with child safety experts, has developed this feature in three areas. First will be the communication tools that will give parents more control over helping their child navigate through communication. Second, the messages app will be having an on-device machine learning feature to warn about sensitive content. These updates will be coming to the next iOS and iPadOS later this year.

For the pictures that are already present in the iCloud, another tool will be used. This tool will scan the photos, and if it finds any sexually explicit pictures, it will inform National Center for Missing and Exploited Children (NCMEC). Apple mentions that it still protects the privacy of the users. It will be using an on-device matching tool to scan the instances to match the known CSAM image hashes database. Apple will then change the images to other hashes that are unreadable and store them securely on the user’s device.

Most Child Safety groups praised this move as Apple joins other companies like Facebook, Google, and Microsoft. WhatsApp, on the other hand, has said that they will never implement such a move. WhatsApp CEO, in a tweet, mentioned that this move from Apple is a wrong approach and a setback for people’s privacy all over the world.

So that’s been it. Thank you for reading, and do share the article if you get a bit piece of information. Also, do keep an eye on this space for more relevant updates. Stay safe, and we hope to see you around.

Exit mobile version