Operating SystemsSoftware

Apple Plans To Scan iPhones For Images That Promote Child Abuse

0
Apple
Image courtesy CNet

When it comes to compliance with authorities, Apple has certainly found itself on the wrong end in the past. This is mostly because of its strict privacy policies that even prevent the company from scanning its own devices. Well, that is about to slightly change as the company plans to start scanning photos stored on iPhones and iCloud for child abuse imagery.

As reported first by the Financial Times, this new system is designed to help law enforcement in criminal investigations. However, it is expected that it could open the gateway to increased legal and government demands for user data.

Dubbed neuralMatch, this new system will “proactively alter a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified.” The software was trained using 200,000 images from the USA’s National Center for Missing & Exploited Children. For that, it will initially roll out in the U.S. as it compares images to a database of known images of child sexual abuse.

The firm already scans files stored on iCloud against known child abuse imagery. But neuralMatch would take the step further allowing access to local storage.

This development follows a period when Apple has famously stood up to federal authorities explaining that its privacy rules are unbreakable. This includes saying a no to the FBI in 2015 when they wanted the iPhone-maker to build a backdoor into iOS to access an iPhone belonging to a suspect.

So, this will probably mend the torn bridges between the FBI and Apple…maybe.

Explained: Why Does Electricity Go Off When it Starts Raining?

You may also like

Comments

Leave a reply

Your email address will not be published. Required fields are marked *