Posted inBusiness

Apple announces it will roll out photo check to prevent child abuse

While child safety groups have praised the move, others express concerns it opens a backdoor into digital privacy for government and other groups

Apple announces it will roll out photo check to prevent child abuse
Apple announces it will roll out photo check to prevent child abuse

In a move that drew concern from privacy advocates and praise from child safety groups, Apple announced that its iPhones and iPads will start detecting images containing child sexual abuse and reporting them as they are uploaded to its online storage in the United States.

This system will be soon launched in other countries, in accordance to local laws.

In a website post, Apple said: “At Apple, our goal is to create technology that empowers people and enriches their lives – while helping them stay safe. We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM).

“iOS and iPadOS will use new applications of cryptography to help limit the spread of CSAM online, while designing for user privacy. CSAM detection will help Apple provide valuable information to law enforcement on collections of CSAM in iCloud Photos.”

Apple is also introducing new communication tools which will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content, while keeping private communications unreadable.

Updates are being made to Siri and Search, which provide parents and children expanded information and help if they encounter unsafe situations. Siri and Search will also intervene when users try to search for CSAM-related topics.

“When receiving this type of content, the photo will be blurred and the child will be warned,”  Apple said. “As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it.”

Similar precautions are triggered if a child tries to send a sexually explicit photo.

All these features are coming later this year in updates to iOS 15, iPadOS 15, watchOS 8, and macOS Monterey.

Digital rights organisations say the tweaks to Apple’s operating systems create a potential ‘backdoor’ into gadgets that could be exploited by governments or other groups.

Among others, WhatsApp chief Will Cathcart  was critical of Apples move. The world’s largest fully encrypted messaging service, is also under pressure from governments that want to see what people are saying, and it fears that will now increase.

Cathart tweeted: “We’ve had personal computers for decades, and there has never been a mandate to scan the private content of all desktops, laptops or phones globally for unlawful content. It’s not how technology built in free countries works.”