Feedback

Simfin

online safety and digital citizenship specialist

 Tagged with iphone


06 August 2021

Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.

Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

 

Read more