Feedback

Simfin

online safety and digital citizenship specialist

 Tagged with CSAM


06 January 2026

A spokesperson for the regulator said it was also investigating concerns Grok has been producing "undressed images" of people.

The BBC has seen several examples on the social media platform X of people asking the chatbot to alter real images to make women appear in bikinis without their consent, as well as putting them in sexual situations.

Read more

06 August 2021

Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.

Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

 

Read more