Feedback

Simfin

online safety and digital citizenship specialist

 Tagged with csam


19 February 2026

Tech platforms would have to remove intimate images which have been shared without consent within 48 hours, under a proposed UK law.

The government said tackling intimate image abuse should be treated with the same severity as child sexual abuse material (CSAM) and terrorist content.

Failure to abide by the rules could result in companies being fined up to 10% of their global sales or have their services blocked in the UK.

Read more

06 January 2026

A spokesperson for the regulator said it was also investigating concerns Grok has been producing "undressed images" of people.

The BBC has seen several examples on the social media platform X of people asking the chatbot to alter real images to make women appear in bikinis without their consent, as well as putting them in sexual situations.

Read more

06 August 2021

Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.

Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.

Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.

 

Read more