The UK will be the first country in the world to make it illegal to possess, create or distribute AI tools designed to create child sexual abuse material (CSAM), with a punishment of up to five years in prison.
Tagged with child abuse
Paedophiles are using artificial intelligence (AI) technology to create and sell life-like child sexual abuse material, the BBC has found.
Some are accessing the images by paying subscriptions to accounts on mainstream content-sharing sites such as Patreon.
The National Police Chief's Council said it was "outrageous" that some platforms were making "huge profits" but not taking "moral responsibility".
The makers of the abuse images are using AI software called Stable Diffusion, which was intended to generate images for use in art or graphic design.
People trying to view sexual images of children online will trigger a first-of-its-kind chatbot, which has launched to help potential offenders stop their behaviour.
Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.
Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.
Criminals and paedophiles are trying to groom and exploit young siblings as part of an emerging trend of online sexual abuse, experts have warned.
The Internet Watch Foundation said victims ranged from 3-16 years, with some groomed to copy adult pornography.
The majority of child sexual abuse gangs are made up of white men under the age of 30, an official paper has said.
Comments
make a comment