What’s behind the rise of self-generated
indecent images of children online?
A report from the APPG (All Party Parliamentary Group) on social media
Great training, very informative from a policing perspective. Thank you Simon
20 November 2024
18 November 2024
01 November 2024
What’s behind the rise of self-generated
indecent images of children online?
A report from the APPG (All Party Parliamentary Group) on social media
Facial recognition technology (FRT) may need to be regulated in much the same way as some ethically sensitive medical techniques to ensure there are sufficient safeguards in place to protect people's privacy and freedoms.
A sweeping set of regulations governing how online services should treat children’s data have been welcomed by campaigners as they come into effect.
The Age Appropriate Design Code – which was written into law as part of the 2018 Data Protection Act, which also implemented GDPR in the UK – mandates websites and apps from Thursday to take the “best interests” of their child users into account, or face fines of up to 4% of annual global turnover.
There is a clear summary of the code here
Now, with the Taliban back in power, each digital breadcrumb could be a reason to be punished or killed.
There are several different ways the Taliban could find out information about you: information stored locally on your device; your contacts (messages with whom you’ve exchanged may be on their devices); the cloud services you use; and the data moving between those places, subject to interception.
App and web-based ordering has become commonplace during the pandemic.
But the Information Commissioner's Office told the BBC that customers should be aware they had a choice over whether to share information.
Venues should only ask for data that is "relevant and necessary", the ICO said.
Apple has announced details of a system to find child sexual abuse material (CSAM) on US customers' devices.
Before an image is stored onto iCloud Photos, the technology will search for matches of already known CSAM.
Apple said that if a match is found a human reviewer will then assess and report the user to law enforcement.
Comments
make a comment