Instagram has launched new technology to recognise self-harm and suicide content on its app in the UK and Europe.
The new tools can identify both images and words that break its rules on harmful posts.
Instagram has launched new technology to recognise self-harm and suicide content on its app in the UK and Europe.
The new tools can identify both images and words that break its rules on harmful posts.
Molly Russell, 14, took her own life in 2017. When her family looked into her Instagram account they found distressing material about depression and suicide.
Molly's father Ian says he believes Instagram is partly responsible for his daughter's death.
A powerful and upsetting interactive site displaying 600 messages that led a young girl to taking her own life.
Hannah Smith was bombarded with vile messages telling her to kill herself. Posts on the website Ask.fm told her to drink bleach, that she was a slut, and encouraged her to take her own life. Last summer, she was found hanged in her room.
But the messages were not from internet trolls - 98% of the messages had come from the same IP address as Hannah's, with about four posts that had not.
Comments
make a comment