The social media platform Telegram uses an algorithm that promotes extremist content, a new study shared exclusively with the BBC has revealed.
Tagged with extremism
How did one single social media post, taken down within hours for being false, still end up being viewed millions of times and presented as credible evidence about the Southport attack?
This demonstrates how important it is to teach adults basic digital literacy skills.
What connects a dad living in Lahore in Pakistan, an amateur hockey player from Nova Scotia - and a man named Kevin from Houston, Texas?
They’re all linked to Channel3Now - a website whose story giving a false name for the 17-year-old charged over the Southport attack was widely quoted in viral posts on X. Channel3Now also wrongly suggested the attacker was an asylum seeker who arrived in the UK by boat last year.
It’s become a familiar pattern of events: a violent, terrifying attack unfolds, innocent people are killed, and social media is set alight with unfounded - and often incorrect - accusations about about the assailant's identity and what the motivation was.
Think back to the stabbing attacks in Sydney earlier this year, falsely blamed on a Jewish student, or even the assassination attempt on Donald Trump in July.
It’s the same with Monday’s attack on a children’s holiday dance and yoga session in Southport, England.
For 14 hours over the weekend, Sydney university student Ben Cohen was one of the most reviled men on the internet after he was falsely accused of being the knifeman who went on a stabbing rampage in a Sydney shopping centre, killing six people.
The ABC has pieced together how anti-semitic and pro-Kremlin accounts turned Mr Cohen into an internet villain.
(Police) Data collection was "unreliable", it said, and intelligence gathering "wasn't prioritised".
It found most offenders were white, despite the concerns of politicians about "Asian" grooming gangs.
Comments
make a comment