Technology
Facebook, Instagram and TikTok directing harmful content towards teens within 24 hours of them joining
FACEBOOK, Instagram and TikTok are directing harmful content towards young teens within 24 hours of accounts going live, a disturbing investigation claims.
The tech giants are recommending material that breaks their own guidelines, including X-rated images, pro-suicide messages and celebrations of anorexia. They also allow porn to be sent directly to youngsters, the probe points out.
GettyTech giants are directing harmful content towards teens within 24 hours of joining platforms, an investigation reveals[/caption]
5RIGHTSDummy avatar of a teen set up by 5Rights investigation that was sent porn[/caption]
One saw adult material promoted alongside a Home Office campaign warning about “being pressured to send nudes”.
The investigation, by online safety campaign group the 5Rights Foundation, also found horrifying pictures of sliced arms and razor blades had been sent to a 13-year-old girl on Instagram.
That was despite bosses at the social media platform promising to outlaw such material in the wake of Instagram user Molly Russell taking her life in 2017, aged 14, after witnessing disturbing material online.
Last night her father Ian, 58, said the findings brought back the horror of his daughter’s death.
He told The Sun: “This research is so powerful. It takes me back to those horrific discoveries that were made about Molly’s online life after her death.” In the final six months of her life, Molly used her Instagram account more than 120 times a day, including looking at graphic images of self-harm and suicide.
Following an inquest last February, Instagram said it “does not allow content that promotes or glorifies self-harm or suicide and will remove content of this kind”.
Yet research by 5Rights suggests content is being pushed towards youngsters with little apparent concern for the risks to their wellbeing. Now Ian is calling for action.
He said: “Four school-age children die by suicide every week in the UK, according to the Office for National Statistics. I believe many of those will be connected to what they have seen online.
“Every week you delay, you are risking young lives. It is of the utmost urgency to bring this legislation as quickly as is sensible.
“Why is it still there three and a half years after Molly’s death? Why is it taking so long to do anything about it?” For 5Rights’ investigation, dummy avatars of teens were set up on major social media platforms.
These accounts were soon deluged with disturbing material.
All ten received direct messages from accounts they did not follow, including adult strangers and adverts for porn.
5Rights claims the tech giants are too preoccupied with pushing content as quickly as possible and not concerned with the risks.
Ian, from North London, says social media sites “helped” kill his daughter by allowing Molly to access such dreadful content.
‘NO SAFETY STANDARDS’
He said: “This investigation highlights what we found in Molly’s case, which is that material is suggested to you.
“The platforms, as well as using the algorithms behind the scenes, were messaging Molly on their platforms saying ‘you like #sad/depression, why not follow #lonely/helplessness as well?’
“In one case on Molly’s social media, the platform went as far as emailing her suggestions for other suicide and self-harm content.”
Ian says there were no warning signs that Molly was depressed or viewing self-harm sites.
He explains: “There were no obvious signs of mental ill-health in Molly.
“She was extraordinary at hiding what was going on in her head.”
The draft Online Safety Bill aims to make Britain “the safest place to be online in the world”.
But it is unlikely to pass until late 2022 and might not come into force for several years.
Critics claim it could restrict free speech online, but Ian said: “A lot of people are being alarmist about what is needed to make the internet safe.
“All we want is the normal safety measures that apply to our everyday lives mirrored online.”
5Rights carried out anonymous interviews with senior figures at top tech companies.
One product manager is said to have told it: “There are no safety standards. There is no ethics board in the digital space.”
It reported a strategy director as saying: “Companies make their money from attention. Reducing attention will reduce revenue.”
Baroness Kidron, who chairs 5Rights, wants the Online Safety Bill to be tougher, so harmful material cannot be sent to under-18s.
She said: “Perhaps the most startling image of the report is a screenshot in which it is clearly visible that a ‘child’ avatar is being ‘targeted’ with adverts for Nintendo Switch, a sweet shop and teen tampons — and, at the same time, pro-suicide material. How is that right? How is that legal?”
Research shows that 98 per cent of children aged over ten use the internet, while a third of 12 to 15-year-olds have seen “worrying or nasty content”.
Most read in Tech
Ian said: “These are global companies that count their profits in the billions. You ask yourself, ‘Why are they failing our children?’”
A spokesperson for TikTok said: “We removed 62million videos in the first quarter of 2021 for violating our community guidelines, 82 per cent of which were removed before they received a single view. TikTok has taken industry-leading steps to promote a safe and age-appropriate experience for teens.”
Facebook had not responded last night.
5RIGHTSAnother fake teen account was sent pro-anorexia content[/caption]
5RIGHTSA profile set up for the purposes of the probe was sent pro-suicide material[/caption]
PAMolly Russell, 14, took her own life after viewing self-harm content[/caption]