Social media firms could face UK ban unless they remove harmful content

Woman On Her PhoneUK health secretary Matt Hancock appeared on the BBC warning social media companies that they could face a ban unless they actively remove harmful content from their platforms.

In an interview on the BBC’s Andrew Marr show, Mr Hancock said: “If we think they need to do things they are refusing to do, then we can and we must legislate.”

Hancock was asked if the government would ever go as far as banning online platforms that fail to remove harmful content.

“Ultimately parliament does have that sanction, yes,” he said. “It’s not where I’d like to end up, in terms of banning them, of course, because there’s a great positive to social media too.”

“But we run our country through parliament and we will and we must act if we have to.”

Hancock has urged social media firms to remove content from their platforms that promote suicide or self-harm.

In a letter to Twitter, Snapchat, Pinterest, Apple, Google and Facebook, who own Instagram, Hancock said:

“It is appalling how easy it still is to access this content online and I am in no doubt about the harm this material can cause, especially for young people.

“It is time for internet and social media providers to step up and purge this content once and for all.”

Hancock added that the British government is working on a white paper on the issue.

“I want to make the U.K. the safest place to be online for everyone – and ensure that no other family has to endure the torment that Molly’s parents have had to go through,” he said.

In 2017, a 14 year old British girl called Molly Russell took her life after viewing disturbing content on Pinterest and Instagram. Molly’s father, Ian, believes that the platforms played a part in his daughter’s death.

Mr Russell, from Harrow, north London, told The Sunday Times:

“The more I looked , the more there was that chill horror that I was getting a glimpse into something that had such profound effects on my lovely daughter.”

“Pinterest has a huge amount to answer for,” he added.

Mr Russell said: “It is clear to us that despite what the social media companies tell the public about their policies of removing disturbing content that such content is still available for young people to find easily and by finding it they have more and more of it pushed on them by algorithms.”

“It is time for tech companies to stand up and take more responsibility for the content available to their young users.”

Facebook said it was “deeply sorry”, adding that graphic content which sensationalises self-harm and suicide “has no place on our platform”.

Facebook executive Steve Hatch said: “The first thing I’d like to say is just what a difficult story it was to read and I, like anyone, was deeply upset.

“I’m deeply sorry for how this must have been such a devastating event for their family.”

He added:

“If people are posting in order to seek help and in order to seek support from communities, the experts in this area tell us that is a valuable thing for them to do. It can help with recovery, it can help with support

“If it’s there to sensationalise and glamourise, of course it has no place on our platform, it shouldn’t be on our platform. And if we need to work harder to make sure it isn’t on our platform then we certainly will.”