“This Government is behaving like a cowboy builder who delays the work again and again, and leaves the job only half done.”
Campaigners and MPs have responded with anger after the Government quietly announced another delay to its flagship legislation designed to reduce ‘online harms’.
The UK Government is currently developing proposals for a law which would impose a duty of care on tech companies to tackle a rising tide of abuse, self-harm material and child exploitation online.
The legislation is expected to give an online harms regulator – as yet unappointed – the power to investigate and sanction tech companies for failing to meet new standards on removing harmful content.
But the Government appears to have watered down and now repeatedly delayed the plans, following a consultation which closed last year. The Government has failed to publish its full response, blaming the pandemic, despite the White Paper being launched in April 2019. Ministers have also failed to publish the voluntary ‘interim’ codes of practice for tech giants.
Moreover, MPs are concerned that the proposed legislation will ignore the problems of misinformation and disinformation, and lack sanctions for tech companies who fail in their duty of care.
Ministers have previously insisted the plans will not be watered down and are on their way. A report in the Telegraph on the 7th October suggested the Government would announce the plans ‘within weeks’, with a new online harms regulator having powers to shut down social media sites found guilty of serious breaches of their statutory duty of care, by blocking their internet service providers.
Minister Carole Dinenage also suggested senior managers of tech companies would be held personally liable for breaches of the new rules, and subject to sanctions including fines.
But just a week later, MPs learnt the legislation will be delayed until at least next year. The Chair of the Lords Democracy and Digital Committee, Lord Puttnam, has suggested this means the Online Harms Bill will not come into effect until 2023 or 2024.
Children’s welfare groups such as the NSPCC have been pushing for action on online harms for years now. Calls for action grew after Molly Russell, aged 14, took her own life in 2017. Russell had viewed unregulated self-harm and suicide content on Instagram. Her family have been at the forefront of demands for regulation of social media giants when it comes to removing harmful content.
The Government itself has noted that two thirds of adults in the UK are concerned about content online, and close to half say that they have seen hateful content in the past year. In a recent report, the government pointed to a large survey of young people who had been cyberbullied, which showed that 37% had developed depression and 26% had suicidal thoughts.
Commenting on the delay, Nicola Aitken, Policy Manager at fact-checking group Full Fact, told Left Foot Forward: “The time has come for legislative measures to be brought before Parliament. Bad information ruins lives—we’ve seen that first-hand during the coronavirus pandemic. Delaying the introduction of legislation risks yet more harm when people need protection.
“We are still waiting a year later to see the full response to the Online Harms consultation from the Government. This means we have insufficient information on how the proposed powers of a regulator will function, which makes it difficult to assess whether the Online Harms proposals would even have any material impact on tackling misinformation.”
Nathan Sparkes, Director of Policy at press campaigners Hacked Off, added: “The Government has already significantly watered down its promises on online harms regulation. Since first publishing the White Paper in April 2019 it’s offered an exemption to the comment sections of national newspapers, despite abundant evidence of racism, fake news and anti-Semitism appearing on those platforms. An approach which exempts platforms which are hosting hateful comments in this way can only ever be a partial solution, and is bound to fail.
“Repeated delays to the timeline for introducing the legislation at all suggests the Government are not taking the problem of online harms nearly seriously enough. This Government is behaving like a cowboy builder who delays the work again and again, and leaves the job only half done.”
Matthew McGregor, Campaigns Director at anti-racism group HOPE not hate, is concerned that delays to the bill mean hate speech will continue unabated online. “There are serious changes needed to ensure that government and regulators have the tools they need to counter hate speech, far right organising, and the appalling impact of bullying, abuse and harassment. This delay seems to be the result of indecision inside government – it’s just not good enough,” he told this site.
The chair of the Commons Digital, Culture, Media and Sport Committee, Tory MP Julian Knight, described further delays as ‘unjustifiable’. “The Government has accepted the evidence this committee presented to it about the unstoppable spread of online misinformation during the pandemic and the harms involved. However, instead of acting with urgency, we’re now being told we have to wait until next year to see the legislation to tackle online harms legislation being published,” Mr Knight said.
He added: “We’re also disappointed that the Government has failed to take this opportunity to identify the body that will be carrying out the crucial role of online harms regulator. We warned in our Report that a continued delay would bring into question the seriousness of intention in this area but once again we’re told we have to wait.”
Some groups including Big Brother Watch have however warned the proposed legislation would cause ‘state sponsored censorship’ by seeking to prevent ‘harmful’ speech going online before it is published: “Expect state-sponsored upload filters, recommendation systems & mass surveillance.” The Open Rights Group and the Index on Censorship also warned that the proposed framework could threaten freedom of expression.
But children’s charities and anti-hate speech groups say action must be taken soon given the scale of unregulated hate speech and abuse online.
At the Independent Inquiry into Child Sexual Abuse (IICSA), police organisations stated that there were at least 100,000 people in the UK accessing abuse images at any one time.
Individuals can report harmful content online to the Internet Watch Foundation.
Josiah Mortimer is co-editor of Left Foot Forward.
To reach hundreds of thousands of new readers we need to grow our donor base substantially.
That's why in 2024, we are seeking to generate 150 additional regular donors to support Left Foot Forward's work.
We still need another 117 people to donate to hit the target. You can help. Donate today.