The watchdog Ofcom is set to be given new powers to tackle “corrosive and abhorrent harms” on social media, the UK government has announced.
The government confirmed on Wednesday it was “minded” to hand the broadcast and telecoms regulator responsibility for regulating illegal and harmful content online. It has been dubbed Britain’s “first internet watchdog.”
A new law is also on the cards enshrining a “duty of care” on firms to have processes in place that protect children and vulnerable users.
But a new report says the government will stop short of forcing platforms to remove legal content even if it could cause harm, citing the need to protect free expression.
Instead firms which facilitate the sharing of user-generated content will be ordered to draw up and “consistently and transparently” enforce their own standards of acceptable behaviour.
Nicky Morgan, the UK minister for digital, culture, media and sport, and home secretary Priti Patel made the announcement in a report published on Wednesday.
“We will not prevent adults from accessing or posting legal content, nor require companies to remove specific pieces of legal content,” the ministers wrote in a foreword. They said users should also be given the chance to appeal when their content is removed, and platforms must be more transparent over removal decisions.
Half the UK public have seen hateful content online in the past year, according to the report. The government hope the measures will help tackle not only cyber-bullying, but also other darker sides of the internet.
“Terrorist propaganda and vile online child sexual abuse destroy lives and tear families and communities apart,” the ministers said.
But a decision has been delayed on more controversial aspects of how the reforms could be enforced. The consultation last year had considered forcing platforms to withdraw any services that facilitate extremely serious harms such as usage by terrorists.
The consultation had also mooted making business leaders liable for major breaches of the new duty of care, with the risk of fines or even criminal charges.
The latest report said it was “essential that company executives are sufficiently incentivised to take online safety seriously,” but said the government’s enforcement plans would be unveiled in the spring.
Andy Burrows, the National Society for the Prevention of Cruelty to Children’s (NSPCC) head of child safety online policy, said: “Any regulator will only succeed if it has the power to hit rogue companies hard in the pocket and hold named directors criminally accountable for putting children at risk on their sites.”
But Labour’s shadow digital minister Tracy Brabin said it was “shameful” the government had still only published a part-response to a consultation rather than legislating already on the issues.
“Today’s proposals are long overdue, and nothing short of legislation will reassure families that their loved ones are safe online,” she said.
But respondents to the consultation also warned excessive enforcement could see cautious firms over-blocking user-generated content to avoid penalties.
The report details the government’s follow-up plans to its white paper on online harms last April, which promised to make companies more responsible for their users’ safety online.
The announcement marks a significant change for Ofcom, which has regulated broadcasting, telecoms and radio since it was founded in 2003. But the report said Ofcom’s experience and track record made it the “best fit” for the job.
The regulator itself unveiled a new chief executive on Wednesday, announcing civil servant Dame Melanie Dawes will take over from March.