Stronger regulation of social media platforms is needed because their culture “needs to change”, MPs and peers have been told.
Online safety campaigner Ian Russell, whose daughter Molly took her own life after viewing harmful content, said the Government’s draft Online Safety Bill was an essential tool to help stop the spread of abuse.
Giving evidence to a joint committee of MPs and peers examining the proposed Bill, Mr Russell said stricter rules around protecting people from harmful content, and harsher punishments for sites that fail to do so was the only way to force change.
“It seems only when either news stories break in a particularly public way or when perhaps regulations change that platforms respond,” he said.
“From our view, the corporate culture at these platforms needs to change. They need to be proactive rather than reactive.
“After all, they have the resources and skills to do this, but it’s so often done as an afterthought.
“They should live up to their words about taking online safety seriously.”
Mr Russell was joined before the committee by a number of other online safety campaigners and experts to give evidence on how to improve the draft Online Safety Bill, which was published by the Government earlier this year.
The draft Bill proposes to introduce new rules for online platforms to ensure they protect their users from encountering harm, being bound by a new duty of care – overseen by Ofcom as the sector’s new regulator – with large fines and having their sites blocked the potential punishments for breaking the rules.
Echoing Mr Russell’s concerns about the corporate culture at many major tech firms, Izzy Wick, director of UK policy at children’s online safety group, 5Rights Foundation, said the harms children in particular faced were down to “systemic” issues with the platforms.
“The scale at which children experience harm online is because of the systems and processes that are designed to extend engagement, to maximise reach and to maximise activity at any cost, including the safety and well-being of children,” she said, highlighting how just a few misplaced clicks by a teenager on social media could soon seem served with a range of harmful content.
“No environment can be 100% risk-free, but we want to make sure that rather than creating a walled garden for children we are doing everything we can and taking every available opportunity to prevent risk, and if the Bill is going to deliver for children it needs to focus on those systems and processes that create risk,” she added.
During the hearing, concerns were raised over the proposed rule which would see content classed as “democratically important” being protected, which refers to content intended to contribute to political debate, promoting or opposing government policy.
When asked by the committee if they were worried this could be exploited as a loophole, disinformation expert Nina Jankowicz, director of external engagement at Alethea Group, said she was concerned some people could attempt to portray themselves as political or citizen journalists to keep their content online.
She said she had been targeted with abuse by “citizen bloggers” and people who “like to think of themselves as journalists” and said she was concerned online trolls would use this approach to claim their freedom of expression was being “quashed” if their content was taken down.
Matt Harrison, public and parliamentary affairs manager for the Royal Mencap Society, said he believed it was a “dangerous loophole” given the prominence and importance of social media in political debate.
He added that he had “struggled” to think of a scenario where a harmful or discriminatory remark against a disabled person would be in the public interest or of “democratic value”.
“So I think it is a loophole and I think there’s a couple of phrases within the legislation which are very ambiguous and open to interpretation from various angles and actually could undermine the intentions of the Bill,” he said,
But Mr Russell sounded a hopeful note, suggesting that properly implementing the Bill would help make the internet safer for everyone.
“I think the online world, after a period of self-regulation which patently hasn’t worked, is more dangerous, and I think that’s a problem for the online world because it needs to do good, it’s here for us to use to do good,” he said.
“And so the online world needs to be a better reflection of the offline world in which the dangers are controlled and the platforms, using digital technology, make it safe, particularly for young and vulnerable people.
“And in my mind, it’ll be a return to the world of the internet that I used to use 10 years ago when it seemed to be a much safer place than it is now – the algorithms of the platforms seem to have propelled it towards a much darker, dangerous place.”