Top tech companies including Facebook (FB) and Twitter (TWTR) will need to remove and limit the spread of illegal content such as child sexual abuse and terrorist material, or face huge fines of up to £18m ($24m) or 10% of annual global turnover, whichever is higher, the UK government has proposed.
The government plans to bring the laws forward in an Online Safety Bill next year.
The laws “will safeguard people’s rights online and empower adult users to keep themselves safe while preventing companies arbitrarily removing content.”
“It will defend freedom of expression and the invaluable role of a free press, while driving a new wave of digital growth by building trust in technology businesses,” the Department for Digital, Culture, Media and Sport said in a statement.
Ofcom is confirmed as the regulator with the power to fine companies failing in their duty of care. It will have the power to block non-compliant services from being accessed in the UK.
Social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games will all be impacted by the rules.
The most popular social media sites, likely to include Facebook, TikTok, Instagram and Twitter according to the department, will need to “go further” by enforcing clear terms and conditions which explicitly state how they will handle content which is legal but could cause significant physical or psychological harm to adults.
This includes dangerous disinformation and misinformation about coronavirus vaccines, and will help bridge the gap between what companies say they do and what happens in practice, the government said.
The legislation includes provisions to impose criminal sanctions on senior managers. The government will not hesitate to bring these powers into force should companies fail to take the new rules seriously, it said.
Digital secretary Oliver Dowden said: “I’m unashamedly pro tech but that can’t mean a tech free for all,” adding that the proposed framework “will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.”
Earlier in the year, Lord David Puttnam, member of the House of Lords, had said the bill may not come into effect until 2023 or 2024, after a government minister said they could not commit to bringing it to parliament in 2021.
The bill was unveiled in 2019 after the story of teenager Molly Russell, who allegedly killed herself after viewing online images of self-harm in 2017, came to light.
The government has said it is also looking into whether the promotion of self-harm should be made illegal.
Rocio Concha, director of policy and advocacy at consumer group Which?, believes the bill is a “missed opportunity to force online platforms such as Facebook and Google to address their shortcomings and prevent fraudsters from operating on their sites.”
However, she praised the government for “giving tech giants more responsibility to protect users.”
Last week, the UK’s Competition and Markets Authority said it has issued advice to the UK government on a new regime for digital markets, which will “proactively shape the behaviour of the most powerful tech firms” including Google (GOOG) and Facebook.
WATCH: What is inflation and why is it important?