Advertisement
UK markets closed
  • FTSE 100

    7,895.85
    +18.80 (+0.24%)
     
  • FTSE 250

    19,391.30
    -59.37 (-0.31%)
     
  • AIM

    745.67
    +0.38 (+0.05%)
     
  • GBP/EUR

    1.1607
    -0.0076 (-0.65%)
     
  • GBP/USD

    1.2370
    -0.0068 (-0.55%)
     
  • Bitcoin GBP

    51,240.14
    -1,266.03 (-2.41%)
     
  • CMC Crypto 200

    1,371.97
    +59.34 (+4.52%)
     
  • S&P 500

    4,967.23
    -43.89 (-0.88%)
     
  • DOW

    37,986.40
    +211.02 (+0.56%)
     
  • CRUDE OIL

    83.24
    +0.51 (+0.62%)
     
  • GOLD FUTURES

    2,406.70
    +8.70 (+0.36%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     
  • HANG SENG

    16,224.14
    -161.73 (-0.99%)
     
  • DAX

    17,737.36
    -100.04 (-0.56%)
     
  • CAC 40

    8,022.41
    -0.85 (-0.01%)
     

Tech Lobby Asks for EU Liability Cover to Tackle Hate Speech

(Bloomberg) -- Big technology platforms are calling on the European Union to protect them from legal liabilities for removing hate speech and illegal content as government scrutiny over how platforms manage user posts grows worldwide.

A safeguard protecting companies that actively manage user posts would result in “better quality content moderation,” by incentivizing platforms to remove bad content while protecting free expression, Edima, an association representing Facebook Inc., ByteDance Ltd.-owned TikTok, Alphabet Inc.’s Google and others, said in a paper Monday.

Current EU rules protect platforms from liability for what’s posted on their sites, unless they have “actual knowledge” of its presence -- for instance, if a user flags it as harmful. Once platforms are made aware of illegal content, they’re obliged to act fast to remove it.

Tech firms fear that by removing content voluntarily, such as with algorithms or other systems to detect infringements, they could be deemed to have actual knowledge and be liable for hosting the bad posts. That’s becoming more of a concern as the European Commission, the bloc’s executive body, prepares to overhaul the longstanding rules to give platforms greater responsibility for the content spread on their sites for everything from hate speech and terrorist propaganda to unsafe toys.

ADVERTISEMENT

“All of our members take their responsibility very seriously and want to do more to tackle illegal content and activity online,” said Siada El Ramly, director general of Edima. “A European legal safeguard for service providers would give them the leeway to use their resources and technology in creative ways in order to do so.”

Europe isn’t alone in increasing scrutiny of tech firms’ legal protections. A U.S. Senate panel has called the chief executive officers of Facebook and Twitter Inc. to testify about their content policies next month. President Donald Trump and other conservatives claim current liability laws for tech platforms enable the companies to silence their views.

In recent years, platforms have also come under intense scrutiny for failing to do enough to monitor activity such as hate speech that’s blamed for inciting violence in places like Myanmar, or for letting Russians spread disinformation to influence the 2016 U.S. presidential election and the U.K.’s Brexit vote.

Read More: U.S., EU Part Ways in Regulating User Content on Social Media

Still, tech companies have been wary of shouldering too much legal responsibility for posts, which they say could harm freedom of speech by incentivizing firms to block more content than is necessary to avoid sanctions.

Edima said it’s sending its proposed amendments, which say providers should still be held accountable for inaction if they receive a substantiated notification of a specific illegality, to officials in the European Commission, Parliament and Council.

New Rules

The EU doesn’t plan to remove the liability protections altogether, but could hit companies with fines if they fail to do enough. The commission is also considering a provision that clarifies that measures to actively search for problematic content doesn’t take tech companies “outside the scope of the liability exemptions,” according to a draft of upcoming policy obtained by Bloomberg. A representative for the commission declined to comment on the draft.

As part of the regulatory overhaul, platforms could also face obligations to maintain a notification system for users to flag illegal content, to report regularly on content removal rates and to collect identification information from business users.

“Very large platforms” may face additional requirements, including providing more transparency around their content moderation, amplification of certain content and online advertising services. In addition, the EU is planning to set up a new board in charge of supporting national authorities to monitor compliance, according to the draft.

The EU proposals, which will also include a new regulation to curb the power of large platforms, are due to be unveiled in early December, but could still be delayed. Once proposed, the Commission, Parliament and Council will need to agree to a final version of the text before it becomes law.

(Updates with context on liability rules throughout.)

For more articles like this, please visit us at bloomberg.com

Subscribe now to stay ahead with the most trusted business news source.

©2020 Bloomberg L.P.