UK markets closed
  • NIKKEI 225

    -229.33 (-0.77%)

    -245.52 (-0.86%)

    +0.03 (+0.05%)

    +0.10 (+0.01%)
  • DOW

    -55.20 (-0.16%)

    -49.50 (-0.11%)
  • CMC Crypto 200

    -6.03 (-0.47%)
  • ^IXIC

    -50.19 (-0.36%)
  • ^FTAS

    -15.62 (-0.40%)

Facebook and Twitter say Australia wants to give regulator too much power in bid to combat online bullying

  • Oops!
    Something went wrong.
    Please try again later.
Josh Taylor
·4-min read
  • Oops!
    Something went wrong.
    Please try again later.
<span>Photograph: Hero Images/Alamy</span>
Photograph: Hero Images/Alamy

Facebook and Twitter have raised concern controversial new legislation aimed at curbing bullying and governing online activity will give Australia’s eSafety commissioner too much power over speech online with little oversight.

The online safety bill, introduced into parliament last week, is aimed at giving a broad range of powers to the eSafety commissioner to target bullying and harassment online, extending existing powers protecting children from online bullying to adults, as well as powers over abhorrent violent and terrorist material and adult content online and on social media in Australia.

Under the proposed legislation, the eSafety commissioner will get powers to conduct investigations into bullying, image-based abuse, and other areas within its remit, with the power to summon a person to appear before the commissioner, produce documents or to answer questions relevant to the investigation.

Twitter has said it is concerned about conferring “quasi-judicial and law enforcement powers” on the eSafety commissioner without any guidelines or guardrails around which cases would be considered serious enough to use these powers, other than the broad “serious harm” definition in the legislation.

Related: Australian cyber abuse laws won't address Coalition MPs’ concerns about deplatforming

“In a law enforcement setting, such investigative powers are typically accompanied through a well-established legal process, like obtaining a warrant,” Twitter said in its submission to the Senate committee reviewing the legislation.

“We believe that the process as currently drafted lacks appropriate oversight and due process (ie independent judicial authorisation and the establishment of probable cause).”

Facebook said in its submission the legislation “grants a single regulator a considerable level of discretion and power over speech online” without clear checks and balances.

The company said it was concerned the commissioner could issue takedown requests to social media for content deemed to be “offensive”. It said the definition of offensive was subjective – pointing to past controversy over similar clauses in section 18C of the Racial Discrimination Act.

“The risk is that an eSafety commissioner could order the removal of ‘offensive’ content with public interest value (such as, posts from whistleblowers that contain allegations similar to stories considered by the Royal Commission into Institutional Responses to Child Abuse),” Facebook said.

“There is also the very real risk that this low threshold, as it has in other legislation, would capture political speech: the heat of political debate may result in legitimate political comments that could be considered offensive. Because the speech of political officials is within scope of the legislation, a regulator will have the discretion to potentially police what politicians say to each other.”

Facebook also raised concern the new scheme proposed in the legislation would extend takedowns beyond social media posts into private messaging, when law enforcement already had powers to prosecute people for harassing others over messaging services.

“Private messaging could involve interactions that are highly nuanced and context-dependent and could be misinterpreted as bullying, like a group of friends sharing an in-joke, or an argument between adults currently or formerly in a romantic relationship,” Facebook said.

“It does not seem clear that government regulation of these types of conversations are warranted, given there are already measures to protect against when these conversations become abusive.”

Facebook said private messaging services, such as Messenger or WhatsApp, already had tools to allow users to delete unwanted messages and block contacts.

Twitter, Facebook and Google in each of their submissions pointed out the lack of transparency over how the commissioner’s decisions were made, lack of oversight and the limited review channels – other than taking a case to the administrative appeals tribunal.

Related: Trolls and social media platforms face huge fines in Australia for failing to remove abuse material

Google argued there should be an oversight body made up of the companies and other stakeholders, to address both human rights and technical issues that may arise.

Many of the submissions to the Senate inquiry raised concerns about the speed at which the government was attempting to pass the bill into law.

The government held consultation between 23 December and 14 February on the draft legislation and waited less than two weeks before the legislation was introduced into parliament.

The legislation was referred to a Senate committee last week, with a submissions deadline on Tuesday. A hearing has been scheduled for Friday, and the committee is due to report back to government on 11 March.

Facebook said there had not been enough time for the government to consider the “valid concerns” about the legislation that were raised before it was introduced into parliament.

“Even the best-intentioned policymakers would struggle to comprehend the significant changes contemplated in this legislation within that very short period of time. We urge the committee to recommend a longer period of time for Australian policymakers to properly review and consider the consequences of the draft legislation at hand.”