Advertisement
UK markets closed
  • NIKKEI 225

    38,460.08
    +907.92 (+2.42%)
     
  • HANG SENG

    17,201.27
    +372.34 (+2.21%)
     
  • CRUDE OIL

    82.90
    -0.46 (-0.55%)
     
  • GOLD FUTURES

    2,336.00
    -6.10 (-0.26%)
     
  • DOW

    38,465.35
    -38.34 (-0.10%)
     
  • Bitcoin GBP

    51,966.57
    -1,721.98 (-3.21%)
     
  • CMC Crypto 200

    1,397.60
    -26.50 (-1.86%)
     
  • NASDAQ Composite

    15,705.18
    +8.54 (+0.05%)
     
  • UK FTSE All Share

    4,374.06
    -4.69 (-0.11%)
     

Facebook 'bug' stopped it removing terrorist content

The site also said it took down a million more newly-uploaded posts that it had found itself in the second quarter than during the first, with the overall number rising from 1.2m to 2.2m.   - AFP
The site also said it took down a million more newly-uploaded posts that it had found itself in the second quarter than during the first, with the overall number rising from 1.2m to 2.2m. - AFP

Facebook failed to delete posts promoting terrorism because of a "bug" in its systems, the company has admitted. 

The social network, which has been under mounting pressure to police extremist material, blamed a technical glitch for a huge rise in the length of time it took to take down the posts.

Some of the posts could have been up for several weeks or even months before they were deleted, the Telegraph understands. 

The revelation raises more questions about Facebook's ability to police the content on its own website, even as it has invested in tools to automatically spot and delete terrorist images.

ADVERTISEMENT

Facebook said the average time to take action on posts recently uploaded to the site leapt from less than a minute to 14 hours between April and July, because it fixed a bug which had previously prevented it from removing older posts.

“The increase was prompted by multiple factors, including fixing a bug that prevented us from removing some content that violated our policies, and rolling out new detection and enforcement systems,” said Monika Bickert, Facebook's global head of policy management, and Brian Fishman, its head of counterterrorism policy.

The median time dropped again to less than two minutes in the third quarter of the year.

Facebook said that new technologies used to take down terrorist material “improve and get better over time, but during their initial implementation such improvements may not function as quickly as they will at maturity".

"This may result in increases in the time-to-action, despite the fact that such improvements are critical for a robust counterterrorism effort," the company added.

The site also said it took down 2.2 million newly-uploaded posts that its technology had found in the second quarter of the year, compared to 1.2 million in the first quarter.

So far this year, it has taken down 14.3 million terrorist-related posts. That includes newly-uploaded posts it has found itself, older posts it has found, and those reported by users.

Facebook is using AI to spot potentially harmful posts which look like they express support for Islamic State or al-Qaeda, with an automated tool giving each post a rating to show how likely it is to contain support for terrorism.

Human reviewers then prioritise the items with the highest scores, and some posts with a very high score are automatically removed if the technology indicates that there is a very high likelihood that they contain terrorist content.

Facebook said the machine learning had helped reduce the average amount of time taken to remove posts reported by users from 43 hours in the first quarter to 18 hours in the third.

The site is among social networks which has faced criticism for its role in allowing terror groups to spread propaganda and recruit new members.

It has also faced questions over its use of human content reviewers who must view videos and images which contain violent and unsettling content in order to determine whether they should be removed from the site.

In September the European Commission said Facebook, Google and other tech companies could face fines if they did not remove terrorist content within an hour of being notified about it by the authorities.