Online video giant YouTube told Parliament on Wednesday that it has changed its policy for preventing harmful and offensive content being uploaded onto its site since a video of the Christchurch massacre was mass distributed across the platform.
Footage of the shooting was uploaded to YouTube on 15 March several thousands of times and in different versions, before eventually being removed.
Appearing in front of the the Digital, Culture, Media and Sport Select Committee, Marco Pancini, Youtube’s public policy director for Europe, the Middle East and Africa (EMEA) told MPs that in the aftermath of the footage appearing on the platform, the company has now taken the decision to block any videos referred by its algorithms first, before any review of the content happens.
Previously, referred content was often still allowed to be uploaded while under review. According to Mr Pancini the original video of the shooting was identified “within minutes.”
Mr Pancini said: “Now if something like this happens and the machine is identifying pieces of the video, we block the videos, we review the videos and only after we have checked that it is from an authoritative source reporting the facts do we let the content go online.”
Mr Pancini added that YouTube now had an internal team “proactively looking for new trends in online harm” on its platform.
“The head of this team is a former deputy intelligence expert at the FBI, reporting directly to our CEO [Susan Wojcicki]. Part of this team is based here in London.”
Committee chair Damien Collins MP, asked: ”Do you feel that Christchurch demonstrates that more needs to be done at the level at which these sort of companies operate at present, that the level of adequacy is not enough?”
Mr Pancini replied that, as evidence that more had to be done to prevent undesirable content getting online, Youtube “had signed a pledge and call to action” on Wednesday with President Macron of France and New Zealand Prime Minister Jacinda Arden.
Mr Colling said: “I guess the question lots of people would ask is why does it take a terrible atrocity like that to make companies like yours sign a pledge?”
YouTube, its parent company Google and Instagram were appearing before the select committee as part of its inquiry into immersive and addictive technologies.
One of the MPs concerns was that users might be spending too much time on the platform, given that 70% of the content consumed was recommended by YouTube algorithms, rather that discovered directly by users.
Mr Collins asked: “Do you think there should be a function where people are prompted to take a break?”
Rich Waterworth, YouTube’s marketing director EMEA said that the platform had now introduced “wellbeing tools” to address this issue.
“Users can set a timer to deliver a notification after a set period of time. The video will pause and prompt them to think about whether they need to take a break. The default will be at 75 minutes of continuous viewing, but the user can adjust that to higher or lower settings.”