Advertisement
UK markets closed
  • FTSE 100

    8,139.83
    +60.97 (+0.75%)
     
  • FTSE 250

    19,824.16
    +222.18 (+1.13%)
     
  • AIM

    755.28
    +2.16 (+0.29%)
     
  • GBP/EUR

    1.1674
    +0.0017 (+0.15%)
     
  • GBP/USD

    1.2478
    -0.0033 (-0.26%)
     
  • Bitcoin GBP

    50,895.53
    -458.28 (-0.89%)
     
  • CMC Crypto 200

    1,322.61
    -73.92 (-5.29%)
     
  • S&P 500

    5,102.24
    +53.82 (+1.07%)
     
  • DOW

    38,250.24
    +164.44 (+0.43%)
     
  • CRUDE OIL

    84.08
    +0.51 (+0.61%)
     
  • GOLD FUTURES

    2,348.70
    +6.20 (+0.26%)
     
  • NIKKEI 225

    37,934.76
    +306.28 (+0.81%)
     
  • HANG SENG

    17,651.15
    +366.61 (+2.12%)
     
  • DAX

    18,161.01
    +243.73 (+1.36%)
     
  • CAC 40

    8,088.24
    +71.59 (+0.89%)
     

What is live facial recognition? Man jailed for stabbing bus driver after police use tech

What is live facial recognition? Man jailed for stabbing bus driver after police use tech

A man has been sentenced to more than five years in prison after stabbing a London bus driver. Twenty-year-old Bradley Peek fled the scene but was identified within hours by detectives using CCTV images and facial recognition technology.

The bus driver suffered a punctured lung and a tear to the heart after being stabbed in the back while driving his route in east London.

Peek saw a friend arguing with the driver back in June 2023. He joined in the row and knifed the victim, who was found at the junction with Arbour Square with an injury to his back.

The driver's life was saved by the expertise of medics at the Royal London Hospital in Whitechapel.

ADVERTISEMENT

Metropolitan Police Detective Inspector Jonathan Potter, who led the investigation, said: “This was a senseless attack and a huge team effort from officers across the Met to secure justice for the victim.

“The use of facial recognition technology and CCTV footage allowed officers to quickly identify and locate Peek.

“I would also like to thank colleagues from the Royal London Hospital who performed an emergency operation which ultimately saved the victim’s life.”

Siwan Hayward, director of security, policing and enforcement at Transport for London, said: “This was an appalling act of violence against our colleague who was just trying to do their job.

“Our staff have the right to do their job without fear or intimidation and we do not tolerate any violence, aggression or threatening behaviour towards them.

“We will always work with the police to push for the strongest sentences possible for offenders, and we're pleased that the offender has been brought to justice.”

This comes after ministers have been pushing for the inclusion of facial recognition technology (LFR) in routine law-enforcement activities.

But the use of LFR has long been controversial, and its use by UK police forces has previously been found to be unlawful.

What is live facial recognition and how does it work?

Live facial recognition technology works by using cameras to scan the faces of people in a specific area.

Their images are then streamed to a LFR system which contains a database of people the police are looking for.

This “watchlist” includes offenders, people wanted by the police or the courts, or people who pose a risk of harm to themselves or others.

The Met says “LFR is not a ubiquitous tool that uses lots of CCTV cameras from across London to track every person’s movements. It is a carefully deployed overt policing tactic to help locate a limited number of people the police need to find in order to keep London safe.”

London’s police force most recently used LFR for the King’s coronation, watching out for people in the crowds at locations like Buckingham Palace, Westminster Abbey and the Royal Mile “in order to keep people safe”.

 (@JHurfurt/Twitter)
(@JHurfurt/Twitter)

Why is facial recognition controversial?

The use of facial recognition technology is considered to be controversial for a number of reasons.

One criticism is that thousands of people’s biometric data is captured without their consent. The images used to create the watchlists can also come from anywhere, taken without the individual’s consent.

Additionally, the police have been using LFR in public without parliamentary debate and the public has not been given an opportunity to vote on its use.

Furthermore, people have been misidentified, and subsequently wrongly stopped and searched, and the technology has also found to be discriminatory, with Black people more likely to be misidentified.

In an article for Harvard University, Alex Najibi wrote that “a growing body of research exposes divergent error rates across demographic groups, with the poorest accuracy consistently found in subjects who are female, Black, and 18-30 years old.”

The 2018 “Gender Shades” found that three algorithms performed the worst on darker-skinned females, with error rates up to 34% higher than for lighter-skinned males.

What have the courts and civil liberties groups said about facial recognition?

In August 2020, the Court of Appeal ruled that a UK police force’s use of facial recognition technology infringed on privacy and data protection laws. South Wales Police use of facial recognition was found to be discriminatory, after lawyers argued it interferes with privacy and data protection laws and is potentially discriminatory.

In June 2021, the Information Commissioner expressed concern about the potential misuse of live facial recognition in public places, and warned that “supercharged CCTV” could be used “inappropriately, excessively or even recklessly”.

In August 2021, a group of civil society bodies urged the Government to ban facial recognition cameras. They also accused the police Home Office of bypassing Parliament over guidance for the use of the technology.

“In a democratic society, it is imperative that intrusive technologies are subject to effective scrutiny,” the letter said.

In October 2021, the European Parliament called for a ban on police use of facial recognition technology in public places, as well as a ban on private facial recognition databases.

Campaigners also worry that police may employ face-scanning technology on protesters.

In October 2023, UK police forces and private firms were urged to drop technology due to impact on human rights.

The campaign was spearheaded by the privacy advocate Big Brother Watch and is also backed by 31 groups including Liberty, Amnesty International and the Race Equality Foundation.