UK markets closed
  • FTSE 100

    7,208.81
    +188.36 (+2.68%)
     
  • FTSE 250

    19,123.71
    +430.73 (+2.30%)
     
  • AIM

    896.27
    +10.41 (+1.18%)
     
  • GBP/EUR

    1.1618
    -0.0028 (-0.24%)
     
  • GBP/USD

    1.2270
    +0.0009 (+0.07%)
     
  • BTC-GBP

    17,238.02
    -58.57 (-0.34%)
     
  • CMC Crypto 200

    462.12
    +8.22 (+1.81%)
     
  • S&P 500

    3,911.74
    +116.01 (+3.06%)
     
  • DOW

    31,500.68
    +823.32 (+2.68%)
     
  • CRUDE OIL

    107.06
    +2.79 (+2.68%)
     
  • GOLD FUTURES

    1,828.10
    -1.70 (-0.09%)
     
  • NIKKEI 225

    26,491.97
    +320.72 (+1.23%)
     
  • HANG SENG

    21,719.06
    +445.19 (+2.09%)
     
  • DAX

    13,118.13
    +205.54 (+1.59%)
     
  • CAC 40

    6,073.35
    +190.02 (+3.23%)
     

U.S. agency upgrades Tesla Autopilot safety probe, step before possible recall

  • Oops!
    Something went wrong.
    Please try again later.
·4-min read
In this article:
  • Oops!
    Something went wrong.
    Please try again later.

By David Shepardson

WASHINGTON (Reuters) -The National Highway Traffic Safety Administration (NHTSA) on Thursday said it was upgrading its probe into 830,000 Tesla vehicles with its advanced driver assistance system Autopilot, a required step before it could seek a recall.

The auto safety agency in August opened a preliminary evaluation to assess the performance of the system in 765,000 vehicles after about a dozen crashes in which Tesla vehicles struck stopped emergency vehicles -- and said Thursday https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF it had identified six additional crashes.

NHTSA is upgrading its probe to an engineering analysis, which it must do before demanding a recall if deemed necessary.

The auto safety regulator is reviewing whether Tesla vehicles adequately ensure drivers are paying attention. The agency added evidence suggested drivers in most crashes under review had complied with Tesla's alert strategy that seeks to compel driver attention, raising questions about its effectiveness.

In 2020, the National Transportation Safety Board https://www.ntsb.gov/investigations/Pages/HWY18FH011.aspx criticized Tesla's "ineffective monitoring of driver engagement" after a 2018 fatal Autopilot crash and said NHTSA had provided "scant oversight."

NHTSA said the upgrade https://static.nhtsa.gov/odi/inv/2022/INOA-EA22002-3184.PDF is "to extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision."

Tesla, which has disbanded its press offices, did not respond to a request for comment.

NHTSA said it has reports of 16 crashes, including seven injury incidents and one death, involving Tesla vehicles in Autopilot that had struck stationary first-responder and road maintenance vehicles.

Democratic Senator Ed Markey praised NHTSA's upgrade. "Every day that Tesla disregards safety rules and misleads the public about its 'Autopilot" system, our roads become more dangerous," he wrote https://twitter.com/SenMarkey/status/1534970584299360278?s=20&t=8-y5q7ONYbproS5FM7sLYA on Twitter.

NHTSA said its analysis indicated that Forward Collision Warnings activated in the majority of incidents just prior to impact and that subsequent Automatic Emergency Braking intervened in approximately half of the crashes.

"On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact," the agency added.

NHTSA noted that "where incident video was available, the approach to the first responder scene would have been visible to the driver an average of 8 seconds leading up to impact."

The agency also reviewed 106 reported Autopilot crashes and said in approximately half, "indications existed that the driver was insufficiently responsive to the needs of the dynamic driving task."

"A driver’s use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect," the agency said.

NHTSA also found in about a quarter of the 106 crashes, the primary crash factor appeared to relate to operating the system where Tesla says limitations may exist in places like roadways other than limited-access highways, or while in visibility environments involving factors such as rain, snow, or ice.

Tesla says Autopilot https://static.nhtsa.gov/odi/inv/2022/INOA-PE22002-4385.PDF allows the vehicles to brake and steer automatically within their lanes but does not make them capable of driving themselves.

A NHTSA spokesperson said advanced driving assistance features can promote safety "by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly."

Last week, NHTSA said it asked Tesla https://static.nhtsa.gov/odi/inv/2022/INIM-PE22002-87085.pdf to respond to questions by June 20 after it received 758 reports of unexpected brake activation tied to Autopilot in its separate investigation of 416,000 newer vehicles.

Separately, NHTSA has opened 35 special crash investigations into incidents involving Tesla vehicles, in which Autopilot or other advanced systems were suspected of use involving 14 reported deaths since 2016, including a crash that killed three last month in California.

NHTSA asked a dozen other automakers including General Motors Toyota Motor Corp and Volkswagen to answer questions on "driver engagement and attentiveness strategies" using driver assistance systems" during its Tesla probe but has not released their responses.

(Reporting by David Shepardson; Editing by Bill Berkrot, Bernadette Baum and Chizu Nomiyama)

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting