US probes Tesla recall of 2 million vehicles over Autopilot, citing
concerns
Send a link to a friend
[April 26, 2024] By
David Shepardson
WASHINGTON (Reuters) -U.S. auto safety regulators said Friday they have
opened an investigation into whether Tesla's recall of more than 2
million vehicles announced in December to install new Autopilot
safeguards is adequate.
The National Highway Traffic Safety Administration (NHTSA) said it was
opening an investigation after the agency identified concerns due to
crash events after vehicles had the recall software update installed
"and results from preliminary NHTSA tests of remedied vehicles."
The agency's new probe comes after it closed its nearly three-year
investigation into Autopilot, saying it found evidence that "Tesla’s
weak driver engagement system was not appropriate for Autopilot’s
permissive operating capabilities" that result in a "critical safety
gap."
NHTSA also cited Tesla's statement "that a portion of the remedy both
requires the owner to opt in and allows a driver to readily reverse it."
The agency said Tesla has issued software updates to address issues that
appear related to its concerns but has not made them "a part of the
recall or otherwise determined to remedy a defect that poses an
unreasonable safety risk."
Tesla said in December's its largest-ever recall covering 2.03 million
U.S. vehicles - or nearly all of its vehicles on U.S. roads - was to
better ensure drivers pay attention when using its advanced driver
assistance system.
The new recall investigation covers Model Y, X, S, 3 and Cybertruck
vehicles in the U.S. equipped with Autopilot produced between the 2012
and 2024 model years, NHTSA said.
Tesla said in December Autopilot's software system controls "may not be
sufficient to prevent driver misuse" and could increase the risk of a
crash.
The auto safety agency disclosed Friday that during its Autopilot safety
probe it first launched in August 2021 it identified at least 13 Tesla
crashes involving one or more death and many more involving serious
injuries in which "foreseeable driver misuse of the system played an
apparent role."
NHTSA also on Friday raised concerns about Tesla's Autopilot name "may
lead drivers to believe that the automation has
greater capabilities than it does and invite drivers to overly trust the
automation."
Tesla did not immediately respond to a request for comment.
In February, Consumer Reports, a nonprofit organization that evaluates
products and services, said its testing of Tesla's Autopilot recall
update found changes did not adequately address many safety concerns
raised by NHTSA and urged the agency to require the automaker to take
"stronger steps," saying Tesla's recall "addresses minor inconveniences
rather than fixing the real problems."
[to top of second column] |
Auto pilot is shown on a 2018 Tesla Model 3 electric vehicle in this
photo illustration taken in Solana Beach, California, U.S., June 1,
2018. Picture taken June 1, 2018. REUTERS/Mike Blake/File Photo
Tesla's Autopilot is intended to enable cars to steer, accelerate
and brake automatically within their lane, while enhanced Autopilot
can assist in changing lanes on highways but does not make vehicles
autonomous.
One component of Autopilot is Autosteer, which maintains a set speed
or following distance and works to keep a vehicle in its driving
lane.
Tesla said in December it did not agree with NHTSA's analysis but
would deploy an over-the-air software update that will "incorporate
additional controls and alerts to those already existing on affected
vehicles to further encourage the driver to adhere to their
continuous driving responsibility whenever Autosteer is engaged."
NHTSA's then top official, Ann Carlson, said in December the agency
probe determined that more needed to be done to ensure drivers are
engaged when Autopilot is in use. "One of the things we determined
is that drivers are not always paying attention when that system is
on," Carlson said.
NHTSA opened its August 2021 probe of Autopilot after identifying
more than a dozen crashes in which Tesla vehicles hit stationary
emergency vehicles.
NHTSA said in December it found Autopilot "can provide inadequate
driver engagement and usage controls that can lead to foreseeable
misuse."
Separately, since 2016, NHTSA has opened more than 40 Tesla special
crash investigations in cases where driver systems such as Autopilot
were suspected of being used, with 23 crash deaths reported to date.
Tesla's recall includes increasing prominence of visual alerts and
disengaging of Autosteer if drivers do not respond to
inattentiveness warnings and additional checks upon engaging
Autosteer. Tesla said it will restrict Autopilot use for one week if
significant improper usage is detected.
Tesla disclosed in October the U.S. Justice Department issued
subpoenas related to its Full Self-Driving (FSD) and Autopilot.
Reuters reported in October 2022 that Tesla was under criminal
investigation.
Tesla in February 2023 recalled 362,000 U.S. vehicles to update its
FSD Beta software after NHTSA said the vehicles did not adequately
adhere to traffic safety laws and could cause crashes.
(Reporting by David Shepardson; editing by Jason Neely and Louise
Heavens)
[© 2024 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|