US to probe Tesla's 'Full Self-Driving' system after pedestrian killed
in low visibility conditions
Send a link to a friend
[October 19, 2024] By
TOM KRISHER
DETROIT (AP) — The U.S. government's road safety agency is investigating
Tesla's “Full Self-Driving” system after getting reports of crashes in
low-visibility conditions, including one that killed a pedestrian.
The National Highway Traffic Safety Administration said in documents
that it opened the probe on Thursday after the company reported four
crashes when Teslas encountered sun glare, fog and airborne dust.
In addition to the pedestrian's death, another crash involved an injury,
the agency said.
Investigators will look into the ability of “Full Self-Driving” to
“detect and respond appropriately to reduced roadway visibility
conditions, and if so, the contributing circumstances for these
crashes.”
The investigation covers roughly 2.4 million Teslas from the 2016
through 2024 model years.
A message was left Friday seeking comment from Tesla, which has
repeatedly said the system cannot drive itself and human drivers must be
ready to intervene at all times.
Last week Tesla held an event at a Hollywood studio to unveil a fully
autonomous robotaxi without a steering wheel or pedals. Musk, who has
promised autonomous vehicles before, said the company plans to have
autonomous Models Y and 3 running without human drivers next year.
Robotaxis without steering wheels would be available in 2026 starting in
California and Texas, he said.
The investigation's impact on Tesla's self-driving ambitions isn't
clear. NHTSA would have to approve any robotaxi without pedals or a
steering wheel, and it's unlikely that would happen while the
investigation is in progress. But if the company tries to deploy
autonomous vehicles in its existing models, that likely would fall to
state regulations. There are no federal regulations specifically focused
on autonomous vehicles, although they must meet broader safety rules.
NHTSA also said it would look into whether any other similar crashes
involving “Full Self-Driving” have happened in low visibility
conditions, and it will seek information from the company on whether any
updates affected the system’s performance in those conditions.
“In particular, this review will assess the timing, purpose and
capabilities of any such updates, as well as Tesla’s assessment of their
safety impact,” the documents said.
Tesla reported the four crashes to NHTSA under an order from the agency
covering all automakers. An agency database says the pedestrian was
killed in Rimrock, Arizona, in November of 2023 after being hit by a
2021 Tesla Model Y. Rimrock is about 100 miles (161 kilometers) north of
Phoenix.
The Arizona Department of Public Safety said in a statement that the
crash happened just after 5 p.m. Nov. 27 on Interstate 17. Two vehicles
collided on the freeway, blocking the left lane. A Toyota 4Runner
stopped, and two people got out to help with traffic control. A red
Tesla Model Y then hit the 4Runner and one of the people who exited from
it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the
scene.
[to top of second column] |
The collision happened because the
sun was in the Tesla driver's eyes, so the Tesla driver was not
charged, said Raul Garcia, public information officer for the
department. Sun glare also was a contributing factor in the first
collision, he added.
Tesla has twice recalled “Full Self-Driving” under pressure from
NHTSA, which in July sought information from law enforcement and the
company after a Tesla using the system struck and killed a
motorcyclist near Seattle.
The recalls were issued because the system was programmed to run
stop signs at slow speeds and because the system disobeyed other
traffic laws. Both problems were to be fixed with online software
updates.
Critics have said that Tesla’s system, which uses only cameras to
spot hazards, doesn’t have proper sensors to be fully self driving.
Nearly all other companies working on autonomous vehicles use radar
and laser sensors in addition to cameras to see better in the dark
or poor visibility conditions.
Musk has said that humans drive with only eyesight, so cars should
be able to drive with just cameras. He has called lidar (light
detection and ranging), which uses lasers to detect objects, a
“fool's errand.”
The “Full Self-Driving” recalls arrived after a three-year
investigation into Tesla's less-sophisticated Autopilot system
crashing into emergency and other vehicles parked on highways, many
with warning lights flashing.
That investigation was closed last April after the agency pressured
Tesla into recalling its vehicles to bolster a weak system that made
sure drivers are paying attention. A few weeks after the recall,
NHTSA began investigating whether the recall was working.
NHTSA began its Autopilot crash investigation in 2021, after
receiving 11 reports that Teslas that were using Autopilot struck
parked emergency vehicles. In documents explaining why the
investigation was ended, NHTSA said it ultimately found 467 crashes
involving Autopilot resulting in 54 injuries and 14 deaths.
Autopilot is a fancy version of cruise control, while “Full
Self-Driving” has been billed by Musk as capable of driving without
human intervention.
The investigation that was opened Thursday enters new territory for
NHTSA, which previously had viewed Tesla's systems as assisting
drivers rather than driving themselves. With the new probe, the
agency is focusing on the capabilities of “Full Self-Driving" rather
than simply making sure drivers are paying attention.
Michael Brooks, executive director of the nonprofit Center for Auto
Safety, said the previous investigation of Autopilot didn't look at
why the Teslas weren't seeing and stopping for emergency vehicles.
“Before they were kind of putting the onus on the driver rather than
the car,” he said. “Here they're saying these systems are not
capable of appropriately detecting safety hazards whether the
drivers are paying attention or not.”
All contents © copyright 2024 Associated Press. All rights reserved
|