Self-driving car industry confronts trust issues after
Uber crash
Send a link to a friend
[March 22, 2018]
By Alexandria Sage, Tina Bellon and Nick Carey
DETROIT (Reuters) - The fatal accident
involving an Uber self-driving car cranks up pressure on the
self-driving vehicle industry to prove its software and sensors are safe
in the absence of strong government standards, experts in the field
said.
Automakers including General Motors Co <GM.N>, technology companies such
as Alphabet Inc <GOOGL.O> and ride services providers like Uber
Technologies Inc [UBER.UL] have all urged policy makers at the federal
and state level not to put a heavy regulatory hand on an industry still
in development. They have said their extensive testing demonstrates
commitment to safety.
But the Uber accident in Tempe, Arizona - the first death attributed to
a self-driving car operating in autonomous mode - has given ammunition
to critics of the industry concerned that the lack of clear standards
allows manufacturers to test faulty, or partially developed technology
on public streets.
Industry executives had begun well before Sunday's accident to defuse
concerns by opening up about their testing methods - without giving away
the secrets of their system designs.
Public disclosure of self-driving car testing data is inconsistent and
varies by state. California, for example, requires manufacturers to
report instances when an autonomous vehicle system disengages. Arizona
does not.
"There is no question whatsoever that regulations are coming," said Doug
Mehl, a partner at A.T. Kearney's automotive practice, based in Detroit.
"But right now (automakers), software developers and service providers
have an opportunity to shape what those regulations are going to look
like."
Alphabet's Waymo self-driving car unit, for example, has underscored in
a report that its autonomous vehicles have now logged 5 million miles in
real-world testing, and billions more in computer simulations. GM's
Cruise Automation unit has highlighted its decision to teach its driving
system to navigate San Francisco's congested streets.
Still, Amnon Shashua, head of Intel Corp's <INTC.O> Mobileye vision
systems unit, said the industry must do more. He has called for the
self-driving vehicle industry to develop "provable safety assurances".
"We need to prove that these vehicles are much, much safer than humans,"
Shashua told Reuters. "How do you go and guarantee that you have a
technology that the probability of a fatality per one hour of driving is
1,000 times better than a human? Nobody talks about that because nobody
knows what to do."
(Interactive Graphic: Autonomous vehicle performance in California -
http://tmsnrt.rs/2DFUPgA)
[to top of second column] |
A still frame taken from video released March 21, 2018 shows the
exterior view of the self-driving Uber vehicle leading up to the
fatal collision in Tempe, Arizona, U.S. on March 18, 2018. Tempe
Police Department/Handout via REUTERS
NO FEDERAL STANDARDS
Most self-driving vehicles are equipped with radar sensors and lidar
sensors, which use lasers to detect obstacles around the vehicle. There
are no federal standards yet specifying how such systems should work.
Congress and federal regulators are still debating how tightly to
regulate such systems.
"There should be vision tests for the sensors they are using, both
static and dynamic to see how well they work," said Missy Cummings, a
Duke University mechanical engineering professor.
The short video recorded by cameras in the Uber vehicle that struck
pedestrian Elaine Herzberg while crossing a street in Tempe, Arizona
late Sunday raises questions about whether the Uber system responded
better than a human driver, experts said on Wednesday.
Uber has hired human operators to sit in the driver's seats of its
autonomous vehicles to intervene if necessary. The video released by
Tempe police shows a human operator behind the wheel of the Uber vehicle
before the impact.
The operator is seen looking down, away from the street, in the seconds
before the vehicle struck Herzberg. She was pushing a bicycle across the
street from left lane into the right lane where the Uber vehicle was
driving.
"It seems it should have detected her," Daniel Sperling, director of the
Institute for Transportation Studies at University of California Davis
told Reuters in an email after viewing the video. "It seems unlikely
that a human driver would have done better. We do want AVs to do better
than us and the potential exists."
Americans were wary of autonomous vehicle technology even before
Sunday's fatality.
According to a Reuters/Ipsos opinion poll released in late January,
two-thirds of Americans are uncomfortable about the idea of riding in
self-driving cars.
"The greater risk for the industry is that if people feel it is unsafe,
or the testing is unsafe, you'll see a real backlash against this
technology," Matthew Johnson-Roberson, co-director of the University of
Michigan Ford Center for Autonomous Vehicles.
(Reporting by Alexandria Sage, Nick Carey, Tina Bellon and Paul Lienert.
Editing by Joseph White and Kenneth Maxwell)
[© 2018 Thomson Reuters. All rights
reserved.] Copyright 2018 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |