U.S. senator slams Tesla's 'misleading' name for
Autopilot driver assistance system
Send a link to a friend
[January 25, 2020] By
David Shepardson
WASHINGTON (Reuters) - A U.S. senator on
Friday urged Tesla Inc <TSLA.O> to rebrand its driver assistance system
Autopilot, saying it has "an inherently misleading name" and is subject
to potentially dangerous misuse.
But Tesla said in a letter that it had taken steps to ensure driver
engagement with the system and enhance its safety features.
The electric automaker introduced new warnings for red lights and stop
signs last year "to minimize the potential risk of red light- or stop
sign-running as a result of temporary driver inattention," Tesla said in
the letter.
Senator Edward Markey said he believed the potential dangers of
Autopilot can be overcome. But he called for "rebranding and remarketing
the system to reduce misuse, as well as building backup driver
monitoring tools that will make sure no one falls asleep at the wheel."
Markey's comments came in a press release, with a copy of a Dec. 20 from
Tesla addressing some of the Democratic senator's concerns attached.
Autopilot has been engaged in at least three Tesla vehicles involved in
fatal U.S. crashes since 2016.
Crashes involving Autopilot have raised questions about the
driver-assistance system’s ability to detect hazards, especially
stationary objects.
There are mounting safety concerns globally about systems that can
perform driving tasks for extended stretches of time with little or no
human intervention, but which cannot completely replace human drivers.
Markey cited videos of Tesla drivers who appeared to fall asleep behind
the wheel while using Autopilot, and others in which drivers said they
could defeat safeguards by sticking a banana or water bottle in the
steering wheel to make it appear they were in control of the vehicle.
[to top of second column] |
An advertisement promotes Tesla Autopilot at a showroom of U.S. car
manufacturer Tesla in Zurich, Switzerland March 28, 2018. REUTERS/Arnd
Wiegmann/File Photo
Tesla, in its letter, said its revisions to steering wheel monitoring meant that
in most situations "a limp hand on the wheel from a sleepy driver will not work,
nor will the coarse hand pressure of a person with impaired motor controls, such
as a drunk driver."
It added that devices "marketed to trick Autopilot, may be able to trick the
system for a short time, but generally not for an entire trip before Autopilot
disengages."
Tesla also wrote that while videos like those cited by Markey showed "a few bad
actors who are grossly abusing Autopilot" they represented only "a very small
percentage of our customer base."
Earlier this month, the U.S. National Highway Traffic Safety Administration (NHTSA)
said it was launching an investigation into a 14th crash involving Tesla in
which it suspects Autopilot or other advanced driver assistance system was in
use.
NHTSA is probing a Dec. 29 fatal crash of a Model S Tesla in Gardena,
California. In that incident, the vehicle exited the 91 Freeway, ran a red light
and struck a 2006 Honda Civic, killing its two occupants.
The National Transportation Safety Board will hold a Feb. 25 hearing to
determine the probable cause of a 2018 fatal Tesla Autopilot crash in Mountain
View, California.
(Reporting by David Shepardson; Editing by Chizu Nomiyama and Tom Brown)
[© 2020 Thomson Reuters. All rights
reserved.] Copyright 2020 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|