FSD
is an advanced driver assistance system that handles some
driving tasks, but Tesla says it does not make vehicles
completely autonomous. The features "require a fully attentive
driver", according to the company.
If Tesla's cars are deemed autonomous by California, state laws
would require it to disclose all crashes on public roads, even
when under manual control. Those reports are made public, as is
data on self-driving systems being disengaged.
California's Department of Motor Vehicles (DMV) informed Tesla
about the regulator's review last week, the Los Angeles Times
said.
"Recent software updates, videos showing dangerous use of that
technology, open investigations by the National Highway Traffic
Safety Administration, and the opinions of other experts in this
space prompted the reevaluation," the DMV said, according to the
report.
The DMV and Tesla did not immediately respond to Reuters
requests for comment.
In October last year, Tesla vehicles with the then latest 10.3
FSD software repeatedly provided forward collision warnings when
there was no immediate danger, according to video postings of
beta users. However, Tesla fixed the software within a day.
(Reporting by Jaiveer Shekhawat and Shubham Kalia in Bengaluru;
Editing by Shounak Dasgupta)
[© 2022 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|