Federal regulator finds Tesla Autopilot has ‘critical safety gap’ linked to hundreds of collisions

A
Tesla
Model
X
burns
after
crashing
on
U.S.
Highway
101
in
Mountain
View,
California,
U.S.
on
March
23,
2018. 

S.
Engleman
|
Via
Reuters

Federal
authorities
say
a “critical
safety
gap”
in


Tesla
‘s
Autopilot
system
contributed
to
at
least
467
collisions,
13
resulting
in
fatalities
and “many
others”
resulting
in
serious
injuries.

The
findings
come
from

a
National
Highway
Traffic
Safety
Administration
analysis

of
956
crashes
in
which
Tesla
Autopilot
was
thought
to
have
been
in
use.
The
results
of
the
nearly
three-year
investigation
were
published
Friday.

Tesla’s
Autopilot
design
has “led
to
foreseeable
misuse
and
avoidable
crashes,”
the
NHTSA
report
said.
The
system
did
not “sufficiently
ensure
driver
attention
and
appropriate
use.”

NHTSA’s
filing
pointed
to
a “weak
driver
engagement
system,”
and
Autopilot
that
stays
switched
on
even
when
a
driver
isn’t
paying
adequate
attention
to
the
road
or
the
driving
task.
The
driver
engagement
system
includes
various
prompts,
including “nags”
or
chimes,
that
tell
drivers
to
pay
attention
and
keep
their
hands
on
the
wheel,
as
well
as
in-cabin
cameras
that
can
detect
when
a
driver
is
not
looking
at
the
road. 

The
agency
also
said
it
was
opening
a
new
probe
into
the
effectiveness
of
a
software
update
Tesla
previously
issued
as
part
of
a
recall
in
December.
That
update
was
meant
to
fix
Autopilot
defects
that
NHTSA
identified
as
part
of
this
same
investigation.

The
voluntary
recall
via
an
over-the-air
software
update
covered
2
million
Tesla
vehicles
in
the
U.S.,
and
was
supposed
to
specifically
improve
driver
monitoring
systems
in
Teslas
equipped
with
Autopilot.

NHTSA
suggested
in
its
report
Friday
that
the
software
update
was
probably
inadequate,
since
more
crashes
linked
to
Autopilot
continue
to
be
reported.

More
CNBC
reporting
on
Tesla

In
one
recent
example,
a
Tesla
driver
in
Snohomish
County,
Washington,
struck
and
killed
a
motorcyclist
on
April
19,
according
to
records
obtained
by
CNBC
and
NBC
News.
The
driver
told
police
he
was
using
Autopilot
at
the
time
of
the
collision.

The
NHTSA
findings
are
the
most
recent
in
a
series
of
regulator
and
watchdog
reports
that
have
questioned
the
safety
of
Tesla’s
Autopilot
technology,
which
the
company
has
promoted
as
a
key
differentiator
from
other
car
companies.


On
its
website
,
Tesla
says
Autopilot
is
designed
to
reduce
driver “workload”
through
advanced
cruise
control
and
automatic
steering
technology.

Tesla
has
not
issued
a
response
to
Friday’s
NHTSA
report
and
did
not
respond
to
a
request
for
comment
sent
to
Tesla’s
press
inbox,
investor
relations
team
and
to
the
company’s
vice
president
of
vehicle
engineering,
Lars
Moravy.

Following
the
release
of
the
NHTSA
report,
Sens.
Edward
J.
Markey,
D-Mass.,
and
Richard
Blumenthal,
D-Conn.,
issued
a
statement
calling
on
federal
regulators
to
require
Tesla
to
restrict
its
Autopilot
feature “to
the
roads
it
was
designed
for.”


On
its
Owner’s
Manual
website
,
Tesla
warns
drivers
not
to
operate
the
Autosteer
function
of
Autopilot “in
areas
where
bicyclists
or
pedestrians
may
be
present,”
among
a
host
of
other
warnings.

“We
urge
the
agency
to
take
all
necessary
actions
to
prevent
these
vehicles
from
endangering
lives,”
the
senators
said.
 

Earlier
this
month,
Tesla

settled
a
lawsuit

from
the
family
of
Walter
Huang,
an
Apple
engineer
and
father
of
two,
who
died
in
a
crash
when
his
Tesla
Model
X
with
Autopilot
features
switched
on
hit
a
highway
barrier.
Tesla
has
sought
to
seal
from
public
view
the
terms
of
the
settlement.

In
the
face
of
these
events,
Tesla
and
CEO
Elon
Musk
signaled
this
week
that
they
are
betting
the
company’s
future
on
autonomous
driving.

“If
somebody
doesn’t
believe
Tesla’s
going
to
solve
autonomy,
I
think
they
should
not
be
an
investor
in
the
company,”
Musk
said
on
Tesla’s
earnings
call
Tuesday.
He
added, “We
will,
and
we
are.”

Musk
has
for
years
promised
customers
and
shareholders
that
Tesla
would
be
able
to
turn
its
existing
cars
into
self-driving
vehicles
with
a
software
update.
However,
the
company
offers
only
driver
assistance
systems
and
has
not
produced
self-driving
vehicles
to
date.

He
has
also
made
safety
claims
about
Tesla’s
driver
assistance
systems
without
allowing
third-party
review
of
the
company’s
data.

For
example,
in
2021,
Elon
Musk
claimed
in
a
post

on
social
media
, “Tesla
with
Autopilot
engaged
now
approaching
10
times
lower
chance
of
accident
than
average
vehicle.”

Philip
Koopman,
an
automotive
safety
researcher
and

Carnegie
Mellon
University

associate
professor
of
computer
engineering,
said
he
views
Tesla’s
marketing
and
claims
as “autonowashing.”
He
also
said
in
response
to
NHTSA’s
report
that
he
hopes
Tesla
will
take
the
agency’s
concerns
seriously
moving
forward.

“People
are
dying
due
to
misplaced
confidence
in
Tesla
Autopilot
capabilities.
Even
simple
steps
could
improve
safety,”
Koopman
said. “Tesla
could
automatically
restrict
Autopilot
use
to
intended
roads
based
on
map
data
already
in
the
vehicle.
Tesla
could
improve
monitoring
so
drivers
can’t
routinely
become
absorbed
in
their
cellphones
while
Autopilot
is
in
use.”


A
version
of
this
story 
was
published
on
NBCNews.com
.

Comments are closed.