NEW YORK (Legal Newsline) – On May 7, Ohio resident Joshua Brown was driving
his Tesla in semi-autonomous mode when a tractor-trailer pulled in front of
him in Williston, Florida.
Brown’s vehicle, on Autopilot, crashed into
the truck, killing the 40-year-old. According to Tesla, it was the first known
fatal crash involving a vehicle operating in semi-autonomous mode.
What are the legal ramifications of this incident?
There are a lot.
Douglas Bohn, a commercial litigator with Cullen and Dykman LLP in New York, said there
is a potential for a wrongful death lawsuit, but a lot of questions have to be answered
“I don’t think it has been filed yet,” Bohn
told Legal Newsline. “Is there potential for one? Yes.”
One of the reasons it has not been filed yet is
that the case still is being investigated in a number of different ways.
“What you need to do in this situation is
figure out what happened,” Bohn said. “How did the accident occur? Was there a
malfunction in the technology?”
Bohn said it is not always a
technological malfunction or product liability.
“The technology is not always to blame,” Bohn
said, “You also have human error. There are product liability cases that are
tragedies. It came down to human error and it was not the technological design.”
The Tesla press release said the vehicle was on a divided
highway with Autopilot engaged when a tractor-trailer drove across the highway
perpendicular to the Model S. Neither Autopilot nor the driver noticed the
white side of the tractor-trailer against a brightly lit sky, so the brake was
The high-ride height of the trailer combined with its
positioning across the road and the extremely rare circumstances of the impact
caused the Model S to pass under the trailer, with the bottom of the trailer
impacting the windshield of the Model S, the press release says.
Had the Model S impacted the front or
rear of the trailer, even at high speed, its advanced crash safety system would
likely have prevented serious injury as it has in numerous other similar
incidents, Tesla said.
There also has been speculation in the legal community that the driver was possibly distracted and watching a movie at the time of the
accident. There has also been speculation that the sensors used on the Model S
have a “blind spot” that limits the Autopilot system’s ability to detect some
objects at the height of the tractor-trailer.
A Tesla executive, who
would only speak to the New York Times on the condition of anonymity, said the Autopilot system had performed safely in millions of miles driven. “It’s not
like we are starting to test this using our customers as guinea pigs,” the executive said.
The Tesla executive also said that drivers should be constantly
aware of road conditions and take control of the car at any time.
Elon Musk, Tesla’s chief
executive, told the Wall Street Journal that the company planned a blog post
for Tesla owners on how to operate their vehicles safely.
The federal government
is getting involved as well. It is
investigating the crash, and Sen. John Thune
(R-S.D.), who chairs the U.S. Senate Commerce Committee, sent a letter to Musk wanting the
Tesla executive to come to Washington to testify.
information regarding the actions Tesla Motors has taken thus far, as well as
future actions planned in response to this accident,” Thune wrote.
Tesla also received a
letter from the National Highway Traffic Safety Administration indicating
the agency was investigating whether there are defects in the various
crash-prevention systems related to Autopilot.
The letter asked about
those systems, including automatic emergency braking, which is supposed to stop
Tesla models from running into other vehicles detected by radar and a camera.
“We have to figure out
what happened,” Bohn said, "not only from a technological standpoint but a
legal one as well. You can’t warn away a defect.”