Florida Judge Rules Tesla Knew About Autopilot Defect That Led To Deadly 2019 Crash

Florida Judge Rules Tesla Knew About Autopilot Defect That Led To Deadly 2019 Crash

While using the driver assist system marketed as “Autopilot” in his red Tesla Model 3 on a dark 2019 morning in Delray Beach, Fla., Jeremy Banner took his hands off the wheel and trusted the system to drive for him, as it was pitched to him to do just that. A tractor trailer crossing both lanes in front of him was missed by the system’s sensors, and the car ran at full speed under the side of the trailer. The roof was ripped from the car, Banner was instantly killed, and the car continued driving for nearly a minute before coming to a stop at the curb. A judge ruled last week that Banner’s wife’s negligence lawsuit against Tesla can proceed to trial.

Bryant Walker Smith, a University of South Carolina law professor, told Reuters that the judge’s summary is significant because “it suggests alarming inconsistencies between what Tesla knew internally, and what it was saying in its marketing.”

“This opinion opens the door for a public trial in which the judge seems inclined to admit a lot of testimony and other evidence that could be pretty awkward for Tesla and its CEO,” Smith continued. “And now the result of that trial could be a verdict with punitive damages.”

Walter Isaacson On Elon Musk(s)

The judge cited Tesla’s 2016 video unveil of its so-called Autopilot Full Self-Driving driver assistance program as part of his reasoning. “Absent from this video is any indication that the video is aspirational or that this technology doesn’t currently exist in the market,” he wrote.

“It would be reasonable to conclude that the Defendant Tesla through its CEO and engineers was acutely aware of the problem with the ‘Autopilot’ failing to detect cross traffic,” according to the judge’s verdict.

See also  At $3,799, Is This 1993 Toyota Corolla ‘Art Car’ an Eye-Catching Deal?

The plaintiff should be able to argue to a jury that Tesla did not provide sufficient warning that Autopilot and Full-Self Driving require driver attention to take over in case of an emergency situation. Even today after dozens of related deaths, I still hear from Tesla drivers who trust FSD to drive them home when they are impaired (either from fatigue or alcohol) or simply engage in other activities behind the wheel.

According to TechCrunch, the judge in this case went to a variety of ends to reach his verdict.

The judge compared Banner’s crash to a similar 2016 fatal crash involving Joshua Brown in which Autopilot failed to detect crossing trucks, which led to the vehicle crashing into the side of a tractor trailer at high speed. The judge also based his finding on testimony given by Autopilot engineer Adam Gustafsson and Dr. Mary “Missy” Cummings, director of the Autonomy and Robotics Center at George Mason University.

Gustafsson, who was the investigator on both Banner’s and Brown’s crashes, testified that Autopilot in both cases failed to detect the semitrailer and stop the vehicle. The engineer further testified that despite Tesla being aware of the problem, no changes were made to the cross-traffic detection warning system from the date of Brown’s crash until Banner’s crash to account for cross traffic.

The judge wrote in his ruling that the testimony of other Tesla engineers leads to the reasonable conclusion that Musk, who was “intimately involved” in the development of Autopilot, was “acutely aware” of the problem and failed to remedy it

See also  2022 Hyundai Kona N at Lightning Lap 2023

The case — No. 50-2019-CA-009962 — will go to trial at Circuit Court for Palm Beach County, Florida.