Lee’s estate sued Tesla, alleging that the company knew its assisted-driving technology and enhanced safety features were defective when it sold the car. The plaintiff’s case also rests heavy on the claim that Tesla markets its Autopilot features in a way that misleads drivers into believing it is more autonomous than it actually is.
“They sold the hype and people bought it,” Jonathan Michaels, the attorney representing Lee’s estate and fiancée Lindsay Molander, said in his opening arguments. Tesla “made a decision to put their company over the safety of others.”
Thursday’s opening arguments offered a glimpse into Tesla’s strategy for defending its Autopilot features, which have been linked to more than 700 crashes since 2019 and at least 17 fatalities, according to a Washington Post analysis of National Highway Traffic Safety Administration data. The crux of the company’s defense is that the driver is ultimately in control of the vehicle, and they must keep their hands on the wheel and eyes on the road while using the feature.
Michael Carey, the attorney for Tesla, argued the technology was not at fault, and that the crash — in which the Lee’s car made a sharp right turn across two lanes of traffic — “is classic human error” and that Autopilot is “basically just fancy cruise control.” He also said that it’s “inconclusive” whether Autopilot was actually involved, as the data box in the car that would have that information was damaged in the fiery crash.
“This case is not about Autopilot. Autopilot didn’t cause the crash,” Carey said in his opening statements Thursday. “This is a bad crash with bad injuries and may have resulted from bad mistakes — but you can’t blame the car company when that happens. This is a good car with a good design.”
The cluster of trials set for the next year is also likely to demonstrate how much the technology actually relies on human intervention — despite CEO Elon Musk’s claims that cars operating in Autopilot are safer than those controlled by humans. The outcomes could amount to a pivotal moment for Tesla, which has for years tried to absolve itself from responsibility when one of its cars on Autopilot is involved in a crash.
“For Tesla to continue to get its technology on the road, it is going to have to be successful in these cases,” said Ed Walters, who teaches autonomous vehicle law at Georgetown University. “If it faces a lot of liability from accidents … it is going to be very hard for Tesla to continue getting this tech out.”
The company is facing several other lawsuits around the country involving its Autopilot technology. Some take issue with Tesla’s marketing of its autonomous features and argue it lulls drivers into a false sense of complacency.
Many of the cases heading to trial in the next year involve crashes that occurred several years ago, a reflection of the increased use of driver-assisted features and of the lengthy legal process involved in bringing such a case through the court system. In the years since, Tesla has continued to roll out its technology — some of it still in a test phase — to hundreds of thousands more vehicles on the nation’s roadways.
Autopilot, which Tesla introduced in 2014, is a suite of features that enables the car to maintain speed and distance behind other vehicles and follow lane lines, among other tasks. Tesla says drivers must monitor the road and intervene when necessary. Autosteer — a specific Autopilot feature designed to keep the car centered in the lane — is in beta test mode, and Carey said drivers are warned of the potential limitations before they enable the feature.
“We’re telling drivers that because we want you to be extra vigilant. Not because there’s something wrong with it, but because when people are driving in Autopilot, we don’t want them to think the thing is fully self driving,” Carey said. “It is an advisory to everyone who is using Autosteer that you gotta be careful with this stuff.”
While Teslas still require a human to be paying attention behind the wheel, the increasingly capable driver-assistance systems — and growing prevalence of features rooted in automation on the nation’s roads — have prompted legislators and safety advocates to push for more regulation. Musk has repeatedly touted the safety and sophistication of his technology over human drivers, citing crash rates when the modes of driving are compared.
In several of the cases headed to trial within the next year, cars allegedly on Autopilot didn’t act as they were expected to — suddenly accelerating, for instance, or not reacting when another vehicle was in front of them. In one case, set to go before a jury in the coming months, a 50-year-old man driving on Autopilot was killed when his Tesla plowed under a semi truck.
Another case concerns a Tesla in Autopilot that ran through an intersection while the driver wasn’t paying attention, hit a parked car and killed a person standing outside the vehicle. Then, in another, a Tesla in Autopilot rear-ended a car that changed lanes in front of the Tesla. A 15-year-old was thrown from the front car, killing him. The suit alleges that the Tesla did not see or react to the traffic conditions in front of it.
Faced with a sharp increase in Tesla-related crashes involving Autopilot, the National Highway Traffic Safety Administration has opened dozens of investigations into the collisions over the past few years. NHTSA has also issued 16 recalls of the 2019 Tesla Model 3 and opened seven investigations into aspects of the technology — like sudden unintended acceleration and crashes with emergency vehicles.
The 2019 crash involving Lee is not being investigated by NHTSA, and a spokesperson for the agency declined to explain why. The agency has also said that a report of a crash involving driver assistance does not itself imply that the technology was the cause.
“NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully engaged in the driving task at all times,” NHTSA spokesperson Veronica Morales previously told The Post. “Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.”
Regardless of the outcome of the trials, said David Zipper, a visiting fellow at the Harvard Kennedy School’s Taubman Center for State and Local Government, they will at least highlight how the United States needs more regulation on the emerging technology.
“It’s possible the drivers (of Teslas) understand the risks,” Zipper said. “But even if they accept that, what about everyone on a public road or street who is not in a Tesla? None of us signed on to be a guinea pig.”
On Thursday, the attorney for Lee’s fiancée painted a picture of an idyllic evening that ended in sudden tragedy. On the day of the crash, the couple went to Downtown Disney, where they walked around and ate dinner, Michaels said. At dinner, Molander posted a selfie of the two with the caption “Life is short. Don’t forget to be happy.”
After dinner — where both Lee and Molander consumed alcohol — the pair picked up Molander’s son and headed back home. In deposition played for the jury, Molander said all she remembers in the moments leading up to the crash is wondering, “Why are we jerking all of a sudden?”
Before Lee’s car collided with the palm tree, court documents say, he attempted to regain control of the car, but “Autopilot and/or Active Safety features would not allow.” That failure, according to the complaint, led to Lee’s “gruesome and ultimately fatal injuries.”
“Had the vehicle’s Autopilot and/or Active Safety features operated properly, decedent Micah Lee’s death would have been avoided,” according to the court documents. According to a toxicology report taken after the crash, Lee’s blood alcohol content level was 0.051 percent — within the legal limit in California.
Along with alleging the software was defective, the complaint also outlines several allegations related to the physical design of the car. In response to the complaint, Tesla said the car was not in “a defective condition at any time when it left the possession, custody or control of Tesla.”
Lee was on life support for eight days after the crash before his family finally took him off life support. Meanwhile, the injuries sustained by Molander and her son were catastrophic. Molander broke her back, wrist, jaw, and also sustained a traumatic brain injury. Her son, who was 8 years-old at the time, was disemboweled.
At the trial Thursday, the paramedic who responded to the scene testified to the horror he arrived to that night: a car on fire, and a young boy screaming in pain.
“It is locked in my memory,” he said.