My Blog
Technology

Depositions illuminate Tesla Autopilot programming flaws


In Tesla’s marketing materials, the company’s Autopilot driver-assistance system is cast as a technological marvel that uses “advanced cameras, sensors and computing power” to steer, accelerate and brake automatically — even change lanes so “you don’t get stuck behind slow cars or trucks.”

Under oath, however, Tesla engineer Akshay Phatak last year described the software as fairly basic in at least one respect: the way it steers on its own.

“If there are clearly marked lane lines, the system will follow the lane lines,” Phatak said under questioning in July 2023. Tesla’s groundbreaking system, he said, was simply “designed” to follow painted lane lines.

Phatak’s testimony, which was obtained by The Washington Post, came in a deposition for a wrongful-death lawsuit set for trial Tuesday. The case involves a fatal crash in March 2018, when a Tesla in Autopilot careened into a highway barrier near Mountain View, Calif., after getting confused by what the company’s lawyers described in court documents as a “faded and nearly obliterated” lane line.

The driver, Walter Huang, 38, was killed. An investigation by the National Transportation Safety Board later cited Tesla’s failure to limit the use of Autopilot in such conditions as a contributing factor: The company has acknowledged to National Transportation Safety Board that Autopilot is designed for areas with “clear lane markings.”

Phatak’s testimony marks the first time Tesla has publicly explained these design decisions, peeling back the curtain on a system shrouded in secrecy by the company and its controversial CEO, Elon Musk. Musk, Phatak and Tesla did not respond to requests for comment.

Following lane lines is not unique to Tesla: Many modern cars use technology to alert drivers when they’re drifting. But by marketing the technology as “Autopilot,” Tesla may be misleading drivers about the cars’ capabilities — a central allegation in numerous lawsuits headed for trial this year and a key concern of federal safety officials.

For years, Tesla and federal regulators have been aware of problems with Autopilot following lane lines, including cars being guided in the wrong direction of travel and placed in the path of cross-traffic — with sometimes fatal results. Unlike vehicles that are designed to be completely autonomous, like cars from Waymo or Cruise, Teslas do not currently use sensors such as radar or lidar to detect obstacles. Instead, Teslas rely on cameras.

After the crash that killed Huang, Tesla told officials that it updated its software to better recognize “poor and faded” lane markings and to audibly alert drivers when vehicles might lose track of a fading lane. The updates stopped short of forcing the feature to disengage on its own in those situations, however. About two years after Huang died, federal investigators said they could not determine whether those updates would have been sufficient to “accurately and consistently detect unusual or worn lane markings” and therefore prevent Huang’s crash.

Huang, an engineer at Apple, bought his Tesla Model X in fall 2017 and drove it regularly to work along U.S. Highway 101, a crowded multilane freeway that connects San Francisco to the tech hubs of Silicon Valley. On the day of the crash, his car began to drift as a lane line faded. It then picked up a clearer line to the left — putting the car between lanes and on a direct trajectory for a safety barrier separating the highway from an exit onto State Route 85.

Huang’s car hit the barrier at 71 mph, pulverizing its front end, twisting it into unrecognizable heap. Huang was pronounced dead hours later, according to court documents.

In the months preceding the crash, Huang’s vehicle swerved in a similar location eleven times, according to internal Tesla data discussed by Huang’s lawyers during a court hearing last month. According to the data, the car corrected itself seven times. Four other times, it required Huang’s intervention. Huang was allegedly playing a game on his phone when the crash occurred.

The NTSB concluded that driver distraction and Autopilot’s “system limitations” likely led to Huang’s death. In its report, released about two years after the crash, investigators said Tesla’s “ineffective monitoring” of driver engagement also “facilitated the driver’s complacency and inattentiveness.”

Investigators also said that the California Highway Patrol’s failure to report the damaged crash barrier — which was ruined in a previous collision — contributed to the severity of Huang’s injuries.

Huang’s family sued Tesla, alleging wrongful death, and sued the state of California over the damaged crash barrier. The Post obtained copies of several depositions in the case, including testimony which has not been previously reported. Reuters also recently reported on some depositions from the case.

The documents shed light on one of federal regulators and safety officials’ biggest frustrations with Tesla: why Autopilot at times engages on streets where Tesla’s manual says it is not designed to be used. Such areas include streets with cross traffic, urban streets with frequent stoplights and stop signs, and roads without clear lane markings.

In his deposition, Phatak said Autopilot will work wherever the car’s cameras detect lines on the road: “As long as there are painted lane lines, the system will follow them,” he said.

Asked about another crash involving the software, Phatak disputed NTSB’s contention that Autopilot should not have functioned on the road in Florida where driver Jeremy Banner was killed in 2019 when his Tesla barreled into a semi-truck and slid under its trailer. “If I’m not mistaken, that road had painted lane lines,” Phatak said. Banner’s family has filed a wrongful-death lawsuit, which has not yet gone to trial.

Musk has said cars operating in Autopilot are safer than those controlled by humans, a message that several plaintiffs — and some experts — have said creates a false sense of complacency among Tesla drivers. The company has argued that it is not responsible for crashes because it makes clear to Tesla drivers in user manuals and on dashboard screens that they are solely responsible for maintaining control of their car at all times. So far, that argument has prevailed in court, most recently when a California jury found Tesla not liable for a fatal crash that occurred when Autopilot was allegedly engaged.

Autopilot is included in nearly every Tesla. It will steer on streets, follow a set course on freeways and maintain a set speed and distance without human input. It will even change lanes to pass cars and maneuver aggressively in traffic depending on the driving mode selected. It does not stop at stop signs or traffic signals. For an additional $12,000, drivers can purchase a package called Full Self-Driving that can react to traffic signals and gives the vehicles the capability to follow turn-by-turn directions on surface streets.

Since 2017, officials with NTSB have urged Tesla to limit Autopilot use to highways without cross traffic, the areas for which the company’s user manuals specify Autopilot is intended. Asked by an attorney for Huang’s family if Tesla “has decided it’s not going to do anything” on that recommendation, Phatak argued that Tesla was already following the NTSB’s guidance by limiting Autopilot use to roads that have lane lines.

“In my opinion we already are doing that,” Phatak said. “We are already restricting usage of Autopilot.”

A Washington Post investigation last year detailed at least eight fatal or serious Tesla crashes that occurred with Autopilot activated on roads with cross traffic.

Last month, the Government Accountability Office called on the National Highway Traffic Safety Administration, the top auto safety regulator, to provide additional information on driver-assistance systems “to clarify the scope of intended use and the driver’s responsibility to monitor the system and the driving environment while such a system is engaged.”

Phatak’s testimony also shed light on other driver-assist design choices, such as Tesla’s decision to monitor driver attention through sensors that gauge pressure on the steering wheel. Asked repeatedly by the Huang family’s lawyer what tests or studies Tesla performed to ensure the effectiveness of this method, Phatak said it simply tested it with employees.

Other Tesla design decisions have differed from competitors pursuing autonomous vehicles. For one thing, Tesla sells its systems to consumers, while other companies tend to deploy their own fleets as taxis. It also employs a unique, camera-based system and places fewer limits on where the software can be engaged. For example, a spokesperson for Waymo, the Alphabet-owned self-driving car company, said its vehicles operate only in areas that have been rigorously mapped and where the cars have been tested in conditions including fog and rain, a process known as “geo-fencing.”

“We’ve designed our system knowing that lanes and their markings can change, be temporarily occluded, move, and sometimes, disappear completely,” Waymo spokeswoman Katherine Barna said.

California regulators also restrict where these driverless cars can operate, and how fast they can go.

When asked whether Autopilot would use GPS or other mapping systems to ensure a road was suitable for the technology, Phatak said it would not. “It’s not map based,” he said — an answer that diverged from Musk’s statement in a 2016 conference call with reporters that Tesla could turn to GPS as a backup “when the road markings may disappear.” In an audio recording of the call cited by Huang family attorneys, Musk said the cars could rely on satellite navigation “for a few seconds” while searching for lane lines.

Tesla’s heavy reliance on lane lines reflects the broader lack of redundancy within its systems when compared to rivals. The Post has previously reported that Tesla’s decision to omit radar from newer models, at Musk’s behest, culminated in an uptick in crashes.

Rachel Lerman contributed to this report.

Related posts

Very best 0% APR bank cards for January 2022

newsconquest

A Fake Job Offer Reportedly Led to Axie Infinity’s $600M Hack

newsconquest

Hands-On With Insta360 X3 and Invisible Dive Case

newsconquest

Leave a Comment