My Blog
Technology

Tesla Autopilot recall prompts criticism from federal officials

Tesla Autopilot recall prompts criticism from federal officials
Tesla Autopilot recall prompts criticism from federal officials


Tesla this week agreed to issue a remote update to 2 million cars aimed at improving driver attention while Autopilot is engaged, especially on surface roads with cross traffic and other hazards the driver-assistance technology is not designed to detect.

But the recall — the largest in Tesla’s 20-year history — quickly drew condemnation from experts and lawmakers, who said new warnings and alerts are unlikely to solve Autopilot’s fundamental flaw: that Tesla fails to limit where drivers can turn it on in the first place.

“What a missed opportunity,” said Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies. “I have yet to see Tesla, or anyone defending Tesla, come up with an argument for why we should be letting people use [Autopilot] on roads that could have cross traffic. That’s how a lot of these crashes are happening.”

“It’s far from sufficient,” added Sen. Richard Blumenthal (D-Conn.), a frequent Tesla critic.

How Tesla’s Autopilot got grounded

The recall comes more than two years after the National Highway Traffic Safety Administration (NHTSA) first launched an investigation into Autopilot after a string of Teslas plowed into parked emergency vehicles. Since then, the agency said it had reviewed more than 900 crashes involving Autopilot. It found that Autopilot’s key Autosteer feature “may not” have sufficient controls to “prevent driver misuse,” including using the feature outside the controlled-access highways for which it was designed.

The notice said Tesla did not concur with the agency’s findings, though it began sending remote software updates on Tuesday, NHTSA said.

Blumenthal said regulators should have required more significant changes to the software, given its history of crashes. Days before the recall, The Washington Post published an investigation that identified eight fatal or serious crashes on roads for which Autopilot was not intended. Tesla has repeatedly acknowledged in user manuals, legal documents and communications with federal regulators that Autosteer is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.”

“Relying on self-enforcement is really problematic given the company’s statements about how seriously they take the whole recall system, the comments by Elon Musk … They regard recalls as more of entertainment than enforcement,” Blumenthal said. “When a car is going to hit an obstacle or another car or go off the road or hit a barrier, there ought to be more than just voluntary compliance.”

Officials and lawmakers expressed concern that NHTSA may have been reluctant to come down harder on the automaker, which has a cultlike following among consumers and enormous influence over the country’s transition to electric vehicles — a priority for the Biden administration. However, NHTSA said its investigation into Autopilot remains open, and some Tesla critics held out hope that the recall may not be NHTSA’s final action.

In a statement, NHTSA spokeswoman Veronica Morales said, “It is now Tesla’s responsibility under the law to provide a remedy, free of charge to consumers, that fully addresses the safety defect.”

Tesla did not respond to a request for comment Friday. In a statement this week responding to The Post’s report on Autopilot crashes, Tesla said it has a “moral obligation” to continue improving its safety systems and also said that it is “morally indefensible” to not make these features available to a wider set of consumers.

In its investigation, The Post found that Autopilot can be engaged on a wide range of roads with intersections, stop lights and cross traffic. One fatal crash occurred when a Tesla in Autopilot plowed through a T intersection and hit a couple looking at the stars. Another occurred when a Tesla in Autopilot failed to recognize a semi-truck crossing the road.

As part of the recall, Tesla agreed to issue a software update that contained new “controls and alerts,” such as “additional checks” when drivers are activating the features outside controlled-access highways. The update also will suspend a driver’s ability to use Autosteer if they repeatedly fail to stay engaged while using it, with eyes on the road and hands on the wheel.

Nowhere in the recall language, however, does the company say it will restrict the technology to its so-called Operational Design Domain (ODD), the industry term for the specific locations and set of circumstances for which Autopilot is designed. That means consumers will still be able to engage the feature outside the ODD and will simply experience more alerts and precautions when they do.

In a statement to The Post last week, NHTSA said it would be too complex and resource-intensive to verify that systems such as Tesla Autopilot are used within the ODD. It also expressed doubt that doing so would fix the problem.

Tesla critic Dan O’Dowd, who has pushed for the company’s software to be banned through his advocacy group the Dawn Project, said the recall fell short.

“The correct solution is to ban Tesla’s defective software, not to force people to watch it more closely,” he said in a statement. “NHTSA’s recall misses the point that Tesla must address and fix the underlying safety defects in its self-driving software to prevent further deaths.”

Jennifer Homendy, the chair of the National Transportation Safety Board (NTSB), an investigative body that has been critical of the approach taken by regulators at NHTSA, said she was pleased to see the agency take action, though it comes seven years after the first known Autopilot fatality.

“I’m happy they’re taking action, but in the meantime people have died,” Homendy said. “They need to verify that the change being made is being made. And then with a voluntary recall, how do you verify?”

NHTSA’s Morales said that the agency will test several Teslas at a vehicle center in Ohio to “evaluate the adequacy of remedies.”

Tesla and Musk have contended that using the term “recall” is inappropriate for fixes issued through a software update — “anachronistic,” in Musk’s view. But past recalls have been effective in mandating updates that otherwise would not have taken place.

After the recall was announced Wednesday, Tesla’s stock briefly dipped around 3 percent. But the company ended the week more than 4 percent higher than it had started, as investors realized the recall would not dramatically impact Tesla’s business.

Gene Munster, a managing partner at Deepwater Asset Management, said he doesn’t expect this recall to deter Tesla from aggressively charging ahead on Musk’s vision of a fully autonomous future.

“People will still use [Autopilot],” he said. “I don’t think that NHTSA has made the roads measurably safer by having these notifications, and I don’t think that Tesla is going to slow its pursuit … because of this.”

Rep. Anna G. Eshoo, (D-Calif.), whose district includes Palo Alto, Calif., where Tesla has its engineering headquarters, said the recall was “jaw dropping.” Even if the recall mostly just adds extra notifications to drivers, she said that it serves a purpose in alerting drivers that Autopilot is not as autonomous as the name suggests.

“It’s up to [Tesla] to take this very seriously,” she said.

Homendy said her agency has consistently found problems with Tesla’s approach to driver-assistance stemming from fatal crashes involving Autopilot in Williston, Fla., Mountain View, Calif., and Delray Beach, Fla. As early as 2017, NTSB recommended action to stop drivers from engaging Autopilot outside the conditions for which it was designed.

Homendy was skeptical the problem could be addressed voluntarily through warning chimes or precautionary checks. Other automakers such as Ford, General Motors and Subaru include driver-assistance software in their vehicles, but the Tesla crashes involving Autopilot have come under repeated scrutiny from federal agencies.

“If you look at all the [advanced driver-assistance] technology out there, the NTSB is not investigating all that other technology,” she said. “We’ve seen it, we have not found that there is a problem.

“We’ve consistently found that there is a problem with Tesla.”



Related posts

QD-OLED Gaming Monitors to Hit Higher Speeds, New Sizes in 2024

newsconquest

The Fed Cut Rates Again, but Don’t Expect Relief From Sky-High Credit Card Interest

newsconquest

Uber Says It is Bouncing Again From Pandemic In Income Document

newsconquest