Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
(Reuters) — The National Transportation Safety Board on Tuesday sharply criticized Tesla’s lack of system safeguards in a fatal 2018 Autopilot crash in California and called U.S. regulators’ approach in overseeing the driver assistance systems “misguided.”
NTSB board members questioned Tesla’s design of its semi-automated driving assistance system and condemned the National Highway Traffic Safety Administration (NHTSA) for a “hands-off approach” to regulating the increasingly popular systems.
NHTSA has “taken a nonregulatory approach to automated vehicle safety” and should “complete a further evaluation of the Tesla Autopilot system to ensure the deployed technology does not pose an unreasonable safety risk,” NTSB said.
The board faulted Apple and other smartphone makers for refusing to disable devices when users are driving. It also called on the U.S. Occupational Safety and Health Administration to use its authority to take action against “employers who fail to address the hazards of distracted driving.”
MetaBeat will bring together thought leaders to give guidance on how metaverse technology will transform the way all industries communicate and do business on October 4 in San Francisco, CA.
The board’s criticism posed a direct challenge to the auto industry’s efforts to profit from partially automated vehicles and the smartphone industry’s quest to keep user eyes on their devices.
The NTSB can only make recommendations, while NHTSA regulates U.S. vehicles. NHTSA has sent teams to investigate 14 Tesla crashes in which Autopilot is suspected of being in use but has taken no action against the company.
Concerns have grown about systems that can perform driving tasks for extended stretches with little or no human intervention but that cannot completely replace human drivers.
“It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars. Because they don’t have driverless cars,” NTSB chair Robert Sumwalt said.
The Mountain View, California crash — involving a driver who was playing a game on his phone during the fatal trip — illustrates “semi-autonomous vehicles can lead drivers to be complacent … and it also points out that smartphones manipulating [drivers] can be so addictive that people aren’t going to put them down,” Sumwalt added.
Walter Huang, a 38-year-old Apple software engineer, was driving his Tesla Model X in 2018 in Autopilot mode at about 70 miles per hour (113 kph) when it crashed into a safety barrier. The NTSB said Huang had been using an iPhone and recovered logs showed a word-building game was active.
The probable cause of Huang’s crash was Autopilot steering the vehicle off the highway “due to system limitations and the driver’s lack of response due to distraction, likely from a cell phone game application and overreliance [on Autopilot],” the NTSB said. On prior trips, Huang had intervened when Autopilot steered the vehicle toward the same “highway gore” area.
NHTSA said it will carefully review the NTSB’s recommendations. The agency added that commercial motor vehicles “require the human driver to be in control at all times.”
Tesla drivers are able to avoid holding the steering wheel for extended periods while using Autopilot, but the company advises drivers to keep their hands on the wheel and pay attention. Sumwalt said Tesla allowed drivers to remove their hands from the wheel for up to three minutes under certain conditions when using Autopilot.
The NTSB said Tesla added safeguards to require quicker warnings at higher speeds for drivers without their hands on the wheel.
Regulators in Europe place limitations on Autopilot use, and Tesla issues alerts in the region for hands-off driving within 15 seconds, the NTSB said.
Tesla did not respond to requests for comment.
NTSB vice chair Bruce Landsberg called Autopilot “completely inadequate” and noted that Tesla vehicles have repeatedly crashed into large obstacles.
Sumwalt said Tesla — unlike five other auto manufacturers — has ignored NTSB safety recommendations issued in 2017.
Tesla’s Autopilot is tied to at least three deadly crashes since 2016 and suspected in others.
The NTSB in coming days will release the probable cause of a third Tesla Autopilot fatal crash in March 2019 in Florida that showed no evidence the driver’s hands were on the steering wheel in the final 8 seconds before striking a semi-trailer truck.
The NTSB also called on cellphone manufacturers to add more safeguards to prevent the misuse of devices by drivers.
Sumwalt noted that Apple does not have a distracted driving policy but said it should have one. Apple says it expects its employees to follow the law.
(Reporting by David Shepardson, editing by Chris Reese, Sandra Maler, and Dan Grebler.)
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn more about membership.