Tesla fatality blamed on zeal for tech by U.S. regulatorsTesla Motors says the self-driving feature suspected of being involved in a May 7 fatal crash is experimental, yet it’s been installed on all 70,000 of its cars since October 2014.
For groups that have lobbied for stronger safety rules, that’s precisely what’s wrong with U.S. regulators’ increasingly anything-goes approach.
“Allowing automakers to do their own testing, with no specific guidelines, means consumers are going to be the guinea pigs in this experiment,” said Jackie Gillan, president for Advocates for Highway and Auto Safety, a longtime Washington consumer lobbyist who has helped shape numerous auto-technology mandates. “This is going to happen again and again and again.”
Tesla’s use of technology still in development, while common in its Silicon Valley home, contrasts with the cautious method of General Motors and other automakers that have restricted their semi-autonomous cars to test tracks and professional drivers. It’s permitted because U.S. regulators have taken an intentionally light approach to encourage innovation.
The May crash under investigation involved a 40-year-old Ohio man who was killed when his 2015 Model S drove under the trailer of an 18-wheeler on a highway near Williston, Florida, according to the Florida Highway Patrol. The truck driver told the Associated Press that he believes the Ohio man may have been watching a movie. Authorities recovered a portable DVD player but don’t know whether it was playing at the time of the crash.
The National Highway Traffic Safety Administration said Thursday that it is investigating the crash, which comes as the regulator says it is looking for ways to collaborate with the industry. The agency negotiated an agreement to speed the introduction of automatic emergency braking earlier this year, frustrating safety groups who say they had no input and said carmakers’ pledges to install the technology couldn’t be enforced by law.
NHTSA is expected to announce guidelines as soon as this month that will set some parameters for self-driving cars on U.S. roads. Transportation Secretary Anthony Foxx told reporters Wednesday the agency would be as exact as it could without being overly prescriptive.
In January, Foxx and NHTSA chief Mark Rosekind announced in Detroit that they’d allow automakers to demonstrate the safety of autonomous vehicles and apply for exemptions to existing safety rules. They said the government shouldn’t stand in the way of technological progress.
In the Florida crash, Tesla’s “Autopilot” semi-autonomous driving feature failed to detect the white side of the tractor trailer against a brightly lit sky, so it didn’t hit the brakes, according to the company.
The company says the cars are safer than conventional ones. In a blog post, Tesla said the May accident was the first known fatality in more than 130 million miles of Autopilot driving. That compares with one fatality in every 94 million miles among all U.S. vehicles, according to Tesla.
In fact, highway deaths are on the rise. Preliminary data released Friday showed a 7.7 percent increase in U.S. highway fatalities in 2015, to 35,200, compared with the year before. NHTSA attributed the increase, in part, to an improving economy and falling gas prices. But it said that human error is a factor in 94 percent of car crashes.
“Autopilot is by far the most advanced driver-assistance system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility,” the company said. “Since the release of Autopilot, we’ve continuously educated customers on the use of the feature.”
Using customers to help refine a technology is common in the tech industry, where software features are rolled out in “beta” phase to consumers who provide programmers with real-time feedback to refine bugs. The industry has concluded that accepting some flaws is the price for getting to use the new products more quickly.