Fatal crash of Tesla Model S in autopilot mode leads to investigation by federal officials
Federal regulators opened a preliminary probe into the autopilot feature on a Tesla Model S electric car after a fatal crash involving the technology, Tesla said Thursday.
The fatality – thought to be the first in the auto industry related to an autopilot feature – sparked questions about the limitations of the technology and its place in what is seen as an inevitable march toward self-driving vehicles. It followed other recent incidents in which drivers reported collisions while using such technology.
The male driver died in a May 7 crash in Williston, Fla., when a big rig made a left turn in front of his Tesla.
In a blog post, Tesla Motors Inc. said the 2015 car passed under the trailer, with the bottom of the trailer hitting the Model S’ windshield.
“Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied,” Tesla said.
Tesla said it immediately reported the fatal crash to the National Highway Traffic Safety Administration. The automaker emphasized that its autopilot feature is still in a beta phase of introduction and has limitations.
...
The need to maintain control became clear to Aaron Souppouris when he test-drove a Model S in April. Souppouris, a senior editor at the blog Engadget, said Tesla loaned him the car for an article about the autopilot feature and he drove it about 500 miles around England.
There were times at night, he said, when the car went back and forth within a lane and seemed “skittish,” he said.
Once on autopilot, the car tried to change lanes but then reverted back suddenly, and another time it disengaged the autopilot mode in the middle of a lane switch, Souppouris said. The car did better during the day than at night, he said.
I should probably do the smart thing and hold off until we see how things shake out, but I'm going to make a prediction:
This is not going to be that big of a deal.
According to the standard narrative this should be huge; the story has always been one of yet another good-to-go transformative technology kept out of our hands by lawyers and bureaucrats. One of the recurring comments in the reporting was that the first time a self-driving car was involved in a serious accident the result would be a litigious nightmare.
The truth is that the main reason we don't have fully autonomous cars is that parts of the technology have been remarkably hard to crack. Furthermore, the potential economic and environmental impact of driverless cars (level 4 or 5 depending on which scale you prefer) has been overstated, while far-less-hyped semi-autonomous features like adaptive cruise control and automatic braking are already revolutionizing auto safety.
There may be a lawsuit or some regulatory tightening after this, but these things will probably be minor factors (I'd worry more about the PR hit). The main problem facing Tesla is still the inability to turn a profit, and no amount of business-friendly policies will save a company that can't manage that.
No comments:
Post a Comment