The fatal collision between a Tesla Model S operating in its Autopilot mode and a tractor-trailer hauling blueberries on a Florida highway was due to a series of errors, which is the case in many if not most serious traffic accidents.
Now, as federal investigators from the National Highway Traffic Safety Administration and from Tesla work to sort out the chain of mistakes that led to the May 7 crash that killed the driver of the Tesla Model S, some facts have become clear.
Those facts point to driver error, but at this point, there’s no indication that the Autopilot software on the Tesla was responsible for the accident. This means that the dire predictions by some in the (non-technology) media that the accident would mean the end of autonomous vehicles are wrong.
But if the self-driving functions of the Tesla weren’t at fault, then what was? The short answer is that the drivers of both vehicles apparently did things that contributed to the accident’s cause. The truck driver made an improper left turn and the driver of the Tesla apparently wasn’t paying as close attention as he should have.
The accident happened like this: The Tesla was heading east on U.S. Route 27, a four-lane divided highway in Williston, Fla. The truck was heading in the opposite direction, on the other side of the highway. The driver of the truck made a left turn into the path of the Tesla, which collided with the trailer. The Tesla’s roof was sheared off and the rest of the car continued off the highway until it hit a power pole. The New York Times published a clear diagram of the accident.
Some facts about the accident are not in dispute, notably that the Tesla driver was using the car’s Autopilot software at the time of the accident. Likewise, the Tesla’s brakes were not applied prior to the accident, so when it hit the trailer that had turned in front of it, the impact was at an estimated speed of 65 miles per hour.
Also not in dispute is the fact that the truck driver steered directly into the path of the Tesla when he made a left turn on to a side road. Statements by the truck driver reported in the police report indicate that he didn’t see the oncoming Tesla until it was too late to prevent the accident.
There is a witness report that a DVD player was showing a Harry Potter movie at the time of the accident, but that has not been confirmed, although the police report did say that a DVD player was found in the wreckage.
The police report said that the driver of the tractor-trailer failed to yield the right of way to the oncoming traffic. No charges were filed at the time of the accident, but that does not mean that charges won’t be filed following the investigation.
Autopilot Feature Likely Not at Fault in Tesla Model S Fatal Crash
So what does all of this mean regarding the future of self-driving cars and other autonomous vehicles? A lot less than you might think. It certainly does not mean that such vehicles have suffered a setback. And in fact, it may mean that fully autonomous vehicles will save lives.
What’s missing in most of the discussion about the Tesla fatality is the fact that the Autopilot software being used in the Tesla does not make it an autonomous vehicle. While it does have some features that will be used in future vehicles, Tesla warns drivers that the software is still in beta and that drivers must remain alert and in control of their cars at all times.
What Tesla is currently doing with its Autopilot software is similar to, but somewhat more advanced than, driver assistance software from other car companies. There are a number of existing cars that have the ability to watch for lane markings on highways, to slow down and stop for traffic in front of the car and to speed up when traffic ahead speeds up. A few cars, including Tesla, will change lanes to avoid slower traffic, and some will turn on to the proper exit when using the GPS to navigate.
Those are useful features, and in many cases are important safety features. The ability to stop when encountering stopped traffic in the highway ahead is a good example, and there are a number of cars that can do that besides Tesla.
But that’s not the same thing as stopping when a truck makes an unsafe left turn in front of the car. The Tesla software apparently wasn’t designed to do that and even though its onboard radar may have noticed the truck, it’s not clear that it recognized it as a threat.
Had the Tesla and the other traffic on that Florida highway been equipped as autonomous vehicles and had that highway been instrumented, what would have happened instead is that the vehicles would have been in communication and in the best scenario the truck would have waited until the Tesla had passed, thus avoiding the accident.
What happened instead is that the Tesla driver may not have been paying full attention to the road and the truck driver failed to notice and avoid the Tesla while making a left turn against oncoming traffic.
This is the type of accident that autonomous vehicles are intended to prevent, but in this case neither vehicle was autonomous. What happened instead was a tragic accident that need not have happened if only both drivers had been giving the traffic situation their full attention.
Sadly, until cars can give drivers full situational awareness until vehicles gain something like that kind of capability, accidents like this one will keep happening. There’s little that a car can do to make its driver fully attentive and responsible, but that’s what was required in this case.