Autopilot Feature Likely Not at Fault in Tesla Model S Fatal Crash
So what does all of this mean regarding the future of self-driving cars and other autonomous vehicles? A lot less than you might think. It certainly does not mean that such vehicles have suffered a setback. And in fact, it may mean that fully autonomous vehicles will save lives. What’s missing in most of the discussion about the Tesla fatality is the fact that the Autopilot software being used in the Tesla does not make it an autonomous vehicle. While it does have some features that will be used in future vehicles, Tesla warns drivers that the software is still in beta and that drivers must remain alert and in control of their cars at all times. What Tesla is currently doing with its Autopilot software is similar to, but somewhat more advanced than, driver assistance software from other car companies. There are a number of existing cars that have the ability to watch for lane markings on highways, to slow down and stop for traffic in front of the car and to speed up when traffic ahead speeds up. A few cars, including Tesla, will change lanes to avoid slower traffic, and some will turn on to the proper exit when using the GPS to navigate. Those are useful features, and in many cases are important safety features. The ability to stop when encountering stopped traffic in the highway ahead is a good example, and there are a number of cars that can do that besides Tesla.Had the Tesla and the other traffic on that Florida highway been equipped as autonomous vehicles and had that highway been instrumented, what would have happened instead is that the vehicles would have been in communication and in the best scenario the truck would have waited until the Tesla had passed, thus avoiding the accident. What happened instead is that the Tesla driver may not have been paying full attention to the road and the truck driver failed to notice and avoid the Tesla while making a left turn against oncoming traffic. This is the type of accident that autonomous vehicles are intended to prevent, but in this case neither vehicle was autonomous. What happened instead was a tragic accident that need not have happened if only both drivers had been giving the traffic situation their full attention. Sadly, until cars can give drivers full situational awareness until vehicles gain something like that kind of capability, accidents like this one will keep happening. There's little that a car can do to make its driver fully attentive and responsible, but that's what was required in this case.
But that's not the same thing as stopping when a truck makes an unsafe left turn in front of the car. The Tesla software apparently wasn't designed to do that and even though its onboard radar may have noticed the truck, it's not clear that it recognized it as a threat.