Second Tesla Owner Claims Computer Glitch Led To Crash

On April 26, Arianna Simpson was driving her Tesla Model S on Interstate 5 north of Los Angeles with Autopilot engaged. One of the features of Autopilot is adaptive cruise control, which Tesla calls TACC or traffic aware cruise control. It tracks the vehicle ahead and is supposed to adjust the speed of the Tesla automatically.

Tesla Model S crash on I-5

Those of us who are doomed to living with conventional cruise control are all too familiar with what happens when traffic ahead is slower than our selected speed. We have to tap the brakes or hit the cancel button to deactivate, then hit resume or reprogram our cruise control when the road ahead clears. TACC is supposed to lift that burden from our shoulders.

That’s not what happened to her, Arianna reports. She tells Ars Technica, “All of a sudden, the car ahead of me came to a halt. There was a decent amount of space so I figured that the car was going to brake as it is supposed to and didn’t brake immediately. When it became apparent that the car was not slowing down at all, I slammed on the brakes but was probably still going 40 when I collided with the other car.” She blames her car for the accident, saying the Autopilot system failed to respond correctly to the situation.

Not so, says Tesla. As with the case of Josh Overton, the Utah Tesla owner who says his Tesla drove itself into the back of a truck, Tesla accessed the data log in Simpson’s car and said she was entirely at fault for the collision. In a statement to Ars Technica, it said,

“Safety is the top priority at Tesla, and we engineer and build our cars with this foremost in mind. We also ask our customers to exercise safe behavior when using our vehicles. Since the release of Autopilot, we’ve continuously educated customers on the use of the feature, reminding them that they’re responsible for remaining alert and present when using Autopilot and must be prepared to take control at all times.

“Tesla Autopilot is designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable. Autopilot is by far the most advanced such system on the road, but it does not turn a Tesla into an autonomous vehicle and does not allow the driver to abdicate responsibility.”

Tesla’s response has left Simpson with a bad taste in her mouth. She describes herself as “super pro-Tesla and self-driving things in general.” But getting thrown under the bus by the company is not what she expected. She says Tesla lacks empathy and has been “pretty awful throughout the process.”

One of the issues that concerns people who design autonomous driving systems is what is known as the “hand off.” That’s the transition period between when the car is operating in self driving mode and when the driver resumes full control of the automobile. It’s safe to assume that someone using Autopilot may allow his or her attention to wander a bit, assuming the car will take care of routine driving chores. That’s what it’s for, after all. The experts worry about how the computer should alert the driver if a dangerous situation is in the making.

Tesla says its system issues several visual and audible warnings in such instances. Simpson says her car did not of those things but just cruised serenely ahead without slowing. So here we have another classic “she said, the computer said” situation. What has Simpson upset is that the company has decided she is a clueless clod who can’t drive properly. That stings.

Of course she stepped on the brake when she realized there was danger ahead. Of course that turned off Autopilot, as it is supposed to do. The question that Tesla seems to want no part of (no doubt on the advice of counsel) is why did the car not detect the danger ahead and activate emergency braking procedures? Notice the company statement says the system is designed to issue warnings when the driver needs to take over, but it does not say there is any evidence in the data logs that the system in Simpson’s car actually did so.

Tesla’s defensive, “blame the driver” mode seems at odds with the glowing claims Elon Musk makes for his autonomous driving technology. On one hand, the head of the company expounds glowingly on how awesome the technology is. On the other, the company runs for cover every time someone suggests there might be an issue that needs addressing. Tesla’s fall back position is always that its technology is in beta testing mode, and hey, stuff happens. If it does, it’s your fault, not Tesla’s.  Not only that, it will use the data stored in your car’s computer to undermine you.

Is that really the message Tesla wants to send to all those Model 3 reservation holders out there?

Photo credit: Arianna Simpson

 

Steve Hanley

Closely following the transition from internal combustion to electricity. Whether it's cars, trucks, ships, or airplanes, sustainability is the key. Please follow me on Google + and Twitter.