Tesla Model S Crashes Into Stalled Van On Highway

A video uploaded to YouTube on May 25 proves why Tesla Autopilot with its Traffic Aware Cruse Control feature is still only a Level 2 system on the autonomous driving scale. Despite all the hype from Elon Musk about how awesome his company’s technology is, it still requires constant active supervision by a human driver.

UPDATE: The video has been marked private by the person who uploaded it. However, you can watch a GIF created by CNET at this link

The driver who uploaded the video claims his car malfunctioned. Clearly he is looking for Tesla to take responsibility for the crash. But he is not getting a lot of love from Tesla or the Tesla community. The company says its analysis of the data shows all systems functioned correctly.

One Tesla owner posted this online:

Tesla Model S owners manual page 69:
Warning: Traffic-Aware Cruise Control can not detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object, bicycle, or pedestrian is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death. In addition, Traffic-Aware Cruise Control may react to vehicles or objects that either do not exist or are not in the lane of travel, causing Model S to slow down unnecessarily or inappropriately.

What’s going on here? The owner of the car claims Autopilot and TACC worked flawlessly for him in the past. He doesn’t understand why they didn’t this time. The answer is that Autopilot is still a work in progress. Tesla says explicitly that the software is still in beta mode.

Even thought the company collects millions of miles worth of data from cars using Autopilot every day, there are still what Elon Musk calls “corner cases.” Those are the instances when the software still requires assistance from a human driver. What’s happening is that drivers are being lulled into a false sense of security by how well the system operates most of the time.

Take this frank admission from another Tesla owner who was involved in a collision in April. “Once I recognized the car was stopped in front of me, I explicitly remember panicking with the following thoughts going through my head: “Does my car see this? Is it going to do anything? NO. NO IT ISN’T. EMERGENCY.” In retrospect, the actions I needed to take were obvious . I should have regained control immediately. That half of a second or more probably would have made a lot of difference, the problem is that my brain wasn’t primed to have that conversation with itself.”

There it is in a nutshell. Experts worry about what is called “the hand off.” That’s the tiny sliver of time between when autonomous driving systems are functioning normally and when a situation arises that requires human intervention. In the video, an alarm can be heard prior to the crash. The driver assumed it was the chime that sounds when emergency braking occurs. But he was wrong. According to other Tesla owners who have watched the video, the alarm is the one that warns drivers to re-assert direct control of the car.

Tesla’s Autopilot and TACC are very good and getting better all the time. But they are not perfect and won’t be for some time. The problem is not that we humans don’t trust our machines to function correctly, it is that we trust them too much. We want to believe they will work their magic every time without fail. When they don’t, it takes a beat or two for our own fallible brains to recognize there is danger ahead and to respond appropriately.

The dilemma is that on one hand, we have machines that can handle 99% of driving chores for us. On the other hand, we are expected to be fully alert and engaged at all time to handle that 1% of time when our input is needed. Our attention wanders and we get lulled into believing we really can read the newspaper or fall asleep at the wheel and nothing bad will happen. The biggest danger from autonomous driving systems is not that computers might fail but that humans will.

 

Steve Hanley

Closely following the transition from internal combustion to electricity. Whether it's cars, trucks, ships, or airplanes, sustainability is the key. Please follow me on Google + and Twitter.