Breaking News: Another Tesla Model X Crashes On Autopilot

There are thousand of motor vehicle crashes every day. Very few of them ever get press coverage. But add the names “Tesla” and “Autopilot” to a story and it suddenly becomes news, especially after the first reported fatal accident involving a Tesla being driven in Autopilot mode in Florida in May. NHTSA and NTSB have opened investigations into the crash and their findings may have a significant impact on how regulators view Autopilot and semi-autonomous driving systems from other manufacturers.

Model X crash in Montana

 

The latest crash involves a Model X driving on a two lane country road in Montana in the wee hours of the morning. The report comes via a friend of the driver who started a thread on the Tesla Motors Club forum entitled “My friend model X crash bad on AP yesterday.” The first thing to point out is the poster does not appear to be a native English speaker, so some allowances for grammar need to be made. TMC member Eresan began the thread this way:

“Both 2 people on car survived. It was late at night, Autopilot did not detect a wood stake on the road, hit more than 20 wood stakes, tire on front passenger side and lights flyed away. The speed limit is 55, he was driving 60 on autopilot. His car is completely destroyed. The place he had accident does not have cellphone signal, it is 100 miles from the hotel. We are on a 50 people Wechat messenger group. I woke up today saw he managed to get internet to ask people in the Wechat group to call tesla for assistant.”

A few hours later, Eresan added this post: “Just got more photos from the driver. The car was in autopilot at speed between 56-60, the car drove off the road hit the guard rail wood posts. I questioned him how can AP drove off the road himself, he said he also want to find out. Photo attached the wood posts he hit.”

Whoever the driver is, he is not getting much love from others on the TMC forum. Most question the wisdom of operating in Autopilot mode on a dark country road with intersecting roads at 2 am under any circumstances. Here’s a comment posted by mrElbe: “And I thought Tesla drivers were a bit brighter than the average one out there. But apparently not! Using AP at 2 am on a sketchy road is just negligent.” To which Eclectic replies, “Or even suicidal. I know the area where the accident is said to have happened and it’s not a place for AP use at 2 AM. There are all sorts of animals that cross roads in the area, from deer to antelope to even elk. Whatever the person was doing at 2AM, he or she made a series of bad calls. I couldn’t imagine using AP on I 90 under those conditions, let alone a county road in farm/game country.”

No doubt Tesla will have something to say about this latest accident once it has downloaded the data stored in the car’s computer. It’s easy to see that there is storm gathering around Tesla and its Autopilot system. Some argue that the system should not be able to be switched on at all if the conditions are not proper for its use. Others argue that people are still required to be responsible for controlling their cars under all circumstances. For instance, Autopilot can be engaged at speeds up to 90 mph even though Tesla warns it will not be able to detect all objects in the path of the car at those speeds.

Why didn’t Tesla limit the capabilities of the system initially to avoid its use in dangerous circumstances? That’s a good question and one that has no easy answer. It appears the stupidity of human drives came as a surprise to Elon Musk and Tesla. And Musk loves to push boundaries. It’s what he does. Perhaps some of that famous Musk hubris is at fault. Whatever the explanation, it becomes clearer every day that the main cause of these collisions is not software or hardware malfunctions but poor judgment by the nut behind the wheel.

Expect NTSB to rein in some of the features of Aut0pilot and when it does, Musk and Tesla will have no one to blame but themselves. They made the decision to put Autopilot out there in beta form. They could have restricted it to no more than 5 mph over the posted speed limit from day one. They could have programmed it not to activate on two lane roads with cross traffic. They could have let the system prove itself then rolled out upgrades and updates as people became more familiar with the new technology.

They chose not to. Looking back at the issues that beset the launch of the Model X, Musk admitted the company had bitten off more than it could chew. An argument can be made that it has done the same with Autopilot. Elon Musk is a visionary, a force of nature. Once he makes up his mind about something, he is not easily deterred. But there is fine line between determination and pigheadedness. Sometimes, I think Elon Musk is his own worst enemy.

That is my opinion and one that many will disagree with, as is your right. It would be unfortunate, however, if Musk’s  bullheaded nature provides the basis for regulatory actions that impede the development of autonomous driving systems in the future for all drivers. Sometimes bold action needs to be leavened with a dash of common sense — and a realization that there are a lot of really stupid people out there who will be using products in a way that was never intended.

Steve Hanley

Closely following the transition from internal combustion to electricity. Whether it's cars, trucks, ships, or airplanes, sustainability is the key. Please follow me on Google + and Twitter.