Update July 26 at 7:00 a,m. A response from Tesla Motors that was unknown to me at the time this story was written has been posted in the comments to this story by Mitchell Burns. Please see the entire response in his comments below. It adds a vital component to this story you should be aware of. Thanks to Mitchell for making this information available.
An angry Model X driver whose Model X crashed on a Montana back road has posted a long and bitter note on the Tesla Motors Club forum, condemning Tesla for making a beta version of its Autopilot software available in its cars. He says Tesla drivers are “lab rats.” He then calls out Elon Musk personally. “Mr. Musk should stand up as a man, face up the challenge to thoroughly investigate the cause of the accident, and take responsibility for the mistakes of Tesla product.”
The driver, who identifies himself as Pang on TMC, gives the following account of the events that led up to the crash. He says he and a friend drove about 600 miles on Interstate 90 on the way to Yellowstone National Park. When he exited the highway to get on Montana route 2, he drove for about a mile, saw conditions were clear, and turned on Autopilot again.
“After we drove about another mile on state route 2, the car suddenly veered right and crashed into the safety barrier post. It happened so fast, and we did not hear any warning beep. Autopilot did not slow down at all after the crash, but kept going in the original speed setting and continued to crash into more barrier posts in high speed. I managed to step on the break, turn the car left and stopped the car after it crashed 12 barrier posts.
“After we stopped, we heard the car making abnormal loud sound. Afraid that the battery was broken or short circuited, we got out and ran away as fast as we could. After we ran about 50 feet, we found the sound was the engine were still running in high speed. I returned to the car and put it in parking, that is when the loud sound disappeared.”
After a review of the computer logs from Pang’s car Tesla reviewed the driving logs from the Model X and reported that the car was operating for more than two miles with no hands on the steering wheel, despite numerous alarms and warnings issued by the car.
There are three possible explanations for the crash. 1.) Everything happened exactly as Pang says it did. 2. Everything happened exactly as Tesla says it did. 3.) Evil aliens patrolling nearby in a UFO decided to have some fun with Mr. Pang.
Comments on TMC ranged from the incredulous to the acerbic, as people pointed out over and over again that Teslas are programmed to put themselves in Park as soon as the driver’s door is opened with the driver’s seat vacant. The idea that the car’s motors could continue churning away at high speed without the wheels moving also was met with great skepticism.
Still, new technology has always caused fear in people and fear can dangerously distort perception. Some have hinted darkly that the only one who knows what those computer logs actual reveal is Tesla Motors. They, of course, have a vested interest in covering up any bad news about their Autopilot system, especially now that NHTSA, NTSB and the US Senate have opened investigations into the cause of a fatal crash in Florida in May. Still, almost everyone agrees the use of Autopilot on a dark country road in Montana in the wee hours of the morning may not have been the best possible decision.
Pang says Tesla has never contacted him to get his side of the story, although an update suggests the company may have reached out to him now. Is it fair to say that Tesla could have been more proactive at dealing with Mr. Pang’s complaints? It shouldn’t take an emotional and highly embarrassing (if true) blog post to get Tesla to deal appropriately with a customer who has experienced an unexplained crash while driving one of the company’s cars. Perception is reality, especially in a world dominated by internet communications.
I shared this story with my wife and before I was even done she said, “He fell asleep at the wheel and didn’t wake up until the crash.” Oddly enough, that is my assessment as well. That being said, Autopilot is supposed to activate the emergency flashers, slow down, and stop if it thinks the driver is not paying attention. Somehow, it appears that didn’t happen in this case, Surely driving on a curvy country road for two miles without a hand on the wheel should be cause enough for the fail-safe provision of the Autopilot system to kick in, no?
Some information about that rather than a bland “The data shows driver error,” report would probably go a long way to reassuring the public about the safety of the system. With all the bazillions of dollars Tesla has spent to develop Autopilot, surely there is a little camera inside the rear view mirror that monitors the driver’s facial expressions for signs of fatigue? Tesla wouldn’t overlook such an obvious piece of the puzzle, would they?
Source: Teslarati via TMC Photo credit: Steven Xu via TMC