On August 7, Mark Molthan was driving his Tesla Model S on highway 175 near Kaufman, Texas. He was using Autopilot, something he says he has done many times before on that road. He tells police he reached into the glove compartment to get a rag and was cleaning the dashboard of his car when it suddenly swerved off the road and hit a guardrail made of steel cables. The car bounced several times along the guardrail cables before coming to a stop.
The Model S was heavily damaged. Molthan thinks it’s a total loss. But he says he won’t sue Tesla as a result of the collision, since he obviously was not paying attention to the road when the accident happened. “I used Autopilot all the time on that stretch of the highway,” he said during a phone interview with Bloomberg. “But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn’t stop. It actually continued to accelerate after the first impact into the guardrail.”
Molthan may be philosophical about his loss, but his insurance company is less than thrilled about being on the hook for the cost of new Tesla. Lawyers for the carrier have sent a letter to Tesla asking the company to conduct a joint inspection of the car. Tesla says it is looking into the Texas crash. As usual in these instances , Tesla stresses that Autopilot is only an assist feature. Drivers need to keep their hands on the wheel and be prepared to take over control of the car at any time.
Others are annoyed that only the bad news about Autopilot ever makes the news. “I’m disgusted that the only time Autopilot is in the news is when there are crashes,” said Diana Becker of Los Angeles. “Nobody hears about the accidents that don’t happen.” She recently completed a 27 day road trip with her two children in a Tesla. She credits the Autopilot in her Model X with avoiding a collision with a driver who crossed suddenly in front of them.“I drove 400 miles a day on our road trip, and Autopilot was my second pair of eyes,” said Becker. “I depend on it.”
The questions raised by this incident include why did Molthan’s Tesla fail to negotiate a turn in the road it had handled successfully many times before? And why did it keep going after the first collision?
That behavior is become a recurring theme in accidents involving Autopilot. Witnesses to the accident that killed Joshua Brown in Florida on May 7 say his car continued on down the road after striking a tractor trailer crossing the road ahead of him. The driver of a Model X in Montana says his car clipped a guardrail post one dark night while on Autopilot and continued on, hitting a dozen more posts before finally coming to a halt after its right front wheel was ripped off the car. Some people find the apparent inability of the Tesla system to identify when an collision has occurred and stop the car troubling.
The other question presented is whether an insurer can sue Tesla for what it will say is a defect in the Autopilot system when its own insured admits he was at least partially at fault? The answer, of course, is that in an increasingly litigious country, anyone can sue anyone else at any time about anything. As a former colleague of mine loved to say, “Only lawyers and painters can turn black to white.”
if Tesla does get sued, any decision will not be binding on other courts. But it may open the floodgates to a litigation bonanza for trial attorneys. Tesla must be aware that its Autopilot system, which admittedly is still in beta testing, will be a magnet for legal challenges. But Elon Musk’s “Damn the torpedoes. Full speed ahead!” attitude keeps him from slackening his headlong rush toward autonomous driving cars.
In the end, how quickly or slowly autonomous driving system are adopted by other manufacturers and the general public may have less to do with what regulators say and more to do with what courts and juries decide.
Source: Bloomberg Photo credit: Mark Molthan