Below is a dashcam video showing how the latest version of Tesla Autopilot can navigate on a snow covered two lane road with no visible lane markings and no car ahead to follow. It’s difficult to keep all the details about Tesla Autopilot straight, so here’s a quick refresher. (Note: I am not a Tesla engineer nor do I eat, sleep, and breathe Tesla 24/7, so if I get a few details wrong, I apologize in advance.)
Tesla began installing the hardware pieces for its Autopilot system on every car built starting in October of 2014. The system consisted of a forward facing camera supplied by MobilEye, a forward facing radar, and 12 ultrasonic senses mounted around the perimeter of the car. For the first year, the system operated in shadow mode, which means it captured data from real cars in real world driving and fed that information back to engineers at Tesla. They used that data to refine and validate software that would become the basis for the suite of semi-autonomous operations known as Autopilot.
Then in the fall of 2015, Tesla rolled out a working version of Autopilot via an over the air wireless update. In the original scheme of things, the camera was the primary input device and the radar was secondary. The system relied on visible lane markings on road surfaces and could lock on to a car ahead to help guide it. Then in May of this year, a driver using Tesla Autopilot was killed on a Florida highway when his Model S failed to recognize a tractor trailer crossing the road. That incident led to parting of the ways between Tesla and MobilEye, with each blaming the other for the tragedy.
Tesla went back to the drawing boards. It completely reversed the priorities of its hardware, making the radar primary and the camera secondary. It rolled out the new programming in September of this year. In the video, we are seeing how the new software is able to drive the car without human assistance on a road where the lane markings are obscured by snow and with no car ahead for the system to follow.
In October of this year, Tesla completely changed the hardware that operates the Autopilot system. It added a camera with three lenses — a wide angle, a close range focus, and one with a medium range focus — as well as cameras at all four corners of the car. The radar was upgraded as were the ultrasonic sensors. It also added a new “supercompter in a box” from Nvidia with 40 times more processing power to manage the increased flow of data. The new package is referred to as Hardware 2.
Here’s where things get confusing. Once Tesla gets the new system calibrated, validated, and de-bugged, it will be known as Enhanced Autopilot. But Tesla is not at that point yet. When it announced the changes, it said the new system would operate in shadow mode only until its engineers could tweak the software and make it road worthy. The company promised it would start rolling out the changes in December. As of this moment, in typical Tesla fashion, it has over-promised and under-delivered. Until it gets its Enhanced Autopilot fully deployed, cars with the original hardware package and latest software updates have greater semi-autonomous functionality than newer cars.
The wait will be worth it in the long run. When fully realized, Enhanced Autopilot will allow full Level 5 autonomous driving in those jurisdictions where it is permitted. At the moment, only Michigan does so.