Don’t Throw Out the Autonomous Baby With the Tesla Autopilot Bathwater
My dad never graduated college, but he had common sense. If an apple went bad, he would throw it out, then he would check the other apples. But no matter how many bad apples he found, he definitely wouldn’t throw out the oranges. He didn’t know the science, but he knew bad apples don’t affect good oranges, even if they’re in the same bowl.
The same goes for engineering. Take escalators and elevators. Sure, both require electricity and motors, and both move people, but that’s where the similarities end. Even my three-year old knows the difference between them.
Like bad apples, when an escalator fails, you just deal with it. If it’s a problem common to that model of escalator, you improve the rest before others fail. You might even look at escalators in general, just to be safe. But even if you found one particular escalator to have a recurring issue, you wouldn’t suddenly lose trust in elevators. It would make no sense.
But that’s exactly what’s happening right now with the growing controversy around Tesla Autopilot. Almost every time a Tesla is involved in a crash, the words self-driving, driverless or autonomous end up in the headlines.
These crashes have nothing to do with autonomous vehicles (AVs) or actual self-driving technology, because Tesla Autopilot is nothing more than an advanced driver assistance system (ADAS).
It is essential to distinguish between ADAS and AVs, because they are impossible to discuss if one is confused with the other. ADAS and AVs are fundamentally different, and the razor sharp dividing line comes into focus by looking at the user experience.
ADAS requires users to sit up front and drive — even if some aspects of driving are partially automated.
AVs let users sleep in the back.
Tesla Autopilot is ADAS. It is not self-driving or autonomous or driverless, no matter what some fans say, or how headlines may sensationalize it. Even Tesla’s website is clear: “Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”
After twelve incidents of first responder vehicles being struck by Teslas allegedly on Autopilot, the National Highway Traffic Safety Administration (NHTSA) has now opened a formal investigation, a potential first step toward regulation of…well, this is where things look complicated, but are not.
Any Tesla Autopilot issue is an ADAS issue. Tesla Autopilot is but one type of escalator, while true AVs are elevators. Whatever NHTSA decides to do about Tesla Autopilot must recognize that they’re dealing with ADAS, not AVs.
Tesla Autopilot’s core issue is inherent to all ADAS: the better it is, the more drivers trust it, and the more likely they are to zone out. This isn’t theoretical. This is fact, backed by decades of research, and what inspired Argo AI, Waymo and others to focus on AV technology instead of ADAS.
Actual AVs don’t have “zone out” issues, because users — that’s us — can literally sleep in the back. If you want to know the Top 10 things an AV must be able to do to actually be an AV, take a look here.
AVs offer the potential to improve road safety wherever it is deployed, and I believe they will someday make a difference, at scale. But that day won’t come any sooner if they get unfairly dragged into the Tesla Autopilot controversy, or confused with ADAS in general.
We don’t blame oranges for bad apples, or elevators for finicky escalators. And we should call a spade a spade, or in this case, ADAS, ADAS.
Don’t throw out the autonomous vehicle baby with the Tesla Autopilot bathwater.
Alex Roy loves driving, self-driving, and commuting in his Tesla. He is also the Director of Special Operations at Argo AI, host of the No Parking & Autonocast podcasts, editor-at-large at The Drive, founder of the Human Driving Association, author of The Driver, and Producer of APEX: The Secret Race Across America. He held the Cannonball Run record from 2006-2013. You can follow him on Twitter and Instagram.