Hit enter to search or ESC to close

What Are the SAE Levels of Automated Driving? Think of the Hands, Feet, Eyes and Brain

A graphical representation of the inputs needed when a person drives a car, showing a finger pointing towards icons of a foot over a pedal, hands beside a steering wheel, eyes, and a brain

Editor’s Note: We know it’s hard to understand the SAE Levels of Driving Automation so we asked two writers – one layperson and one expert – to offer some guidance. You can find the companion piece here

One of the many things I learned in my two decades in the engineering profession is that communication is not a strength of most engineers — at least to those that aren’t engineers. They tend to use way too many technical terms to express concepts in ways that come across as an alien language to non-engineers. That’s why a standard definition of automated driving levels developed by a bunch of engineers inevitably became misunderstood and misused by marketers, policymakers, and even industry leaders. But there are better ways to express what a vehicle’s electronic brain actually can do to assist, support, or replace a human driver.

In a companion article to my piece, writer Andrew Lewis, who is not an engineer — relates automation to animals for those looking for a fun, easy analogy. I’m going to dive a little deeper however and explore the relationship between our own senses and how vehicle automation evolves.

When it comes to describing assisted and automated driving technologies, we actually face multiple problems. The first is that the marketers love to come up with branding that creates the impression that they are selling you something you can’t get from any other company. To that end, when shopping for a new ride, you may hear terms like intelligent cruise control, smart speed control, dynamic radar cruise control, and more. All of these refer to exactly the same capability, a forward-facing sensor that measures the gap and closing speed to the vehicle ahead and automatically adjusts the vehicle speed to maintain a safe gap. The generic term is adaptive cruise control.

The other main problem is the misuse of the automated driving levels as defined in the SAE J3016 standard. Like other industry standards published by SAE International, J3016 was developed by a volunteer committee of members and put to a vote before publication. As a member of SAE since my college days in the late-1980s, I’m a supporter of the organization, and most of their standards are well reasoned and have helped move the industry forward.

This particular standard defines 6 levels of driving automation from 0, which means momentary assistance, to 5, which means full automation without human supervision or intervention in all conditions.

The problem is the way that each of the SAE levels 1-4 are classified leaves far too much ambiguity and doesn’t really do much to define the capabilities in a way that is useful to ordinary people.

Rather than numerical levels that have no inherent meaning, we need a completely different approach to describing the functionality of assisted and automated driving systems. A variety of alternatives to the SAE levels have been suggested over the years, but they often involve jargon that doesn’t immediately express the capabilities to ordinary drivers.

At a minimum, we generally use four main parts of our bodies while driving: feet, hands, eyes and brain. While there are adaptations available for those with certain physical differences such as hand controls for those without the ability to use feet, the functional mode of a vehicle can be well described by the on/off state of each of these four body parts.

I’ve used this way of describing assisted and autonomous driving functionality during conference talks and interviews for years, but I have no recollection of where I first heard this system and lay no claim to creating it. If it was you, please let me know and I’ll provide full credit.


This is equivalent to SAE Level 0 where all aspects of the driving task fall to the human. The driver must apply the accelerator/brakes/clutch, do the steering/shifting, watch the road and use their brain to interpret the environment and make decisions.


This category describes driver assist systems that allow the vehicle to control speed but the driver is responsible for everything else. The driver must watch the road and is fully responsible for directional control and must also supervise speed. This category includes everything from basic driver assist systems like cruise control and lane departure warnings which require hands on the steering wheel. This group spans both SAE Levels 1 and 2 but the key is that they are designed to assist or augment the human driver, not take over.


This category allows the driver to take hands off the steering wheel but still require them to watch the road and be ready to take back full control at any moment. These systems technically fall into SAE Level 2, but many companies have taken to calling these L2+ although no such thing is actually defined. Despite relieving some of the driving task, this is still not autonomous or self-driving.


This is the category where things can get more complicated. This generally corresponds to the definition of SAE Level 3 and while J3016 groups this with automated rather than assisted driving, it should really be considered assistive.

This is the first category where the driver no longer has to actively watch the road at all times. The systems can handle most of the driving, under certain limited circumstances, such as being in low-speed, stop-and-go traffic on a highway. While in this mode, the driver can read, text or watch videos.

However, as soon as the vehicle exceeds whatever the design limits of the system are, the vehicle will request that the human resume control within a few seconds. That means the driver can’t climb in the back seat and take a nap. That hand-off period and the time it takes the driver to regain full situational awareness can be a potentially dangerous period where neither the software or the human are fully capable. Needless to say, this is still not self-driving or autonomous.


This final category is the only one where the vehicle is capable of autonomous driving without any human supervision or requirement for takeover. This can include vehicles with no human occupants on board such as an automated delivery vehicle or a robotaxi. Even if people are on-board, they may take a nap or do other activities. These vehicles need fail-operational capabilities so that they can continue operating, at least until they can stop in a safe location even if a fault has been detected. That means they need redundant sensing systems, compute platforms and actuators to make it all go, stop and steer. This category can include vehicles that are geographically limited (SAE Level 4) or can go anywhere (SAE Level 5).

These five new categories aren’t perfect either, but they do provide the people using vehicles with a better understanding of the vehicle capabilities and responsibilities that they retain as a driver/passenger in each mode. There is still a spectrum of opportunities within each category and a vehicle may and likely will be able to be used in multiple modes.

In addition to feet, hands, eyes and brain, we often also use at least one other sense while driving, our ears. While listening is not necessarily essential to driving, and people with hearing loss can certainly drive safely, hearing can aid situational awareness.

Telematics systems can already “listen” to the cloud to provide traffic and road condition alerts. As V2X or vehicle-to-everything communications becomes more common in the coming years, our cars will be listening for signals from other vehicles, road infrastructure and vulnerable road users to provide alerts to drivers. Automated vehicles are also already beginning to use external microphones to listen for emergency vehicles.

Unless and until we achieve the capability for automated driving in all conditions we’ll need to understand what the vehicle can do in any mode, and therefore clear communications are critical as this technology becomes more pervasive.

Choose your lane

How Autonomous Vehicles Distinguish Between Bicycles and People Who Ride Them

How Autonomous Vehicles Distinguish Between Bikes and People

When it comes to how autonomous vehicles see the world, humans come first, literally. Autonomous vehicles (AVs), like the kind operated by Pittsburgh-based Argo AI, use Machine Learning to detect and classify the objects in their surroundings, identifying people...
Why The League of American Bicyclists is optimistic about autonomous vehicles

Why a Leading Cycling Advocacy Group Is Optimistic About Autonomous Vehicles

As autonomous vehicle use grows, AV companies and the League of American Bicyclists are collaborating on how to ensure cyclists and motorists can share the roads safely, even if the “motorist” is artificial intelligence software. As part of the...

Self-Driving Is Arriving Right On Time. Just Like Ice Cream Did

Seven years ago, I was a self-driving skeptic. Not of the technology. Of all the “experts” promising autonomous vehicles would be everywhere by 2020. You didn’t need to be Nostradamus to know that was ridiculous. All you needed was...
Illustration of a futuristic parking deck turned into a mixed-use space, with AVs driving by

How Autonomous Vehicles Could Help Transform Parking Lots

Researchers say it’s likely that autonomous vehicles (AVs) can help reduce the need for parking lots, opening more room for grass and trees and other elements of nature. It may not seem like it when you’re circling the block...
An illustration of an Argo autonomous vehicle in teal against a backdrop of buildings, a bicyclist, and research papers

7 Big Breakthroughs From Argo Research at CVPR 2022

The 2022 Conference on Computer Vision and Pattern Recognition (CVPR 2022) is nearly here. Thousands of computer scientists, software engineers, and researchers from around the globe will gather in New Orleans to review and discuss their latest work in...

Researchers Predict the Future With Lidar Data

Researchers at Carnegie Mellon University’s Argo AI Center for Autonomous Vehicle Research, a private-public partnership funded by Argo for advancing the autonomous-vehicle (AV) field, say they have come up with a way to use lidar data to visualize not...

Must Reads