Self-driving non-sequitur: Level 4 autonomy requires more intelligence than Level 5 (!)

Original article was published by Michael Ali on Artificial Intelligence on Medium


Self-driving non-sequitur: Level 4 autonomy requires more intelligence than Level 5 (!)

Michael Ali

Greg Martin

August 2020

The SAE levels of driving automation ([1] and Appendix) define self-driving capabilities ranked in order of increasing artificial intelligence (AI) exhibited by motor vehicles. Level 0 is no automation, the human controls everything. At Level 5, automation controls everything, all the time. One implication is that lower levels are less capable, from an AI standpoint, than upper levels. Another implication is that the design/engineering effort to create the AI increases as the level increases. Unfortunately, implementing Level 4 autonomy requires more AI capability and design/engineering effort to implement than Level 5.

In theory, Level 4 autonomy is simpler because the AI only operates in some bounded environment, whereas the Level 5 AI must operate in every environment that a human being does. However, in practice, Level 4 requires the AI to not only operate within the bounded environment, but to also recognize when it is outside of boundary limits. Recognizing the limit violations requires Level 4 AI to have three reasoning capabilities that are beyond those required for Level 5:

#1 knowing what it doesn’t know (meta knowledge)

#2 identifying operational boundaries,

#3 transitioning to human control or other fallback when it finds itself operating outside those boundaries

Capability #1: Knowing what it doesn’t know (meta knowledge)

Level 4 is defined as “The sustained and ODD-specific performance by an ADS of the entire DDT and DDT fallback without any expectation that a user will respond to a request to intervene.” [1]

Where:

DDT: dynamic driving task

ADS: autonomous driving system

ODD: operational design domain

A Level 5 AI has no more limitations than a human driver, whereas a Level 4 AI is designed to have limits. So the Level 4 AI must know what those limits are, as defined by the Operational Design Domain (ODD). Example domains include highway driving in clear weather, geofenced retirement community, school zone at night. The task of defining ODDs raises serious questions, not the least of which is how different vehicle manufacturers will standardize on common definitions [2]. However, let’s assume that a way to define ODDs exists, along the lines of the graphic below [3], which purports to show what an ODD must cover. The meta-knowledge issue is this: a Level 4 vehicle know how to operate inside its ODD, but also how to stay inside of its ODD and how to detect when it gets or might get outside of its ODD, which requires meta knowledge about the ODD. Let’s demonstrate this point via a simple example.

Figure 2 shows a driving domain consisting of three attributes: amount of rain (none to high), type of road (highway, street, dirt) and vehicle speed (low, medium, high). Using these 3 dimensions, we have defined an ODD that includes highway and street driving, with no to heavy rain, at low to high speed. The ODD also includes dirt roads, but only up to a light rain, at medium speed.

Now consider a vehicle traveling at high speed on a street in light rain, driving toward a dirt road as in Figure 3. Approaching the dirt road in the rain, a Level 5 vehicle, similar to a human, would slow down. Let’s say that conditions indicate that slowing down to somewhere between medium and high speed is sufficient. Then, as it moved onto the dirt road, the Level 5 vehicle would take further action upon detecting any sliding or slippage. A Level 4 vehicle, because of the limits defined by the ODD, must reason about the situation differently. As it approaches the dirt road, the Level 4 vehicle will also determine that the right action is to slow down. However, if it slowed to medium-high speed, as it entered the dirt road, it would exit the ODD (it can only handle a dirt road in light rain at low to medium speeds). Examining the boundaries of the ODD, the Level 4 vehicle determines that it can stay inside the ODD by slowing to medium speed before it moves onto the dirt road. The Level 4 vehicle must understand both the road conditions and the ODD to make the right response.

This simple example demonstrates how a Level 4 vehicle needs to do some sophisticated ODD boundary analysis in addition to the normal reasoning around changing road conditions. The analysis required both the information in the ODD and meta knowledge about the boundary of the ODD and how to stay within it. A general real-world ODD will have more than three dimensions with discontinuous, non-convex boundaries. Finding the right action(s) to stay within that ODD therefore becomes a daunting task. A potential answer might be to always exit the ODD and execute a fallback behavior. However, that would mean the vehicle will tend to exit its ODD regularly, which exacerbates the issues raised by capabilities #2 and #3 discussed next.

Figure 1. Constituent parts of an operational design domain (ODD)
Figure 2. Example ODD with boundaries
Figure 3. Two-lane highway becoming a street, then a dirt road

Capability #2: Identifying the boundaries

By definition a Level 4 vehicle has to know when it is outside of its ODD. However, transitions between inside and outside may be hard to define and/or detect. Boundaries defined by geofencing or coordinates are easy, but ODDs go beyond those. Consider the example of a Level 4 vehicle that only does highway driving. What defines leaving a highway? Taking an exit or coming to a stop after leaving by an exit? Some highways terminate in a stoplight or a stop sign, but not all, some just become one or two lane roads. Another example: a Level 4 vehicle that cannot handle snowy conditions. Is “snowy” the appearance of the first snowflake, or the level of slipperiness detected upon hitting the brakes, the degree of visibility, or all three? A Level 5 car just has to handle conditions, a Level 4 car has to handle the conditions that it is designed for AND know when those conditions do not apply.

Capability #3 Transitioning to a fallback control

When the Level 4 vehicle finds itself outside of the ODD (still noting the difficulties of identifying the boundaries mentioned earlier), it has to transition to some other operational mode. Per the definition of Level 4, it cannot be assumed that a human driver is available to take over. So the AI must have some other fallback mode. This fallback requires the vehicle to handle (at least with some minimal skill) a situation for which it is not designed. A Level 5 vehicle only needs fallback when a component fails (ex: a sensor breaks, a tire goes flat, loss of actuation). A Level 4 vehicle needs fallbacks for system failures and leaving the ODD. So the Level 4 must be skilled in its ODD, but also “skilled enough” when it finds itself outside of it’s ODD.

To summarize, in comparison to the Level 4 capabilities above, a Level 5 vehicle does not deal with:

A. Knowing what it doesn’t know: a Level 5 car knows everything it needs to know

B. Identifying the boundaries: A Level 5 ODD covers the entire driving domain

C. Transitioning to fallback control after exiting the ODD: a Level 5 car ODD is functionally complete, it only needs fallbacks for subsystem or component failures.

Since Level 4 autonomy must address conditions not addressed in Level 5 autonomy, Level 4 must require more intelligence than Level 5. Therefore the SAE Levels cannot serve as a logically consistent maturity model for achieving vehicle autonomy. At issue is the complexity of dealing with general ODDs. To salvage the model, we must limit the Level 4 ODDs to measurable boundaries and proficient adult human driver capabilities within those boundaries.

We recommend a definition such as this:

An ODD consists of only two elements, a geographical boundary and a speed constraint which can be no greater than 25 miles per hour. Within the ODD, the vehicle can handle normal environmental conditions. The only fallback is to glide to a stop as quickly as possible without putting passengers or pedestrians at risk (i.e., in case of a system failure, or exceptional condition encountered, or a boundary limit exceeded, the vehicle comes to a stop).

This narrowed definition of an ODD has the following implications:

A. Level 4 highway driving would be precluded because of the speed limit, the need to handle any and all conditions (such as weather, lane closures and shifts, etc.) and that the only fallback action available is stopping

B. It gives the passenger a good understanding of the vehicle’s capabilities

C. Since there is only one fallback behavior, the vehicle manufacturer can more easily validate that foreseeable contingencies are addressed

D. A vehicle can execute multiple ODDs, defined by geography

In conclusion, a Level 4 vehicle running a generalized ODD requires a higher level of intelligence than a Level 5 vehicle. This is a non sequitur in the SAE Levels of Driving Automation. ODDs need to be confined to measurable boundaries with simple fallbacks for the model to be logically consistent.

REFERENCES

[1] SAE, SURFACE VEHICLE RECOMMENDED PRACTICE RATIONALE, J3016TM, SEP2016

[2] Eliot, Lance, “Key To Driverless Cars, Operational Design Domains (ODD), Here’s What They Are, Woes Too”, https://medium.com/@lance.eliot/key-to-driverless-cars-operational-design-domains-odd-heres-what-they-are-woes-too-a0f1059e0bdb, last accessed: 08/09/2020.

[3] Ray, Paul and Eric Thorn, A FRAMEWORK FOR AUTOMATED DRIVING SYSTEM TESTABLE CASES AND SCENARIOS, National Highway Traffic Safety Administration United States, Paper 19–0301

APPENDIX