4D imaging radar: a “catfish” entering the field of L2+ autonomous driving

The advent of 4D imaging radar technology for automotive sensing and autonomous driving applications has changed the timeline and economic value of our evolution from L0 cars to fully automated L5 cars. Radar now has new capabilities that enable accurate environmental mapping, which will significantly enhance the overall sensing and perception capabilities of the car, and the automotive industry has placed a high priority on the future role of radar relative to camera sensors and lidar sensors. expect.

4D imaging radar: a “catfish” entering the field of L2+ autonomous driving

Ultra-high-resolution radar mapping has proven to be an attractive and cost-effective alternative to lidar. The advent of commercially viable 4D imaging radar technology will have a significant impact on the ADAS sensor portfolio for L2+ and higher vehicle deployments. It is like a catfish entering the field of autonomous driving, and it will surely disrupt the entire technology and market landscape.

The advent of 4D imaging radar technology for automotive sensing and autonomous driving applications has changed the timeline and economic value of our evolution from L0 cars to fully automated L5 cars. Radar now has new capabilities that enable accurate environmental mapping, which will significantly enhance the overall sensing and perception capabilities of the car, and the automotive industry has placed a high priority on the future role of radar relative to camera sensors and lidar sensors. expect.

In a series of performance and reliability indicators, imaging radar has made up the gap with lidar, and even better in some indicators, and its advantages in commercial cost structure are also beyond the reach of lidar. As these sensor technologies begin to overlap in functionality, we must conduct a detailed assessment of their respective roles and costs.

At the same time, at the critical juncture of the automotive industry’s transition from L2 to L3 safety and automation, some key questions have arisen about the timing and duration of the upgrade. The L2+ level has become the new hotspot, and OEMs are working hard to solve the complex design issues that must be solved to reach the L3 level.

The related cost of fully L3 autonomous driving is still considerable, mainly because the driver does not need the system redundancy required for standby. In fact, L2+ autonomous driving has attracted a lot of attention and has experienced strong growth, because the driver is arranged to be on standby while providing the L3 function. The additional need for redundancy is reduced.

NXP’s “Imaging Radar Online Technology Day” event is in full swing, with 7 selected technical lectures, a comprehensive display of NXP’s scalable imaging radar sensor portfolio and ecological partner products, participate now >>

complexity level

SAE divides autonomous driving into five levels, and it is very helpful to evaluate the potential impact of 4D imaging radar on ADAS and AD applications, especially the large difference between L2 and L3. On an L2 car, the driver needs to be attentive at all times; the driver is ultimately responsible for the safety of the car, and in the event of an accident, they are held responsible. But starting at Level 3, in-vehicle safety automation will become robust enough that it will be the responsibility of the car OEM for safety.

There are also important distinctions between L3 and L4/L5. At L3, driver intervention is still required in some cases, while at L4/L5, driver intervention is only available upon request, and in some L5 use cases, the driver cannot even intervene. L4 and L5 cars must at least be able to drive the car to a stop in any situation without human intervention.

These levels of autonomous driving impose new system redundancy requirements as more of the driving responsibility rests with the car. At Level 3, drivers must be able to take over the car in challenging traffic conditions, while “eyes and hands free” in other situations.

In these scenarios, it may take up to a minute for the driver to gradually fully take over control of the car, and this function—that is, safely handing over control of the car from the car to the driver—requires a significantly higher level of redundancy in the system complexity and cost.

Therefore, the number and configuration of cameras, radar, and lidar sensors required per vehicle to achieve L3 performance, as well as the difference from typical L2 sensor configurations, can have a large impact on OEM manufacturing costs.

4D imaging radar: a “catfish” entering the field of L2+ autonomous driving
Figure 1: Levels of Advanced Driver Assistance Systems and Autonomous Driving

This helps to explain why the L2+ level is here, its purpose is to help OEMs minimize the cost increase from the L2 level, while starting to provide customers with advanced ADAS capabilities close to the L3 level without fully crossing the L3 level line. , as this would shift the responsibility from the driver to the OEM. The L2+ level can make full use of the sensors and semiconductor components associated with the L3 level, keeping manufacturing costs at the L2+ level, while avoiding the additional cost of implementing system redundancy that transfers control from the car at the L3 level. necessary to the driver. At the same time, on the road to L3, L4 and L5, OEMs strive to achieve market differentiation, and in the next few years, many OEMs will introduce new safety and comfort features, which will benefit consumers.

L2+: The next key battlefield

These new safety and comfort features are concentrated at the L2+ level, at a price that consumers are comfortable with, and the central question is whether consumers are willing to pay a higher price for the more system redundancy necessary to achieve the L3 level.

For OEMs, the L2+ level allows them to avoid the substantial costs required to address the L3 level redundancy issues and corner cases that reduce the competitiveness of the car in the market. Level 2+ also allows OEMs to gradually roll out advanced safety and comfort features, allowing more time for sensor technology to mature and commercial-grade adoption at higher levels of autonomous driving. At this transition level, drivers are able to continue to provide the necessary redundancy and OEMs are able to strike a better balance between comfort features and cost.

As they approach L3, OEMs must seriously consider these important questions: If the cost burden of implementing L3 system redundancy is similar to the expected cost burden of L4, why stay at L3? Are customers willing to pay more for L3 safety system redundancy if they still need to focus on driving? While OEMs may not agree on these issues, it is still reasonable to assume that L2+ car production will be much higher than L3 cars for years to come.

A recent Yole Development report suggests that market penetration of L4/L5 vehicles will remain in the single digits until at least 2030, with a portion of these vehicles being used as robotic vehicles. At the same time, as the market penetration rate of L0-L2 vehicles begins to decline, the adoption rate of L2+ vehicles will continue to grow steadily, and by 2030, L2+ vehicles are likely to reach nearly 50% of the market share. Therefore, over the next decade, L2+ vehicles are expected to be the focus of automotive OEMs.

4D imaging radar: a “catfish” entering the field of L2+ autonomous driving
Figure 2: Market Penetration Forecast for Autonomous Vehicles (2021-2030)

Three sensors, no single solution is perfect

We perform a higher-level analysis of the three main sensing technologies (camera, radar, and lidar) that enable ADAS and AD to fully understand the benefits of 4D imaging radar for L2+ vehicles. Ultimately, we found that there is no one-size-fits-all solution, and that each of the three solutions has advantages and disadvantages that complement each other and provide redundancy for other sensor types.

Of course, cameras and radar sensors are widely deployed today because the two technologies are mature, economical, complementary, and affordable. Lidar sensors are not functionally complementary to cameras and radar sensors, and thus serve as a redundancy for both.

The combination of the camera sensor’s ability to detect RGB color information and megapixel resolution makes it an indispensable device for “reading” traffic signs and other applications, as well as improving object recognition and classification accuracy.

But the efficiency and reliability of camera technology can be severely compromised in a variety of lighting conditions, as well as in harsh weather and road conditions. There are also new technologies on the market that can automatically remove moisture and dust from automotive camera lenses, but these mechanisms increase bill-of-materials costs and introduce mechanical loopholes that affect system stability.

The camera’s ability to measure distance and speed will remain significantly limited. Of course, we can get velocity and depth estimates from the stereo camera configuration, but the accuracy is limited, which needs to be compensated from the radar layer.

Lidar: Delivering performance benefits to handle extreme situations

The main differentiating feature of lidar is the ultra-precise angular resolution down to the 0.1-degree level, both horizontally and vertically, in addition to the high resolution of distance measurements, thanks to its use of extremely short wavelengths and pulses . These advantages make lidar ideal for high-resolution 3D environmental mapping, enabling precise detection of spaces, boundaries, and the positioning of the car itself.

However, lidar has some of the same disadvantages as camera sensors. Compared to radar sensors, lidars are very limited in their ability to estimate speed and detect objects at long ranges. Additionally, lidars are susceptible to severe weather and road conditions, which will incur higher costs to address stability and maintenance challenges.

Over the past few years, a variety of new lidars have appeared on the market, such as solid-state lidars, MEMS lidars, or electronically scanned lidars. These new technologies aim to make lidar more “friendly” for automotive applications, including in terms of size, cost and stability. These new technologies are a big improvement over mechanical rotating lidars, but overall, it will be a while before they catch up to the maturity of other ADAS sensors.

The biggest obstacle to widespread adoption of lidar in mainstream passenger vehicles is cost. According to recent OEM assessments, in 2021, lidar for small-scale applications will cost about ten times as much as 12-TX and 16-RX imaging radars with four cascaded radar transceivers. While the cost of both lidar and radar will decline over time, it is expected that by 2030, lidar will still cost twice as much as radar, even when used at scale in advanced automation use cases.

Going forward, lidar will still offer performance advantages to handle extreme situations that arise in complex driving scenarios. Therefore, it will still be an important part of the redundancy necessary for L4 and L5 autonomous driving at an acceptable price.

Read the full white paper

4D imaging radar: a “catfish” entering the field of L2+ autonomous driving

This article is excerpted from NXP’s latest white paper “4D Imaging Radar: A Sensor Ideal for L2+ Autonomous Vehicles”. Based on the previous in-depth interpretation of the status quo and development trends in the field of autonomous driving, in the following chapters, 4D imaging will be discussed. The advantages of radar and its application in a series of extremely challenging scenarios are analyzed in more detail.

The complete chapters of the Chinese version of this white paper are as follows, and you are welcome to download:

complexity level
L2+ – the next key battleground
Three sensors, no single solution is perfect
Lidar: Delivering performance benefits to handle extreme situations
4D Imaging: The Next Leap in Radar
Challenging use case

in conclusion

▲The author of this article is HUANYU GU, Senior Manager of ADAS Product Marketing and Business Development at NXP Semiconductors

The Links:   6DI15S-050 7MBR100U4B120-50 BEST-IGBT