Since then, radar technology has continued to develop, becoming a crucial enabling technology for automotive functional safety. The automotive radar market is projected to exceed $18 billion by 2033, helping engineers deploy advanced driver assistance systems (ADAS). Many functions of modern automobiles, such as automatic emergency braking systems, forward collision warning, blind spot detection, lane change assist, rear collision warning systems, adaptive cruise control, and automatic stop-and-go, all rely on radar.
For years, improving radar resolution has been a significant challenge for engineers, and recent technological innovations have enabled radar to provide more accurate information in target detection. Traditional 3D automotive radar sensors utilize radio frequency to detect the distance, position, and Doppler effect (velocity) of objects. To enhance the role of radar in the safety value chain and support autonomous driving, the industry is constantly pushing the boundaries of 3D radar. Since 2022, the European Telecommunications Standards Institute (ETSI) and the US Federal Communications Commission (FCC) have developed spectrum regulations and standards, with Europe and the US phasing out the 24GHz ultra-wideband (UWB) radar frequency while opening up a continuous 5GHz band from 76GHz to 81GHz. The 76GHz band is used for long-range detection, while the 77-81GHz band is used for short-range, high-precision detection. Advanced automotive radar systems with higher frequencies and wider bandwidths can improve range resolution; for example, a 24GHz radar system has a range resolution of 75cm, while a 77GHz radar system improves it to 4cm, enabling better detection of approaching targets. This has driven the development of 4D radar, which, based on 3D radar data, can provide more accurate and detailed 3D spatial information such as the vertical position of objects. The emergence of 4D imaging radar allows autonomous vehicles to detect small objects at higher resolution, create more complete "all-around" environmental maps, correctly interpret objects from a vertical perspective, and avoid misjudgments.
Human drivers rely on a combination of visual, auditory, and experiential information in complex traffic environments, while autonomous vehicles depend on radar sensors, cameras, lidar, and vehicle-to-everything (V2X) systems to provide accurate data for perceiving the traffic environment. These data streams communicate with ADAS or autonomous driving algorithms to help the car perceive the relative positions and speeds of objects, triggering passive or active responses from the control algorithms.
Currently, automakers and radar module providers test radar module functionality through both software and hardware. Hardware testing primarily employs two methods: one uses corner reflectors to represent static targets, but these need to be moved when the scene changes; the other uses a radar target simulator (RTS), which electronically simulates radar targets, including both static and dynamic targets and their related parameters. However, in complex scenarios with more than 32 targets, RTS-based functional testing has limitations. It also cannot assess the ability of 4D and imaging radar to detect extended targets (objects represented by point clouds) and fails to accurately represent the complexity of the real world.
Machine learning helps developers train ADAS algorithms to better interpret and classify data from sensor systems such as radar. The "you only look once" (YOLO)-based radar target detection method can simultaneously and accurately detect and segment multiple objects. Rigorous testing of physical radar sensors and ADAS algorithms is crucial before road testing of autonomous driving systems. Automakers are using radar scene simulation technology to "bring" real-world road scenarios into the lab for 360-degree all-around simulation testing. New radar scene simulation technologies utilize ray tracing and point cloud techniques to extract data from highly realistic traffic simulation scenarios, enabling better object detection and differentiation. Through new millimeter-wave (mmwave) over-the-air (OTA) technology, radar scene simulators can generate multiple static and dynamic targets at different distances and speeds, providing more realistic scenarios for radar sensor testing. In radar scene simulation, sensors and algorithms can undergo rapid multiple design iterations, correcting errors and fine-tuning designs, greatly aiding pre-road driving tests and helping automakers develop variable handling applications, such as verifying the impact of bumper design on radar functionality. Autonomous driving platform providers and radar system manufacturers can enhance vehicles' perception of different real-world traffic scenarios through repeatable and customizable scenarios, providing a wealth of data for machine learning in autonomous driving algorithms. High-speed digital signal processing (DSP) also plays a crucial role in fine-tuning radar detection results, capable of collecting various information about targets such as pedestrians and training radar algorithms to identify targets.
From chip design to manufacturing and radar module testing, every stage of automotive radar design, development, and manufacturing requires rigorous testing. The use of millimeter-wave frequencies in automotive radar applications presents numerous challenges, but it also drives continuous innovation and development in radar technology, evolving it into a super sensor. In the future, with continued technological advancements, automotive radar is expected to possess even more powerful functions, such as more accurate target recognition and more intelligent decision support. Deep integration with other sensors will bring higher safety and reliability to autonomous driving, playing a greater role in the field of intelligent transportation and unlocking limitless possibilities.