Share this

Unmanned Delivery Vehicle Navigation System: A Zonal Perception Strategy Using LiDAR and Ultrasonic Sensors

2026-04-06 05:13:23 · · #1

I. Sensor Characteristics and Complementarity Analysis

LiDAR (LiDAR) constructs a 3D point cloud model by emitting laser pulses and measuring reflection time. Its core advantages lie in its long range (0.1-200 meters), high precision (±2cm), and all-weather operation. For example, the RoboSense M1 LiDAR uses a 128-line mechanical rotating design with a vertical angular resolution of 0.2°, which can clearly identify pedestrian body movements at a distance of 100 meters. However, LiDAR is prone to noise in direct sunlight, rain, and fog, and has blind spots for measuring highly reflective objects such as glass and metal.

Ultrasonic sensors, based on the emission and reception of ultrasonic pulses, calculate distance using the time difference. Their characteristics include short range (0.02-5 meters), low cost (unit cost < $5), and strong environmental adaptability. For example, the MaxBotix HRMAX-V3 ultrasonic module maintains a measurement accuracy of ±1cm within a temperature range of -40℃ to 85℃ and is insensitive to dust and rain. However, ultrasonic sensors have limitations such as wide beam angle (typically 60°) and low data refresh rate (<20Hz), making it difficult to independently support obstacle avoidance requirements in high-speed motion scenarios.

The complementarity between the two is reflected in the following aspects: LiDAR provides global environment modeling, while ultrasonic sensors supplement near-field detail perception; when LiDAR is susceptible to environmental interference, ultrasonic sensors provide redundant data; when ultrasonic sensors cannot identify object types, LiDAR point clouds can assist in classification. This difference in characteristics lays the foundation for the zonal perception strategy.

II. Technical Implementation of the Partition Awareness Strategy

Unmanned delivery vehicle navigation systems typically divide the perception range into three areas: near field (0-1 meter), mid field (1-5 meters), and far field (5-200 meters), and design sensor fusion schemes for the characteristics of different areas.

1. Near-field region: Ultrasonic pre-contact warning

The near-field area is where autonomous vehicles face the highest risk of collisions with obstacles, requiring millimeter-level accuracy and millisecond-level response. The system deploys 8-12 ultrasonic sensors in this area, arranged around the vehicle at 30° intervals. For example, JD.com's unmanned delivery vehicles use a "4 front + 4 rear + 2 side" layout, with forward sensors covering a 120° range around the front of the vehicle and side sensors monitoring the door opening area.

The ultrasonic sensor performs three tasks in this area:

Static obstacle detection: Identifies fixed obstacles such as steps and curbs by continuously monitoring changes in distance;

Dynamic obstacle avoidance: Predicts obstacle movement trends based on wheel speed sensor data, triggering emergency braking when the relative speed exceeds 0.5 m/s;

Vehicle body protection: In automatic parking scenarios, monitor the distance to surrounding vehicles to prevent scratches.

To overcome the problem of data sparsity in ultrasonic sensors, the system adopts a multi-sensor voting mechanism: a threat is confirmed only when three or more sensors detect an obstacle simultaneously, reducing the false alarm rate from 15% to 0.3%.

2. Midfield Area: Cross-validation of LiDAR and Ultrasonic Testing

The mid-field area is the main activity range of dynamic targets such as pedestrians and non-motorized vehicles, requiring a balance between the sensing range and the data refresh rate. The system deploys 1-2 16-line LiDARs (such as the Hesai Pandar40P) in this area, with a horizontal field of view of 360° and a vertical field of view covering -15° to +15°, scanning the environment at a frequency of 10Hz.

The fusion strategy of lidar and ultrasound includes:

Target-level fusion: This involves associating dynamic targets (such as pedestrians and bicycles) detected by LiDAR with ultrasonic data, and optimizing target trajectory prediction through extended Kalman filtering (EKF). For example, when LiDAR detects a pedestrian suddenly accelerating, the ultrasonic sensor can supplement the near-field distance data in real time to correct the time-of-collision (TTC) calculation.

Feature-level fusion: Reflection intensity features and ultrasonic echo amplitude features are extracted from the lidar point cloud to construct a multimodal target descriptor. Experiments show that the target classification accuracy after fusion is improved from 78% to 92%.

Decision-level fusion: In rainy or foggy weather, when the point cloud density of the lidar decreases by 30%, the system automatically increases the weight of the ultrasonic sensor to ensure the reliability of obstacle avoidance decisions.

3. Far-field region: Global modeling dominated by LiDAR

In the far-field region, environmental perception and path planning within a 200-meter range are required. The system deploys a 64-line or 128-line LiDAR (such as the RoboSense M1 Pro) in this area, working in conjunction with a high-precision map to achieve SLAM positioning. For example, Meituan's unmanned delivery vehicles use a "1 main + 2 auxiliary" LiDAR layout, with the main radar responsible for modeling the 120° area in front, and the auxiliary radars supplementing blind spots on the sides and rear.

The core tasks of lidar in this area include:

Static map construction: Centimeter-level positioning is achieved by matching real-time point clouds with high-precision maps using the ICP algorithm;

Dynamic obstacle tracking: The DBSCAN clustering algorithm is used to segment the point cloud, combined with the Hungarian algorithm to achieve multi-target tracking;

Driving zone segmentation: Based on point cloud density and ground normal vector analysis, semantic information such as lane lines and sidewalks is identified.

The role of ultrasonic sensors in the far field is limited to redundancy in extreme scenarios: when the lidar fails due to malfunction or strong interference, the ultrasonic sensor can provide basic obstacle avoidance capabilities to ensure the vehicle stops safely.

III. Challenges and Optimizations in Engineering Practice

1. Multi-sensor spatiotemporal synchronization

The difference in sampling frequencies between lidar and ultrasonic sensors (10Hz vs 20Hz) can easily lead to misaligned data timestamps. The system employs a hardware-triggered synchronization scheme: using lidar as a reference, the ultrasonic sensor's sampling time is synchronized via a PWM signal, keeping the time error within ±1ms. Spatial synchronization is achieved through joint calibration, using a checkerboard calibration board to simultaneously acquire data from both sensors, solving for a 6-DOF transformation matrix to align the point cloud and ultrasonic data in a unified coordinate system.

2. Dynamic environmental adaptability optimization

In densely populated commercial areas, the system needs to handle occlusion and suddenly appearing obstacles. By introducing an attention mechanism, the LiDAR prioritizes scanning suspicious areas detected by the ultrasonic sensor, increasing the point cloud allocation ratio from 10% in uniform scanning to 30%, thus reducing the dynamic target detection latency from 200ms to 80ms.

3. Balancing low power consumption and cost

The ultrasonic sensor consumes only 0.5W, while the lidar consumes 20-50W. The system employs dynamic operating mode switching: when driving at low speeds, the lidar resolution is reduced (from 64 lines to 16 lines), reducing power consumption by 60%; when stationary, some ultrasonic sensors are turned off, reducing overall energy consumption by 45%.

IV. Future Trends and Outlook

With the maturation of solid-state LiDAR and 4D millimeter-wave radar, the perception system of unmanned delivery vehicles will evolve towards all-solid-state and multi-modal fusion. For example, Hesai Technology's FT120 solid-state LiDAR achieves 128-line equivalent performance through non-repetitive scanning technology, reducing power consumption to 10W; ultrasonic sensors are developing towards higher frequencies (1MHz) and arrays, with beam angles compressed to 15°, approaching the accuracy of LiDAR. In the future, the zoned perception strategy of LiDAR and ultrasonic sensors will be further refined, combined with AI algorithms to achieve end-to-end optimization of "perception-decision-control," driving unmanned delivery vehicles towards Level 4 autonomous driving.

Read next

CATDOLL 146CM Tami TPE

Height: 146cm A-cup Weight: 26kg Shoulder Width: 32cm Bust/Waist/Hip: 64/54/74cm Oral Depth: 3-5cm Vaginal Depth: 3-15c...

Articles 2026-02-22
CATDOLL Dora Hard Silicone Head

CATDOLL Dora Hard Silicone Head

Articles
2026-02-22
CATDOLL 138CM Tami Torso Doll

CATDOLL 138CM Tami Torso Doll

Articles
2026-02-22
CATDOLL Chu Soft Silicone Head

CATDOLL Chu Soft Silicone Head

Articles
2026-02-22