Share this

Will lidar used in autonomous driving damage cameras?

2026-04-06 03:22:48 · · #1

However, with their widespread application in the field of autonomous driving, a pressing question has emerged: Will lidar damage cameras?

Working principle and characteristics of lidar

LiDAR (Light Detection and Ranging) primarily relies on Time-of-Flight (ToF) ranging technology. During operation, the transmitting system emits short-pulse laser beams into the surrounding environment. When the laser beam encounters a target object, a portion of the light is reflected back to the LiDAR's receiving system. By precisely measuring the time difference between emission and reception, and combining this with the known constant of the speed of light, the distance between the target object and the LiDAR can be accurately calculated. To achieve comprehensive 360-degree scanning of the environment around a vehicle, LiDAR typically employs high-speed rotation or tilting scanning methods to ensure continuous coverage of all directions by the laser beam.

Currently, the commonly used operating wavelengths for automotive LiDAR are 1550 nm and 905 nm. While the 905 nm wavelength is relatively close to visible light, it tends to focus on the retina when entering the human eye. To ensure eye safety, its laser power must be strictly controlled at a low level, which limits its detection range and accuracy to some extent. On the other hand, the 1550 nm wavelength laser, because most of its energy is absorbed by the cornea and lens when entering the eye, is relatively safe for the human eye. Therefore, it can use higher output power, achieving a longer detection range and higher detection accuracy, demonstrating significant advantages in autonomous driving scenarios.

Working principle and key components of a camera

The core working principle of a camera is to convert received light signals into electrical signals. When a scene is focused by the lens, the resulting optical image is projected onto the surface of the image sensor. The image sensor, as a key component of a camera, is mainly divided into two types: CCD (Charge-Coupled Device) and CMOS (Complementary Metal-Oxide-Semiconductor). Taking a CMOS sensor as an example, its surface contains a large number of photodiodes. When light shines on the photodiodes, corresponding charges are generated. These charges, representing electrical signals, are converted into digital image signals by an A/D (Analog-to-Digital) converter. These signals are then transmitted to a digital signal processing chip (DSP) for processing and finally transmitted to a display device or storage device through a relevant interface.

CMOS sensors are extremely sensitive to light, capable of capturing light across a wide spectrum, from ultraviolet to infrared. However, this high sensitivity also makes them vulnerable to certain light sources. Compared to the human eye, CMOS sensors lack natural protective layers and energy absorption and dispersion mechanisms such as the cornea and lens. When directly exposed to high-energy light, energy can rapidly accumulate in localized areas within a very short time, potentially leading to a series of problems.

Analysis of the possibility of lidar damaging cameras

Damage to CMOS sensors caused by high-energy lasers

When a high-power laser beam directly illuminates a CMOS sensor, it can trigger several negative effects. First, the localized high-energy focusing of the laser beam on the sensor surface causes a rapid rise in local temperature, resulting in severe localized overheating. This overheating can damage the internal structure of the semiconductor material, causing some pixels to lose their normal light-sensing function, manifesting as obvious burn marks or abnormal bright spots in the captured image.

Secondly, direct stimulation from high-energy lasers can also cause short circuits or breakdowns in the internal circuitry of the sensor. Because the internal circuitry of a CMOS sensor is extremely delicate, high-energy lasers may subject the electronic components to excessively high voltage and current, leading to the failure of circuit connections in some pixels. Ultimately, this can cause the entire pixel to malfunction, resulting in black or abnormally colored pixel blocks in the image, severely impacting image quality.

Furthermore, even if the energy of a single laser irradiation is insufficient to cause immediate and severe damage to the sensor, if the sensor is repeatedly irradiated by lasers within a short period, the cumulative effect of the energy may gradually exceed the sensor's tolerance limit. As the cumulative damage intensifies, it may lead to a gradual decline in sensor performance, or even ultimately cause complete damage to the entire sensor area, rendering the camera unusable.

The influence of the relative position of LiDAR and camera

The relative position, shooting angle, and distance between the camera and the LiDAR are crucial factors in determining whether the camera will be damaged. When the camera and LiDAR are close together and the shooting angle is directly facing the laser emission port, the laser energy received by the camera's CMOS sensor is relatively concentrated, greatly increasing the risk of sensor damage. This is because, in this situation, the laser beam acts almost directly on the sensor surface, resulting in high energy density and making it more susceptible to problems such as overheating and short circuits.

Conversely, if the camera and LiDAR are kept at a considerable distance, or if the camera is positioned at a side angle, the laser energy will significantly attenuate during propagation due to divergence, scattering, and absorption by the air. Once the laser energy has attenuated to a certain level, its impact on the CMOS sensor becomes extremely limited, thus greatly reducing the possibility of damage to the camera.

Case Studies and Research

Back at CES 2019, an incident occurred involving LiDAR damaging a camera. At the time, an attendee used a Sony camera to photograph a display vehicle equipped with an autonomous driving system using a 1550nm LiDAR. Because the camera was close to the LiDAR and directly facing it, the camera sensor was affected by the LiDAR's high-power output during the shot, resulting in noticeable defects in the image data. This incident vividly demonstrated the potential damage that LiDAR can cause to cameras.

In addition, a video of a mobile phone filming a LiDAR sensor has circulated online. In the video, a Xiaomi 12S Ultra phone is filming a NIO ES7. When the lens gets close to the LiDAR sensor on the roof of the car, multiple horizontal green lines quickly appear in the phone's image. Analysis shows that this is because the laser light directly shines on the phone's camera for a short period of time, causing damage to some pixels and thus forming abnormal lines in the image.

Protective Measures and Future Outlook

To mitigate the potential harm of lidar to cameras, several targeted protective measures have been proposed and implemented. For example, a specially designed anti-glare filter is installed in front of the camera. This filter effectively filters out specific wavelengths of laser light, reducing their impact on the sensor. Simultaneously, treating the lens with a special coating can also disperse laser energy to some extent, reducing the energy density when the laser directly hits the sensor. Furthermore, improving the lens structure and optimizing the sensor layout and protective design to enhance the overall anti-interference capability of the camera is also an effective way to strengthen its resistance to high-energy lasers.

With the continuous development and popularization of autonomous driving technology, the safety and compatibility of LiDAR and cameras, as core sensors, will receive increasing attention. In the future, on the one hand, automakers and component suppliers need to fully consider the mutual influence between LiDAR and cameras during product design and production, reducing potential risks at the source by optimizing system architecture and adjusting component parameters. On the other hand, relevant industry standards and specifications also need further improvement and refinement, clarifying the safety thresholds for key indicators such as LiDAR emission power, wavelength range, and camera anti-interference capabilities. This will provide stricter guidance and constraints for product research and development, production, and application, ensuring the safety and reliability of autonomous driving systems and bringing a safer and more convenient travel experience to users.

In conclusion, under certain conditions, automotive LiDAR can indeed potentially damage cameras. Although the automotive industry has implemented a series of measures to ensure the safety of LiDAR users, further strengthening of protection for electronic devices such as cameras is still necessary. Both drivers and industry professionals should be fully aware of this potential risk and take appropriate protective measures during actual use and operation to avoid unnecessary losses.

Read next

CATDOLL 139CM Qiu Silicone Doll

Height: 139 Silicone Weight: 25kg Shoulder Width: 33cm Bust/Waist/Hip: 61/56/69cm Oral Depth: N/A Vaginal Depth: 3-15cm...

Articles 2026-02-22