Design of Anti-halo Image Acquisition System with Dual CMOS Image Sensor
2026-04-06 06:57:55··#1
1. Introduction When motor vehicles meet oncoming traffic at night, the glare from the headlights of other vehicles can create blind spots for drivers, leading to serious traffic accidents. To completely eliminate this traffic hazard caused by vehicle glare, this paper proposes an anti-glare image acquisition system solution based on DSP and image processing technology to reduce glare during nighttime driving. Currently, image acquisition mainly uses CCD and CMOS image sensors. Among them, CMOS image sensors have the advantages of low power consumption, low cost, single power supply drive, long service life, and easy on-chip system integration, making them suitable for anti-glare image acquisition systems. This paper proposes that the anti-glare image acquisition system use the OV7620 CMOS color image sensor from OmniVision instead of the traditional CCD image sensor. This sensor can be controlled by software programming, can directly output digital image information, and greatly reduces the system design difficulty, reduces the system size, and improves the flexibility and stability of the system design. 2. System Hardware Design This anti-glare image acquisition system consists of image acquisition and image processing. The image acquisition part is implemented by two OV7620 image sensors; the image processing is implemented by a TMS320C6414. The EPM3128 outputs each memory address through programming. Its system block diagram is shown in Figure 1. 2.1 Image Acquisition Section In the anti-halo image acquisition system, the TMS320C6414 simulates the I2C bus through general-purpose GPIO ports to set the internal register parameters of the OV7620, enabling it to perform corresponding functions. The OV7620 provides 125 registers from 00H to 7CH for controlling the sensor, allowing settings such as shutter mode, integration time, A/D converter operating characteristics, gamma correction and window position, output data format, frame rate, and pixel clock. Among these, the registers directly affecting pixel integration time are the exposure control register, clock prescaler register, and frame rate adjustment register. The OV7620 is usually set as the master device on the I2C bus by default. Setting bit 6 of register 29 to 1 changes it to a slave device on the I2C bus; therefore, the master device TMS320C6414 can write to the OV7620. The I2C bus initialization is completed only when the SBB pin of the TMS320C6414 is low. At this time, the OV7620 acts as a slave device, supporting a 7-bit address data transmission protocol of 400 Kbit/s. Setting the 5th bit of OV7620's internal register 28 to 1 enables progressive scan mode; setting the 5th bit of registers 13 and 14 to 0 enables 16 bits of YUV422-640x480 pixel digital image data (the high 8 bits are the luminance signal, and the low 8 bits are the chrominance signal); the output is set to 25 frames/s. The OV7620's video timing circuit generates synchronization signals such as horizontal sync, vertical sync, and mixed video sync, as well as internal clock signals such as the pixel clock. The EPM3128 determines image read/write operations and related processing based on these synchronization signals. This system uses a single-frame image data output processing method. The EPM3128 sets the CE, WE, LB, and UB pins of the IS61LV51216 to low level through its I/O port, allowing the OV7620 to input 16-bit image data to the IS61LV51216. When the OV7620's internal control bit SRAM signal is high, it indicates it is in external RAM mode. At this time, all data buses become tri-state and ready to send data. The timing diagram of the OV7620 outputting a single frame of image data to the external SRAM is shown in Figure 2. First, the sensor's internal control bit SRAM is checked. When SRAM is high, the OV7620 enters external RAM mode. Then, an initialization pulse is sent to AGCFN via the EPM3128 to obtain one frame of data. However, as shown in Figure 2, the data shifted out is not entirely valid image data. Valid image data is determined by HREF (Horizontal Reference Output) and VSYNC (Vertical Synchronization Signal). Therefore, it is necessary to check if VSYNC is 1 and if the rising edge of HREF has arrived. If it is a rising edge, it indicates that the sensor has started outputting valid data. When HRFF=1, the pixel clock PCLK counts and transmits the count value to the address bus of the external SRAM. Simultaneously, the image data DATA output by the OV7620 is transmitted to the SRAM data bus for write operations to the external SRAM. When HRFF=0, the counting pauses. After the OV7620 sends one frame of data, VSYNC=0. Therefore, the counter can be stopped and image acquisition terminated by checking if VSYNC is 0. 2.2 Image Processing Section The TMS320C6414 is a high-performance digital processor from TI, featuring a powerful hardware architecture and software system suitable for anti-halo image acquisition systems. The TMS320C6414 has an L2 cache of 1024KB. The L2 MODE field of the cache configuration register (CCFG) is used to configure L2 to mode 5, setting the on-chip SRAM to 768KB. The TMS320C6414 synchronously reads image data into the internal SRAM via the EMIFA port using EDMA. Enhanced Direct Memory Access (EDMA) is used for real-time image digital signal processing. It allows data transfer between memory spaces to be completed in the CPU background, quickly and efficiently transferring image digital information from external memory to the DSP's internal SRAM. Setting the EVT4 bit of the EER control register to 1 enables the use of EDMA channel 4 (EDMA4) to acquire image data. This channel is configured for 32-bit transmission, shifting one frame of image data into the internal SRAM with each interrupt. For synchronous sampling, the SRAM1 and VSYNC1 of the first OV7620 can be controlled by NAND gates to synchronously sample and store the data in their respective connected IS61LV51216. The falling edge of VSYNC1 indicates the end of one frame of image data output from the OV7620. VSYNC1 is then connected to the AF5 pin of the TMS320C6414 via an inverter to trigger an EDMA4 interrupt and simultaneously read the image data stored in the IS61LV51216. The CPU uses a thresholding segmentation algorithm to process the image information. Image information is transmitted and processed synchronously under the control of the TMS320C6414, fully meeting the real-time requirements of the system. The TMS320C6414 is connected to an external 40 MHz crystal oscillator, with CLOKMODE[1:0] set to 10, enabling its internal frequency to reach 480 MHz. The DSP detects the synchronization signals VSYNC and CHSYNC of the OV7620 and the pixel clock PCLK through its I/O ports, ensuring that the DSP can accurately read the digital image data output by the OV7620. The acquired digital image data is saved under the intervention of the synchronization signals and the pixel clock, ensuring the integrity of the transmitted digital image. Figure 3 shows the hardware circuit diagram of the first-channel image acquisition system. 3 System Software Design In the CCS (Code Composer Studio) compilation environment, C language and linear assembly are used for programming. The computer writes the successfully compiled system program to the external Flash memory through the JTAG interface, enabling the software to run at high speed, stably, and reliably on the hardware platform. The system software design flow is shown in Figure 4. The TMS320C6414 employs a thresholding segmentation algorithm. The exposure time of the first OV7620 channel is set to 1/50 s, and the acquired image information is stored in memory 1 (the highest bit of the address in memory 1 is set to 0 via EPM3128, i.e., 0XXXXXXXXXXXXXXXXX). The exposure time of the second OV7620 channel is set to 1/1000 s, and the acquired image information is stored in memory 2 (the highest bit of the address in memory 2 is set to 1 via EPM3128, i.e., 1XXXXXXXXXXXXXXXXXX). Both OV7620 channels synchronously sample the same scene under the control of the TMS320C6414, with a brightness threshold of 245. The TMS320C6414 sequentially reads the value of each storage cell from memory 1 and determines its brightness value. If the brightness value of a pixel is less than 245 and there is no halo, the pixel value and address value are temporarily stored in the SRAM inside the DSP. If the brightness value of a pixel is greater than or equal to 245 and a halo is generated, the data in that storage unit is discarded, and the pixel value and address value are read from the corresponding address in memory 2 (0XXXXXXXXXXXXXXXXX replaced with 1XXXXXXXXXXXXXXXXX). After storing a row of pixels, it is determined whether the starting bit of all pixel addresses is 1. If the starting bit is 1, it means that a halo has been generated. At this time, the pixel value and the three pixels before and after it are taken, for a total of 7 pixel values, and the average value is calculated. The average value is used to replace the pixel value until the row of pixels is processed and then output. If the starting bit is not 1, it means that no halo has been generated, and the pixel value is directly output. If the halo is generated at the beginning or end of a frame, only the pixel values that meet the conditions are taken and the average value is calculated. This algorithm can freely set the threshold size according to different sampling information and calculate the total number of pixels for average value, which has good flexibility. A Sony DCR-DVD808E digital camcorder and a Plink data acquisition card were used to acquire single still images via a PC. In the first experiment, the camera exposure time was set to 1/50 s. The bright car headlights produced a halo effect, making it impossible to distinguish the outer contours of the headlights and the license plate, while distant people in relatively dim light were visible. In the second experiment, the camera exposure time was adjusted to 1/1000 s. The bright car headlights and license plate were visible, but distant people were completely invisible. A thresholding segmentation algorithm was simulated using MATLAB on both images, with a threshold set to 245. The processed image was clearer and achieved the expected results. Figure 5 shows the MATLAB simulation. 4 Conclusion The anti-halo acquisition system completely eliminates the halo effect produced by vehicles during driving, greatly reducing traffic accidents and protecting people's lives and property. In industrial arc welding, this system not only protects the health of operators but also greatly improves welding quality. Therefore, the research on the anti-halo acquisition system has significant practical value.