Optimize the measurement accuracy of automated testing systems
2026-04-06 05:29:51··#1
Introduction In test and measurement applications, engineers frequently hear the term "measurement accuracy." For automated test systems, measurement accuracy is not only a crucial parameter for evaluating performance but also an indicator that scientists constantly strive to improve. While its importance is universally acknowledged, many people are unclear about its true definition, often conflating measurement accuracy with the resolution of an analog-to-digital converter (ADC). So, what is the true definition of accuracy? What factors affect the measurement accuracy of a system? And how can customers interpret the correct accuracy parameters from the instrument's technical specifications? As we know, all measurements are approximate estimates of the "true" value; that is, the measured value always has a certain error compared to the "true" value. The magnitude of this error is what is commonly referred to as measurement accuracy, reflecting the ability of the measurement instrument system to accurately reproduce the measured signal value. The sources of measurement error are multifaceted. For measurement equipment, in addition to the various error factors inherent in the ADC itself, the front-end signal conditioning and the overall board layout all affect the overall measurement accuracy. Furthermore, measurement accuracy is also affected by numerous external factors, such as environmental noise and operating temperature. Therefore, when evaluating the measurement accuracy of an instrument system, in addition to the number of bits in the ADC, the absolute accuracy value of the device (the combined value of multiple error factors) should also be considered, as well as the influence of temperature, noise, and other external factors encountered by the system in a real-world environment. The following will begin with the instrument's technical parameters, further analyze several important factors affecting measurement accuracy, and conclude with calibration services that can improve measurement accuracy, helping readers correctly evaluate and optimize this important indicator to the greatest extent possible. Correctly Interpreting Instrument Technical Parameters Correctly interpreting the instrument's technical parameters is the most fundamental element in understanding the concept of measurement accuracy. Because different instrument manufacturers use different terminology when defining measurement accuracy, or use similar terminology to express different meanings, it is crucial to clearly understand all the parameters involved in defining instrument characteristics. Take the most common data acquisition card as an example; many customers believe that all 12-bit resolution data acquisition cards on the market have the same accuracy. This statement completely confuses the concepts of resolution and accuracy. Resolution generally refers to the smallest part that the largest signal can be divided into after sampling. For example, a data acquisition card with a 12-bit analog-to-digital converter (ADC) has an optimal resolution of 1/(2︿12) = 1/4096. That is, when the input voltage range is +/-10V (i.e., Vpp=20V), the smallest voltage it can distinguish is 20V/4096=4.88mV. Theoretically, the higher the resolution, the denser the points that divide the signal, and thus the more realistic and smoother the reconstructed signal. The concept of absolute accuracy refers to the absolute value of the maximum deviation between the measured value and the "true" value. Before the signal under test enters the ADC, it must pass through other devices on the data acquisition board, such as the multiplexer (MUX) and the programmable gain amplifier (Amplifier). Random noise may be introduced during this process, as well as drift of the reference source due to changes in time and temperature, and nonlinear errors introduced before and after gain. The combined effect of these factors on the measurement results is what we call absolute accuracy. For customers, besides the number of bits in the ADC, it's even more important to understand the absolute accuracy of the data acquisition board they purchase. Sometimes, a 16-bit data acquisition card might be less accurate than a well-designed 12-bit card. The technical parameter table shown in Figure 1 details various error values such as gain error, offset error, and uncertainty noise, along with their combined absolute accuracy, providing customers with complete information to ensure the accuracy of the final measurement. Digital multimeters (DMMs) use different methods of representation. The industry typically uses the number of bits to describe the resolution of a digital multimeter, leading customers to often assume that a 6.5-bit digital multimeter is necessarily accurate to 6.5 bits. However, this is not the case. The number of bits only relates to the number of digits displayed by the instrument, not the smallest resolvable change in the input signal. Therefore, it's necessary to check whether the instrument's sensitivity and effective resolution are high enough to ensure it provides the required measurement resolution. For example, a 6.5-bit DMM can represent a given range of 1,999,999 counts. However, if the peak-to-peak noise measured by the instrument is 20 counts, the smallest resolvable change is 0.52 x 20 counts (resolution = voltage or count of Gaussian noise × 0.52). Therefore, in the presence of actual noise, the true effective number of bits (ENOD) of this 6.5-bit DMM is: The accuracy of a digital multimeter is usually expressed as ±(ppm of reading + ppm of range). For example, if a DMM is set to a 10V range to measure a 7V signal and operates for 90 days after calibration at 23ºC ±5ºC, according to the DMM's parameter table (see Figure 2), the accuracy of the DMM in this case is ±(20 ppm of reading + 6 ppm of range) = ±(20 ppm of 7 V + 6 ppm of 10 V) = 200 µV. In addition to the commonly understood number of bits of a DMM, users should understand the concept of effective number of bits and know how to calculate the measurement accuracy of a DMM. These indicators are crucial for ensuring the measurement performance of the DMM. As illustrated in the two examples above, understanding the measurement accuracy and resolution requirements of automated testing systems allows for an accurate assessment of the overall error range of the instrument system and verification of its compliance with testing requirements. Furthermore, customers should proactively consult with instrument manufacturers to accurately grasp the meaning of each technical specification in the datasheet and the true performance of the instrument.[b]Several Important Factors Affecting System Accuracy[/b] Cooling and heat dissipation are crucial factors affecting equipment accuracy. Operating temperature is therefore paramount. Good cooling not only ensures stable operation of the chassis and its modules but also improves the mean time between failures (MTBF) of the corresponding boards and power supplies. Some professional measurement bus standards, such as the PXI bus, have strict specifications for cooling and heat dissipation, including defining the direction of airflow within the chassis and heat dissipation on a per-slot basis (see Figure 3), ensuring the system completes measurement tasks under normal operating temperatures. Power Management Similar to cooling and heat dissipation systems, a stable and sufficient power supply is essential for ensuring system measurement accuracy. Instrument manufacturers should provide the maximum current and corresponding power parameters that the chassis power supply can provide under different voltage conditions, especially the power derating indicators under extreme temperature conditions (e.g., >50 °C) (see Figure 4). This helps users fully understand the system's power distribution, thereby avoiding measurement inaccuracies or even instrument malfunctions. Some professional instrument manufacturers conduct full-load testing on their systems during the design and production phases to evaluate power distribution and heat dissipation, ensuring the accuracy of measurements for each module within the chassis. These are additional safeguards for customers. EMI/EMC certification guarantees the safety and electromagnetic compatibility of the entire testing system, as well as measurement accuracy under various operating environments. When purchasing instrument systems, customers should pay attention to relevant safety certifications. Besides the CE certification that manufacturers can self-declare, it's crucial to check whether the system has passed safety certifications from authoritative third-party organizations. Common examples include FCC certification in North America, C-Tick certification in Australia, and Demko and TUV certifications in the European Union. Some instrument manufacturers offer as few third-party certifications as possible or only provide CE certification to save costs. Such systems inevitably have some safety risks, and users need to be mindful of this when purchasing testing systems.Cables and Wiring Methods Cables and wiring methods, as essential components of automated testing systems, significantly impact measurement accuracy, especially for low-current and low-voltage measurements. Therefore, high-quality cables and professional wiring methods are recommended to minimize noise interference, improve the signal-to-noise ratio, and maximize measurement accuracy. High-quality cables are crucial. 50/60 Hz power line noise is perhaps the most common noise source. Using shielded or coaxial cables is the most effective way to eliminate this interference. Some customers may hope to remove this noise through post-processing filtering; however, in low-current measurements, 50/60 Hz power line noise can easily saturate the sensitive preamplifier circuit of the ammeter, making it difficult to restore the previous measurement accuracy with any filtering. Therefore, shielded cables must be used. Leakage current is a small current generated to ground due to changes in the insulation material (such as dirt). For some low-voltage measurement applications, low-leakage and low-thermal EMF cables are recommended because such cables use Teflon fabric for complete insulation on both the inner and outer layers, ensuring their shielding and insulation properties, thus avoiding leakage current. When the test system has fewer than 50 test points or only a few instruments, it is easy to connect the instruments and the device under test (DUT) using junction boxes or screw terminals. However, for hundreds or thousands of test points, multiple instruments, or even large systems requiring reconfigurability or frequent switching, a professional large-scale interconnect system is often necessary. A large-scale interconnect system is a mechanical device used to facilitate the connection of input and output signals from a large number of DUTs. This system usually includes mechanical guardrails through which all signals can be quickly connected from the instrument (usually in a cabinet) to the DUT. Large-scale interconnect systems also protect the cables at the instrument front end from wear and tear and damage during repeated connection/disconnection. The factors discussed earlier regarding the importance of calibration services are all factors to consider when building a test system. However, during instrument use, the accuracy of the electronic components within the instrument will also deviate over time. As shown in Figure 5, the duration of continuous operation and environmental conditions exacerbate this deviation, introducing significant uncertainty into measurements. To address this issue, regular instrument calibration is essential. Calibration is divided into external calibration and self-calibration. External calibration compares the instrument's current performance with known standard accuracy, adjusting the instrument's measurement capabilities to ensure its accuracy remains within the manufacturer's specified range. To perform external calibration on an instrument, it can be sent back to the manufacturer or to a calibration and metrology laboratory. Alternatively, with the appropriate calibration conditions, external calibration can be performed independently. Regardless of the method chosen, it is crucial to pay attention to the external calibration interval specified by the manufacturer. For example, one manufacturer's function generator may have an external calibration interval of one year, while another manufacturer's function generator with equivalent or better accuracy specifications may have an external calibration interval of two years. To reduce maintenance costs for automated test systems, customers should choose the instrument from the second manufacturer. Therefore, customers need to carefully consider the external calibration interval parameter when selecting instruments. In addition to external calibration, some manufacturers' instruments also include highly practical self-calibration functions. Instruments with self-calibration capabilities contain precise hardware resources such as voltage reference sources, allowing for quick and easy calibration at any time. This reduces measurement errors caused by environmental factors without needing to remove the instrument from the test system or connect it to external calibration equipment. Of course, self-calibration cannot completely replace external calibration; it merely provides a method to improve instrument measurement accuracy between external calibration cycles. In conclusion, measurement accuracy, as a relatively complex and comprehensive factor, truly reflects the ability of an instrument system to reproduce signal values and has become one of the important indicators for evaluating automated test systems. This article discusses how to interpret instrument technical parameters, several important factors affecting measurement accuracy, and the final calibration service—all issues that test engineers most frequently encounter during selection, and are sometimes the most easily overlooked factors. It is hoped that this will raise awareness among engineers about the importance of measurement accuracy and provide some helpful assistance in the selection of test systems. Editor: He Shiping