Share this

DCS System Embedded Test Design Based on CodeTest Tool

2026-04-06 06:22:48 · · #1

Abstract: This paper introduces the CodeTest embedded testing tool and its unique testing scheme for embedded testing of large-scale DCS systems .

Keywords: Embedded testing, CodeTest tool, DCS system. With the development of DCS systems, there is an urgent need for a tool that can perform real-time online testing and analysis of DCS system software during the integration and system development phases to ensure system performance and reliability.

The long-term stability and real-time requirements of DCS systems place extremely stringent demands on their software quality from manufacturers. Furthermore, the distributed nature of DCS systems makes integration and system-level testing very challenging. This paper introduces a unique testing scheme for DCS distributed systems, simultaneously testing various system modules (each computer running multiple system modules) distributed across multiple computers in a network, monitoring key performance indicators such as coverage, memory leaks, and operational performance. The testing tool used is the CodeTest embedded testing tool from Metrowerks, Inc.

1. DCS System Overview

DCS systems are generally physically distributed control systems with two basic structures: bus network and star network. Some DCS customers, due to their small production scale, may have low system requirements and can integrate the server, engineer station, and operator station onto a single machine. However, in terms of their control stations and the systems on the machines, the entire system is still physically and logically distributed. Taking the bus network structure as an example, the system structure is shown in Figure 1.

2. Overview of CodeTest Embedded Testing Tool

CodeTest boasts powerful test analysis capabilities. Thanks to its improvements and enhancements in software tracking technology and bus data capture, CodeTest possesses robust performance analysis, memory analysis, advanced coverage analysis, and code tracing functions.

The CodeTest tool has three main versions: one is a pure hardware version, which has long been obsolete because it cannot meet the needs of users; the other two are a pure software version and a hardware-assisted software version, with the hardware-assisted software version being the best.

Pure software testing tools require two essential tasks: instrumentation functions and preprocessing tasks. The presence of these functions increases the code size, which can impact system efficiency. However, with continuous improvements in CPU speed and storage technology, pure software solutions remain viable.

3 Design of Embedded Test Scheme for DCS System

Because the DCS system is quite complex, the server has 15 lib files and 20 exe ​​tasks, while the operator station has 4 dll projects and 6 exe tasks. These modules form a real-time running whole at the management network layer. Testing a program or a test case will inevitably affect other tasks. For example, writing a value from the operator station to the I/O control station changes the on/off state of a valve. This value is transmitted to the real-time database, completing the operation history, and then sent to the system network driver. The gateway.exe and GatewayMonitor modules, which communicate with the I/O station, send it to the field control station. The engineer station is mainly used for offline configuration, and it has more than a dozen dll and exe projects. During project configuration, multiple modules will run simultaneously. During download, the download task module and the server operator station program will run simultaneously (at least with the operator station and server daemons). At this time, it was very difficult to collect complete coverage data, because a tester's action will trigger the execution of code in multiple modules on several machines. Using the CodeTest testing tool and its cleverly designed test scheme, this problem has finally been solved.

3.1 CodoTest Testing Methodology (Pure Software Version)

When testing with the pure software version of CodeTest, first use CodeTest to instrument (mark points), generating an .exe or other executable file. Then, run CodeTest's ctserver.exe on machine A, where the test program is installed, and set the port for collecting test data. Next, run CodeTest Manager (ctmgr) on machine B (A and B can also be the same machine), create a workspace, specify the instrumentation file, memory check target file, port, and the IP address of the machine where ctserver is located, connect to ctserver, and execute it. Finally, run the program C.exe to be tested on machine A. The execution status, performance, coverage, and memory leak detection data of C.exe are collected in CodeTest Manager's Software Probe. CodeTest Manager provides a user-friendly window interface, allowing you to view the runtime coverage of each function and file, as well as save, export, and merge test results.

3.2 Analysis and Design of a Small Test Plan Figure 1 has already given the architecture of the DCS system. Here, we will combine CodeTest to design a test plan.

To make it easier to understand, let's start with a simple design example: Suppose a small software system runs on machine A and machine B. Machine A runs two processes (or task modules): A1.exe and A2.exe. A1.exe uses the ALIB1.1ib and ALIB2.1ib library files, and A2.exe uses the A.dll dynamic link library. B.exe runs on machine B, and operations on B.exe will trigger the two processes A1 and A2 on machine A.

We are now conducting system testing on the system composed of three task modules, A1, A2, and B, monitoring important test indicators such as coverage, memory leaks, and operating performance.

The test plan is shown in Figure 2. Set up machine C (which can also be machine A or machine B) to collect test data.

The testing system for this simple system is already quite complex. For a DCS integrated automation control system with more than 60 projects and at least 20 processes running simultaneously, the testing plan is even more complicated, and there are many more issues to consider.

There are still some challenges to be addressed in the subsystem testing plan shown in Figure 2:

(1) For A1 and A2, how can we simultaneously collect code execution test data, call lib static library files or dll dynamic link library files, and check the execution status of these library files to see if there is a memory leak in the library program? After exploration, the solution is as follows: Use CodeTest's append tagging method to tag A1 and A2 and their library files to a symbol database file (the IDB file generated by CodeTest tagging, append tagging command format: -CTidb=E:\importan\test.idb. CodeTest has many detailed techniques, please refer to the user manual and the software's built-in help file), and use a ctserver and a communication port to collect test data. Note that in order to track the execution status of each line of code in the Coverage Data of CodeTest Manager, the path of each source code must be added to the Source Code Directories in the Configuration window.

(2) A1 and A2 may have been developed by two different engineers who may not want to mix their test data together. In this case, two different ports can be run on machine A to collect test data (ctserver). An additional Software Probe should also be opened in CodeTest Manager and the corresponding configuration should be specified. When instrumenting, instrumentation should be done separately to generate their own IDB symbol library files.

3.3 Test Plan for Large-Scale DCS Integrated Automation Control System

The testing scheme for large-scale DCS integrated automation control systems is similar to that of the smaller systems mentioned above, but the impact of instrumentation functions on the DCS system must be considered. To mitigate this impact, a dedicated high-configuration computer (1.5GB memory) is used (H) to run codeTest Manager to collect test data from various modules of the system server, operator stations, and engineer stations. This way, the server, operator stations, and engineer stations only need to run the test data collection server (ctservei), significantly reducing the additional burden on the testing system.

Computer H was chosen as the central location for test data, primarily based on the following considerations:

(1) Test data is centralized and test reports can be directly exported for analysis. Especially for modules with low coverage, test managers and development engineers can identify which functions lack corresponding test cases based on code execution, and then hand them over to test engineers to further enrich the test cases.

(2) Save testing costs. Centralized collection of test information can reduce workload. On the other hand, it is also limited by the CodeTest license. At that time, there was only one network card and one license, and CodeTest Manager could only run on one machine. Of course, under better conditions, using several computers to collect data from the server, operator station and engineer station respectively would yield better test results. It has the least impact on the software system, but the cost will increase accordingly.

In summary, the test plan for the DCS system is shown in Figure 3.

As shown in Figure 3, a relatively large number of ctservers are used, mainly for two reasons. First, the system has many modules, and many of these modules are developed and maintained by different developers and tested by different test engineers. Using different ctservers allows for the separation of collected test information, facilitating the analysis and discussion of test cases, bug analysis, and the analysis of test intensity. Second, each module in the system undertakes different tasks or performs certain functions, thus providing convenience for functional testing.

3.4 Implementation of Embedded Testing Scheme for DCS System

At this point, the test plan design is complete. Guided by the exemplary experiment of the previous small system, the implementation process has few difficulties. Following the codeTest testing process, instrumentation is done first, followed by system setup. Due to the large size of the system, with numerous .exe and library files, instrumentation itself is challenging and requires a significant amount of work. However, once instrumentation is complete and the .exe files are generated, these executables will be used continuously for testing. The system source code must be hosted on the machine where CodeTestManager resides, so that when viewing code execution in a traceable manner, every page and line of the source code can be tracked. The main difficulties encountered are as follows:

(1) Difficulties in instrumentation: The system uses a large number of library files, each of which is a VC project. The key is that this library will be included by multiple EXE projects. In order to avoid the inconsistency between the IDB symbol database and the instrumented program after the test system is built, instrumentation must be performed separately for each EXE. For each EXE project that is instrumented, first check the library files it depends on, instrument the VC project of the library file in the IDB symbol database append mode, and append the symbol database of the instrumented EXE project to the end.

(2) Difficulty in testing system operation: The system has a large number of processes, and even more when multiple server processes are added. The system startup process, especially the server startup, is systematic and sequential. Starting the server manually would be a painful process. The solution is to use Windows scripts. For example, to start two processes consecutively, the method is as follows:

For distributed and embedded systems, CodeTest does offer unique testing solutions, especially the hardware-assisted software version, which is even more powerful. CodeTest allows for the design of different test plans at various stages of testing and can also serve as an auxiliary tool in the software development process.

Read next

CATDOLL CATDOLL 115CM Purple Silicone Doll

Height: 115 Silicone Weight: 22kg Shoulder Width: 29cm Bust/Waist/Hip: 57/53/64cm Oral Depth: N/A Vaginal Depth: 3-15cm...

Articles 2026-02-22