Share this

Machine Vision Motion Control Integrated Machine Application Examples | Online Contour Extraction and Contour Trajectory Processing (Part 2)

2026-04-06 02:55:41 · · #1

Application Background

In practical machine vision applications, it is often necessary to extract samples of different shapes within the camera's field of view and process the contour trajectory based on the position of the sample's contour trajectory.

In the previous lesson, we discussed the visual contour extraction part of online contour extraction and contour trajectory processing. In this lesson, we will share with you how to implement contour trajectory processing based on the position of the sample's contour trajectory.

01 Detection Principle

(I) Preliminary Preparation Materials

1. One nine-point calibration board

2. One VPLC516E

3. One 1.3-megapixel camera

4. One 16mm lens

5. One ring light

6. One XY machining training platform

7. Camera power cord, network cable, and several power cords.

8. One laser head

(II) Software Algorithm Implementation Steps

1. First, the motor needs to be enabled via the bus scanning program to ensure that basic motor operation is normal. By adding the motor motion function based on the contour trajectory extraction tutorial in previous chapters, and combining it with the camera calibration function from previous chapters, we can obtain the contour of any sample from the image and allow the XY machine to move according to the sample contour.

2. After confirming the normal operation of the machine, the machine returns to zero to ensure the machine coordinates are correct. At this point, a nine-point calibration board is needed. Place it in the camera's field of view and obtain the center coordinates "X, Y" of the nine points on the image according to the "ZV_CALGETSCAPTS command". 3. Then, the machine coordinates need to be moved to align with the center coordinates of the nine points, generating nine machine coordinates. This example uses a laser point to align with the nine center coordinates to generate the nine center coordinates on the machine (Note: it needs to be aligned according to the sequence of the nine center coordinates in the image).

4. Then, the coordinates of the nine center points on the image and the nine center points on the machine are obtained and calibration coefficients are established using the "ZV_CALCAM command". Subsequently, the image coordinates can be converted into the machine coordinates corresponding to the image trajectory coordinates using the "ZV_CALTRANSW" command through the calibration coefficients.

5. Finally, the machine coordinates are converted from the moving contour trajectory according to our motion command "MOVEABS linear motion command".

02 Software Implementation

1. Open ZDevelop software: Open the project file "Contour.zpj" from the previous course → Create a new file "EtherCat.bas" for EtherCAT bus initialization → Create a new file "vision.bas" for writing nine-point calibration related functions → Create a new file "motion.bas" for writing XY machine tool related operation functions → Add the files to the project.

2. The main Basic commands used.

A. Machine vision uses nine-point calibration related instructions:

The pixel coordinates of the sample contour trajectory are converted into world coordinates by using calibration coefficients.

a. Obtain the coordinates of the points on the solid circle calibration plate.

b. Calibration.

c. Convert pixel coordinates to world coordinates.

B. Motion control function: Linear motion commands:

Linear interpolation motion is performed based on the coordinate data obtained by converting the pixel coordinates of the sample contour trajectory into world coordinates.

Full code and sample image download address

03 Operation Demonstration

(I) Operating Procedures

→First, connect to the controller IP and download the project to the controller.

→ Reset the machine on the XY training platform.

→ Scan the camera, and the camera will capture an image of a nine-point calibration plate.

→Click on camera calibration to extract 9 solid circle mark points.

→ Click Auto to see the effect of the laser head automatically aligning the points (it automatically moves the laser point to the corresponding nine center positions based on the x and y coordinates and spacing of the center of the first calibration plate, generating a corresponding calibration coefficient).

→ Click "Calibrate" to convert pixel coordinates into world coordinates using calibration coefficients. The calibration is now complete.

→Click to capture an image of a sample.

→ Select the ROI type to use.

→ Click "Extract Contour" to obtain the coordinate information of the sample contour.

→Click on the motion contour position to see the laser point move according to the sample contour. The contour trajectory processing is now complete.

→End.

(II) Effect Demonstration

This concludes our presentation on the application of the Zhengdong Technology Machine Vision Motion Control All-in-One Machine: Online Contour Extraction and Contour Trajectory Processing (Part 2).

For more exciting content, please follow the "Zheng Motion Assistant" WeChat official account. For related development environment and example code, please contact Zheng Motion's technical sales engineer: 400-089-8936.

This article is original content from Zheng Motion Technology. We welcome everyone to reprint it for mutual learning and to jointly improve China's intelligent manufacturing level. Copyright belongs to Zheng Motion Technology. Please indicate the source if you reprint this article.


Read next

CATDOLL 123CM LuisaTPE

Height: 123cm Weight: 23kg Shoulder Width: 32cm Bust/Waist/Hip: 61/54/70cm Oral Depth: 3-5cm Vaginal Depth: 3-15cm Anal...

Articles 2026-02-22