Share this

AI at the Edge: How Collaborative Robots Quickly Process Sensor Data

2026-04-06 04:29:35 · · #1

Whether it's a traditional industrial robot system or today's most advanced collaborative robots (Cobots), they all rely on sensors that generate large amounts of highly variable data. This data helps build better machine learning (ML) and artificial intelligence (AI) models. Robots then use these models to become "autonomous," making real-time decisions and navigating in dynamic, real-world environments.

Industrial robots typically operate in "enclosed" environments, where they cease movement for safety reasons if humans enter. However, this limitation on human/robot collaboration prevents many benefits from being realized. Robots with autonomous operation capabilities can support safe and efficient coexistence between humans and robots.

Sensing and intelligent perception are crucial for robotic applications because the high performance of robotic systems, especially ML/AI systems, largely depends on the performance of the sensors that provide these systems with critical data. The vast number of increasingly sophisticated and accurate sensors available today, combined with systems capable of fusing data from all these sensors, can support robots with ever-improving perception and awareness.

The development of AI

Robotics automation has long been a revolutionary technology in manufacturing, and integrating AI into robots will undoubtedly bring about significant changes in robotics technology in the coming years. This article explores some of the key development trends in robotics, automation, and the most important technologies for tightly linking AI and the data it requires to achieve intelligence. It also discusses how to use and integrate different sensors in AI systems.

Driving AI processing technology for robots to edge computing

ML comprises two main parts: training and inference, which can be performed on entirely different processing platforms. Training is typically done offline on a desktop or in the cloud and involves feeding large datasets into a neural network. At this stage, real-time performance or functionality is not an issue. The result of the training phase is a trained AI system deployed to perform specific tasks, such as investigating bottlenecks on an assembly line, calculating and tracking people in a room, or determining whether an invoice is forged.

However, for AI to realize its potential applications across many industries, sensor data fusion must be performed in real-time or near real-time during inference (executing trained ML algorithms). To achieve this, designers need to implement ML and deep learning models at the edge, deploying inference capabilities into embedded systems.

For example, consider deploying collaborative robots in the workplace (as shown in Figure 1) to work closely with humans. These robots need to use data from near-field and vision sensors to ensure they successfully prevent human harm while supporting humans in performing activities that are difficult for them. All of this data needs to be processed in real time, but the cloud cannot provide the real-time, low-latency response required by collaborative robots. To overcome this bottleneck, today's advanced AI systems are being developed for the edge, meaning robots exist in edge devices.

This distributed AI model relies on highly integrated processors that have:

• A comprehensive set of peripheral devices for interfacing with various sensors

• High-performance processing capabilities to run machine vision algorithms

• Methods to accelerate in-depth learning reasoning.

In addition, all these functions must work efficiently, with relatively low power consumption and a relatively small size so that they can be carried out at the edge.

As ML becomes more widespread, the availability of power- and size-optimized "inference engines" is also increasing. These engines are hardware products specifically designed to perform ML inference.

Integrated System-on-Chip (SoC) is often a good choice in the embedded space because, in addition to wrapping various processing elements that can run deep learning inference, SoC also integrates many of the necessary components that make embedded applications complete.

Let's analyze the current trends in robot development.

Collaborative robots

Traditional industrial robots lack peripheral devices, which are generally inaccessible to humans. In contrast, collaborative robots are designed to interact safely with humans while operating, moving slowly and gracefully.

According to the ISO standard TS15066, a collaborative robot is a robot capable of being used in a collaborative environment. Collaborative operation means that the robot and a human work synchronously within a defined workspace to perform production operations (this does not include robot + robot systems or human-robot collaboration in the same location but operating at different times). Defining and deploying collaborative robots allows for anticipating potential conflicts between the robot's physical components (such as actual functional extensions, like lasers) and the operator. More importantly, this utilizes sensors to determine the operator's precise position and speed.

Collaborative robot manufacturers must implement high levels of environmental sensing and redundancy in their robotic systems to quickly detect and prevent potential collisions. Integrated sensors connected to the control unit will sense an imminent collision between the robotic arm and a person or other object, and the control unit will immediately shut down the robot. The robot will also shut down if any sensor or its electronic circuitry malfunctions.

Logistics robots

Logistics robots are mobile devices that operate in environments with or without human presence, such as warehouses, distribution centers, ports, or industrial parks. Logistics robots pick up goods and carry them to packing stations, or transport goods from one building to another within a company site; some logistics robots can also pick and pack. These robots typically move within specific environments and require sensors for localization, mapping, and collision prevention (especially with humans).

Until recently, most logistics robots used predefined routes; now they are able to adjust their navigation based on the positions of other robots, people, and goods. Ultrasonic, infrared, and LiDAR sensors are currently in use. Given the robot's mobility, the internal control unit typically communicates wirelessly with a central remote control system. Advanced technologies currently employed in logistics robots include machine learning logic, human-robot collaboration, and environmental analysis.

Rising labor costs and stringent government regulations have spurred the wider adoption of logistics robots. Their popularity has also increased as the cost of components such as equipment and sensors has decreased, and the cost (and time required) of integration has also trended downwards.

Last-mile delivery robot

The "last mile" of delivery is the final step in the logistics process of moving products from warehouse shelves to the customer's doorstep: the moment when the goods are finally delivered to the buyer's door. This is not only crucial to customer satisfaction, but last-mile delivery is also costly and time-consuming.

Last-mile delivery costs account for a large portion of overall freight costs: making last-mile delivery more efficient has become a focus of the development and implementation of new robotic technologies that can drive process improvements and increase efficiency.

Sensor technology for AI in robots

As robotics advances, complementary sensor technology also evolves. Much like the five human senses, combining different sensing technologies can provide optimal results when deploying robotic systems in constantly changing and uncontrolled environments. Even the simplest tasks performed by robots will rely on 3D machine vision to feed data into AI technology. Without machine vision capable of reconstructing 3D images, and without AI translating that visual information into successful robotic actions, grasping an object without a predetermined position and movement is impossible.

The most popular and relevant sensor technologies used today to support AI in robotics include:

• Time-of-Flight (ToF) optical sensor: This sensor is based on the ToF principle and uses photodiodes (a single sensor element or an array) and active illumination to measure distance. The light wave reflected from an obstacle is compared to the emitted wave to measure the delay, which represents the distance. This data helps create 3D maps of objects.

• Temperature and humidity sensors: Many robots need to measure temperature, and sometimes also the humidity of their environment and components, including motors and the main AI motherboard, to ensure they operate within safe limits.

• Ultrasonic sensors: If a robot cannot see anything in a bright environment or cannot find itself in a dark environment, it means that the visual sensors are not working. By transmitting ultrasonic waves and listening to the echoes reflected back from objects (similar to the principle of bat maneuvering), ultrasonic sensors can operate well in both dark and bright environments, overcoming the limitations of optical sensors.

• Vibration sensors: Industrial vibration sensing is a core component of preventative maintenance and condition monitoring. Integrated electronic piezoelectric sensors are the most commonly used vibration sensors in industrial environments.

Millimeter-wave sensors: Millimeter-wave sensors use radio waves and their echoes to determine the direction and distance of moving objects by measuring three factors: speed, angle, and range. This helps robots take more preventative measures based on how quickly an object approaches the sensor. Radar sensors perform exceptionally well in dark environments and can sense through materials such as dry walls, plastics, and glass.

While humans will still perform most tasks in factory workshops, robots will adapt to human work and increase automation. To achieve this, they need to be equipped with more AI capabilities to identify and adapt to various situations in real time, which is only possible when AI is at the forefront.

Disclaimer: This article is a reprint. If there are any copyright issues, please contact us promptly for deletion (QQ: 2737591964). We apologize for any inconvenience.

Read next

CATDOLL 108CM Sabrina Full Silicone Doll

Height: 108 Silicone Weight: 17kg Shoulder Width: 26cm Bust/Waist/Hip: 51/47/59cm Oral Depth: N/A Vaginal Depth: 3-13cm...

Articles 2026-02-22