A Review of Multi-Sensor Information Fusion Technology for Mobile Robots
2026-04-06 06:40:42··#1
Abstract: Multi-sensor information fusion technology is currently a research hotspot in the field of mobile robots. This paper elaborates on the application and research progress of multi-sensor information fusion technology in the field of mobile robots, especially exploring the implementation methods of multi-sensor information fusion in depth. It also points out the future development direction of multi-sensor information fusion technology in the field of mobile robots. Keywords: Mobile robot; Multi-sensor; Information fusion Mobile robots are an important research branch in the field of robotics. They are comprehensive systems integrating multiple functions such as environmental perception, dynamic decision-making and planning, and behavior control and execution. With the continuous development of robot technology, the application scope and functions of mobile robots have been greatly expanded and improved. They are widely used not only in industries such as industry, defense, and services, but also in field operations, hazardous and dangerous environments, extreme operations, and space applications, which have received high attention from countries around the world. Research on mobile robots began in the 1960s. Currently, research on mobile robots, such as perception-based position judgment and local and global navigation schemes; new methods for obstacle detection and avoidance; and multi-sensor information fusion, has attracted widespread attention from experts and scholars both domestically and internationally. Intelligent robots are a type of robot system capable of sensing their environment and their own state through sensors, enabling autonomous movement towards targets in obstacle-filled environments, and thus completing various tasks. Intelligentization is the development direction of mobile robots, and the development of sensor technology is a crucial foundation for realizing the intelligence of mobile robots. Multi-sensor information fusion technology for mobile robots overcomes the inherent shortcomings of using a single sensor and has become a key technology in the field of mobile robot intelligence research. 1. The Perception System of Mobile Robots When working normally, mobile robots not only need to monitor their own position, posture, speed, and internal system state, but also need to perceive their working environment, allowing the robot's work sequence and operations to naturally adapt to changes in the working environment. Therefore, accurately acquiring external and internal state information is essential for the normal operation of mobile robots, improving work efficiency, saving energy, and preventing accidents. Currently, sensors used in mobile robots can be broadly divided into two categories: internal sensors and external sensors. Internal sensors are used to monitor the internal state parameters of the robot system, such as power supply voltage and wheel position; internal sensors mainly include odometers, gyroscopes, magnetic compasses, and photoelectric encoders. External sensors are used to perceive information about the external environment, such as temperature, humidity, color and texture of objects, and distance to the robot. There are many types of external sensors, mainly including vision sensors, laser rangefinders, ultrasonic sensors, infrared sensors, and proximity sensors. Different sensors are integrated on the mobile robot to form a multi-sensor information fusion perception system. 2. Implementation of Multi-Sensor Information Fusion in Mobile Robots Currently, the multi-sensor information fusion methods used in the field of mobile robots mainly include: weighted average method, Kalman filtering, extended Kalman filtering, Bayes estimation, Dempster-Shafer evidence reasoning, fuzzy logic, neural networks, and behavior-based and rule-based methods. These methods can be used to perform fusion at different levels, such as the data layer, feature layer, and decision layer. They can also achieve information fusion between ranging sensor information, internal trajectory estimation system information, and global positioning information, thereby accurately and comprehensively understanding and describing the measured object and environment, enabling the mobile robot to make correct judgments and decisions. 2.1 Weighted Average Method This method involves weighting and averaging the redundant information provided by a set of sensors, and using the weighted average value as the information fusion value. It is the simplest and most intuitive method for information fusion of low-level data from multiple sensors. The biggest drawback of this method is the difficulty in obtaining the optimal weighted average, and determining the weights requires a significant amount of time. 2.2 Kalman filtering and its extensions for real-time fusion of dynamic, low-level redundant sensor data. This method recursively determines the statistically optimal fusion data estimate using the statistical characteristics of the measurement model. If the system has a linear dynamic model, and the system noise and sensor noise are Gaussian-distributed white noise, then Kalman filtering provides a unique statistically optimal estimate for the fusion data. The recursive nature of this method makes it computationally fast and requires minimal storage space. With the rapid development of computer technology, the computational requirements and complexity of Kalman filtering no longer hinder its practical application. This method is increasingly favored, especially in multi-sensor, multi-target tracking systems where it demonstrates unique advantages. For example, Tomatis et al. used a Kalman filtering-based hybrid method to implement navigation for a mobile robot, and experimental results showed a success rate of 96% over a distance of 1.15 km. In terms of the tracking accuracy of the mobile robot, the error deviating from the target point was only 9 mm. In practical engineering applications, assumptions about the linearity of the system model or instability in data processing can significantly impact the information fusion process. In such cases, Extended Kalman Filtering (EKF) is often used instead of conventional Kalman Filtering. EKF is an important method for mobile robots to achieve real-time localization and navigation. In mobile robot localization and navigation, sensor fusion and nonlinear model predictive control methods are utilized, and optimal estimation is achieved using extended Kalman Filtering. Kalman Filtering is used to estimate the state through statistical characteristics and minimize the error caused by noise. 2.3 Bayes Estimation Bayes estimation is a commonly used method for fusing low-level information from multiple sensors in static environments. Its information is described as a probability distribution and is suitable for instability with additive Gaussian noise. This fusion method originated in the early stages of multi-sensor fusion technology. When applying the Bayes estimation method, firstly, the model should be described; then, a prior probability should be assigned to each proposition; and then inference should be made using the probability, especially by estimating the confidence level based on the information data to obtain the result. However, when new information from a sensor arrives, and the number of unknown propositions is greater than the number of known propositions, the probability of the known propositions is highly unstable. This method is mainly applied to the state estimation of mobile robots and the identification and tracking of moving targets. 2.4 Dempster-Shafer Evidence Reasoning The concept of evidence reasoning was first proposed by Dempster in 1967, and later further developed and improved by his student Shafer. Dempster-Shafer evidence reasoning is an extension of the Bayes method, but also different from it. Bayes estimation only uses a value to replace the premise probability of being true; when premises are correlated, the Bayes method struggles to guarantee the consistency of the estimate. The Dempster-Shafer method uses an unstable interval, avoiding the shortcomings of the Bayes method by using the prior probability of unstable unknown premises. Because the way and content of Dempster-Shafer evidence reasoning studies problems are particularly suitable for handling information fusion problems in multi-sensor integrated systems, this evidence reasoning has now become an important theoretical foundation for information fusion. In the field of mobile robotics, this method has been successfully applied to the identification of targets by mobile robots. The advantage of Dempster-Shafer evidence reasoning is that it does not require specifying prior probabilities; its disadvantage is that the computational load is generally very large, and how to effectively obtain basic probability assignments in practical engineering applications requires further in-depth research. Meanwhile, reference [7] also points out that the Dempster-Shafer theory only accumulates individual information sources, and when events are merged, there is an unreasonable relationship between time weight and trust. Therefore, the theory needs further in-depth research and improvement. 2.5 Fuzzy logic and artificial neural networks Fuzzy logic can be used to directly represent the uncertainty in the multi-sensor data fusion process in the reasoning process. The target recognition fusion calculation based on fuzzy rules is very simple. By specifying a real number between 0 and 1 to represent the authenticity, this is equivalent to the premise of implicit operators. However, it is not like the Dempster-Shafer method: as evidence accumulates, the probability value of possible target objects is gradually increased, and the probability value of impossible target objects is reduced. In recent years, fuzzy set reasoning has been widely used in mobile robot target recognition and path planning. Sasiadek uses fuzzy logic and extended Kalman filtering to perform sensor information fusion. The artificial neural network method is an information processing method that imitates the biological nervous system. A neural network includes multi-layer processing units connected in various ways. The neural network performs nonlinear transformation on the input data, thereby completing the classification from data to attributes performed by cluster analysis technology. Multi-sensor information fusion based on neural networks has the following characteristics: it possesses a unified internal knowledge representation form; through specific learning algorithms, sensor information acquired by the neural network can be fused to obtain corresponding network parameters; it can convert knowledge rules into digital form, facilitating the establishment of a knowledge base; it does not require the establishment of a precise mathematical model of the system, making it very suitable for nonlinear testing situations; it has the capability of large-scale parallel processing, resulting in very fast system information processing speed, and it also possesses strong fault tolerance and robustness. Information fusion based on neural networks is essentially an uncertain reasoning process. It fully utilizes information from the external environment to achieve automatic knowledge acquisition and associative reasoning based on this. Through extensive learning and reasoning, the complex relationships of the uncertain environment are fused into symbols that the system can understand. Research on neural networks provides a good method for multi-sensor information fusion; its nonlinear approximation capability is particularly noteworthy in information fusion, typically employing a three-layer perceptron model and the backpropagation (BP) algorithm. Currently, in multi-sensor information fusion for mobile robots, neural networks are mainly used for target recognition, obtaining accurate estimates of obstacle images for the mobile robot, and correctly guiding robot movement. The multi-sensor information fusion method using neural networks can solve the problem of autonomous walking in mobile robots. To effectively improve the performance and speed of neural network information fusion, the structural model of information fusion using array neural networks can be achieved through sub-methods. Its nonlinear approximation capability is particularly noteworthy in information fusion, typically employing a three-layer perceptron model and the backpropagation (BP) algorithm. Traditional neural network structures require a very large number of hidden nodes, even many hidden layers, for a large number of training samples, thus incurring a significant computational burden. Limited by computer processing speed, this results in poor real-time performance, a problem that urgently needs further resolution. 3. Development Trends With the rapid development of electronic technology and VLSI technology, sensor structures will evolve towards parallel structures. Therefore, developing software and hardware with parallel computing capabilities to meet the requirements of multi-sensor information fusion with large amounts of data and complex computations is one of the main development trends in multi-sensor information fusion technology. The main development directions for multi-sensor information fusion hardware are: researching integrated circuit chips capable of processing multi-sensor information, continuously developing new sensors for mobile robots, and continuously standardizing sensor models and interfaces. Currently, there are many multi-sensor information fusion algorithms, but most are based on a stationary random process with a linear normal distribution. Therefore, further research is needed to develop novel information fusion algorithms, improve the performance of multi-sensor fusion systems, and address the challenges of nonlinear and non-stationary normal distribution information fusion. Artificial intelligence (AI) enables systems to possess good flexibility and understandability, thus handling complex problems. AI research will play a significant role in sensor selection, automatic task error detection and recovery, and other fields. Currently, the application of AI in multi-sensor information fusion is a research hotspot both domestically and internationally. Multi-sensor information fusion for mobile robots in unknown environments primarily addresses their autonomous localization and navigation problems. Currently, research results on autonomous localization and environmental modeling of mobile robots based on multi-sensor information fusion are mostly limited to indoor structured environments. Issues such as the robustness of decision rules, the effectiveness of sensor placement, the adaptability of biosensor methods, and the comprehensive consideration of self-localization, motion planning and control, and robot dynamics still require in-depth research. In particular, mobile robot technology in unstructured environments will be a key focus of future robot technology development. 4. Conclusion Multi-sensor information fusion technology is one of the key technologies for intelligent mobile robots. With the development of sensor technology and the improvement of information fusion technology, the perception ability and system decision-making ability of mobile robots in acquiring environmental information will be continuously improved. The continuous development of sensing, intelligent, and computing technologies will promote the development of mobile robots towards intelligence and full autonomy. Mobile robots will surely be able to play anthropomorphic roles in various fields such as harmful and dangerous environments, extreme operations, and space, becoming true friends of mankind.