You are on page 1of 17

Abstract

In recent years, benefits of multi-sensor fusion have motivated research in a variety of application areas. Redundant and complementary sensor data can be fused and integrated using multi-sensor fusion techniques to enhance system capability and reliability. This paper provides an overview of the paradigm of multi-sensor integration and fusion. Applications of multi-sensor fusion in robotics and other areas such as biomedical system, equipment monitoring, remote sensing, and transportation system are presented. Finally, future research directions of multi-sensor fusion technology including micro-sensors, smart sensors, and adaptive fusion techniques are addressed.

Introduction
Sensor is a device that detects or senses the value or changes of value of the variable being measured. The term sensor sometimes is used instead of the term detector, primary element or transducer. The fusion of information from sensors with different physical characteristics, such as light, sound, etc enhances the understanding of our surroundings and provide the basis for planning, decision making, and control of autonomous and intelligent machines. Sensors are used to provide a system with useful information concerning some features of interest in the systems environment. Multi-sensor integration and fusion refers to the synergistic combination of sensory data from multiple sensors to provide more reliable and accurate information. The potential advantages of multi-sensor integration and fusion are redundancy, complementarity, timelinesss, and cost of the information. The integration or fusion of redundant information can reduce overall uncertainty and thus serve to increase the accuracy with which the features are perceived by the system. Multiple sensors providing redundant information can also serve to increase reliability in the case of sensor error or failure. Complementary information from multi-sensor allows features in the environment to be perceived that are impossible to perceive using just the information from each individual sensor operating separately. More timely information may be provided by multiple sensors due to processing parallelism that may be possible to achieve as part of the integration process. Multi-sensor integration and fusion is a rapidly evolving research area and requires interdisciplinary knowledge in control theory, signal processing , artificial intelligence, probability and statistics, etc. There has been much research on the subject of multi-sensor and fusion in recent years. A number of researchers have reviewed the multi-sensor fusion algorithms, architectures, and applications. Luo and Kay reviewed the general paradigms, fusion techniques, and specific sensor combination for multisensor integration and fusion. Multi-sensor based mobile robots and applications in industrial, space, navigation, and et al were surveyed. Hall and Lines conducted an overview of multi-sensor data fusion technology, JDL fusion process model, military and nonmilitary applications. Dasarathy reviewed various characterizations of sensor fusion in the literature and proposed the input/output representation of the fusion process. Vahney presented an introduction to multi-sensor data fusion including conceptual framework, system architecture, and applications. The above mentioned papers and references therein provide a framework for the study of multi-sensor integration and fusion.

Sensors Evolution
A sensor is a device that responds to some external stimuli and then provides some useful output. With the concept of input and output, one can begin to understand how sensors play a critical role in both closed and open loops. One problem is that sensors have not been specified. In other words they tend to respond variety of stimuli applied on it without being able to differentiate one from another. Nevertheless, sensors and sensor technology are necessary ingredients in any control type application. Without the feedback from the environment that sensors provide, the system has no data or reference points, and thus no way of understanding what is right or wrong g with its various elements. Sensors are so important in automated manufacturing particularly in robotics. Automated manufacturing is essentially the procedure of remo0ving human element as possible from the manufacturing process. Sensors in the condition measurement category sense various types of inputs, condition, or properties to help monitor and predict the performance of a machine or system. And control of autonomous and intelligent machines.

Sensor And Sensor Technology In The Past


The earliest example of sensors are not inanimate devices but living organisms. A more recent example of living organisms used in the early days of coal mining in the United States and Europe. Robots must have the ability to sense and discriminate between objects. They must then be able to pick up these objects, position them properly and work with them without damaging or destroying them. Intelligent system equipped with multiple sensors can interact with and operate in an unstructured environment without complete control of a human operator. Due to the fact that the system is operating in a totally unknown environment, a system may lack of sufficient knowledge concerning the state of the outside world. Storing large amounts of data may not be feasible. Considering the dynamically change world and unforeseen events, it is usually difficult to know the state of the world. Sensors can allow a system to learn the state of the world as needed and to cautiously update its own model of the world.

Sensors Principle
A sensor is defined as a measurement device which can detect characteristics of an object through some form of interaction with them. Sensors can be classified into two categories: 1.Contact 2. non-contact A contact sensor measure the response of a target to some form of physical contact .this group of sensors responds to touch, force ,torque, pressure, temperature or electrical quantities. A noncontact type sensor measures the response brought by some form of electromagnetic radiation. This group of sensors responds to light, x-ray, acoustic, electric or magnetic radiation.

Paradigm Of Multi-sensor Fusion And Integration


Multi-sensor integration is the synergistic use of the information provided by multiple sensory devices to assist in the accomplishment of a task by a system. Multi-sensor fusion refers to any stage in the integration process where there is an actual combination of different sources of sensory information into one representational format. Separate operation of such a sensor will influence the other sensors indirectly through the effects he sensor has on the system controller and the world model. A guiding or cueing type sensory processing refers to the situation where the data from one sensor is used to guide or cue the operation of other sensors. The results of sensory processing functions serve as inputs to the world model .a world model is used to store information concerning any possible state of the environment that the system is expected to be operating in. A world model can include both a prior information and recently acquired sensory information. High level reasoning processes can use the world model to make inferences that can be used to detect subsequent processing of the sensory information and the operation of the system controller. Sensor selection refers to any means used to select the most appropriate configuration of sensors among the sensors available to the system.

Multi-sensor Fusion
The fusion of data or information from multiple sensors or as ingle sensor over time can takes place at different levels of representation. The different levels of multi-sensor fusion can be used to provide information to a system that can be used for a variety of purposes.eg signal level fusion can be used in real time application and can be considered as just an additional step in the overall processing of the signals, pixel level fusion can be used to improve the performance of many image processing tasks like segmentation ,and feature and symbol level fusion can be used to provide an object recognition system with additional features that can be used to increase its recognition capabilities.

Multi-sensor integration
Hierarchical structures are useful in allowing for the efficient representation of the different forms, levels, and resolutions of the information used for sensory and control hierarchy, logical sensor network, and JDL model. Modularity in the operation of integration functions enables much of the processing to be distributed across the system. The object-oriented programming paradigm and distributed blackboard control structure are two constructs that are especially useful in promoting modularity for multi-sensor integration. The use of the artificial neutral network formalism allows adaptability to be directly incorporated into the integration process.

The diagram shown in figure represents multi-sensor integration as being a composite of basic functions. A group of n sensors provide input to the integration process. In order for the data from each sensor to be used for integration it must first be effective modelled. A sensor model represents the uncertainty and error in the data from each sensor and provides a measure of its quality that can be used by the subsequent integration functions. A common assumption is that the uncertainty in the sensory data can be integrated into the operation of the system in accord with three different types of sensory processing: fusion, separate operation, and guiding or cueing. Sensor registration refers to ant of the means used to make the data from each sensor figure. Functional diagram of multi-sensor integration and fusion.

Multi-sensor systems
The span of possible multi-sensor systems can be described by the product of three variables: 1) sensor, 2) property, 3) data, with two possible values, single or multiple, for each variable. This yields a total of eight different configurations. For instance, the single sensor, single property, single data configuration is an example of a system having only one sensor (e.g.: one visual image obtained by a video camera). A single sensor, single property, multiple property, multiple data is the configuration n which a single sensor records a property as a function of time, (e.g.: a sequence of images describing a dynamic scene). An instance of multiple sensor, single property, single data configuration is a system with many range finders employed for redundancy purposes. A multiple sensor, multiple property, multiple data configuration is most general and complex. An example of this configuration is an autonomous robot with several sensors. There are several different methods for combining multiple data sources. Some of them are deciding, guiding, averaging, Bayesian statistics, and integration. 1. Deciding is the use of one of the data sources during a certain time of the fusion process. Usually the decision as to which source to use is based upon some confidence measures or the use of the most dominant or more certain data. 2. Averaging is the combination of several data sources, possibly in a weighted values. This type of fusion ensures that all sensors play a role in the fusion process, but not all to the same degree. 3. Guiding is the use of one or more sensors to focus the attention of another sensor on some part of the scene. An example of guiding is the use of intensity data to locate objects in a scene, and then the use of a tactile sensor to explore some of the objects in more detail. 4. Integration is the delegation of various sensors to particular tasks. For instance, the intensity image may be used to find objects, the range image can then be used to find objects, and then a tactile sensor can be used to help locate and pick up the close objects for further inspection. In this case, the data is not fused but is used in

succession to complete a task. Therefore, there is no redundancy in sensor measurements. Approaches to sensors fusion can be put into one general framework. In this the sensors are shown by circles, and their outputs are denoted by X1, X2,, Xn. Corresponding to each sensor i, there is an input transformation denoted by fi, which is shown by the oval shape. The input transformation could be the identity transformation, which does nothing to the input, that is the input and output are the same. On the other hand, it could be a simple operation like edge detection, or a more complex task like object recognition, which will output a list of possible interpretations of objects presents in the scene. The fusion is performed in the large rectangular block. We have listed a number of possible fusion strategies which can be used. The most simple fusion strategy will be the one in which raw sensor measurements of the same property obtained by multiple sensors are combined. For instance, focus and stereo range data can be combined using Bayes rule. In another case, the sonar and infra-red depth measurements can be combined using simple if then rules, or the range and intensity edge maps can fused by using the logical and operation. On the other hand, a more complex fusion strategy might use weighted least squares fit to determine an objects location and orientation using multiple sensors measurements.

The Role Of Multi-sensor Integration And Fusion In Intelligent Systems


In the operation of an intelligent system, the role of multi-sensor integration and fusion can be understood with reference to the type of information that the integrated multiple sensors can uniquely provide the system. The potential advantages gained through the synergistic use of this multi-sensory information can be decomposed into a combination of four fundamental aspects: the redundancy, complementarity, timeliness, and cost of the information. Prior to discussing these aspects, this section first provides a definition of the distinction between the notions of the integration and fusion of multisensory information; secondly, a general pattern of multi-sensor integration and fusion is presented within the context of an overall system architecture to highlight some of the important functions in the integration process.

A. Multi-sensor integration verses fusion


Multi-sensor integration, as defined in this paper, refers to the synergistic use of the information provided by multiple sensory devices to assist in the accomplishment of a task by a system. An additional distinction is made between multi-sensor integration and the more restricted notion of multi-sensor fusion. Multi-sensor fusion, as defined in this paper, refers to any stage in the integration process where there is an actual combination (or fusion) of different sources of sensory information into one representation format. (this definition would also apply to the fusion of information from a single sensory device acquired over an extended time period). Although the distinction of fusion from integration is not standard in the literature, it serves to separate the more general issues involved in the integration of multiple sensory devices at the system architecture and control level, from the more specific issues involving the actual fusion of sensory information e.g.: in many integrated multisensor systems the information from one sensor may be used to guide the operation of other sensors in the system without ever actually fusing the sensors information.

B. A general pattern
While the fusion of information takes place at the nodes in the figure, the entire network structure, together with the integration functions , shown as part of the system, are part of the multi-sensor integration process. In the figure, n sensors are integrated to provide information to the system. The outputs x1 and x2 from the first two sensors are fused at the lower left hand node into a new representation x1,2. The

output x3 from the third sensor could then be fused with x1,2 at the next node, resulting in the representation x1,2,3 which might then be fused at nodes higher in the structure. In a similar manner the output from all n sensors could be integrated into an overall network structure. The dashed lines from the system to each node represent any of the possible signals sent from the integration functions within the system. The three functions shown in the figure are some of the functions typically used as part of the integration process. Sensor selection can select the most appropriate group of sensors to use in response to changing conditions, sensory information can be represented within the world model, and the information from different sensors may need to be transformed before it can be fused or represented in the world model. Shown along the right side of the figure is a scale indicating the level of representation of the information at the corresponding level in the network structure. The transformation from lower to lower to higher levels of representation as the information moves up through the structure is common in most multi-sensor integration processes. At the lowest level, raw sensory data are transformed into information in the form of a signal. As a result of a series of fusion steps, the signal may be transformed into progressively more abstract numeric or symbolic representations. This signals-to-symbols paradigm is common in computational vision.

C. Potential advantages in integrating multiple sensors


The purpose of external sensors is to provide a system with useful information concerning some features of interest in the systems environment. The potential advantages in integrating and/or fusing information from multiple sensors are that the information can be obtained more accurately, concerning features that are impossible to perceive with individual sensors, in less time, and at a lesser cost. These advantages correspond, respectively, to the notions of the redundancy, complementarity, timeliness, and cost of the information provided the system. 1. Redundant information is provided from a group of sensors (or a single sensor over time) when each sensor is perceiving, possibly with a different fidelity, the same features in the environment. The integration or fusion of redundant information can reduce overall uncertainty and thus increase the accuracy with which the features are perceived by the system. Multiple sensors providing

redundant information can also serve to increase reliability in the case of sensor or failure. 2. Complementary information from multiple sensors allows features in the environment to be perceived that are impossible to perceive using just the information from each individual sensor operating separately. If the features to be perceived are considered dimensions in a space of features, then complementary information is provided information when each sensor is only able to provide information concerning a subset of features that form a subspace in the feature space, i.e., each sensor can be said to perceive features that are independent of the features perceived by the other sensors; conversely, the dependent features perceived by sensors providing redundant information would form a basis in the feature space. 3. More timely information, as compared to the speed at which it could be provided by a single sensor, may be provided by multiple sensors due to either the actual speed of operation of each sensor, or the processing parallelism that may be possible to achieve as part of the integration process. 4. Less costly information, in the context of a system with multiple sensors, is information obtained at a lesser cost when compared to the equivalent information that could be obtained from a single sensor. Unless the information provided by the single sensor is being used for additional functions in the system, the total cost of the single sensor should be compared to the total cost of the integrated multisensor system. The role of multi-sensor integration and fusion in the overall operation of the system can be defined as the degree to which each of these four aspects is present in the information provided by the sensors to the system. Redundancy information can usually be fused at a lower level of representation compared to complementary information because it can more easily be made commensurate. Complementary information is usually either fused at a symbolic level of representation, or provided directly to different parts of the system without being fused. While in most cases the advantages gained through the use of redundant, complementary, or more timely information in a system one case fused information was used in a distributed network of target tracking sensors just to reduce the bandwidth required for communication between groups of sensors in the network.

Applications Of Multi-sensor Fusion And Integration


In recent years, benefits of multi-sensor fusion have motivated research in a variety of application area as follows:

1 Robotics
Robots with multi-sensor fusion and integration enhance their flexibility and productivity in industrial application such as material handling, part fabrication, inspection and assembly. Mobile robot present one of the most important application areas for multi-sensor fusion and integration .When operating in an uncertain or unknown environment, integrating and tuning data from multiple sensors enable mobile robots to achieve quick perception for navigation and obstacle avoidance. Marge mobile robot equipped with multiple sensors. perception, position location, obstacle avoidance vehicle control, path planning, and learning are necessary functions for an autonomous mobile robot.

Honda humanoid robot is equipped with an inclination sensor that consists of three accelerometer and three angular rate sensors. Each foot and wrist is equipped with a six axis force sensor and the robot head contains four video cameras. Multi-sensor fusion and integration of vision ,tactile, thermal, range, laser radar, and forward looking infrared sensors play a very important role for robotic system. Honda humanoid robot And the robot five-fingered robotic hand holding an object in the field of-view of a fixed camera. MARGE mobile robot with a variety of sensors.

2 Military application
It is used in the area of intelligent analysis, situation assessment, force command and control, avionics, and electronic warfare. It is employed for tracking targets such as missiles, aircrafts and submarines.

3 Remote sensing
Application of remote sensing include monitoring climate, environment, water sources, soil and agriculture as well as discovering natural sources and fighting the important of illegal drugs. Fusing or integrating the data from passive multi spectral sensors and active radar sensors is necessary for extracting use full information from satellite or airborne imaginary.

4 Biomedical application

Multi-sensor fusion technique to enhance automatic cardiac rhythm monitoring by integrating electrocardiogram and hemodynamic signals. Redundant and complementary information from the fusion process can improve the performance and robustness for the detection of cardiac events including the ventricular activity and the atria activity.

5 Transportation system
Transportation system such as automatic train control system, intelligent vehicle and high way system, GSP based vehicle system, and navigation air craft landing tracking system utilize multisensor fusion technique to increase the reliability, safety, and efficiency.

Future Research Directions


1 Fault detection
Fault detection has become a critical aspect of advanced fusion system design. Failures normally produce a change in the system dynamics and pose a significant risk. There are many innovative methods have been accomplished.

2 Micro sensors and smart sensors


Successful application of a sensor depends on sensor performance, cost and reliability. However, a large sensor may have excellent operating characteristics but its marketability is severely limited by its size. Reducing the size of a sensor often increases its applicability through the following.1 lower weight and greater portability2 lower manufacturing cost and fewer materials3 wider range of application. Clearly, fewer materials are needed to manufacture a small sensor but the cost of materials processing is often a more significant factor. The revolution and semiconductor technology have enabled us to produce small reliable processors in the form of integrated circuits. The microelectronic applications have led to a considerable demand for small sensors or micro sensors that can fully exploit the benefits of IC technology. Smart sensors can integrate main processing, hardware and software. According to the definition proposed by Breckenridge and Husson, a smart sensor must possess three features: 1) perform a logical computable function, 2) communicate with one or more other devices, 3) make a decision using logic or fuzzy sensor data.

3 Adaptive multi-sensor fusion


In general, multi-sensor fusion requires exact information about the sensed environment. However, in the real world, precise information about the sensed environment is scare and the sensors are not always perfectly functional. Therefore a robust algorithm in the presence of various forms of uncertainty is necessary. Researchers have developed adaptive multi-sensor fusion algorithm to address uncertainties associated with imperfect sensors.

Conclusion
Sensors play an n important role in our everyday life because we have a need to gather information and process it for some tasks. Successful application of sensor depends on sensor performance, cost and reliability. The paradigm of multi-sensor fusion and integration as well as fusion techniques and sensor technologies are used in micro sensor based application in robotics, defence, remote sensing, equipment monitoring, biomedical engineering and transportation systems. Some directions for future research in multi-sensor fusion and integration target micro sensors and adaptive fusion techniques. This may be of interest to researches and engineers attempting to study the rapidly evolving field of multi-sensor fusion and integration.

Bibliography
1.Ren.C.Luo, Fellow, IEEE Chin Chen Yih and Kuo Lan Su Multi-sensor Fusion And Integration: Approaches, Applications, and Future Research Directions, IEEE Sensors Journal, Volume 2 ,No 2 April 2002 pp 107-1182.Encyclopedia of instrumentation and control pp 6103.Paul chasmpran, Sensors Evolution, International Encyclopedia of robotics Application and Automation, vol 3 pp1505- 15164.M . Rahimi and P.A Hancock, Sensors, Integration, International Encyclopedia of Robotics application& Automation Vol 3 pp 1523- 15315.Kevin Hartwig, Sensors, Principles, International Encycloprdia of Robotics Application and Automation, Vol 3 pp 1532-1536.

You might also like