What is sensor fusion?

On its face, sensor fusion, also known as multi-sensor data fusion, sounds simple. Two or more sensors are better than one. Mash them together and you have sensor fusion! Well, the software and algorithms behind the data to make sensor fusion possible will quickly have you rethinking that “simple” label.

What is sensor fusion?

Essentially, sensor fusion aims to overcome the limitations of individual sensors by gathering and fusing data from multiple sensors to produce more reliable information with less uncertainty. This more robust information can then be used to make decisions or take certain actions. The technology behind the magic curtain is seemingly simple: a microcontroller uses software algorithms to aggregate or fuse data from various sensors to paint a more complete picture of the process or situation at hand. The ideas is that this more comprehensive understanding of the process or situation then offers more and/or deeper insights which can then influence new, more intelligent, and more accurate responses.

The algorithms of sensor fusion

The complex nature of sensor fusion comes into play when we start to talk about the algorithms in sensor fusion software, as well as the various categories and defining characteristics of this nuanced technology. As algorithms grow more and more complex, the costs of software and processing capabilities will increase.

  • Kalman Filter: This prediction-correction filtering algorithm is the most widely used in sensor fusion and is particularly useful in navigation and positioning technology.
  • Bayesian Network: Based on Bayes’ rule, where the focus is probability, these algorithms predict the likelihood of contributing factors from multiple hypotheses.
  • Central Limit Theorem (CLT): With the law of large numbers at its core, CLT algorithms capture numerous samples or readings to produce the most accurate average value of the dataset, typically represented in a bell curve.
  • Convolutional Neural Network: These algorithms fuse image recognition data from multiple sources to classify results.
  • Dempster-Shafer: Considered a generalized version of Bayesian theory, these algorithms use uncertainty management and inference mechanisms and closely mirror human reasoning and perception.

Classifications of sensor fusion

To further illustrate the nuances of sensor fusion, here’s an overview of the various ways to classify this technology:

Data origination

  • Direct Fusion: The fusion of sensor data from heterogeneous or homogeneous sensors.
  • Indirect Fusion: The fusion of data from existing environmental or human sources.

The location of data fusion

  • Centralized: Data is sent to a central location where it is fused and processed.
  • Decentralized: Data is captured onsite where it is fused and processed.

Sensor configurations - The flow of information between sensors

  • Complementary: When “sensors do not directly depend on each other but can be combined to give a more complete image,” which is useful with motion recognition tasks.
  • Competitive or Redundant: When each sensor “delivers independent measurements of the same property,” which is helpful in error correction, for example.
  • Cooperative: When information from independent sensors is used to “derive information that would not be available from single sensors,” which is useful in studying human motion in research and medicine, for example.
Sensor fusion block diagram
Block diagram for sensor fusion.

Categories or levels of sensor fusion

As can be expected, the uses and applications of sensor fusion are wide-ranging. The following levels indicate the various uses of sensor fusion:

  • Level 0: Data alignment
  • Level 1: Entity assessment (for object detection, etc.)
  • Level 2: Situation assessment
  • Level 3: Impact assessment
  • Level 4: Process refinement
  • Level 5: User refinement

Data type

The type of information used as inputs for algorithms can also define the sensor fusion level.

  • Data level: Raw data from various sources is used to feed the fusion algorithm.
  • Feature level: Information or features from a variety of individual sensors is used to feed the fusion algorithm.
  • Decision level: Once data- and feature-level sensor fusion have occurred, decision-level sensor fusion takes place when a hypothesis is selected from a set of hypotheses.

Industry uses & applications

Sensors are used in innumerable applications across a multitude of industries and sectors. Because sensors are so ubiquitous, it stands to reason that the applications for sensor fusion are just as vast. A report by Emergen Research predicts that the global sensor fusion market will reach USD 16.72 Billion by 2027. Here’s a short list of industries that benefit from sensor fusion

  • Automotive Industry
  • Climate Monitoring
  • Computer Software
  • Consumer Electronics
  • Healthcare
  • Home Automation
  • Industrial Control
  • Internet of Things
  • Manufacturing
  • Military
  • Oil Exploration

References 

The Role of Sensor Fusion in the Internet of Things

What Is Sensor Fusion?

Sensor Fusion Algorithms Explained

Bayesian and Dempster-Shafer fusion

Sensor fusion Using Dempster-Shafer Theory

Sensor Fusion

Sensor Fusion Market Size Worth USD 16.72 Billion by 2027 / CAGR of 19.6%: Emergen Research

RELATED

What is a Flow Sensor?

What is a Hall Effect Sensor?

What is MEMS technology?