Robustness and Accuracy of Bayesian Information Fusion Systems

Authors Gregor Pavlin, Jan Nunnink, Frans Groen



PDF
Thumbnail PDF

File

DagSemProc.05381.3.pdf
  • Filesize: 485 kB
  • 32 pages

Document Identifiers

Author Details

Gregor Pavlin
Jan Nunnink
Frans Groen

Cite As Get BibTex

Gregor Pavlin, Jan Nunnink, and Frans Groen. Robustness and Accuracy of Bayesian Information Fusion Systems. In Form and Content in Sensor Networks. Dagstuhl Seminar Proceedings, Volume 5381, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2006) https://doi.org/10.4230/DagSemProc.05381.3

Abstract

Modern situation assessment and controlling applications often require efficient 
fusion of large amounts of heterogeneous and uncertain information. In addition, 
fusion results are often mission critical. 

It turns out that Bayesian networks (BN) are suitable for a significant class of 
such applications, since they facilitate modeling of very heterogeneous types of 
uncertain information and support efficient belief propagation techniques. BNs are 
based on a rigorous theory which facilitates (i) analysis of the robustness of fusion 
systems and (ii) monitoring of the fusion quality. 

We assume domains where situations can be described through sets of discrete random 
variables. A situation corresponds to a set of hidden and observed states that the 
nature `sampled' from some true distribution over the combinations of possible states. 
Thus, in a particular situation certain states materialized while others did not, which 
corresponds to a point-mass distribution over the possible states.  Consequently, the 
state estimation can be reduced to a classification of the possible combinations of 
relevant states. We assume that there exist mappings between hidden states of interest 
and optimal decisions/actions.  

In this context, we consider classification of the states accurate if it is equivalent
 to the truth in the sense that knowing the truth would not change the action based 
on the classification. Clearly, BNs provide a mapping between the observed symptoms 
and hypotheses about hidden events. Consequently, BNs have a critical impact on the 
fusion accuracy.

We emphasize a fundamental difference between the model accuracy and fusion 
(i.e.classification) accuracy. A BN is a generalization over many possible situations
 that captures probability distributions over the possible events in the observed 
domain. However, even a perfect generalization does not necessarily support accurate 
classification in a particular situation. We address this problem with the help of the 
Inference Meta Model (IMM) which describes information fusion in BNs from a coarse, 
runtime perspective.

IMM is based on a few realistic assumptions and exposes properties of BNs that are r
elevant for the construction of inherently robust fusion systems. With the help of IMM 
we show that in BNs featuring many conditionally independent network fragments inference 
can be very insensitive to the modeling parameter values. This implies that fusion can be 
robust, which is especially relevant in many real world applications where we cannot obtain 
precise models due to the lack of sufficient training data or expertise. In addition, 
IMM introduces a reinforcement propagation algorithm that can be used as an alternative 
to the common approaches to inference in BNs. We can show that the classification accuracy 
of this propagation algorithm is asymptotically approaching 1 as the number of conditionally 
independent network fragments increases. Because of these properties, the propagation 
algorithm can be used as a basis for effective detection of misleading fusion results 
as well as discovery of inadequate modeling components and erroneous information sources.

Subject Classification

Keywords
  • Robust Information Fusion
  • Bayesian Networks
  • Heterogeneous Information
  • Modeling Uncertainties

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail