Lazy propagation: A junction tree inference algorithm based on lazy evaluation

https://doi.org/10.1016/S0004-3702(99)00062-4Get rights and content
Under an Elsevier user license
open archive

Abstract

In this paper we present a junction tree based inference architecture exploiting the structure of the original Bayesian network and independence relations induced by evidence to improve the efficiency of inference. The efficiency improvements are obtained by maintaining a multiplicative decomposition of clique and separator potentials. Maintaining a multiplicative decomposition of clique and separator potentials offers a tradeoff between off-line constructed junction trees and on-line exploitation of barren variables and independence relations induced by evidence.

We consider the impact of the proposed architecture on a number of commonly performed Bayesian network tasks. The tasks we consider include cautious propagation of evidence, determining a most probable configuration, and fast retraction of evidence a long with a number of other tasks. The general impression is that the proposed architecture increases the computational efficiency of performing these tasks.

The efficiency improvement offered by the proposed architecture is emphasized through empirical evaluations involving large real-world Bayesian networks. We compare the time and space performance of the proposed architecture with non-optimized implementations of the Hugin and Shafer–Shenoy inference architectures.

Keywords

Bayesian networks
Junction trees
Probabilistic inference

Cited by (0)