Skip to main content

Markov Chain Monte Carlo Methods for Hierarchical Bayesian Expert Systems

  • Conference paper
Selecting Models from Data

Part of the book series: Lecture Notes in Statistics ((LNS,volume 89))

Abstract

In a hierarchical Bayesian expert system, the probabilities relating the variables are not known precisely; rather, imprecise knowledge of these probabilities is described by placing prior distributions on them. After obtaining data, one would like to update those distributions to reflect the new information gained; however, this can prove difficult computationally if the observed data are incomplete. This paper describes a way around these difficulties—use of Markov chain Monte Carlo methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Cooper, G. F. (1984) NESTOR: A computer-based medical diagnostic aid that integrates causal and probabilistic knowledge, PhD Thesis, Dept of Computer Science, Stanford University.

    Google Scholar 

  2. Cooper, G. F. (1992) Personal communication.

    Google Scholar 

  3. Dawid, A.P. and Lauritzen, S.L. (1989)“Markov Distributions, Hyper-Markov Laws and Meta-Markov Models on Decomposable Graphs, with Applications to Bayesian Learning in Expert Systems,” Technical Report R-89-31. Institute for Electronic Systems, Aalborg University.

    Google Scholar 

  4. Dawid, A.P. and Lauritzen, S.L. (1993) “Hyper Markov Laws in the Statistical Analysis of Decomposable Graphical Models,” Ann. Stat., in press

    Google Scholar 

  5. Devroye, Luc (1986) Non-uniform Random Variate Generation. Springer-Verlag, New York.

    MATH  Google Scholar 

  6. Gelfand, A. and Smith, A.F.M. (1991) “Gibbs sampling for marginal posterior expectations,” Communs. Statist. Theory Meth. 20, 1747–1766.

    Article  MathSciNet  Google Scholar 

  7. Geyer, Charles J. (1992) “Practical Markov Chain Monte Carlo,” Stat. Sci. 7, 473–483.

    Article  Google Scholar 

  8. Geyer, Charles J. and Tierney, Luke (1992) “On the Convergence of Monte Carlo Approximations to the Posterior Density,” Technical Report 579. School of Statistics, University of Minnesota.

    Google Scholar 

  9. Hastings, W.K. (1970) “Monte Carlo Sampling Methods Using Markov Chains and Their Applications,” Biometrika 57, 97 - 109.

    Article  MATH  Google Scholar 

  10. Lauritzen, S.L., Dawid, A.P., Larsen, B. N. and Leimer, H.G. (1990) “Independence Properties of Directed Markov Random Fields,” Networks 20, 491 - 505.

    Article  MathSciNet  MATH  Google Scholar 

  11. Lauritzen, S.L. and Spiegelhalter, D. (1988) “Local Computations with Probabilities on Graphical Structures and their Application to Expert Systems (with Discussion),” J. Roy. Statist. Soc. ser. B 50, 157 - 224.

    MathSciNet  MATH  Google Scholar 

  12. Madigan, D. and York, Jeremy C. (1993) “Bayesian Graphical Models,” Technical Report 259. Department of Statistics, University of Washington.

    Google Scholar 

  13. Pearl, J. (1988) Probabilistic Reasoning in Intelligent Systems. Morgan Kaufmann, San Mateo.

    Google Scholar 

  14. Ripley, B. (1987) Stochastic Simulation. John Wiley and Sons, New York.

    Book  MATH  Google Scholar 

  15. Smith, A.F.M. and Roberts, G. O. (1993) “Bayesian Computation via the Gibbs Sampler and Related Markov Chain Monte Carlo Methods,” J. R. Statist. Soc. B 55, 3–23.

    MathSciNet  MATH  Google Scholar 

  16. Spiegelhalter, D.J. and Cowell, R.G. (1992) “Learning in Probabilistic Expert Systems,” in: Bayesian Statistics 4, Bernardo, J.M., Berger, J.O., Dawid, A.P. and Smith, A.F.M., eds, Oxford University Press, Oxford.

    Google Scholar 

  17. Spiegelhalter, D.J. and Lauritzen, S.L. (1990) “Sequential Updating of Conditional Probabilities on Directed Graphical Structures,” Networks 20, 579–605.

    Article  MathSciNet  MATH  Google Scholar 

  18. Tierney, Luke ( 1991 a) “Markov Chains for Exploring Posterior Distributions.” Technical Report 560, School of Statistics, University of Minnesota.

    Google Scholar 

  19. Tierney, Luke (1991b) “Exploring Posterior Distributions Using Markov Chains,” in: Computing Science and Statistics, Proceedings of the 23rd Symposium on the Interface, 563–570.

    Google Scholar 

  20. Wei, G.C.G. and Tanner, M. A. (1990) “Calculating the Content and the Boundary of the Highest Posterior Density Region via Data Augmentation”, Biometrika 77, 649–652.

    Article  MathSciNet  Google Scholar 

  21. York, Jeremy C. (1992a) “Use of the Gibbs sampler in Expert Systems,” Artificial Intelligence 56, 115 - 130.

    Article  MathSciNet  MATH  Google Scholar 

  22. York, Jeremy C. (1992b) Bayesian Methods for the Analysis of Misclassified or Incomplete Multivariate Discrete Data, PhD Thesis, Dept of Statistics, University of Washington.

    Google Scholar 

  23. “Bayesian methods for estimating the size of a closed population.” Technical report 234, Department of Statistics, University of Washington.

    Google Scholar 

  24. York, Jeremy C., Madigan, David, Heuch, Ivar and Lie, Rolv Terje (1993) “Birth Defects Registered by Double Sampling: A Bayesian Approach Incorporating Covariates and Model Uncertainty,” submitted for publication.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1994 Springer-Verlag New York

About this paper

Cite this paper

York, J.C., Madigan, D. (1994). Markov Chain Monte Carlo Methods for Hierarchical Bayesian Expert Systems. In: Cheeseman, P., Oldford, R.W. (eds) Selecting Models from Data. Lecture Notes in Statistics, vol 89. Springer, New York, NY. https://doi.org/10.1007/978-1-4612-2660-4_45

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-2660-4_45

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-0-387-94281-0

  • Online ISBN: 978-1-4612-2660-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics