Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles.
The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including:
-
Markov Chain Monte Carlo algorithms in Bayesian inference
-
Generalized linear models
-
Bayesian hierarchical models
-
Predictive distribution and model checking
-
Bayesian model and variable evaluation
Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all data sets and code are available on the book's related Web site.
Requiring only a working knowledge of probability theory and statistics, Bayesian Modeling Using WinBUGS serves as an excellent book for courses on Bayesian statistics at the upper-undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners in the fields of statistics, actuarial science, medicine, and the social sciences who use WinBUGS in their everyday work.
Table of Contents
Preface.Acknowledgments.
Acronyms.
1. Introduction to Bayesian inference.
1.1 Introduction: Bayesian modeling in the 21st century.
1.2 Definition of statistical models.
1.3 Bayes theorem.
1.4 Model-based Bayesian Inference.
1.5 Inference using conjugate prior distributions.
1.6 Nonconjugate Analysis.
Problems.
2. Markov Chain Monte Carlo Algorithms in Bayesian Inference.
2.1 Simulation, Monte Carlo integration, and their implementation in Bayesian inference.
2.2 Markov chain Monte Carlo methods.
2.3 Popular MCMC algorithms.
2.4 Summary and closing remarks.
Problems.
3. WinBUGS Software: Introduction, Setup and Basic Analysis.
3.1 Introduction and historical background.
3.2 The WinBUGS environment.
3.3 Preliminaries on using WinBUGS.
3.4 Building Bayesian models in WinBUGS.
3.5 Compiling the model and simulating values.
3.6 Basic output analysis using the sample monitor tool.
3.7 Summarizing the procedure.
3.8 Chapter summary and concluding comments.
Problems.
4. WinBUGS Software: Illustration, Results, and Further Analysis.
4.1 A complete example of running MCMC in WinBUGS for a simple model.
4.2 Further output analysis using the inference menu.
4.3 Multiple chains.
4.4 Changing the properties of a figure.
4.5 Other tools and menus.
4.6 Summary and concluding remarks.
Problems.
5. Introduction to Bayesian Models: Normal models.
5.1 General modeling principles.
5.2 Model specification in normal regression models.
5.3 Using vectors and multivariate priors in normal regression models.
5.4 Analysis of variance models.
Problems.
6. Incorporating Categorical Variables in Normal Models and Further Modeling Issues.
6.1 Analysis of variance models using dummy variables.
6.2 Analysis of covariance models.
6.3 A Bioassay example.
6.4 Further modeling issues.
6.5 Closing remarks.
Problems.
7. Introduction to Generalized Linear Models: Binomial and Poisson Data.
7.1 Introduction.
7.2 Prior distributions.
7.3 Posterior inference.
7.4 Poisson regression models.
7.5 Binomial response models.
7.6 Models for contingency tables.
Problems.
8. Models for Positive Continuous Data, Count Data, and Other GLM-Based Extensions.
8.1 Models with nonstandard distributions.
8.2 Models for positive continuous response variables.
8.3 Additional models for count data.
8.4 Further GLM-based models and extensions.
Problems.
9. Bayesian Hierarchical Models.
9.1 Introduction.
9.2 Some simple examples.
9.3 The generalized linear mixed model formulation.
9.4 Discussion, closing remarks, and further reading.
Problems.
10. The Predictive Distribution and Model Checking.
10.1 Introduction.
10.2 Estimating the predictive distribution for future or missing observations using MCMC.
10.3 Using the predictive distribution for model checking.
10.4 Using cross-validation predictive densities for model checking, evaluation, and comparison.
10.5 Illustration of a complete predictive analysis: Normal regression models.
10.6 Discussion.
Problems.
11. Bayesian Model and Variable Evaluation.
11.1 Prior predictive distributions as measures of model comparison: Posterior model odds and Bayes factors.
11.2 Sensitivity of the posterior model probabilities: The Lindley-Bartlett paradox.
11.3 Computation of the marginal likelihood.
11.4 Computation of the marginal likelihood using WinBUGS.
11.5 Bayesian variable selection using Gibbs-based methods.
11.6 Posterior inference using the output of Bayesian variable selection samplers.
11.7 Implementation of Gibbs variable selection in WinBUGS using an illustrative example.
11.8 The Carlin Chib’s method.
11.9 Reversible jump MCMC (RJMCMC).
11.10 Using posterior predictive densities for model evaluation.
11.11 Information criteria.
11.12 Discussion and further reading.
Problems.
Appendix A: Model Specification via Directed Acyclic Graphs: The Doodle Menu.
A.1 Introduction: Starting with DOODLE.
A.2 Nodes.
A.3 Edges.
A.4 Panels.
A.5 A simple example.
Appendix B: The Batch Mode: Running a Model in the Background Using Scripts.
B.1 Introduction.
B.2 Basic commands: Compiling and running the model.
Appendix C: Checking Convergence Using CODA/BOA.
C.1 Introduction.
C.2 A short historical review.
C.3 Diagnostics implemented by CODA/BOA.
C.4 A first look of CODA/BOA.
C.5 A simple example.
Appendix D: Notation Summary.
D.1 MCMC.
D.2 Subscripts and indices.
D.3 Parameters.
D.4 Random variables and data.
D.5 Sample estimates.
D.6 Special functions, vectors and matrices.
D.7 Distributions.
D.8 Distribution-related notation.
D.9 Notation used in ANOVA and ANCOVA.
D.10 Variable and model specification.
D.11 Deviance information criterion (DIC).
D.12 Predictive measures.
References.
Index.