Details

Understanding Computational Bayesian Statistics


Understanding Computational Bayesian Statistics


Wiley Series in Computational Statistics, Band 644 1. Aufl.

von: William M. Bolstad

129,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 20.01.2012
ISBN/EAN: 9780470567340
Sprache: englisch
Anzahl Seiten: 336

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<b>A hands-on introduction to computational statistics</b> <b>from a Bayesian point of view</b> <p>Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, <i>Understanding Computational Bayesian Statistics</i> successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistical models, including the multiple linear regression model, the hierarchical mean model, the logistic regression model, and the proportional hazards model.</p> <p>The book begins with an outline of the similarities and differences between Bayesian and the likelihood approaches to statistics. Subsequent chapters present key techniques for using computer software to draw Monte Carlo samples from the incompletely known posterior distribution and performing the Bayesian inference calculated from these samples. Topics of coverage include:</p> <ul> <li>Direct ways to draw a random sample from the posterior by reshaping a random sample drawn from an easily sampled starting distribution</li> <li>The distributions from the one-dimensional exponential family</li> <li>Markov chains and their long-run behavior</li> <li>The Metropolis-Hastings algorithm</li> <li>Gibbs sampling algorithm and methods for speeding up convergence</li> <li>Markov chain Monte Carlo sampling</li> </ul> <p>Using numerous graphs and diagrams, the author emphasizes a step-by-step approach to computational Bayesian statistics. At each step, important aspects of application are detailed, such as how to choose a prior for logistic regression model, the Poisson regression model, and the proportional hazards model. A related Web site houses R functions and Minitab macros for Bayesian analysis and Monte Carlo simulations, and detailed appendices in the book guide readers through the use of these software packages.</p> <p><i>Understanding Computational Bayesian Statistics</i> is an excellent book for courses on computational statistics at the upper-level undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners who use computer programs to conduct statistical analyses of data and solve problems in their everyday work.</p>
<p>Preface xi</p> <p><b>1 Introduction to Bayesian Statistics I</b></p> <p>1.1 The Frequentist Approach to Statistics 1</p> <p>1.2 The Bayesian Approach to Statistics 3</p> <p>1.3 Comparing Likelihood and Bayesian Approaches to Statistics 6</p> <p>1.4 Computational Bayesian Statistics 19</p> <p>1.5 Purpose and Organization of This Book 20</p> <p><b>2 Monte Carlo Sampling from the Posterior 25</b></p> <p>2.1 Acceptance-Rejection-Sampling 27</p> <p>2.2 Sampling-Importance-Resampling 33</p> <p>2.3 Adaptive-Rejection-Sampling from a Log-Concave Distribution 35</p> <p>2.4 Why Direct Methods Are Inefficient for High-Dimension Parameter Space 42</p> <p><b>3. Bayesian Inference 47</b></p> <p>3.1 Bayesian Inference from the Numerical Posterior 47</p> <p>3.2 Bayesian Inference from Posterior Random Sample 54</p> <p><b>4. Bayesian Statistics Using Conjugate Priors 61</b></p> <p>4.1 One-Dimensional Exponential Family of Densities 61</p> <p>4.2 Distributions for Count Data 62</p> <p>4.3 Distributions for Waiting Times 69</p> <p>4.4 Normally Distributed Observations with Known Variance 75</p> <p>4.5 Normally Distributed Observations with Known Mean 78</p> <p>4.6 Normally Distributed Observations with Unknown Mean and Variance 80</p> <p>4.7 Multivariate Normal Observations with Known Covariance Matrix 85</p> <p>4.8 Observations from Normal Linear Regression Model 87</p> <p>Appendix: Proof of Poisson Process Theorem 97</p> <p><b>5. Markov Chains 101</b></p> <p>5.1 Stochastic Processes 102</p> <p>5.2 Markov Chains 103</p> <p>5.3 Time-Invariant Markov Chains with Finite State Space 104</p> <p>5.4 Classification of States of a Markov Chain 109</p> <p>5.5 Sampling from a Markov Chain 114</p> <p>5.6 Time-Reversible Markov Chains and Detailed Balance 117</p> <p>5.7 Markov Chains with Continuous State Space 120</p> <p><b>6. Markov Chain Monte Carlo Sampling from Posterior 127</b></p> <p>6.1 Metropolis-Hastings Algorithm for a Single Parameter 130</p> <p>6.2 Metropolis-Hastings Algorithm for Multiple Parameters 137</p> <p>6.3 Blockwise Metropolis-Hastings Algorithm 144</p> <p>6.4 Gibbs Sampling 149</p> <p>6.5 Summary 150</p> <p><b>7 Statistical Inference from a Markov Chain Monte Carlo Sample 159</b></p> <p>7.1 Mixing Properties of the Chain 160</p> <p>7.2 Finding a Heavy-Tailed Matched Curvature Candidate Density 162</p> <p>7.3 Obtaining An Approximate Random Sample For Inference 168</p> <p>Appendix: Procedure for Finding the Matched</p> <p>Curvature Candidate Density for a Multivariate Parameter 176</p> <p><b>8 Logistic Regression 179</b></p> <p>8.1 Logistic Regression Model 180</p> <p>8.2 Computational Bayesian Approach to the Logistic Regression Model 184</p> <p>8.3 Modelling with the Multiple Logistic Regression Model 192</p> <p><b>9 Poisson Regression and Proportional Hazards Model 203</b></p> <p>9.1 Poisson Regression Model 204</p> <p>9.2 Computational Approach to Poisson Regression Model 207</p> <p>9.3 The Proportional Hazards Model 214</p> <p>9.4 Computational Bayesian Approach to Proportional Hazards Model 218</p> <p><b>10 Gibbs Sampling and Hierarchical Models 235</b></p> <p>10.1 Gibbs Sampling Procedure 236</p> <p>10.2 The Gibbs Sampler for the Normal Distribution 237</p> <p>10.3 Hierarchical Models and Gibbs Sampling 242</p> <p>10.4 Modelling Related Populations with Hierarchical Models 244</p> <p>Appendix: Proof That Improper Jeffrey's Prior Distribution for the Hypervariance Can Lead to an<br />Improper Postenor 261</p> <p><b>11 Going Forward with Markov Chain Monte Carlo 265</b></p> <p>A Using the Included Minitab Macros 271</p> <p>B Using the Included R Functions 289</p> <p>References 307</p> <p>Topic Index 313</p>
"Understanding computational Bayesian statistics is an excellent book for courses on computational statistics at the advanced undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners who use computer programs to conduct statistical analyses of data and solve problems in their everyday work." (Mathematical Reviews, 2011)
<p><b>WILLIAM M. BOLSTAD, P<small>H</small>D,</b> is Senior Lecturer in the Department of Statistics at The University of Waikato (New Zealand). Dr. Bolstad's research interests include Bayesian statistics, MCMC methods, recursive estimation techniques, multiprocess dynamic time series models, and forecasting. He is the author of <i>Introduction to Bayesian Statistics, Second Edition,</i> also published by Wiley.
<p><b>A hands-on introduction to computational statistics from a Bayesian point of view</b> <p>Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, <i>Understanding Computational Bayesian Statistics</i> successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistical models, including the multiple linear regression model, the hierarchical mean model, the logistic regression model, and the proportional hazards model. <p>The book begins with an outline of the similarities and differences between Bayesian and the likelihood approaches to statistics. Subsequent chapters present key techniques for using computer software to draw Monte Carlo samples from the incompletely known posterior distribution and performing the Bayesian inference calculated from these samples. Topics of coverage include: <ul> <li>Direct ways to draw a random sample from the posterior by reshaping a random sample drawn from an easily sampled starting distribution</li> <li>The distributions from the one-dimensional exponential family</li> <li>Markov chains and their long-run behavior</li> <li>The Metropolis-Hastings algorithm</li> <li>Gibbs sampling algorithm and methods for speeding up convergence</li> <li>Markov chain Monte Carlo sampling</li> </ul> <p>Using numerous graphs and diagrams, the author emphasizes a step-by-step approach to computational Bayesian statistics. At each step, important aspects of application are detailed, such as how to choose a prior for logistic regression model, the Poisson regression model, and the proportional hazards model. A related Web site houses R functions and Minitab<sup>®</sup> macros for Bayesian analysis and Monte Carlo simulations, and detailed appendices in the book guide readers through the use of these software packages. <p><i>Understanding Computational Bayesian Statistics</i> is an excellent book for courses on computational statistics at the upper-level undergraduate and graduate levels. It is also a valuable reference for researchers and practitioners who use computer programs to conduct statistical analyses of data and solve problems in their everyday work.

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €