Details

Applied Bayesian Modelling


Applied Bayesian Modelling


Wiley Series in Probability and Statistics, Band 595 2. Aufl.

von: Peter Congdon

68,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 23.05.2014
ISBN/EAN: 9781118895054
Sprache: englisch
Anzahl Seiten: 464

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p>This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBUGS and OPENBUGS. This feature continues in the new edition along with examples using R to broaden appeal and for completeness of coverage.</p>
Preface xi <p><b>1 Bayesian methods and Bayesian estimation 1</b></p> <p>1.1 Introduction 1</p> <p>1.1.1 Summarising existing knowledge: Prior densities for parameters 2</p> <p>1.1.2 Updating information: Prior, likelihood and posterior densities 3</p> <p>1.1.3 Predictions and assessment 5</p> <p>1.1.4 Sampling parameters 6</p> <p>1.2 MCMC techniques: The Metropolis–Hastings algorithm 7</p> <p>1.2.1 Gibbs sampling 8</p> <p>1.2.2 Other MCMC algorithms 9</p> <p>1.2.3 INLA approximations 10</p> <p>1.3 Software for MCMC: BUGS, JAGS and R-INLA 11</p> <p>1.4 Monitoring MCMC chains and assessing convergence 19</p> <p>1.4.1 Convergence diagnostics 20</p> <p>1.4.2 Model identifiability 21</p> <p>1.5 Model assessment 23</p> <p>1.5.1 Sensitivity to priors 23</p> <p>1.5.2 Model checks 24</p> <p>1.5.3 Model choice 25</p> <p>References 28</p> <p><b>2 Hierarchical models for related units 34</b></p> <p>2.1 Introduction: Smoothing to the hyper population 34</p> <p>2.2 Approaches to model assessment: Penalised fit criteria, marginal likelihood and predictive methods 35</p> <p>2.2.1 Penalised fit criteria 36</p> <p>2.2.2 Formal model selection using marginal likelihoods 37</p> <p>2.2.3 Estimating model probabilities or marginal likelihoods in practice 38</p> <p>2.2.4 Approximating the posterior density 40</p> <p>2.2.5 Model averaging from MCMC samples 42</p> <p>2.2.6 Predictive criteria for model checking and selection: Cross-validation 46</p> <p>2.2.7 Predictive checks and model choice using complete data replicate sampling 50</p> <p>2.3 Ensemble estimates: Poisson–gamma and Beta-binomial hierarchical models 53</p> <p>2.3.1 Hierarchical mixtures for poisson and binomial data 54</p> <p>2.4 Hierarchical smoothing methods for continuous data 61</p> <p>2.4.1 Priors on hyperparameters 62</p> <p>2.4.2 Relaxing normality assumptions 63</p> <p>2.4.3 Multivariate borrowing of strength 65</p> <p>2.5 Discrete mixtures and dirichlet processes 69</p> <p>2.5.1 Finite mixture models 69</p> <p>2.5.2 Dirichlet process priors 72</p> <p>2.6 General additive and histogram smoothing priors 78</p> <p>2.6.1 Smoothness priors 79</p> <p>2.6.2 Histogram smoothing 80</p> <p>Exercises 83</p> <p>Notes 86</p> <p>References 89</p> <p><b>3 Regression techniques 97</b></p> <p>3.1 Introduction: Bayesian regression 97</p> <p>3.2 Normal linear regression 98</p> <p>3.2.1 Linear regression model checking 99</p> <p>3.3 Simple generalized linear models: Binomial, binary and Poisson regression 102</p> <p>3.3.1 Binary and binomial regression 102</p> <p>3.3.2 Poisson regression 105</p> <p>3.4 Augmented data regression 107</p> <p>3.5 Predictor subset choice 110</p> <p>3.5.1 The g-prior approach 114</p> <p>3.5.2 Hierarchical lasso prior methods 116</p> <p>3.6 Multinomial, nested and ordinal regression 126</p> <p>3.6.1 Nested logit specification 128</p> <p>3.6.2 Ordinal outcomes 130</p> <p>Exercises 136</p> <p>Notes 138</p> <p>References 144</p> <p><b>4 More advanced regression techniques 149</b></p> <p>4.1 Introduction 149</p> <p>4.2 Departures from linear model assumptions and robust alternatives 149</p> <p>4.3 Regression for overdispersed discrete outcomes 154</p> <p>4.3.1 Excess zeroes 157</p> <p>4.4 Link selection 160</p> <p>4.5 Discrete mixture regressions for regression and outlier status 161</p> <p>4.5.1 Outlier accommodation 163</p> <p>4.6 Modelling non-linear regression effects 167</p> <p>4.6.1 Smoothness priors for non-linear regression 167</p> <p>4.6.2 Spline regression and other basis functions 169</p> <p>4.6.3 Priors on basis coefficients 171</p> <p>4.7 Quantile regression 175</p> <p>Exercises 177</p> <p>Notes 177</p> <p>References 179</p> <p><b>5 Meta-analysis and multilevel models 183</b></p> <p>5.1 Introduction 183</p> <p>5.2 Meta-analysis: Bayesian evidence synthesis 184</p> <p>5.2.1 Common forms of meta-analysis 185</p> <p>5.2.2 Priors for stage 2 variation in meta-analysis 188</p> <p>5.2.3 Multivariate meta-analysis 193</p> <p>5.3 Multilevel models: Univariate continuous outcomes 195</p> <p>5.4 Multilevel discrete responses 201</p> <p>5.5 Modelling heteroscedasticity 204</p> <p>5.6 Multilevel data on multivariate indices 206</p> <p>Exercises 208</p> <p>Notes 210</p> <p>References 211</p> <p><b>6 Models for time series 215</b></p> <p>6.1 Introduction 215</p> <p>6.2 Autoregressive and moving average models 216</p> <p>6.2.1 Dependent errors 218</p> <p>6.2.2 Bayesian priors in ARMA models 218</p> <p>6.2.3 Further types of time dependence 222</p> <p>6.3 Discrete outcomes 229</p> <p>6.3.1 INAR models for counts 231</p> <p>6.3.2 Evolution in conjugate process parameters 232</p> <p>6.4 Dynamic linear and general linear models 235</p> <p>6.4.1 Further forms of dynamic models 238</p> <p>6.5 Stochastic variances and stochastic volatility 244</p> <p>6.5.1 ARCH and GARCH models 244</p> <p>6.5.2 State space stochastic volatility models 245</p> <p>6.6 Modelling structural shifts 248</p> <p>6.6.1 Level, trend and variance shifts 249</p> <p>6.6.2 Latent state models including historic dependence 250</p> <p>6.6.3 Switching regressions and autoregressions 251</p> <p>Exercises 258</p> <p>Notes 261</p> <p>References 265</p> <p><b>7 Analysis of panel data 273</b></p> <p>7.1 Introduction 273</p> <p>7.2 Hierarchical longitudinal models for metric data 274</p> <p>7.2.1 Autoregressive errors 275</p> <p>7.2.2 Dynamic linear models 276</p> <p>7.2.3 Extended time dependence 276</p> <p>7.3 Normal linear panel models and normal linear growth curves 278</p> <p>7.3.1 Growth curves 280</p> <p>7.3.2 Subject level autoregressive parameters 283</p> <p>7.4 Longitudinal discrete data: Binary, categorical and Poisson panel data 285</p> <p>7.4.1 Binary panel data 285</p> <p>7.4.2 Ordinal panel data 288</p> <p>7.4.3 Panel data for counts 292</p> <p>7.5 Random effects selection 295</p> <p>7.6 Missing data in longitudinal studies 297</p> <p>Exercises 302</p> <p>Notes 303</p> <p>References 306</p> <p><b>8 Models for spatial outcomes and geographical association 312</b></p> <p>8.1 Introduction 312</p> <p>8.2 Spatial regressions and simultaneous dependence 313</p> <p>8.2.1 Regression with localised dependence 316</p> <p>8.2.2 Binary outcomes 317</p> <p>8.3 Conditional prior models 321</p> <p>8.3.1 Ecological analysis involving count data 324</p> <p>8.4 Spatial covariation and interpolation in continuous space 329</p> <p>8.4.1 Discrete convolution processes 332</p> <p>8.5 Spatial heterogeneity and spatially varying coefficient priors 337</p> <p>8.5.1 Spatial expansion and geographically weighted regression 338</p> <p>8.5.2 Spatially varying coefficients via multivariate priors 339</p> <p>8.6 Spatio-temporal models 343</p> <p>8.6.1 Conditional prior representations 345</p> <p>8.7 Clustering in relation to known centres 348</p> <p>8.7.1 Areas or cases as data 350</p> <p>8.7.2 Multiple sources 350</p> <p>Exercises 352</p> <p>Notes 354</p> <p>References 355</p> <p><b>9 Latent variable and structural equation models 364</b></p> <p>9.1 Introduction 364</p> <p>9.2 Normal linear structural equation models 365</p> <p>9.2.1 Cross-sectional normal SEMs 365</p> <p>9.2.2 Identifiability constraints 367</p> <p>9.3 Dynamic factor models, panel data factor models and spatial factor models 372</p> <p>9.3.1 Dynamic factor models 372</p> <p>9.3.2 Linear SEMs for panel data 374</p> <p>9.3.3 Spatial factor models 378</p> <p>9.4 Latent trait and latent class analysis for discrete outcomes 381</p> <p>9.4.1 Latent trait models 381</p> <p>9.4.2 Latent class models 382</p> <p>9.5 Latent trait models for multilevel data 387</p> <p>9.6 Structural equation models for missing data 389</p> <p>Exercises 392</p> <p>Notes 394</p> <p>References 397</p> <p><b>10 Survival and event history models 402</b></p> <p>10.1 Introduction 402</p> <p>10.2 Continuous time functions for survival 403</p> <p>10.2.1 Parametric hazard models 405</p> <p>10.2.2 Semi-parametric hazards 408</p> <p>10.3 Accelerated hazards 411</p> <p>10.4 Discrete time approximations 413</p> <p>10.4.1 Discrete time hazards regression 415</p> <p>10.5 Accounting for frailty in event history and survival models 417</p> <p>10.6 Further applications of frailty models 421</p> <p>10.7 Competing risks 423</p> <p>Exercises 425</p> <p>References 426</p> <p>Index 431</p>
<p>“A nice guidebook to intermediate and advanced Bayesian models.”  <i> (</i><i>Scientific Computing</i>, 13 January 2015)</p>
<p><strong>Peter Congdon</strong> is Research Professor of Quantitative Geography and Health Statistics at Queen Mary University of London. He has written three earlier books on Bayesian modelling and data analysis techniques with Wiley, and has a wide range of publications in statistical methodology and in application areas. His current interests include applications to spatial and survey data relating to health status and health service research.

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €