Details

The EM Algorithm and Extensions


The EM Algorithm and Extensions


Wiley Series in Probability and Statistics, Band 382 2. Aufl.

von: Geoffrey J. McLachlan, Thriyambakam Krishnan

142,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 09.11.2007
ISBN/EAN: 9780470191606
Sprache: englisch
Anzahl Seiten: 400

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

The only single-source——now completely updated and revised——to offer a unified treatment of the theory, methodology, and applications of the EM algorithm <p>Complete with updates that capture developments from the past decade, The EM Algorithm and Extensions, Second Edition successfully provides a basic understanding of the EM algorithm by describing its inception, implementation, and applicability in numerous statistical contexts. In conjunction with the fundamentals of the topic, the authors discuss convergence issues and computation of standard errors, and, in addition, unveil many parallels and connections between the EM algorithm and Markov chain Monte Carlo algorithms. Thorough discussions on the complexities and drawbacks that arise from the basic EM algorithm, such as slow convergence and lack of an in-built procedure to compute the covariance matrix of parameter estimates, are also presented.</p> <p>While the general philosophy of the First Edition has been maintained, this timely new edition has been updated, revised, and expanded to include:</p> <ul> <li> <p>New chapters on Monte Carlo versions of the EM algorithm and generalizations of the EM algorithm</p> </li> <li> <p>New results on convergence, including convergence of the EM algorithm in constrained parameter spaces</p> </li> <li> <p>Expanded discussion of standard error computation methods, such as methods for categorical data and methods based on numerical differentiation</p> </li> <li> <p>Coverage of the interval EM, which locates all stationary points in a designated region of the parameter space</p> </li> <li> <p>Exploration of the EM algorithm's relationship with the Gibbs sampler and other Markov chain Monte Carlo methods</p> </li> <li> <p>Plentiful pedagogical elements—chapter introductions, lists of examples, author and subject indices, computer-drawn graphics, and a related Web site</p> </li> </ul> <p>The EM Algorithm and Extensions, Second Edition serves as an excellent text for graduate-level statistics students and is also a comprehensive resource for theoreticians, practitioners, and researchers in the social and physical sciences who would like to extend their knowledge of the EM algorithm.</p>
Preface to the Second Edition. <p>Preface to the First Edition.</p> <p>List of Examples.</p> <p><b>1. General Introduction.</b></p> <p>1.1 Introduction.</p> <p>1.2 Maximum Likelihood Estimation.</p> <p>1.3 Newton-Type Methods.</p> <p>1.4 Introductory Examples.</p> <p>1.5 Formulation of the EM Algorithm.</p> <p>1.6 EM Algorithm for MAP and MPL Estimation.</p> <p>1.7 Brief Summary of the Properties of EM Algorithm.</p> <p>1.8 History of the EM Algorithm.</p> <p>1.9 Overview of the Book.</p> <p>1.10 Notations.</p> <p><b>2. Examples of the EM Algorithm.</b></p> <p>2.1 Introduction.</p> <p>2.2 Multivariate Data with Missing Values.</p> <p>2.3 Least Square with the Missing Data.</p> <p>2.4 Example 2.4: Multinomial with Complex Cell Structure.</p> <p>2.5 Example 2.5: Analysis of PET and SPECT Data.</p> <p>2.6 Example 2.6: Multivariate <i>t</i>-Distribution (Known D.F.).</p> <p>2.7 Finite Normal Mixtures.</p> <p>2.8 Example 2.9: Grouped and Truncated Data.</p> <p>2.9 Example 2.10: A Hidden Markov AR(1) Model.</p> <p><b>3. Basic Theory of the EM Algorithm.</b></p> <p>3.1 Introduction.</p> <p>3.2 Monotonicity of a Generalized EM Algorithm.</p> <p>3.3 Monotonicity of a Generalized EM Algorithm.</p> <p>3.4 Convergence of an EM Sequence to a Stationary Value.</p> <p>3.5 Convergence of an EM Sequence of Iterates.</p> <p>3.6 Examples of Nontypical Behavior of an EM (GEM) Sequence.</p> <p>3.7 Score Statistic.</p> <p>3.8 Missing Information.</p> <p>3.9 Rate of Convergence of the EM Algorithm.</p> <p><b>4. Standard Errors and Speeding up Convergence.</b></p> <p>4.1 Introduction.</p> <p>4.2 Observed Information Matrix.</p> <p>4.3 Approximations to Observed Information Matrix: i.i.d. Case.</p> <p>4.4 Observed Information Matrix for Grouped Data.</p> <p>4.5 Supplemented EM Algorithm.</p> <p>4.6 Bookstrap Approach to Standard Error Approximation.</p> <p>4.7 Baker’s, Louis’, and Oakes’ Methods for Standard Error Computation.</p> <p>4.8 Acceleration of the EM Algorithm via Aitken’s Method.</p> <p>4.9 An Aitken Acceleration-Based Stopping Criterion.</p> <p>4.10 conjugate Gradient Acceleration of EM Algorithm.</p> <p>4.11 Hybrid Methods for Finding the MLE.</p> <p>4.12 A GEM Algorithm Based on One Newton-Raphson Algorithm.</p> <p>4.13 EM gradient Algorithm.</p> <p>4.14 A Quasi-Newton Acceleration of the EM Algorithm.</p> <p>4.15 Ikeda Acceleration.</p> <p>5. Extension of the EM Algorithm.</p> <p>5.1 Introduction.</p> <p>5.2 ECM Algorithm.</p> <p>5.3 Multicycle ECM Algorithm.</p> <p>5.4 Example 5.2: Normal Mixtures with Equal Correlations.</p> <p>5.5 Example 5.3: Mixture Models for Survival Data.</p> <p>5.6 Example 5.4: Contingency Tables with Incomplete Data.</p> <p>5.7 ECME Algorithm.</p> <p>5.8 Example 5.5: MLE of <i>t</i>-Distribution with the Unknown D.F.</p> <p>5.9 Example 5.6: Variance Components.</p> <p>5.10 Linear Mixed Models.</p> <p>5.11 Example 5.8: Factor Analysis.</p> <p>5.12 Efficient Data Augmentation.</p> <p>5.13 Alternating ECM Algorithm.</p> <p>5.14 Example 5.9: Mixtures of Factor Analyzers.</p> <p>5.15 Parameter-Expanded EM (PX-EM) Algorithm.</p> <p>5.16 EMS Algorithm.</p> <p>5.17 One-Step-Late Algorithm.</p> <p>5.18 Variance Estimation for Penalized EM and OSL Algorithms.</p> <p>5.19 Incremental EM.</p> <p>5.20 Linear Inverse problems.</p> <p><b>6. Monte Carlo Versions of the EM Algorithm.</b></p> <p>6.1 Introduction.</p> <p>6.2 Monte Carlo Techniques.</p> <p>6.3 Monte Carlo EM.</p> <p>6.4 Data Augmentation.</p> <p>6.5 Bayesian EM.</p> <p>6.6 I.I.D. Monte Carlo Algorithm.</p> <p>6.7 Markov Chain Monte Carlo Algorithms.</p> <p>6.8 Gibbs Sampling.</p> <p>6.9 Examples of MCMC Algorithms.</p> <p>6.10 Relationship of EM to Gibbs Sampling.</p> <p>6.11 Data Augmentation and Gibbs Sampling.</p> <p>6.12 Empirical Bayes and EM.</p> <p>6.13 Multiple Imputation.</p> <p>6.14 Missing-Data Mechanism, Ignorability, and EM Algorithm.</p> <p><b>7. Some Generalization of the EM Algorithm.</b></p> <p>7.1 Introduction.</p> <p>7.2 Estimating Equations and Estimating Functions.</p> <p>7.3 Quasi-Score and the Projection-Solution Algorithm.</p> <p>7.4 Expectation-Solution (ES) Algorithm.</p> <p>7.5 Other Generalization.</p> <p>7.6 Variational Bayesian EM Algorithm.</p> <p>7.7 MM Algorithm.</p> <p>7.8 Lower Bound Maximization.</p> <p>7.9 Interval EM Algorithm.</p> <p>7.10 Competing Methods and Some Comparisons with EM.</p> <p>7.11 The Delta Algorithm.</p> <p>7.12 Image Space Reconstruction Algorithm.</p> <p><b>8. Further Applications of the EM Algorithm.</b></p> <p>8.1 Introduction.</p> <p>8.2 Hidden Markov Models.</p> <p>8.3 AIDS Epidemiology.</p> <p>8.4 Neural Networks.</p> <p>8.5 Data Mining.</p> <p>8.6 Bioinformatics.</p> <p>References.</p> <p>Author Index.</p> <p>Subject Index.</p>
"The <i>EM Algorithm and Extension, Second Edition</i>, serves as an excellent text for graduate-level statistics students and is also a comprehensive resource for theoreticians, practioners, and researchers in the social and physical sciences who would like to extend their knowledge of the EM algorithm." (<i>Mathematical Review,</i> Issue 2009e)
<b>Geoffrey J. McLachlan</b>, PhD, DSc, is Professor of Statistics in the Department of Mathematics at The University of Queensland, Australia. A Fellow of the American Statistical Association and the Australian Mathematical Society, he has published extensively on his research interests, which include cluster and discriminant analyses, image analysis, machine learning, neural networks, and pattern recognition. Dr. McLachlan is the author or coauthor of Analyzing Microarray Gene Expression Data, Finite Mixture Models, and Discriminant Analysis and Statistical Pattern Recognition, all published by Wiley. <p><b>Thriyambakam Krishnan,</b> PhD, is Chief Statistical Architect, SYSTAT Software at Cranes Software International Limited in Bangalore, India. Dr. Krishnan has over forty-five years of research, teaching, consulting, and software development experience at the Indian Statistical Institute (ISI). His research interests include biostatistics, image analysis, pattern recognition, psychometry, and the EM algorithm.</p>
<p>The only single-source——now completely updated and revised——to offer a unified treatment of the theory, methodology, and applications of the EM algorithm</p> <p>Complete with updates that capture developments from the past decade, The EM Algorithm and Extensions, Second Edition successfully provides a basic understanding of the EM algorithm by describing its inception, implementation, and applicability in numerous statistical contexts. In conjunction with the fundamentals of the topic, the authors discuss convergence issues and computation of standard errors, and, in addition, unveil many parallels and connections between the EM algorithm and Markov chain Monte Carlo algorithms. Thorough discussions on the complexities and drawbacks that arise from the basic EM algorithm, such as slow convergence and lack of an in-built procedure to compute the covariance matrix of parameter estimates, are also presented.</p> <p>While the general philosophy of the First Edition has been maintained, this timely new edition has been updated, revised, and expanded to include:</p> <ul> <li> <p>New chapters on Monte Carlo versions of the EM algorithm and generalizations of the EM algorithm</p> </li> <li> <p>New results on convergence, including convergence of the EM algorithm in constrained parameter spaces</p> </li> <li> <p>Expanded discussion of standard error computation methods, such as methods for categorical data and methods based on numerical differentiation</p> </li> <li> <p>Coverage of the interval EM, which locates all stationary points in a designated region of the parameter space</p> </li> <li> <p>Exploration of the EM algorithm's relationship with the Gibbs sampler and other Markov chain Monte Carlo methods</p> </li> <li> <p>Plentiful pedagogical elements—chapter introductions, lists of examples, author and subject indices, computer-drawn graphics, and a related Web site</p> </li> </ul> <p>The EM Algorithm and Extensions, Second Edition serves as an excellent text for graduate-level statistics students and is also a comprehensive resource for theoreticians, practitioners, and researchers in the social and physical sciences who would like to extend their knowledge of the EM algorithm.</p>

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €