Details

Long-Memory Time Series


Long-Memory Time Series

Theory and Methods
Wiley Series in Probability and Statistics, Band 662 1. Aufl.

von: Wilfredo Palma

131,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 27.04.2007
ISBN/EAN: 9780470131459
Sprache: englisch
Anzahl Seiten: 304

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<b>A self-contained, contemporary treatment of the analysis of long-range dependent data</b> <p>Long-Memory Time Series: Theory and Methods provides an overview of the theory and methods developed to deal with long-range dependent data and describes the applications of these methodologies to real-life time series. Systematically organized, it begins with the foundational essentials, proceeds to the analysis of methodological aspects (Estimation Methods, Asymptotic Theory, Heteroskedastic Models, Transformations, Bayesian Methods, and Prediction), and then extends these techniques to more complex data structures.</p> <p>To facilitate understanding, the book:</p> <ul> <li> <p>Assumes a basic knowledge of calculus and linear algebra and explains the more advanced statistical and mathematical concepts</p> </li> <li> <p>Features numerous examples that accelerate understanding and illustrate various consequences of the theoretical results</p> </li> <li> <p>Proves all theoretical results (theorems, lemmas, corollaries, etc.) or refers readers to resources with further demonstration</p> </li> <li> <p>Includes detailed analyses of computational aspects related to the implementation of the methodologies described, including algorithm efficiency, arithmetic complexity, CPU times, and more</p> </li> <li> <p>Includes proposed problems at the end of each chapter to help readers solidify their understanding and practice their skills</p> </li> </ul> <p>A valuable real-world reference for researchers and practitioners in time series analysis, economerics, finance, and related fields, this book is also excellent for a beginning graduate-level course in long-memory processes or as a supplemental textbook for those studying advanced statistics, mathematics, economics, finance, engineering, or physics. A companion Web site is available for readers to access the S-Plus and R data sets used within the text.</p>
<p>Preface xiii</p> <p>Acronyms xvii</p> <p><b>1 Stationary Precedes 1</b></p> <p>1.1 Fundamental concepts 2</p> <p>1.1.1 Stationarity 4</p> <p>1.1.2 Singularity and Regularity 5</p> <p>1.1.3 Wold Decomposition Theorem 5</p> <p>1.1.4 Causality 7</p> <p>1.1.5 Invertibility 7</p> <p>1.1.6 Best Linear Predictor 8</p> <p>1.1.7 Szego-Kolmogorov Formula 8</p> <p>1.1.8 Ergodicity 9</p> <p>1.1.9 Martingales 11</p> <p>1.1.10 Cumulants 12</p> <p>1.1.11 Fractional Brownian Motion 12</p> <p>1.1.12 Wavelets 14</p> <p>1.2 Bibliographic Notes 15</p> <p>Problems 16</p> <p><b>2 State Space Systems 21</b></p> <p>2.1 Introduction 22</p> <p>2.1.1 Stability 22</p> <p>2.1.2 Hankel Operator 22</p> <p>2.1.3 Observability 23</p> <p>2.1.4 Controllability 23</p> <p>2.1.5 Minimality 24</p> <p>2.2 Representations of Linear Processes 24</p> <p>2.2.1 State Space Form to Wold Decomposition 24</p> <p>2.2.2 Wold Decomposition to State Form 25</p> <p>2.2.3 Hankel Operator to State Space Form 25</p> <p>2.3 Estimation of the State 26</p> <p>2.3.1 State Predictor 27</p> <p>2.3.2 State Filter 27</p> <p>2.3.3 State Smoother 27</p> <p>2.3.4 Missing Observation 28</p> <p>2.3.5 Steady State System 28</p> <p>2.3.6 Prediction of Future Observations 30</p> <p>2.4 Extensions 32</p> <p>2.5 Bibliographic Notes 32</p> <p>Problems 33</p> <p><b>3 Long-Memory/Processes 39</b></p> <p>3.1 Defining Long Memory 40</p> <p>3.1.1 Alternative Definitions 41</p> <p>3.1.2 Extensions 43</p> <p>3.2 ARFIMA Processes 43</p> <p>3.2.1 Stationarity, Causality, and Invertibility 44</p> <p>3.2.2 Infinite AR and MA Expansions 46</p> <p>3.2.3 Spectral Density 47</p> <p>3.2.4 Autocovariance Function 47</p> <p>3.2.5 Sample Mean 48</p> <p>3.2.6 Partial Autocorrelations 49</p> <p>3.2.7 Illustrations 49</p> <p>3.2.8 Approximation of Long-Memory Processes 55</p> <p>3.3 Fractional Gaussian Noise 56</p> <p>3.3.1 Sample Mean 56</p> <p>3.4 Technical Lemmas 57</p> <p>3.5 Bibliographic Notes 58</p> <p>Problems 59</p> <p><b>4 Estimation Methods 65</b></p> <p>4.1 Maximum-Likelihood Estimation 66</p> <p>4.1.1 Cholesky Decomposition Method 66</p> <p>4.1.2 Durbin-Levinson Algorithm 66</p> <p>4.1.3 Computation of Autocovariances 67</p> <p>4.1.4 State Space Approach 69</p> <p>4.2 Autoregressive Approximations 71</p> <p>4.2.1 Haslett-Raftery Method72</p> <p>4.2.2 Beran Approach 73</p> <p>4.2.3 A State Space Method 74</p> <p>4.3 Moving-Average Approximation 75</p> <p>4.4 Whittle Estimation 78</p> <p>4.4.1 Other versions 80</p> <p>4.4.2 Non-Gaussian Data 80</p> <p>4.4.3 Semiparametric Methods 81</p> <p>4.5 Other Methods 81</p> <p>4.5.1 A Regression Method 82</p> <p>4.5.2 Rescale Range Method 83</p> <p>4.5.3 Variance Plots 85</p> <p>4.5.4 Detrended Fluctuation Analysis 87</p> <p>4.5.5 A Wavelet-Based Method 91</p> <p>4.6 Numerical Experiments 92</p> <p>4.7 Bibliographic Notes 93</p> <p>Problems 94</p> <p><b>5 Asymptotic Theory 97</b></p> <p>5.1 Notation and Definitions 98</p> <p>5.2 Theorems 99</p> <p>5.2.1 Consistency 99</p> <p>5.2.2 Central Limit Theorem 101</p> <p>5.2.3 Efficiency 104</p> <p>5.3 Examples 104</p> <p>5.4 Illustration 108</p> <p>5.5 Technical Lemmas 109</p> <p>5.6 Bibliographic Notes 109</p> <p>Problems 109</p> <p><b>6 Heteroskedastic Models 115</b></p> <p>6.1 Introduction 116</p> <p>6.2 ARFIMA-GARCH Model 117</p> <p>6.2.1 Estimation 119</p> <p>6.3 Other Models 119</p> <p>6.3.1 Estimation 121</p> <p>6.4 Stochastic Volatility 121</p> <p>6.4.1 Estimation 122</p> <p>6.5 Numerical Experiments 122</p> <p>6.6 Application 123</p> <p>6.6.1 Model without Leverage 123</p> <p>6.6.2 Model with Leverage 124</p> <p>6.6.3 Model Comparison 124</p> <p>6.7 Bibliographic Notes 125</p> <p>Problems 126</p> <p><b>7 Transformations 131</b></p> <p>7.1 Transformation of Gaussian Processes 132</p> <p>7.2 Autocorrelation of Squares 134</p> <p>7.3 Asymptotic behavior 136</p> <p>7.4 Illustrations 138</p> <p>7.5 Bibliographic Notes 142</p> <p>Problems 143</p> <p><b>8 Bayesian Methods 147</b></p> <p>8.1 Bayesian Modeling 148</p> <p>8.2 Markov Chain Monte Carlo Methods 149</p> <p>8.2.1 Metropolis-Hastings Algorithm 149</p> <p>8.2.2 Gibbs Sampler 150</p> <p>8.2.3 Overdispersed Distributions 152</p> <p>8.3 Monitoring Convergence 153</p> <p>8.4 A Simulated Example 155</p> <p>8.5 Data Application 158</p> <p>8.6 Bibliographic Notes 162</p> <p>Problems 162</p> <p><b>9 Prediction 167</b></p> <p>9.1 One-Step Ahead Predictors 168</p> <p>9.1.1 Infinite Past 168</p> <p>9.1.2 Finite Past 168</p> <p>9.1.3 An Approximate Predictor 172</p> <p>9.2 Multistep Ahead Predictors 173</p> <p>9.2.1 Infinite Past 173</p> <p>9.2.2 Finite Past 174</p> <p>9.3 Heteroskedastic Models 175</p> <p>9.3.1 Prediction of Volatility 176</p> <p>9.4 Illustration 178</p> <p>9.5 Rational Approximations 180</p> <p>9.5.1 Illustration 182</p> <p>9.6 Bibliographic Notes Problems 184</p> <p><b>10 Regression</b> <b>187 </b></p> <p>10.1 Linear Regression Model 188</p> <p>10.1.1 Grenander conditions 188</p> <p>10.2 Properties of the LSE 191</p> <p>10.2.1 Consistency 192</p> <p>10.2.2 Asymptotic Variance 193</p> <p>10.2.3 Asymptotic Normality 193</p> <p>10.3 Properties of the BLUE 194</p> <p>10.3.1 Efficiency of the LSE Relative to the BLUE 195</p> <p>10.4 Estimation of the Mean 198</p> <p>10.4.1 Consistency 198</p> <p>10.4.2 Asymptotic Variance 199</p> <p>10.4.3 Normality 200</p> <p>10.4.4 Relative Efficiency 200</p> <p>10.5 Polynomial Trend 202</p> <p>10.5.1 Consistency 203</p> <p>10.5.2 Asymptotic Variance 203</p> <p>10.5.3 Normality 204</p> <p>10.5.4 Relative Efficiency 204</p> <p>10.6 Harmonic Regression 205</p> <p>10.6.1 Consistency 205</p> <p>10.6.2 Asymptotic Variance 205</p> <p>10.6.3 Normality 205</p> <p>10.6.4 Efficiency 206</p> <p>10.7 Illustration: Air Pollution Data 207</p> <p>10.8 Bibliographic Notes 210</p> <p>Problems 211</p> <p><b>11 Missing Data 215</b></p> <p>11.1 Motivation 216</p> <p>11.2 Likelihood Function with Incomplete Data 217</p> <p>11.2.1 Integration 217</p> <p>11.2.2 Maximization 218</p> <p>11.2.3 Calculation of the Likelihood Function 219</p> <p>11.2.4 Kalman Filter with Missing Observations 219</p> <p>11.3 Effects of Missing Values on ML Estimates 221</p> <p>11.3.1 Monte Carlo Experiments 222</p> <p>11.4 Effects of Missing Values on Prediction 223</p> <p>11.5 Illustrations 227</p> <p>11.6 Interpolation of Missing Data 229</p> <p>11.6.1 Bayesian Imputation 234</p> <p>11.6.2 A Simulated Example 235</p> <p>11.7 Bibliographic Notes 239</p> <p>Problems 239</p> <p><b>12 Seasonality 245</b></p> <p>12.1 A Long-Memory Seasonal Model 246</p> <p>12.2 Calculation of the Asymptotic Variance 250</p> <p>12.3 Autocovariance Function 252</p> <p>12.4 Monte Carlo Studies 254</p> <p>12.5 Illustration 258</p> <p>12.6 Bibliographic Notes 260</p> <p>Problems 261</p> <p>References 265</p> <p>Topic Index 279</p> <p>Author Index 283</p>
"...Palma presents a textbook for a graduate course summarizing the theory and methods developed to deal with long-range-dependent data, and describing some applications to real-life time series." (<i>SciTech Book Reviews</i>, June 2007) <p>"...textbook for a graduate course summarizing the theory and methods developed to deal with long-range-dependent data, and describing some applications to real-life time series.... Problems and bibliographic notes are provided at the end of each chapter." (<i>SciTech Book News</i>, June 2007)</p> <p>"I believe that this text provides an important contribution to the long-memory time series literature. I feel that it largely achieves its aims and could be useful for those instructors wishing to teach a semester-long special topics course.... I strongly recommend this book to anyone interested in long-memory time series. Both researchers and beginners alike will find this text extremely useful." (<i>Journal of the American Statisticial</i> <i>Associatio</i>n<i>,</i> Dec 2008)</p> <p>"Very well-organized catalogue of long-memory time series analysis." (<i>Mathematical Reviews</i>, 2008)</p> <p>"Judging by its contents and scope [the aim of this book] has been largely achieved.... The list of references is selective but quite comprehensive. Each chapter concludes with a 'Problems' section which should be helpful to instructors wishing to use this book as standalone basis for a course in its subject area..." (<i>International Statistical Review</i>, 2007)</p>
<b>Wilfredo Palma</b>, PhD, is Chairman and Professor of Statistics in the Department of Statistics at Pontificia Universidad Católica de Chile. Dr. Palma has published several refereed articles and has received over a dozen academic honors and awards. His research interests include time series analysis, prediction theory, state space systems, linear models, and econometrics.
<p>A self-contained, contemporary treatment of the analysis of long-range dependent data</p> <p>Long-Memory Time Series: Theory and Methods provides an overview of the theory and methods developed to deal with long-range dependent data and describes the applications of these methodologies to real-life time series. Systematically organized, it begins with the foundational essentials, proceeds to the analysis of methodological aspects (Estimation Methods, Asymptotic Theory, Heteroskedastic Models, Transformations, Bayesian Methods, and Prediction), and then extends these techniques to more complex data structures.</p> <p>To facilitate understanding, the book:</p> <ul> <li> <p>Assumes a basic knowledge of calculus and linear algebra and explains the more advanced statistical and mathematical concepts</p> </li> <li> <p>Features numerous examples that accelerate understanding and illustrate various consequences of the theoretical results</p> </li> <li> <p>Proves all theoretical results (theorems, lemmas, corollaries, etc.) or refers readers to resources with further demonstration</p> </li> <li> <p>Includes detailed analyses of computational aspects related to the implementation of the methodologies described, including algorithm efficiency, arithmetic complexity, CPU times, and more</p> </li> <li> <p>Includes proposed problems at the end of each chapter to help readers solidify their understanding and practice their skills</p> </li> </ul> <p>A valuable real-world reference for researchers and practitioners in time series analysis, economerics, finance, and related fields, this book is also excellent for a beginning graduate-level course in long-memory processes or as a supplemental textbook for those studying advanced statistics, mathematics, economics, finance, engineering, or physics. A companion Web site is available for readers to access the S-Plus® and R data sets used within the text.</p>

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €