Details

Theory of Ridge Regression Estimation with Applications


Theory of Ridge Regression Estimation with Applications


Wiley Series in Probability and Statistics, Band 285 1. Aufl.

von: A. K. Md. Ehsanes Saleh, Mohammad Arashi, B. M. Golam Kibria

111,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 08.01.2019
ISBN/EAN: 9781118644522
Sprache: englisch
Anzahl Seiten: 384

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>A guide to the systematic analytical results for ridge, LASSO, preliminary test, and Stein-type estimators with applications</b></p> <p><i>Theory of Ridge Regression Estimation with Applications</i> offers a comprehensive guide to the theory and methods of estimation. Ridge regression and LASSO are at the center of all penalty estimators in a range of standard models that are used in many applied statistical analyses. Written by noted experts in the field, the book contains a thorough introduction to penalty and shrinkage estimation and explores the role that ridge, LASSO, and logistic regression play in the computer intensive area of neural network and big data analysis.</p> <p>Designed to be accessible, the book presents detailed coverage of the basic terminology related to various models such as the location and simple linear models, normal and rank theory-based ridge, LASSO, preliminary test and Stein-type estimators.
The authors also include problem sets to enhance learning. This book is a volume in the Wiley Series in Probability and Statistics series that provides essential and invaluable reading for all statisticians. This important resource:</p> <ul> <li>Offers theoretical coverage and computer-intensive applications of the procedures presented</li> <li>Contains solutions and alternate methods for prediction accuracy and selecting model procedures</li> <li>Presents the first book to focus on ridge regression and unifies past research with current methodology</li> <li>Uses R throughout the text and includes a companion website containing convenient data sets</li> </ul> <p>Written for graduate students, practitioners, and researchers in various fields of science, <i>Theory of Ridge Regression Estimation with Applications</i> is an authoritative guide to the theory and methodology of statistical estimation.</p>
<p>List of Figures xvii</p> <p>List of Tables xxi</p> <p>Preface xxvii</p> <p>Abbreviations and Acronyms xxxi</p> <p>List of Symbols xxxiii</p> <p><b>1 Introduction to Ridge Regression </b><b>1</b></p> <p>1.1 Introduction 1</p> <p>1.1.1 Multicollinearity Problem 3</p> <p>1.2 Ridge Regression Estimator: Ridge Notion 5</p> <p>1.3 LSE vs. RRE 6</p> <p>1.4 Estimation of Ridge Parameter 7</p> <p>1.5 Preliminary Test and Stein-Type Ridge Estimators 8</p> <p>1.6 High-Dimensional Setting 9</p> <p>1.7 Notes and References 11</p> <p>1.8 Organization of the Book 12</p> <p><b>2 Location and Simple Linear Models </b><b>15</b></p> <p>2.1 Introduction 15</p> <p>2.2 Location Model 16</p> <p>2.2.1 Location Model: Estimation 16</p> <p>2.2.2 Shrinkage Estimation of Location 17</p> <p>2.2.3 Ridge Regression–Type Estimation of Location Parameter 18</p> <p>2.2.4 LASSO for Location Parameter 18</p> <p>2.2.5 Bias and MSE Expression for the LASSO of Location Parameter 19</p> <p>2.2.6 Preliminary Test Estimator, Bias, and MSE 23</p> <p>2.2.7 Stein-Type Estimation of Location Parameter 24</p> <p>2.2.8 Comparison of LSE, PTE, Ridge, SE, and LASSO 24</p> <p>2.3 Simple Linear Model 26</p> <p>2.3.1 Estimation of the Intercept and Slope Parameters 26</p> <p>2.3.2 Test for Slope Parameter 27</p> <p>2.3.3 PTE of the Intercept and Slope Parameters 27</p> <p>2.3.4 Comparison of Bias and MSE Functions 29</p> <p>2.3.5 Alternative PTE 31</p> <p>2.3.6 Optimum Level of Significance of Preliminary Test 33</p> <p>2.3.7 Ridge-Type Estimation of Intercept and Slope 34</p> <p>2.3.7.1 Bias and MSE Expressions 35</p> <p>2.3.8 LASSO Estimation of Intercept and Slope 36</p> <p>2.4 Summary and Concluding Remarks 39</p> <p><b>3 ANOVA Model </b><b>43</b></p> <p>3.1 Introduction 43</p> <p>3.2 Model, Estimation, and Tests 44</p> <p>3.2.1 Estimation of Treatment Effects 45</p> <p>3.2.2 Test of Significance 45</p> <p>3.2.3 Penalty Estimators 46</p> <p>3.2.4 Preliminary Test and Stein-Type Estimators 47</p> <p>3.3 Bias and Weighted L2 Risks of Estimators 48</p> <p>3.3.1 Hard Threshold Estimator (Subset Selection Rule) 48</p> <p>3.3.2 LASSO Estimator 49</p> <p>3.3.3 Ridge Regression Estimator 51</p> <p>3.4 Comparison of Estimators 52</p> <p>3.4.1 Comparison of LSE with RLSE 52</p> <p>3.4.2 Comparison of LSE with PTE 52</p> <p>3.4.3 Comparison of LSE with SE and PRSE 53</p> <p>3.4.4 Comparison of LSE and RLSE with RRE 54</p> <p>3.4.5 Comparison of RRE with PTE, SE, and PRSE 56</p> <p>3.4.5.1 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^<sup>PT</sup> (𝛼) 56</p> <p>3.4.5.2 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^ <sup>s</sup> 56</p> <p>3.4.5.3 Comparison of 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) with 𝜽<sub>n</sub>^<sup>S</sup><sup>+</sup> 57</p> <p>3.4.6 Comparison of LASSO with LSE and RLSE 58</p> <p>3.4.7 Comparison of LASSO with PTE, SE, and PRSE 59</p> <p>3.4.8 Comparison of LASSO with RRE 60</p> <p>3.5 Application 60</p> <p>3.6 Efficiency in Terms of Unweighted L2 Risk 63</p> <p>3.7 Summary and Concluding Remarks 72</p> <p>3A. Appendix 74</p> <p><b>4 Seemingly Unrelated Simple Linear Models </b><b>79</b></p> <p>4.1 Model, Estimation, and Test of Hypothesis 79</p> <p>4.1.1 LSE of 𝜃 and 𝛽 80</p> <p>4.1.2 Penalty Estimation of 𝛽 and 𝜃 80</p> <p>4.1.3 PTE and Stein-Type Estimators of 𝛽 and 𝜃 81</p> <p>4.2 Bias and MSE Expressions of the Estimators 82</p> <p>4.3 Comparison of Estimators 86</p> <p>4.3.1 Comparison of LSE with RLSE 86</p> <p>4.3.2 Comparison of LSE with PTE 86</p> <p>4.3.3 Comparison of LSE with SE and PRSE 87</p> <p>4.3.4 Comparison of LSE and RLSE with RRE 87</p> <p>4.3.5 Comparison of RRE with PTE, SE, and PRSE 89</p> <p>4.3.5.1 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^<sup>PT </sup>89</p> <p>4.3.5.2 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^<sup>S</sup> 89</p> <p>4.3.5.3 Comparison of 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) with 𝜽<sub>n</sub>^<sup>S</sup><sup>+</sup> 90</p> <p>4.3.6 Comparison of LASSO with RRE 90</p> <p>4.3.7 Comparison of LASSO with LSE and RLSE 92</p> <p>4.3.8 Comparison of LASSO with PTE, SE, and PRSE 92</p> <p>4.4 Efficiency in Terms of Unweighted L2 Risk 93</p> <p>4.4.1 Efficiency for 𝜷 94</p> <p>4.4.2 Efficiency for 𝜽 95</p> <p>4.5 Summary and Concluding Remarks 96</p> <p><b>5 Multiple Linear Regression Models </b><b>109</b></p> <p>5.1 Introduction 109</p> <p>5.2 Linear Model and the Estimators 110</p> <p>5.2.1 Penalty Estimators 111</p> <p>5.2.2 Shrinkage Estimators 113</p> <p>5.3 Bias and Weighted L2 Risks of Estimators 114</p> <p>5.3.1 Hard Threshold Estimator 114</p> <p>5.3.2 Modified LASSO 116</p> <p>5.3.3 Multivariate Normal Decision Theory and Oracles for Diagonal Linear Projection 117</p> <p>5.3.4 Ridge Regression Estimator 119</p> <p>5.3.5 Shrinkage Estimators 119</p> <p>5.4 Comparison of Estimators 120</p> <p>5.4.1 Comparison of LSE with RLSE 120</p> <p>5.4.2 Comparison of LSE with PTE 121</p> <p>5.4.3 Comparison of LSE with SE and PRSE 121</p> <p>5.4.4 Comparison of LSE and RLSE with RRE 122</p> <p>5.4.5 Comparison of RRE with PTE, SE, and PRSE 123</p> <p>5.4.5.1 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^<sup>PT</sup>(𝛼) 123</p> <p>5.4.5.2 Comparison Between 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) and 𝜽<sub>n</sub>^<sup>S</sup> 124</p> <p>5.4.5.3 Comparison of 𝜽<sub>n</sub>^<sup>RR</sup> (k<sub>opt</sub>) with 𝜽<sub>n</sub>^<sup>S</sup><sup>+</sup> 124</p> <p>5.4.6 Comparison of MLASSO with LSE and RLSE 125</p> <p>5.4.7 Comparison of MLASSO with PTE, SE, and PRSE 126</p> <p>5.4.8 Comparison of MLASSO with RRE 127</p> <p>5.5 Efficiency in Terms of Unweighted L2 Risk 127</p> <p>5.6 Summary and Concluding Remarks 129</p> <p><b>6 Ridge Regression in Theory and Applications </b><b>143</b></p> <p>6.1 Multiple Linear Model Specification 143</p> <p>6.1.1 Estimation of Regression Parameters 143</p> <p>6.1.2 Test of Hypothesis for the Coefficients Vector 145</p> <p>6.2 Ridge Regression Estimators (RREs) 146</p> <p>6.3 Bias, MSE, and L2 Risk of Ridge Regression Estimator 147</p> <p>6.4 Determination of the Tuning Parameters 151</p> <p>6.5 Ridge Trace 151</p> <p>6.6 Degrees of Freedom of RRE 154</p> <p>6.7 Generalized Ridge Regression Estimators 155</p> <p>6.8 LASSO and Adaptive Ridge Regression Estimators 156</p> <p>6.9 Optimization Algorithm 158</p> <p>6.9.1 Prostate Cancer Data 160</p> <p>6.10 Estimation of Regression Parameters for Low-Dimensional Models 161</p> <p>6.10.1 BLUE and Ridge Regression Estimators 161</p> <p>6.10.2 Bias and L2-risk Expressions of Estimators 162</p> <p>6.10.3 Comparison of the Estimators 165</p> <p>6.10.4 Asymptotic Results of RRE 166</p> <p>6.11 Summary and Concluding Remarks 168</p> <p><b>7 Partially Linear Regression Models </b><b>171</b></p> <p>7.1 Introduction 171</p> <p>7.2 Partial Linear Model and Estimation 172</p> <p>7.3 Ridge Estimators of Regression Parameter 174</p> <p>7.4 Biases and L2 Risks of Shrinkage Estimators 177</p> <p>7.5 Numerical Analysis 178</p> <p>7.5.1 Example: Housing Prices Data 182</p> <p>7.6 High-Dimensional PLM 188</p> <p>7.6.1 Example: Riboflavin Data 192</p> <p>7.7 Summary and Concluding Remarks 193</p> <p><b>8 Logistic Regression Model </b><b>197</b></p> <p>8.1 Introduction 197</p> <p>8.1.1 Penalty Estimators 199</p> <p>8.1.2 Shrinkage Estimators 200</p> <p>8.1.3 Results on MLASSO 201</p> <p>8.1.4 Results on PTE and Stein-Type Estimators 202</p> <p>8.1.5 Results on Penalty Estimators 204</p> <p>8.2 Asymptotic Distributional L2 Risk Efficiency Expressions of the Estimators 204</p> <p>8.2.1 MLASSO vs. MLE 205</p> <p>8.2.2 MLASSO vs. RMLE 206</p> <p>8.2.3 Comparison of MLASSO vs. PTE 206</p> <p>8.2.4 PT and MLE 207</p> <p>8.2.5 Comparison of MLASSO vs. SE 208</p> <p>8.2.6 Comparison of MLASSO vs. PRSE 208</p> <p>8.2.7 RRE vs. MLE 209</p> <p>8.2.7.1 RRE vs. RMLE 209</p> <p>8.2.8 Comparison of RRE vs. PTE 211</p> <p>8.2.9 Comparison of RRE vs. SE 211</p> <p>8.2.10 Comparison of RRE vs. PRSE 212</p> <p>8.2.11 PTE vs. SE and PRSE 212</p> <p>8.2.12 Numerical Comparison Among the Estimators 213</p> <p>8.3 Summary and Concluding Remarks 213</p> <p><b>9 Regression Models with Autoregressive Errors </b><b>221</b></p> <p>9.1 Introduction 221</p> <p>9.1.1 Penalty Estimators 223</p> <p>9.1.2 Shrinkage Estimators 224</p> <p>9.1.2.1 Preliminary Test Estimator 224</p> <p>9.1.2.2 Stein-Type and Positive-Rule Stein-Type Estimators 225</p> <p>9.1.3 Results on Penalty Estimators 225</p> <p>9.1.4 Results on PTE and Stein-Type Estimators 226</p> <p>9.1.5 Results on Penalty Estimators 229</p> <p>9.2 Asymptotic Distributional L2-risk Efficiency Comparison 230</p> <p>9.2.1 Comparison of GLSE with RGLSE 230</p> <p>9.2.2 Comparison of GLSE with PTE 231</p> <p>9.2.3 Comparison of LSE with SE and PRSE 231</p> <p>9.2.4 Comparison of LSE and RLSE with RRE 232</p> <p>9.2.5 Comparison of RRE with PTE, SE and PRSE 233</p> <p>9.2.5.1 Comparison Between 𝜷<sub>n</sub>^<sup>GRR</sup>(k<sub>opt</sub>) and 𝜷<sub>n</sub>^<sup>G</sup><sup>(</sup><sup>PT</sup><sup>)</sup>233</p> <p>9.2.5.2 Comparison Between 𝜷<sub>n</sub>^<sup>GRR</sup>(k<sub>opt</sub>) and 𝜷<sub>n</sub>^<sup>G</sup><sup>(</sup><sup>S</sup><sup>)</sup> 234</p> <p>9.2.5.3 Comparison of 𝜷<sub>n</sub>^<sup>GRR</sup>(k<sub>opt</sub>) with 𝜷<sub>n</sub>^<sup>G</sup><sup>(</sup><sup>S</sup><sup>+) </sup>234</p> <p>9.2.6 Comparison of MLASSO with GLSE and RGLSE 235</p> <p>9.2.7 Comparison of MLASSO with PTE, SE, and PRSE 236</p> <p>9.2.8 Comparison of MLASSO with RRE 236</p> <p>9.3 Example: Sea Level Rise at KeyWest, Florida 237</p> <p>9.3.1 Estimation of the Model Parameters 237</p> <p>9.3.1.1 Testing for Multicollinearity 237</p> <p>9.3.1.2 Testing for Autoregressive Process 238</p> <p>9.3.1.3 Estimation of Ridge Parameter k 239</p> <p>9.3.2 Relative Efficiency 240</p> <p>9.3.2.1 Relative Efficiency (REff) 240</p> <p>9.3.2.2 Effect of Autocorrelation Coefficient 𝜙 243</p> <p>9.4 Summary and Concluding Remarks 245</p> <p><b>10 Rank-Based Shrinkage Estimation </b><b>251</b></p> <p>10.1 Introduction 251</p> <p>10.2 LinearModel and Rank Estimation 252</p> <p>10.2.1 Penalty R-Estimators 256</p> <p>10.2.2 PTREs and Stein-type R-Estimators 258</p> <p>10.3 Asymptotic Distributional Bias and L2 Risk of the R-Estimators 259</p> <p>10.3.1 HardThreshold Estimators (Subset Selection) 259</p> <p>10.3.2 Rank-based LASSO 260</p> <p>10.3.3 Multivariate Normal DecisionTheory and Oracles for Diagonal Linear Projection 261</p> <p>10.4 Comparison of Estimators 262</p> <p>10.4.1 Comparison of RE with Restricted RE 262</p> <p>10.4.2 Comparison of RE with PTRE 263</p> <p>10.4.3 Comparison of RE with SRE and PRSRE 263</p> <p>10.4.4 Comparison of RE and Restricted RE with RRRE 265</p> <p>10.4.5 Comparison of RRRE with PTRE, SRE, and PRSRE 266</p> <p>10.4.6 Comparison of RLASSO with RE and Restricted RE 267</p> <p>10.4.7 Comparison of RLASSO with PTRE, SRE, and PRSRE 267</p> <p>10.4.8 Comparison of Modified RLASSO with RRRE 268</p> <p>10.5 Summary and Concluding Remarks 268</p> <p><b>11 High-Dimensional Ridge Regression </b><b>285</b></p> <p>11.1 High-Dimensional RRE 286</p> <p>11.2 High-Dimensional Stein-Type RRE 288</p> <p>11.2.1 Numerical Results 291</p> <p>11.2.1.1 Example: Riboflavin Data 291</p> <p>11.2.1.2 Monte Carlo Simulation 291</p> <p>11.3 Post Selection Shrinkage 293</p> <p>11.3.1 Notation and Assumptions 296</p> <p>11.3.2 Estimation Strategy 297</p> <p>11.3.3 Asymptotic Distributional L2-Risks 299</p> <p>11.4 Summary and Concluding Remarks 300</p> <p><b>12 Applications: Neural Networks and Big Data </b><b>303</b></p> <p>12.1 Introduction 304</p> <p>12.2 A Simple Two-Layer Neural Network 307</p> <p>12.2.1 Logistic Regression Revisited 307</p> <p>12.2.2 Logistic Regression Loss Function with Penalty 310</p> <p>12.2.3 Two-Layer Logistic Regression 311</p> <p>12.3 Deep Neural Networks 313</p> <p>12.4 Application: Image Recognition 315</p> <p>12.4.1 Background 315</p> <p>12.4.2 Binary Classification 316</p> <p>12.4.3 Image Preparation 318</p> <p>12.4.4 Experimental Results 320</p> <p>12.5 Summary and Concluding Remarks 323</p> <p>References 325</p> <p>Index 333</p>
<p><b>A. K. Md. EHSANES SALEH, PhD,</b> is a Professor Emeritus and Distinguished Research Professor in the school of Mathematics and Statistics, Carleton University, Ottawa, Canada. <p><b>MOHAMMAD ARASHI, PhD,</b> is an Associate Professor at Shahrood University of Technology, Iran and Extraordinary Professor and C2 rated researcher at University of Pretoria, Pretoria, South Africa. <p><b>B. M. GOLAM KIBRIA, PhD,</b> is a Professor in the Department of Mathematics and Statistics at Florida International University, Miami, FL.
<p>Theory of <b>Ridge Regression Estimation</b> with<b> Applications</b> <p><b>A guide to the systematic analytical results for ridge, LASSO, preliminary test, and Stein-type estimators with applications</b> <p><i>Theory of Ridge Regression Estimation with Applications</i><b></b> offers a comprehensive guide to the theory and methods of estimation. Ridge regression and LASSO are at the center of all penalty estimators in a range of standard models that are used in many applied statistical analyses. Written by noted experts in the field, the book contains a thorough introduction to penalty and shrinkage estimation and explores the role that ridge, LASSO, and logistic regression play in the computer intensive area of neural network and big data analysis. <p>Designed to be accessible, the book presents detailed coverage of the basic terminology related to various models such as the location and simple linear models, normal and rank theory-based ridge, LASSO, preliminary test and Stein-type estimators. The authors also include problem sets to enhance learning. This book is a volume in the Wiley Series in Probability and Statistics series that provides essential and invaluable reading for all statisticians. This important resource: <ul> <li>Offers theoretical coverage and computer-intensive applications of the procedures presented</li> <li>Contains solutions and alternate methods for prediction accuracy and selecting model procedures</li> <li>Presents the first book to focus on ridge regression and unifies past research with current methodology</li> </ul> <p>Written for graduate students, practitioners, and researchers in various fields of science,<i> Theory of Ridge Regression Estimation with Applications</i> is an authoritative guide to the theory and methodology of statistical estimation.

Diese Produkte könnten Sie auch interessieren:

Statistics for Microarrays
Statistics for Microarrays
von: Ernst Wit, John McClure
PDF ebook
90,99 €