Details

Performance Evaluation


Performance Evaluation

Proven Approaches for Improving Program and Organizational Performance
Research Methods for the Social Sciences 1. Aufl.

von: Ingrid J. Guerra-López

51,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 27.07.2017
ISBN/EAN: 9781119461203
Sprache: englisch
Anzahl Seiten: 320

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-López, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.
<p>Acknowledgments xi</p> <p>Preface xiii</p> <p>The Author xv</p> <p><b>Part One: Introduction To Evaluation</b></p> <p><b>One: Foundations of Evaluation 3</b></p> <p>A Brief Overview of Evaluation History 4</p> <p>Evaluation: Purpose and Definition 5</p> <p>Performance Improvement: A Conceptual Framework 8</p> <p>Making Evaluation Happen: Ensuring Stakeholders’ Buy-In 9</p> <p>The Evaluator: A Job or a Role? 10</p> <p>The Relationship to Other Investigative Processes 11</p> <p>When Does Evaluation Occur? 15</p> <p>General Evaluation Orientations 18</p> <p>Challenges That Evaluators Face 20</p> <p>Ensuring Commitment 23</p> <p>Benefits of Evaluation 24</p> <p>Basic Definitions 25</p> <p><b>Two: Principles of Performance-based Evaluation 27</b></p> <p>Principle 1: Evaluation Is Based on Asking the Right Questions 28</p> <p>Principle 2: Evaluation of Process Is a Function of Obtained Results 32</p> <p>Principle 3: Goals and Objectives of Organizations Should Be Based on Valid Needs 33</p> <p>Principle 4: Derive Valid Needs Using a Top-Down Approach 34</p> <p>Principle 5: Every Organization Should Aim for the Best That Society Can Attain 34</p> <p>Principle 6: The Set of Evaluation Questions Drives the Evaluation Study 35</p> <p><b>Part Two: Models of Evaluation</b></p> <p><b>Three: Overview of Existing Evaluation Models 39</b></p> <p>Overview of Classic Evaluation Models 40</p> <p>Selected Evaluation Models 42</p> <p>Selecting a Model 43</p> <p>Conceptualizing a Useful Evaluation That Fits the Situation 44</p> <p><b>Four: KIRKPATRICK’S Four LEVELS Of Evaluation 47</b></p> <p>Kirkpatrick’s Levels 49</p> <p>Comments on the Model 54</p> <p>Strengths and Limitations 55</p> <p>Application Example: Wagner (1995) 56</p> <p><b>Five: Phillips’s Return-on-investment Methodology 61</b></p> <p>Phillips’s ROI Process 63</p> <p>Comments on the Model 67</p> <p>Strengths and Limitations 70</p> <p>Application Example: Blake (1999) 70</p> <p><b>Six: Brinkerhoff’s Success Case Method 75</b></p> <p>The SCM Process 77</p> <p>Strengths and Weaknesses 78</p> <p>Application Example: Brinkerhoff (2005) 79</p> <p><b>Seven: the Impact Evaluation Process 81</b></p> <p>The Elements of the Process 83</p> <p>Comments on the Model 96</p> <p>Strengths and Limitations 97</p> <p>Application Example 97</p> <p><b>Eight: the Cipp Model 107</b></p> <p>Stufflebeam’s Four Types of Evaluation 108</p> <p>Articulating Core Values of Programs and Solutions 111</p> <p>Methods Used in CIPP Evaluations 112</p> <p>Strengths and Limitations 113</p> <p>Application Example: Filella-Guiu and Blanch-Pana (2002) 113</p> <p><b>Nine: Evaluating Evaluations 117</b></p> <p>Evaluation Standards 119</p> <p>The American Evaluation Association Principles for Evaluators 120</p> <p>Application Example: Lynch et al. (2003) 122</p> <p><b>Part Three: Tools and Techniques of Evaluation</b></p> <p><b>Ten: Data 133</b></p> <p>Characteristics of Data 135</p> <p>Scales of Measurement 137</p> <p>Defining Required Data from Performance Objectives 139</p> <p>Deriving Measurable Indicators 141</p> <p>Finding Data Sources 152</p> <p>Follow-Up Questions and Data 155</p> <p><b>Eleven: Data Collection 159</b></p> <p>Observation Methodology and the Purpose of Measurement 160</p> <p>Designing the Experiment 186</p> <p>Problems with Classic Experimental Studies in Applied Settings 188</p> <p>Time-Series Studies 188</p> <p>Simulations and Games 189</p> <p>Document-Centered Methods 191</p> <p>Conclusion 192</p> <p><b>Twelve: Analysis of Evaluation Data: Tools and Techniques 195</b></p> <p>Analysis of Models and Patterns 196</p> <p>Analysis Using Structured Discussion 197</p> <p>Methods of Quantitative Analysis 199</p> <p>Statistics 200</p> <p>Graphical Representations of Data 210</p> <p>Measures of Relationship 212</p> <p>Inferential Statistics: Parametric and Nonparametric 214</p> <p>Interpretation 217</p> <p><b>Thirteen: Communicating The Findings 221</b></p> <p>Recommendations 222</p> <p>Considerations for Implementing Recommendations 225</p> <p>Developing the Report 226</p> <p>The Evaluator’s Role After the Report 235</p> <p><b>Part Four: Continual Improvement</b></p> <p><b>Fourteen: Common Errors in Evaluation 239</b></p> <p>Errors of System Mapping 240</p> <p>Errors of Logic 242</p> <p>Errors of Procedure 244</p> <p>Conclusion 246</p> <p><b>Fifteen: Continual Improvement 249</b></p> <p>What Is Continual Improvement? 250</p> <p>Monitoring Performance 250</p> <p>Adjusting Performance 253</p> <p>The Role of Leadership 254</p> <p><b>Sixteen: Contracting for Evaluation Services 257</b></p> <p>The Contract 258</p> <p>Contracting Controls 260</p> <p>Ethics and Professionalism 262</p> <p>Sample Statement of Work 262</p> <p><b>Seventeen: Intelligence Gathering For Decision Making 271</b></p> <p>Performance Measurement Systems 273</p> <p>Issues in Performance Measurement Systems 275</p> <p>Conclusion 277</p> <p><b>Eighteen: the Future of Evaluation in Performance Improvement 279</b></p> <p>Evaluation and Measurement in Performance Improvement Today 281</p> <p>What Does the Future Hold? 282</p> <p>Conclusion 283</p> <p>References and Related Readings 285</p> <p>Index 295</p>
<p>Ingrid J. Guerra-López, PhD, is an associate professor at Wayne State University, director of the Institute for Learning and Performance Improvement, associate research professor at the Sonora Institute of Technology in Mexico, and principal of Intelligence Gathering Systems.</p>
<p>Performance Evaluation</p> <p>Proven Approaches for Improving Program and Organizational Performance</p> <p>Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-López, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.</p> <p>Designed to be comprehensive, the book highlights five major and classic evaluation models including</p> <ul> <li> <p>Kirkpatrick's Four Levels</p> </li> <li> <p>Phillips's ROI</p> </li> <li> <p>Brinkerhoff's Case Success Method</p> </li> <li> <p>Stufflebeam's CIPP</p> </li> <li> <p>Guerra-López's Impact Evaluation Process</p> </li> </ul> <p>Guerra-López's book bridges the gap between theory and practice. It illustrates the various models in accessible terms, explores the research evidence behind each, explains how each model looks in practice, and shows how each approach can be tailored for specific evaluations.</p>

Diese Produkte könnten Sie auch interessieren:

Now We Get It!
Now We Get It!
von: Janette K. Klingner, Sharon Vaughn, Alison Boardman, Elizabeth Swanson
PDF ebook
19,99 €
Now We Get It!
Now We Get It!
von: Janette K. Klingner, Sharon Vaughn, Alison Boardman, Elizabeth Swanson
EPUB ebook
19,99 €
Phonics for Dummies
Phonics for Dummies
von: Susan M. Greve
PDF ebook
17,99 €