Details

Evaluation Theory, Models, and Applications


Evaluation Theory, Models, and Applications


Research Methods for the Social Sciences 2. Aufl.

von: Daniel L. Stufflebeam, Chris L. S. Coryn

72,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 26.09.2014
ISBN/EAN: 9781118870228
Sprache: englisch
Anzahl Seiten: 800

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<b>The golden standard evaluation reference text</b> <p>Now in its second edition, <i>Evaluation Theory, Models, and Applications</i> is the vital text on evaluation models, perfect for classroom use as a textbook, and as a professional evaluation reference. The book begins with an overview of the evaluation field and program evaluation standards, and proceeds to cover the most widely used evaluation approaches. With new evaluation designs and the inclusion of the latest literature from the field, this <i>Second Edition</i> is an essential update for professionals and students who want to stay current. Understanding and choosing evaluation approaches is critical to many professions, and <i>Evaluation Theory, Models, and Applications, Second Edition</i> is the benchmark evaluation guide.</p> <p>Authors Daniel L. Stufflebeam and Chris L. S. Coryn, widely considered experts in the evaluation field, introduce and describe 23 program evaluation approaches, including, new to this edition, transformative evaluation, participatory evaluation, consumer feedback, and meta-analysis. <i>Evaluation Theory, Models, and Applications, Second Edition</i> facilitates the process of planning, conducting, and assessing program evaluations. The highlighted evaluation approaches include:</p> <ul> <li>Experimental and quasi-experimental design evaluations</li> <li>Daniel L. Stufflebeam's CIPP Model</li> <li>Michael Scriven's Consumer-Oriented Evaluation</li> <li>Michael Patton's Utilization-Focused Evaluation</li> <li>Robert Stake's Responsive/Stakeholder-Centered Evaluation</li> <li>Case Study Evaluation</li> </ul> <p>Key readings listed at the end of each chapter direct readers to the most important references for each topic. Learning objectives, review questions, student exercises, and instructor support materials complete the collection of tools. Choosing from evaluation approaches can be an overwhelming process, but <i>Evaluation Theory, Models, and Applications, Second Edition</i> updates the core evaluation concepts with the latest research, making this complex field accessible in just one book.</p>
<p>List of Figures, Tables, and Exhibits xiii</p> <p>Dedication xvii</p> <p>Preface xix</p> <p>Acknowledgments xxiii</p> <p>The Author xxv</p> <p>Introduction xxvii</p> <p>Changes to the First Edition xxviii</p> <p>Intended Audience xxviii</p> <p>Overview of the Book’s Contents xxix</p> <p>Study Suggestions xxxii</p> <p><b>Part One: Fundamentals of Evaluation 1</b></p> <p><b>1 OVERVIEW OF THE EVALUATION FIELD 3</b></p> <p>What Are Appropriate Objects of Evaluations and Related Subdisciplines of Evaluation? 3</p> <p>Are Evaluations Enough to Control Quality, Guide Improvement, and Protect Consumers? 4</p> <p>Evaluation as a Profession and Its Relationship to Other Professions 4</p> <p>What Is Evaluation? 6</p> <p>How Good Is Good Enough? How Bad Is Intolerable? How Are These Questions Addressed? 17</p> <p>What Are Performance Standards? How Should They Be Applied? 18</p> <p>Why Is It Appropriate to Consider Multiple Values? 20</p> <p>Should Evaluations Be Comparative, Noncomparative, or Both? 21</p> <p>How Should Evaluations Be Used? 21</p> <p>Why Is It Important to Distinguish Between Informal Evaluation and Formal Evaluation? 26</p> <p>How Do Service Organizations Meet Requirements for Public Accountability? 27</p> <p>What Are the Methods of Formal Evaluation? 29</p> <p>What Is the Evaluation Profession, and How Strong Is It? 29</p> <p>What Are the Main Historical Milestones in the Evaluation Field’s Development? 30</p> <p><b>2 EVALUATION THEORY 45</b></p> <p>General Features of Evaluation Theories 45</p> <p>Theory’s Role in Developing the Program Evaluation Field 47</p> <p>Functional and Pragmatic Bases of Extant Program Evaluation Theory 48</p> <p>AWord About Research Related to Program Evaluation Theory 49</p> <p>Program Evaluation Theory Defined 50</p> <p>Criteria for Judging Program Evaluation Theories 52</p> <p>Theory Development as a Creative Process Subject to Review and Critique by Users 56</p> <p>Status of Theory Development in the Program Evaluation Field 57</p> <p>Importance and Difficulties of Considering Context in Theories of Program Evaluation 58</p> <p>Need for Multiple Theories of Program Evaluation 58</p> <p>Hypotheses for Research on Program Evaluation 59</p> <p>Potential Utility of Grounded Theories 62</p> <p>Potential Utility of Metaevaluations in Developing Theories of Program Evaluation 63</p> <p>Program Evaluation Standards and Theory Development 63</p> <p><b>3 STANDARDS FOR PROGRAM EVALUATIONS 69</b></p> <p>The Need for Evaluation Standards 71</p> <p>Background of Standards for Program Evaluations 73</p> <p>Joint Committee Program Evaluation Standards 74</p> <p>American Evaluation Association Guiding Principles for Evaluators 80</p> <p>Government Auditing Standards 83</p> <p>Using Evaluation Standards 97</p> <p><b>Part Two: An Evaluation of Evaluation Approaches and Models</b> 105</p> <p><b>4 BACKGROUND FOR ASSESSING EVALUATION APPROACHES 107</b></p> <p>Evaluation Approaches 109</p> <p>Importance of Studying Alternative Evaluation Approaches 109</p> <p>The Nature of Program Evaluation 110</p> <p>Previous Classifications of Alternative Evaluation Approaches 110</p> <p>Caveats 112</p> <p><b>5 PSEUDOEVALUATIONS 117</b></p> <p>Background and Introduction 117</p> <p>Approach 1: Public Relations Studies 119</p> <p>Approach 2: Politically Controlled Studies 120</p> <p>Approach 3: Pandering Evaluations 122</p> <p>Approach 4: Evaluation by Pretext 123</p> <p>Approach 5: Empowerment Under the Guise of Evaluation 125</p> <p>Approach 6: Customer Feedback Evaluation 127</p> <p><b>6 QUASI-EVALUATION STUDIES  133</b></p> <p>Quasi-Evaluation Approaches Defined 133</p> <p>Functions of Quasi-Evaluation Approaches 134</p> <p>General Strengths and Weaknesses of Quasi-Evaluation Approaches 134</p> <p>Approach 7: Objectives-Based Studies 135</p> <p>Approach 8: The Success Case Method 137</p> <p>Approach 9: Outcome Evaluation as Value-Added Assessment 143</p> <p>Approach 10: Experimental and Quasi-Experimental Studies 147</p> <p>Approach 11: Cost Studies 152</p> <p>Approach 12: Connoisseurship and Criticism 155</p> <p>Approach 13: Theory-Based Evaluation 158</p> <p>Approach 14: Meta-Analysis 164</p> <p><b>7 IMPROVEMENT- AND ACCOUNTABILITY-ORIENTED EVALUATION APPROACHES 173</b></p> <p>Improvement- and Accountability-Oriented Evaluation Defined 173</p> <p>Functions of Improvement- and Accountability-Oriented Approaches 174</p> <p>General Strengths and Weaknesses of Decision- and Accountability-Oriented Approaches 174</p> <p>Approach 15: Decision- and Accountability-Oriented Studies 174</p> <p>Approach 16: Consumer-Oriented Studies 181</p> <p>Approach 17: Accreditation and Certification 184</p> <p><b>8 SOCIAL AGENDA AND ADVOCACY EVALUATION APPROACHES 191</b></p> <p>Overview of Social Agenda and Advocacy Approaches 191</p> <p>Approach 18: Responsive or Stakeholder-Centered Evaluation 192</p> <p>Approach 19: Constructivist Evaluation 197</p> <p>Approach 20: Deliberative Democratic Evaluation 202</p> <p>Approach 21: Transformative Evaluation 205</p> <p><b>9 ECLECTIC EVALUATION APPROACHES 213</b></p> <p>Overview of Eclectic Approaches 213</p> <p>Approach 22: Utilization-Focused Evaluation 214</p> <p>Approach 23: Participatory Evaluation 219</p> <p><b>10 BEST APPROACHES FOR TWENTY-FIRST-CENTURY EVALUATIONS 229</b></p> <p>Selection of Approaches for Analysis 230</p> <p>Methodology for Analyzing and Evaluating the Nine Approaches 230</p> <p>Our Qualifications as Raters 230</p> <p>Conflicts of Interest Pertaining to the Ratings 231</p> <p>Standards for Judging Evaluation Approaches 231</p> <p>Comparison of 2007 and 2014 Ratings 236</p> <p>Issues Related to the 2011 Program Evaluation Standards 237</p> <p>Overall Observations 237</p> <p>The Bottom Line 240</p> <p><b>Part Three: Explication of Selected Evaluation Approaches 247</b></p> <p><b>11 EXPERIMENTAL AND QUASI-EXPERIMENTAL DESIGN EVALUATIONS 249</b></p> <p>Chapter Overview 249</p> <p>Basic Requirements of Sound Experiments 250</p> <p>Prospective Versus Retrospective Studies of Cause 251</p> <p>Uses of Experimental Design 251</p> <p>Randomized Controlled Experiments in Context 252</p> <p>Suchman and the Scientific Approach to Evaluation 256</p> <p>Contemporary Concepts Associatedwith the Experimental andQuasi-Experimental Design Approach to Evaluation 265</p> <p>Exemplars of Large-Scale Experimental and Quasi-Experimental Design Evaluations 269</p> <p>Guidelines for Designing Experiments 271</p> <p>Quasi-Experimental Designs 280</p> <p><b>12 CASE STUDY EVALUATIONS 291</b></p> <p>Overview of the Chapter 291</p> <p>Overview of the Case Study Approach 292</p> <p>Case Study Research: The Views of Robert Stake 294</p> <p>Case Study Research: The Views of Robert Yin 297</p> <p>Particular Case Study Information Collection Methods 301</p> <p><b>13 DANIEL STUFFLEBEAM’S CIPP MODEL FOR EVALUATION: AN IMPROVEMENT AND ACCOUNTABILITY-ORIENTED APPROACH 309</b></p> <p>Overview of the Chapter 309</p> <p>CIPP Model in Context 309</p> <p>Overview of the CIPP Categories 312</p> <p>Formative and Summative Uses of Context, Input, Process, and Product Evaluations 313</p> <p>Philosophy and Code of Ethics Underlying the CIPP Model 314</p> <p>The Model’s Values Component 317</p> <p>Using the CIPP Framework to Define Evaluation Questions 319</p> <p>Delineation of the CIPP Categories and Relevant Procedures 319</p> <p>Use of the CIPP Model as a Systems Strategy for Improvement 332</p> <p><b>14 MICHAEL SCRIVEN’S CONSUMER-ORIENTED APPROACH TO EVALUATION  341</b></p> <p>Overview of Scriven’s Contributions to Evaluation 341</p> <p>Scriven’s Background 343</p> <p>Scriven’s Basic Orientation to Evaluation 343</p> <p>Scriven’s Definition of Evaluation 343</p> <p>Critique of Other Persuasions 344</p> <p>Formative and Summative Evaluation 345</p> <p>Amateur Versus Professional Evaluation 347</p> <p>Intrinsic and Payoff Evaluation 347</p> <p>Goal-Free Evaluation 347</p> <p>Needs Assessment 348</p> <p>Scoring, Ranking, Grading, and Apportioning 349</p> <p>Checklists 352</p> <p>Key Evaluation Checklist 353</p> <p>The Final Synthesis 354</p> <p>Metaevaluation 357</p> <p>Evaluation Ideologies 357</p> <p>Avenues to Causal Inference 361</p> <p>Product Evaluation 363</p> <p>Professionalization of Evaluation 366</p> <p>Scriven’s Look to Evaluation’s Future 366</p> <p><b>15 ROBERT STAKE’S RESPONSIVE OR STAKEHOLDER-CENTERED EVALUATION APPROACH 373</b></p> <p>Stake’s Professional Background 374</p> <p>Factors Influencing Stake’s Development of Evaluation Theory 374</p> <p>Stake’s 1967 ‘‘Countenance of Educational Evaluation’’ Article 375</p> <p>Responsive Evaluation Approach 383</p> <p>Substantive Structure of Responsive Evaluation 390</p> <p>Functional Structure of Responsive Evaluation 390</p> <p>An Application of Responsive Evaluation 392</p> <p>Stake’s Recent Rethinking of Responsive Evaluation 397</p> <p><b>16 MICHAEL PATTON’S UTILIZATION-FOCUSED EVALUATION 403</b></p> <p>Adherents of Utilization-Focused Evaluation 404</p> <p>Some General Aspects of Patton’s Utilization-Focused Evaluation 405</p> <p>Intended Users of Utilization-Focused Evaluation 407</p> <p>Focusing a Utilization-Focused Evaluation 407</p> <p>The Personal Factor as Vital to an Evaluation’s Success 408</p> <p>The Evaluator’s Roles 408</p> <p>Utilization-Focused Evaluation and Values and Judgments 409</p> <p>Employing Active-Reactive-Adaptive Processes to Negotiate with Users 410</p> <p>Patton’s Eclectic Approach 411</p> <p>Planning Utilization-Focused Evaluations 411</p> <p>Collecting and Analyzing Information and Reporting Findings 412</p> <p>Summary of Premises of Utilization-Focused Evaluation 413</p> <p>Strengths of the Utilization-Focused Evaluation Approach 414</p> <p>Limitations of the Utilization-Focused Evaluation Approach 415</p> <p><b>Part Four: Evaluation Tasks, Procedures, and Tools 421</b></p> <p><b>17 IDENTIFYING AND ASSESSING EVALUATION OPPORTUNITIES</b> <b>423</b></p> <p>Sources of Evaluation Opportunities 423</p> <p>Bidders’ Conferences 431</p> <p><b>18 FIRST STEPS IN ADDRESSING EVALUATION OPPORTUNITIES 435</b></p> <p>Developing the Evaluation Team 436</p> <p>Developing Thorough Familiarity with the Need for the Evaluation 437</p> <p>Stipulating Standards for Guiding and Assessing the Evaluation 437</p> <p>Establishing Institutional Support for the Projected Evaluation 437</p> <p>Developing the Evaluation Proposal’s Appendix 438</p> <p>Planning for a Stakeholder Review Panel 439</p> <p><b>19 DESIGNING EVALUATIONS 445</b></p> <p>A Design Used for Evaluating the Performance Review System of a Military Organization 446</p> <p>Generic Checklist for Designing Evaluations 462</p> <p><b>20 BUDGETING EVALUATIONS 479</b></p> <p>Ethical Imperatives in Budgeting Evaluations 480</p> <p>Fixed-Price Budget for Evaluating a Personnel Evaluation System 483</p> <p>Other Types of Evaluation Budgets 486</p> <p>Generic Checklist for Developing Evaluation Budgets 493</p> <p><b>21 CONTRACTING EVALUATIONS 505</b></p> <p>Definitions of Evaluation Contracts and Memorandums of Agreement 506</p> <p>Rationale for Evaluation Contracting 508</p> <p>Addressing Organizational Contracting Requirements 511</p> <p>Negotiating Evaluation Agreements 511</p> <p>Evaluation Contracting Checklist 512</p> <p><b>22 COLLECTING EVALUATIVE INFORMATION 519</b></p> <p>Key Standards for Information Collection 519</p> <p>An Information Collection Framework 540</p> <p>Useful Methods for Collecting Information 543</p> <p><b>23 ANALYZING AND SYNTHESIZING INFORMATION 557</b></p> <p>General Orientation to Analyzing and Synthesizing Information 558</p> <p>Principles for Analyzing and Synthesizing Information 559</p> <p>Analysis of Quantitative Information 560</p> <p>Analysis of Qualitative Information 575</p> <p>Justified Conclusions and Decisions 580</p> <p><b>24 COMMUNICATING EVALUATION FINDINGS 589</b></p> <p>Review of Pertinent Analysis and Advice from Previous Chapters 590</p> <p>Complex Needs and Challenges in Reporting Evaluation Findings 591</p> <p>Establishing Conditions to Foster Use of Findings 592</p> <p>Providing Interim Evaluative Feedback 600</p> <p>Preparing and Delivering the Final Report 603</p> <p>Providing Follow-Up Support to Enhance an Evaluation’s Impact 619</p> <p><b>Part Five: Metaevaluation and Institutionalizing and Mainstreaming Evaluation 629</b></p> <p><b>25 METAEVALUATION: EVALUATING EVALUATIONS 631</b></p> <p>Rationale for Metaevaluation 632</p> <p>Evaluator and Client Responsibilities in Regard to Metaevaluation 634</p> <p>Formative and Summative Metaevaluations 634</p> <p>A Conceptual and Operational Definition of Metaevaluation 634</p> <p>An Instructive Metaevaluation Case 640</p> <p>Metaevaluation Tasks 643</p> <p>Metaevaluation Arrangements and Procedures 647</p> <p>Comparative Metaevaluations 662</p> <p>Checklists for Use in Metaevaluations 664</p> <p>The Role of Context and Resource Constraints 664</p> <p><b>26 INSTITUTIONALIZING AND MAINSTREAMING EVALUATION 671</b></p> <p>Review of this Book’s Themes 671</p> <p>Overview of the Remainder of the Chapter 672</p> <p>Rationale and Key Principles for Institutionalizing and Mainstreaming Evaluation 673</p> <p>Early Efforts to Help Organizations Institutionalize Evaluation 674</p> <p>Recent Advances of Use in Institutionalizing and Mainstreaming Evaluation 675</p> <p>Checklist for Use in Institutionalizing and Mainstreaming Evaluation 676</p> <p>Glossary 691</p> <p>References 713</p> <p>Index 744</p>
<p><b>DANIEL L. STUFFLEBEAM, P<small>H</small>D,</b> is Distinguished University Professor Emeritus at Western Michigan University, Kalamazoo.</p> <p><b>CHRIS L. S. CORYN, P<small>H</small>D,</b> is director of the Interdisciplinary PhD in Evaluation (IDPE) program and assistant professor in the Evaluation, Measurement, and Research (EMR) program at Western Michigan University. He is the executive editor of the <i>Journal of MultiDisciplinary Evaluation</i>.</p>
<p><b>Evaluation Theory, Models, & Applications Second Edition</b></p> <p>The second edition of the highly acclaimed <i>Evaluation Theory, Models, and Applications</i> presents the core concepts, approaches, and methods in program evaluation. The book’s main contents are an overview of the evaluation field, a discussion of evaluation theory, a review of standards for guiding and judging evaluations, descriptions and judgments of the most widely used evaluation approaches, explanations of methods needed to apply any evaluation approach, and discussions of the overarching topics of metaevaluation and of institutionalizing and mainstreaming systematic evaluation. With the inclusion of the field’s latest literature and reference to many actual evaluations, this Second Edition is an essential update for students, evaluators, and evaluation clients who need to stay current.</p> <p>Authors Daniel Stufflebeam and Chris Coryn, widely experienced and published evaluation experts, describe and examine twenty-three evaluation approaches, including, new to this edition, transformative evaluation, participatory evaluation, consumer feedback, and meta-analysis. Building from its theoretical foundations, this Second Edition provides detailed practical direction along with supporting checklists for planning, budgeting, contracting, implementing, and reporting evaluations; for conducting metaevaluations; and for guiding organizations to institutionalize and mainstream sound evaluation practices. The book’s highlighted evaluation approaches include</p> <ul> <li><b>Experimental and quasi-experimental design evaluations</b></li> <li><b>Daniel Stufflebeam’s CIPP Model</b></li> <li><b>Michael Scriven’s Consumer-Oriented Evaluation</b></li> <li><b>Michael Patton’s Utilization-Focused Evaluation</b></li> <li><b>Robert Stake’s Responsive/Stakeholder-Centered Evaluation</b></li> <li><b>Case Study Evaluation</b></li> </ul> <p>Key readings listed at the end of each chapter direct readers to the most important references for each topic. Learning objectives, review questions, student exercises, and instructor support materials complete the collection of tools.</p>

Diese Produkte könnten Sie auch interessieren: