Details

Data Collection


Data Collection

Planning for and Collecting All Types of Data
Measurement and Evaluation Series 1. Aufl.

von: Patricia Pulliam Phillips, Cathy A. Stawarski

39,99 €

Verlag: Wiley
Format: EPUB
Veröffentl.: 12.05.2016
ISBN/EAN: 9781119254782
Sprache: englisch
Anzahl Seiten: 192

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

Data Collection<br /> <br /> Data Collection is the second of six books in the Measurement and Evaluation Series from Pfeiffer. The proven ROI Methodology--developed by the ROI Institute--provides a practical system for evaluation planning, data collection, data analysis, and reporting. All six books in the series offer the latest tools, most current research, and practical advice for measuring ROI in a variety of settings.<br /> <br /> Data Collection offers an effective process for collecting data that is essential to the implementation of the ROI Methodology. The authors outline the techniques, processes, and critical issues involved in successful data collection. The book examines the various methods of data collection, including questionnaires, interviews, focus groups, observation, action plans, performance contracts, and monitoring records. Written for evaluators, facilitators, analysts, designers, coordinators, and managers, Data Collection is a valuable guide for collecting data that are adequate in quantity and quality to produce a complete and credible analysis.
Principles of the ROI Methodology. <p><b>1. Using Questionnaires and Surveys.</b></p> <p>Types of Questions.</p> <p>Questionnaire Design Steps.</p> <p>Pager: Please do not italicize the Contents H1 items or the sublists to this level.</p> <p>Determine the Specific Information Needed.</p> <p>Involve Stakeholders in the Process.</p> <p>Select the Types of Questions.</p> <p>Develop the Questions.</p> <p>Check the Reading Level.</p> <p>Test the Questions.</p> <p>Address the Anonymity Issue.</p> <p>Design for Ease of Tabulation and Analysis.</p> <p>Develop the Completed Questionnaire and Prepare a Data Summary.</p> <p>Improving the Response Rate for Questionnaires and Surveys.</p> <p>Provide Advance Communication.</p> <p>Communicate the Purpose.</p> <p>Describe the Data Integration Process.</p> <p>Keep the Questionnaire as Simple as Possible.</p> <p>Simplify the Response Process.</p> <p>Use Local Manager Support.</p> <p>Let the Participants Know That They Are Part of a Sample.</p> <p>Consider Incentives.</p> <p>Have an Executive Sign the Introductory Letter.</p> <p>Use Follow-Up Reminders.</p> <p>Send a Copy of the Results to the Participants.</p> <p>Review the Questionnaire with Participants.</p> <p>Consider a Captive Audience.</p> <p>Communicate the Timing of Data Flow.</p> <p>Select the Appropriate Media.</p> <p>Consider Anonymous or Confidential Input.</p> <p>Pilot Test the Questionnaire.</p> <p>Explain How Long Completing the Questionnaire Will Take.</p> <p>Personalize the Process.</p> <p>Provide an Update.</p> <p>Final Thoughts.</p> <p><b>2. Using Tests.</b></p> <p>Types of Tests.</p> <p>Norm-Referenced Tests.</p> <p>Criterion-Referenced Tests.</p> <p>Performance Tests.</p> <p>Simulations.</p> <p>Electromechanical Simulation.</p> <p>Task Simulation.</p> <p>Business Games.</p> <p>In-Basket Simulation.</p> <p>Case Study.</p> <p>Role-Playing.</p> <p>Informal Tests.</p> <p>Exercises, Problems, or Activities.</p> <p>Self-Assessment.</p> <p>Facilitator Assessment.</p> <p>Final Thoughts.</p> <p><b>3. Using Interviews, Focus Groups, and Observation.</b></p> <p>Interviews.</p> <p>Types of Interviews.</p> <p>Interview Guidelines.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Develop the Questions to Be Asked.</p> <p>Test the Interview.</p> <p>Prepare the Interviewers.</p> <p>Provide Clear Instructions to the Participants.</p> <p>Schedule the Interviews.</p> <p>Pager: end of sublist.</p> <p>Focus Groups.</p> <p>Applications of Focus Groups.</p> <p>Guidelines.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Plan Topics, Questions, and Strategy Carefully.</p> <p>Keep the Group Size Small.</p> <p>Use a Representative Sample.</p> <p>Use Experienced Facilitators.</p> <p>Pager: end of sublist.</p> <p>Observations.</p> <p>Guidelines for Effective Observation.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Observations Should Be Systematic.</p> <p>Observers Should Be Knowledgeable.</p> <p>The Observer’s Influence Should Be Minimized.</p> <p>Observers Should Be Selected Carefully.</p> <p>Observers Must Be Fully Prepared.</p> <p>Pager: end of sublist.</p> <p>Observation Methods.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Behavior Checklist.</p> <p>Delayed Report.</p> <p>Video Recording.</p> <p>Audio Monitoring.</p> <p>Computer Monitoring.</p> <p>Pager: end of sublist.</p> <p>Final Thoughts.</p> <p><b>4. Using Other Data Collection Methods.</b></p> <p>Business Performance Monitoring.</p> <p>Using Current Measures.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Identify Appropriate Measures.</p> <p>Convert Current Measures to Usable Ones.</p> <p>Pager: end of sublist.</p> <p>Developing New Measures.</p> <p>Action Planning.</p> <p>Developing an Action Plan.</p> <p>Using Action Plans Successfully.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Communicate the Action Plan Requirement Early.</p> <p>Describe the Action Planning Process at the Beginning of the Program.</p> <p>Teach the Action Planning Process.</p> <p>Allow Time to Develop the Plan.</p> <p>Have the Facilitator Approve Action Plans.</p> <p>Require Participants to Assign a Monetary Value to Each Improvement.</p> <p>Ask Participants to Isolate the Effects of the Program.</p> <p>Ask Participants to Provide a Confidence Level for Estimates.</p> <p>Require That Action Plans Be Presented to the Group.</p> <p>Explain the Follow-Up Process.</p> <p>Collect Action Plans at the Stated Follow-Up Time.</p> <p>Summarize the Data and Calculate the ROI.</p> <p>Pager: end of sublist.</p> <p>Applying Action Plans.</p> <p>Identifying Advantages and Disadvantages of Action Plans.</p> <p>Performance Contracts.</p> <p>Final Thoughts.</p> <p><b>5. Measuring Reaction and Planned Action.</b></p> <p>Why Measure Reaction and Planned Action?</p> <p>Customer Satisfaction.</p> <p>Immediate Adjustments.</p> <p>Team Evaluation.</p> <p>Predictive Capability.</p> <p>Importance of Other Levels of Evaluation.</p> <p>Areas of Feedback.</p> <p>Data Collection Issues.</p> <p>Timing.</p> <p>Methods.</p> <p>Administrative Guidelines.</p> <p>Uses of Reaction Data.</p> <p>Final Thoughts.</p> <p><b>6. Measuring Learning and Confidence.</b></p> <p>Why Measure Learning and Confidence?</p> <p>The Learning Organization.</p> <p>Compliance Issues.</p> <p>Development of Competencies.</p> <p>Certification.</p> <p>Consequences of an Unprepared Workforce.</p> <p>The Role of Learning in Programs.</p> <p>Measurement Issues.</p> <p>Challenges.</p> <p>Program Objectives.</p> <p>Typical Measures.</p> <p>Timing.</p> <p>Data Collection Methods.</p> <p>Administrative Issues.</p> <p>Validity and Reliability.</p> <p>Consistency.</p> <p>Pilot Testing.</p> <p>Scoring and Reporting.</p> <p>Confronting Failure.</p> <p>Uses of Learning Data.</p> <p>Final Thoughts.</p> <p><b>7. Measuring Application and Implementation.</b></p> <p>Why Measure Application and Implementation?</p> <p>Obtain Essential Information.</p> <p>Track Program Focus.</p> <p>Discover Problems and Opportunities.</p> <p>Reward Effectiveness.</p> <p>Challenges.</p> <p>Linking Application with Learning.</p> <p>Building Data Collection into the Program.</p> <p>Ensuring a Sufficient Amount of Data.</p> <p>Addressing Application Needs at the Outset.</p> <p>Measurement Issues.</p> <p>Methods.</p> <p>Objectives.</p> <p>Areas of Coverage.</p> <p>Data Sources.</p> <p>Timing.</p> <p>Responsibilities.</p> <p>Data Collection Methods.</p> <p>Questionnaires.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Progress with Objectives.</p> <p>Use of Program Materials and Handouts.</p> <p>Application of Knowledge and Skills.</p> <p>Changes in Work Activities.</p> <p>Improvements or Accomplishments.</p> <p>Definition of the Measure.</p> <p>Amount of Change.</p> <p>Unit Value.</p> <p>Basis for Value.</p> <p>Total Annual Impact.</p> <p>Other Factors.</p> <p>Improvements Linked with the Program.</p> <p>Confidence Level.</p> <p>Perception of Investment in the Program.</p> <p>Link with Output Measures.</p> <p>Other Benefits.</p> <p>Barriers.</p> <p>Enablers.</p> <p>Management Support.</p> <p>Other Solutions.</p> <p>Target Audience Recommendations.</p> <p>Suggestions for Improvement.</p> <p>Pager: end of sublist.</p> <p>Interviews, Focus Groups, and Observation.</p> <p>Action Plans.</p> <p>Barriers to Application.</p> <p>Uses of Application Data.</p> <p>Final Thoughts.</p> <p><b>8. Measuring Impact and Consequences.</b></p> <p>Why Measure Business Impact?</p> <p>Impact Data Provide Higher-Level Information on Performance.</p> <p>Impact Data Represent the Business Driver of a Program.</p> <p>Impact Data Provide Value for Sponsors.</p> <p>Impact Data Are Easy to Measure.</p> <p>Effective Impact Measures.</p> <p>Hard Data Measures.</p> <p>Soft Data Measures.</p> <p>Tangible Versus Intangible Measures.</p> <p>Impact Objectives.</p> <p>Linking Specific Measures to Programs.</p> <p>Sources of Impact Data.</p> <p>Data Collection Methods.</p> <p>Monitoring Business Performance Data.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Identify Appropriate Measures.</p> <p>Convert Current Measures to Usable Ones.</p> <p>Develop New Measures.</p> <p>Pager: end of sublist.</p> <p>Action Plans.</p> <p>Pager: Please style the following as a sublist to the previous list.</p> <p>Set Goals and Targets.</p> <p>Define the Unit of Measure.</p> <p>Place a Monetary Value on Each Improvement.</p> <p>Implement the Action Plan.</p> <p>Document Specific Improvements.</p> <p>Isolate the Effects of the Program.</p> <p>Provide a Confidence Level for Estimates.</p> <p>Collect Action Plans at Specified Time Intervals.</p> <p>Summarize the Data and Calculate the ROI.</p> <p>Pager: end of sublist.</p> <p>Performance Contracts.</p> <p>Questionnaires.</p> <p>Final Thoughts.</p> <p>9 Selecting the Proper Data Collection Method.</p> <p>Matching Exercise.</p> <p>Selecting the Appropriate Method for Each Level.</p> <p>Type of Data.</p> <p>Investment of Participants? Time.</p> <p>Investment of Managers? Time.</p> <p>Cost.</p> <p>Disruption of Normal Work Activities.</p> <p>Accuracy.</p> <p>Built-In Design Possibility.</p> <p>Utility of an Additional Method.</p> <p>Cultural Bias of Data Collection Method.</p> <p>Final Thoughts.</p> <p>Index.</p> <p>About the Authors.</p>
<p>Patricia Pulliam Phillips is an internationally recognized author, consultant, and president and CEO of the ROI Institute, Inc. Phillips provides consulting services to organizations worldwide. She helps organizations build capacity in the ROI Methodology by facilitating the ROI certification process and teaching the ROI Methodology through workshops and graduate-level courses.</p> <p>Cathy A. Stawarski is program manager of the Strategic Performance Improvement and Evaluation program at the Human Resources Research Organization (HumRRO) in Alexandria, Virginia. She has more than twenty-five years of experience in research, training and development, and program evaluation. Throughout her nearly twenty years at HumRRO, she has worked primarily with clients in the federal sector. Her work includes leading and conducting the evaluation of leadership and human capital initiatives as well as assisting organizations in developing comprehensive evaluation strategies.</p> <p>The ROI Institute, Inc., is a benchmarking, research, and information sharing organization that provides consulting services, workshops, and certification in the ROI Methodology. Widely considered the leading authority on evaluation and measurement of learning and development in organizations, the ROI Institute conducts workshops and offers certification for thousands of practitioners through a variety of strategic partners.</p>
<p>Data Collection</p> <p>Data Collection is the second of six books in the Measurement and Evaluation Series from Pfeiffer. The proven ROI Methodology—developed by the ROI Institute—provides a practical system for evaluation planning, data collection, data analysis, and reporting. All six books in the series offer the latest tools, most current research, and practical advice for measuring ROI in a variety of settings.</p> <p>Data Collection offers an effective process for collecting data that is essential to the implementation of the ROI Methodology. The authors outline the techniques, processes, and critical issues involved in successful data collection. The book examines the various methods of data collection, including questionnaires, interviews, focus groups, observation, action plans, performance contracts, and monitoring records. Written for evaluators, facilitators, analysts, designers, coordinators, and managers, Data Collection is a valuable guide for collecting data that are adequate in quantity and quality to produce a complete and credible analysis.</p>

Diese Produkte könnten Sie auch interessieren:

Mindfulness
Mindfulness
von: Gill Hasson
PDF ebook
12,99 €
Counterparty Credit Risk, Collateral and Funding
Counterparty Credit Risk, Collateral and Funding
von: Damiano Brigo, Massimo Morini, Andrea Pallavicini
EPUB ebook
69,99 €