10-Step Evaluation for Training and Performance Improvement
Autor Seung Youn (Yonnie) Chyungen Limba Engleză Electronic book text – 10 oct 2018
Preț: 403.65 lei
Preț vechi: 474.88 lei
-15% Nou
Puncte Express: 605
Preț estimativ în valută:
68.16€ • 71.48$ • 56.83£
68.16€ • 71.48$ • 56.83£
Indisponibil temporar
Doresc să fiu notificat când acest titlu va fi disponibil:
Se trimite...
Preluare comenzi: 021 569.72.76
Specificații
ISBN-13: 9781544323978
ISBN-10: 1544323972
Pagini: 352
Dimensiuni: 187 x 232 mm
Ediția:1
Editura: SAGE Publications
Colecția Sage Publications, Inc
Locul publicării:Thousand Oaks, United States
ISBN-10: 1544323972
Pagini: 352
Dimensiuni: 187 x 232 mm
Ediția:1
Editura: SAGE Publications
Colecția Sage Publications, Inc
Locul publicării:Thousand Oaks, United States
Recenzii
“This
is
a
very
well
written
book.
It
is
easy
to
read,
follow,
and
the
application
of
the
material
from
chapter
to
chapter
is
well
constructed.”
“Yonnie Chyung has clearly and concisely discussed a ten-step process for evaluation that will appeal to scholarly practitioners across multiple disciplines. Incorporated throughout the text are user-friendly examples, tables, and samples.”
“10-Step Evaluation for Training and Performance Improvementprovides tools for practitioner, students, professors, evaluators, and so many more to address questions as they relate to practical program evaluation. This text offers a solid theoretical framework while offering practicality and readability to its audiences. The tools provided within the text share a best practice point of view that are easily adaptable to many situations and various environments.”
"This book was an exceptional point-by-point, systematic process for my students to develop project-based learning cases of their own. Overall, it was a practical application to program evaluation."
“Yonnie Chyung has clearly and concisely discussed a ten-step process for evaluation that will appeal to scholarly practitioners across multiple disciplines. Incorporated throughout the text are user-friendly examples, tables, and samples.”
“10-Step Evaluation for Training and Performance Improvementprovides tools for practitioner, students, professors, evaluators, and so many more to address questions as they relate to practical program evaluation. This text offers a solid theoretical framework while offering practicality and readability to its audiences. The tools provided within the text share a best practice point of view that are easily adaptable to many situations and various environments.”
"This book was an exceptional point-by-point, systematic process for my students to develop project-based learning cases of their own. Overall, it was a practical application to program evaluation."
Cuprins
List
of
Tables
List of Figures
List of Exhibits
Preface
About the Author
Introduction
Performance Improvement and Evaluation
What Is Evaluation?
What Is Not Evaluation?
How Does Evaluation Compare With Research?
Program Evaluation in the HPI Context
Evaluation Is Often Neglected
Different Evaluation Designs Used in Program Evaluation
Descriptive Case Study Type Evaluation Design
Frameworks for Conducting Evaluations in the HPI Context
The 10-Step Evaluation Procedure
Chapter Summary
Chapter Discussion
Chapter 1. Identify an Evaluand (Step 1) and Its Stakeholders (Step 2)
Identify a Performance Improvement Intervention as an Evaluand
Use the 5W1H Method to Understand the Intervention Program
Ask Why the Intervention Program Was Implemented
Check If Program Goals Are Based on Needs
Sell Evaluation to the Client
Identify Three Groups of Stakeholders
Chapter Summary
Chapter Discussion
Now, Your Turn—Identify an Evaluand and Its Stakeholders
Chapter 2. Identify the Purpose of Evaluation (Step 3)
Differentiate Evaluation From Needs Assessment
Gather Information About the Evaluation Purpose
Assess Stakeholders’ Needs for the Program and the Evaluation
Determine If the Evaluation Is a Formative or Summative Type
Determine If the Evaluation Is Goal Based or Goal Free
Determine If the Evaluation Is Merit Focused or Worth Focused
Keep in Mind Using a System-Focused Evaluation Approach
Write an Evaluation Purpose Statement
Chapter Summary
Chapter Discussion
Now, Your Turn—Identify the Purpose of Evaluation
Chapter 3. Assess Evaluation Feasibility and Risk Factors
Incorporate Macro-Level Tasks Into Micro-Level Steps
Assess Feasibility of the Evaluation Project
List Project Assumptions
Estimate Tasks and Time Involving Stakeholders
Assess Risk Factors for the Evaluation Project
Chapter Summary
Chapter Discussion
Now, Your Turn—Assess Feasibility and Risk Factors
Chapter 4. Write a Statement of Work
Prepare a Statement of Work for the Evaluation
Determine Sections to Be Included in a Statement of Work
Develop a Gantt Chart
Review a Sample Statement of Work
Now, Your Turn—Write a Statement of Work
Chapter 5. Develop a Program Logic Model (Step 4)
Apply a Theory-Based, If–Then Logic to Developing a Program
Review United Way’s Program Outcome Model
Review Kellogg Foundation’s Program Logic Model
Review Brinkerhoff’s Training Impact Model Compared to the Four-Level Training Evaluation Framework
Compare Elements Used in Different Frameworks
Develop a Program Logic Model
Develop a Training Impact Model
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop a Program Logic Model or a Training Impact Model
Chapter 6. Determine Dimensions and Importance Weighting (Step 5)
Think About Dimensions of the Evaluand to Investigate
Start With the Stakeholders’ Needs
Relate the Purpose of Evaluation to the Program Logic Model Elements
Incorporate Relevant Theoretical Frameworks and Professional Standards
Write Dimensional Evaluation Questions
Determine Importance Weighting Based on Usage of Dimensional Findings
Recognize a Black Box, Gray Box, or Clear Box Evaluation
Finalize the Number of Dimensions
Chapter Summary
Chapter Discussion
Now, Your Turn—Determine Dimensions and Importance Weighting
Chapter 7. Determine Data Collection Methods (Step 6)
Determine Evaluation Designs for Dimensional Evaluations
Select Data Collection Methods That Allow Direct Measures of Dimensions
Apply Critical Multiplism
Triangulate Multiple Sets of Data
Select Appropriate Methods When Using the Four-Level Training Evaluation Model
Select Appropriate Methods When Using Brinkerhoff’s Success Case Method
Review an Example of Data Collection Methods
Use an Iterative Design Approach
Assess Feasibility and Risk Factors Again
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Determine Data Collection Methods
Chapter 8. Write an Evaluation Proposal and Get Approval
Determine Sections to Be Included in an Evaluation Proposal
Review a Sample Evaluation Proposal
Now, Your Turn—Write an Evaluation Proposal
Chapter 9. Develop Data Collection Instruments I—Self-Administered Surveys (Step 7)
Comply With IRB Requirements
Use Informed Consent Forms
Determine Materials to Be Developed for Different Data Collection Methods
Distinguish Anonymity From Confidentiality
Develop Materials for Conducting Self-Administered Surveys
Determine Whether to Use Closed-Ended Questions, Open-Ended Questions, or Both
Ask Specific Questions That Measure the Quality of a Dimension
Design Survey Items Using a Question or Statement Format
Recognize Nominal, Ordinal, Interval, and Ratio Scales
Decide Whether to Include or Omit a Midpoint in the Likert Scale
Decide Whether to Use Ascending or Descending Order of the Likert Scale Options
Follow Other Guidelines for Developing Survey Items
Develop Survey Items That Measure a Construct
Test Validity and Reliability of a Survey Instrument
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop Survey Instruments
Chapter 10. Develop Data Collection Instruments II—Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests (Step 7)
Determine Whether to Use a Structured, Unstructured, or Semi-Structured Interview
Develop Materials for Conducting Interviews or Focus Groups
Solicit Interview Volunteers at the End of a Self-Administered Web-Based Survey
Develop Materials for Conducting Observations
Develop Materials for Conducting Extant Data Reviews
Develop Materials for Administering Tests
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop Instruments for Conducting Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests
Chapter 11. Collect Data (Step 8)
Follow Professional and Ethical Guidelines
What Would You Do?
Use Strategies to Collect Data Successfully and Ethically
Use Strategies When Collecting Data From Self-Administered Surveys
Use Strategies When Collecting Data From Interviews and Focus Groups
Use Strategies When Collecting Data From Observations and Tests
Use Strategies to Ensure Anonymity or Confidentiality of Data
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Collect Data
Chapter 12. Analyze Data With Rubrics (Step 9)
Use Evidence-Based Practice
Keep in Mind: Evaluation = Measurement + Valuation With Rubrics
Apply the Same or Different Weighting to the Multiple Sets of Data
Analyze Structured Survey Data With Rubrics
Analyze Unstructured Survey or Interview Data With Rubrics
Analyze Semi-Structured Survey or Interview Data With Rubrics
Analyze Data Obtained From Observations, Extant Data Reviews, and Tests With Rubrics
Determine the Number of Levels and Labels for Rubrics
Triangulate Results Obtained From Multiple Sources for Each Dimension
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Analyze Data With Rubrics
Chapter 13. Draw Conclusions (Step 10)
Revisit Formative or Summative Use of Evaluation Findings
Develop a Synthesis Rubric
Draw Evidence-Based Conclusions and Recommendations
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Draw Conclusions and Make Recommendations
Chapter 14. Write a Final Report and Conduct a Summative Meta-Evaluation
Extend the Evaluation Proposal to a Final Report
Present Dimensional Results in the Evaluation Results Section
Present Supporting Information in Appendices
Present Conclusions
Report the Findings Ethically
Conduct a Summative Meta-Evaluation
Report Limitations
Write an Executive Summary
Present the Final Report to Stakeholders
Follow Up With Stakeholders
Present Complete Sections in a Final Report
Now, Your Turn—Write a Final Report
Appendix A. A Summary of the Frameworks Used
Appendix B. Evaluation Development Worksheets
Appendix C. Survey Questionnaire Make
Appendix D. A Sample Survey Questionnaire Measuring Multiple Dimensions, Sample Rubrics, and Reliability Testing With IBM® SPSS® Statistics
Appendix E. Experimental Studies and Data Analysis With t-Tests Using Excel
Glossary
References
Index
List of Figures
List of Exhibits
Preface
About the Author
Introduction
Performance Improvement and Evaluation
What Is Evaluation?
What Is Not Evaluation?
How Does Evaluation Compare With Research?
Program Evaluation in the HPI Context
Evaluation Is Often Neglected
Different Evaluation Designs Used in Program Evaluation
Descriptive Case Study Type Evaluation Design
Frameworks for Conducting Evaluations in the HPI Context
The 10-Step Evaluation Procedure
Chapter Summary
Chapter Discussion
Chapter 1. Identify an Evaluand (Step 1) and Its Stakeholders (Step 2)
Identify a Performance Improvement Intervention as an Evaluand
Use the 5W1H Method to Understand the Intervention Program
Ask Why the Intervention Program Was Implemented
Check If Program Goals Are Based on Needs
Sell Evaluation to the Client
Identify Three Groups of Stakeholders
Chapter Summary
Chapter Discussion
Now, Your Turn—Identify an Evaluand and Its Stakeholders
Chapter 2. Identify the Purpose of Evaluation (Step 3)
Differentiate Evaluation From Needs Assessment
Gather Information About the Evaluation Purpose
Assess Stakeholders’ Needs for the Program and the Evaluation
Determine If the Evaluation Is a Formative or Summative Type
Determine If the Evaluation Is Goal Based or Goal Free
Determine If the Evaluation Is Merit Focused or Worth Focused
Keep in Mind Using a System-Focused Evaluation Approach
Write an Evaluation Purpose Statement
Chapter Summary
Chapter Discussion
Now, Your Turn—Identify the Purpose of Evaluation
Chapter 3. Assess Evaluation Feasibility and Risk Factors
Incorporate Macro-Level Tasks Into Micro-Level Steps
Assess Feasibility of the Evaluation Project
List Project Assumptions
Estimate Tasks and Time Involving Stakeholders
Assess Risk Factors for the Evaluation Project
Chapter Summary
Chapter Discussion
Now, Your Turn—Assess Feasibility and Risk Factors
Chapter 4. Write a Statement of Work
Prepare a Statement of Work for the Evaluation
Determine Sections to Be Included in a Statement of Work
Develop a Gantt Chart
Review a Sample Statement of Work
Now, Your Turn—Write a Statement of Work
Chapter 5. Develop a Program Logic Model (Step 4)
Apply a Theory-Based, If–Then Logic to Developing a Program
Review United Way’s Program Outcome Model
Review Kellogg Foundation’s Program Logic Model
Review Brinkerhoff’s Training Impact Model Compared to the Four-Level Training Evaluation Framework
Compare Elements Used in Different Frameworks
Develop a Program Logic Model
Develop a Training Impact Model
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop a Program Logic Model or a Training Impact Model
Chapter 6. Determine Dimensions and Importance Weighting (Step 5)
Think About Dimensions of the Evaluand to Investigate
Start With the Stakeholders’ Needs
Relate the Purpose of Evaluation to the Program Logic Model Elements
Incorporate Relevant Theoretical Frameworks and Professional Standards
Write Dimensional Evaluation Questions
Determine Importance Weighting Based on Usage of Dimensional Findings
Recognize a Black Box, Gray Box, or Clear Box Evaluation
Finalize the Number of Dimensions
Chapter Summary
Chapter Discussion
Now, Your Turn—Determine Dimensions and Importance Weighting
Chapter 7. Determine Data Collection Methods (Step 6)
Determine Evaluation Designs for Dimensional Evaluations
Select Data Collection Methods That Allow Direct Measures of Dimensions
Apply Critical Multiplism
Triangulate Multiple Sets of Data
Select Appropriate Methods When Using the Four-Level Training Evaluation Model
Select Appropriate Methods When Using Brinkerhoff’s Success Case Method
Review an Example of Data Collection Methods
Use an Iterative Design Approach
Assess Feasibility and Risk Factors Again
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Determine Data Collection Methods
Chapter 8. Write an Evaluation Proposal and Get Approval
Determine Sections to Be Included in an Evaluation Proposal
Review a Sample Evaluation Proposal
Now, Your Turn—Write an Evaluation Proposal
Chapter 9. Develop Data Collection Instruments I—Self-Administered Surveys (Step 7)
Comply With IRB Requirements
Use Informed Consent Forms
Determine Materials to Be Developed for Different Data Collection Methods
Distinguish Anonymity From Confidentiality
Develop Materials for Conducting Self-Administered Surveys
Determine Whether to Use Closed-Ended Questions, Open-Ended Questions, or Both
Ask Specific Questions That Measure the Quality of a Dimension
Design Survey Items Using a Question or Statement Format
Recognize Nominal, Ordinal, Interval, and Ratio Scales
Decide Whether to Include or Omit a Midpoint in the Likert Scale
Decide Whether to Use Ascending or Descending Order of the Likert Scale Options
Follow Other Guidelines for Developing Survey Items
Develop Survey Items That Measure a Construct
Test Validity and Reliability of a Survey Instrument
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop Survey Instruments
Chapter 10. Develop Data Collection Instruments II—Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests (Step 7)
Determine Whether to Use a Structured, Unstructured, or Semi-Structured Interview
Develop Materials for Conducting Interviews or Focus Groups
Solicit Interview Volunteers at the End of a Self-Administered Web-Based Survey
Develop Materials for Conducting Observations
Develop Materials for Conducting Extant Data Reviews
Develop Materials for Administering Tests
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Develop Instruments for Conducting Interviews, Focus Groups, Observations, Extant Data Reviews, and Tests
Chapter 11. Collect Data (Step 8)
Follow Professional and Ethical Guidelines
What Would You Do?
Use Strategies to Collect Data Successfully and Ethically
Use Strategies When Collecting Data From Self-Administered Surveys
Use Strategies When Collecting Data From Interviews and Focus Groups
Use Strategies When Collecting Data From Observations and Tests
Use Strategies to Ensure Anonymity or Confidentiality of Data
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Collect Data
Chapter 12. Analyze Data With Rubrics (Step 9)
Use Evidence-Based Practice
Keep in Mind: Evaluation = Measurement + Valuation With Rubrics
Apply the Same or Different Weighting to the Multiple Sets of Data
Analyze Structured Survey Data With Rubrics
Analyze Unstructured Survey or Interview Data With Rubrics
Analyze Semi-Structured Survey or Interview Data With Rubrics
Analyze Data Obtained From Observations, Extant Data Reviews, and Tests With Rubrics
Determine the Number of Levels and Labels for Rubrics
Triangulate Results Obtained From Multiple Sources for Each Dimension
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Analyze Data With Rubrics
Chapter 13. Draw Conclusions (Step 10)
Revisit Formative or Summative Use of Evaluation Findings
Develop a Synthesis Rubric
Draw Evidence-Based Conclusions and Recommendations
Conduct Formative Meta-Evaluations
Chapter Summary
Chapter Discussion
Now, Your Turn—Draw Conclusions and Make Recommendations
Chapter 14. Write a Final Report and Conduct a Summative Meta-Evaluation
Extend the Evaluation Proposal to a Final Report
Present Dimensional Results in the Evaluation Results Section
Present Supporting Information in Appendices
Present Conclusions
Report the Findings Ethically
Conduct a Summative Meta-Evaluation
Report Limitations
Write an Executive Summary
Present the Final Report to Stakeholders
Follow Up With Stakeholders
Present Complete Sections in a Final Report
Now, Your Turn—Write a Final Report
Appendix A. A Summary of the Frameworks Used
Appendix B. Evaluation Development Worksheets
Appendix C. Survey Questionnaire Make
Appendix D. A Sample Survey Questionnaire Measuring Multiple Dimensions, Sample Rubrics, and Reliability Testing With IBM® SPSS® Statistics
Appendix E. Experimental Studies and Data Analysis With t-Tests Using Excel
Glossary
References
Index
Descriere
Written
with
a
learning-by-doing
approach
in
mind,
Yonnie
Chyung’s10-Step
Evaluation
for
Training
and
Performance
Improvementgives
students
actionable
instruction
for
identifying,
planning
and
implementing
a
client-based
program
evaluation.
The
book
introduces
readers
to
multiple
evaluation
frameworks
and
uses
problem-based
learning
to
guide
them
through
a
10-step
evaluation
process.
As
students
read
the
chapters,
they
produce
specific
deliverables
that
culminate
in
a
completed
evaluation
project.