Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010179770 | HF5549.5.R3 G84 2008 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-López, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.
Author Notes
Ingrid J. Guerra-Lopez, PhD, is an associate professor at Wayne State University, director of the Institute for Learning and Performance Improvement, associate research professor at the Sonora Institute of Technology in Mexico, and principal of Intelligence Gathering Systems
Table of Contents
Acknowledgments | p. xi |
Preface | p. xiii |
The Author | p. xv |
Part 1 Introduction to Evaluation | |
1 Foundations of Evaluation | p. 3 |
A Brief Overview of Evaluation History | p. 4 |
Evaluation: Purpose and Definition | p. 5 |
Performance Improvement: A Conceptual Framework | p. 8 |
Making Evaluation Happen: Ensuring Stakeholders' Buy-In | p. 9 |
The Evaluator: A Job or a Role? | p. 10 |
The Relationship to Other Investigative Processes | p. 11 |
When Does Evaluation Occur? | p. 15 |
General Evaluation Orientations | p. 18 |
Challenges That Evaluators Face | p. 20 |
Ensuring Commitment | p. 23 |
Benefits of Evaluation | p. 24 |
Basic Definitions | p. 25 |
2 Principles of Performance-Based Evaluation | p. 27 |
Principle 1 Evaluation Is Based on Asking the Right Questions | p. 28 |
Principle 2 Evaluation of Process Is a Function of Obtained Results | p. 32 |
Principle 3 Goals and Objectives of Organizations Should Be Based on Valid Needs | p. 33 |
Principle 4 Derive Valid Needs Using a Top-Down Approach | p. 34 |
Principle 5 Every Organization Should Aim for the Best That Society Can Attain | p. 34 |
Principle 6 The Set of Evaluation Questions Drives the Evaluation Study | p. 35 |
Part 2 Models of Evaluation | |
3 Overview of Existing Evaluation Models | p. 39 |
Overview of Classic Evaluation Models | p. 40 |
Selected Evaluation Models | p. 42 |
Selecting a Model | p. 43 |
Conceptualizing a Useful Evaluation That Fits the Situation | p. 44 |
4 Kirkpatrick's Four Levels of Evaluation | p. 47 |
Kirkpatrick's Levels | p. 49 |
Comments on the Model | p. 54 |
Strengths and Limitations | p. 55 |
Application Example: Wagner (1995) | p. 56 |
5 Phillips's Return-on-Investment Methodology | p. 61 |
Phillips's ROI Process | p. 63 |
Comments on the Model | p. 67 |
Strengths and Limitations | p. 70 |
Application Example: Blake (1999) | p. 70 |
6 Brinkerhoff's Success Case Method | p. 75 |
The SCM Process | p. 77 |
Strengths and Weaknesses | p. 78 |
Application Example: Brinkerhoff (2005) | p. 79 |
7 The Impact Evaluation Process | p. 81 |
The Elements of the Process | p. 83 |
Comments on the Model | p. 96 |
Strengths and Limitations | p. 97 |
Application Example | p. 97 |
8 The Cipp Model | p. 107 |
Stufflebeam's Four Types of Evaluation | p. 108 |
Articulating Core Values of Programs and Solutions | p. 111 |
Methods Used in CIPP Evaluations | p. 112 |
Strengths and Limitations | p. 113 |
Application Example: Filella-Guiu and Blanch-Pana (2002) | p. 113 |
9 Evaluating Evaluations | p. 117 |
Evaluation Standards | p. 119 |
The American Evaluation Association Principles for Evaluators | p. 120 |
Application Example: Lynch et al. (2003) | p. 122 |
Part 3 Tools and Techniques of Evaluation | |
10 Data | p. 133 |
Characteristics of Data | p. 135 |
Scales of Measurement | p. 137 |
Defining Required Data from Performance Objectives | p. 139 |
Deriving Measurable Indicators | p. 141 |
Finding Data Sources | p. 152 |
Follow-Up Questions and Data | p. 155 |
11 Data Collection | p. 159 |
Observation Methodology and the Purpose of Measurement | p. 160 |
Designing the Experiment | p. 186 |
Problems with Classic Experimental Studies in Applied Settings | p. 188 |
Time-Series Studies | p. 188 |
Simulations and Games | p. 189 |
Document-Centered Methods | p. 191 |
Conclusion | p. 192 |
12 Analysis of Evaluation Data: Tools and Techniques | p. 195 |
Analysis of Models and Patterns | p. 196 |
Analysis Using Structured Discussion | p. 197 |
Methods of Quantitative Analysis | p. 199 |
Statistics | p. 200 |
Graphical Representations of Data | p. 210 |
Measures of Relationship | p. 212 |
Inferential Statistics: Parametric and Nonparametric | p. 214 |
Interpretation | p. 217 |
13 Communicating the Findings | p. 221 |
Recommendations | p. 222 |
Considerations for Implementing Recommendations | p. 225 |
Developing the Report | p. 226 |
The Evaluator's Role After the Report | p. 235 |
Part 4 Continual Improvement | |
14 Common Errors in Evaluation | p. 239 |
Errors of System Mapping | p. 240 |
Errors of Logic | p. 242 |
Errors of Procedure | p. 244 |
Conclusion | p. 246 |
15 Continual Improvement | p. 249 |
What Is Continual Improvement? | p. 250 |
Monitoring Performance | p. 250 |
Adjusting Performance | p. 253 |
The Role of Leadership | p. 254 |
16 Contracting for Evaluation Services | p. 257 |
The Contract | p. 258 |
Contracting Controls | p. 260 |
Ethics and Professionalism | p. 262 |
Sample Statement of Work | p. 262 |
17 Intelligence Gathering for Decision Making | p. 271 |
Performance Measurement Systems | p. 273 |
Issues in Performance Measurement Systems | p. 275 |
Conclusion | p. 277 |
18 The Future of Evaluation in Performance Improvement | p. 279 |
Evaluation and Measurement in Performance Improvement Today | p. 281 |
What Does the Future Hold? | p. 282 |
Conclusion | p. 283 |
References and Related Readings | p. 285 |
Index | p. 295 |