Cover image for Performance evaluation : proven approaches for improving program and organizational performance
Title:
Performance evaluation : proven approaches for improving program and organizational performance
Personal Author:
Publication Information:
New York : Jossey Bass, 2008
Physical Description:
xiv, 304 p. : ill. ; 23 cm.
ISBN:
9780787988838

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010179770 HF5549.5.R3 G84 2008 Open Access Book Book
Searching...

On Order

Summary

Summary

Performance Evaluation is a hands-on text for practitioners, researchers, educators, and students in how to use scientifically-based evaluations that are both rigorous and flexible. Author Ingrid Guerra-López, an internationally-known evaluation expert, introduces the foundations of evaluation and presents the most applicable models for the performance improvement field. Her book offers a wide variety of tools and techniques that have proven successful and is organized to illustrate evaluation in the context of continual performance improvement.


Author Notes

Ingrid J. Guerra-Lopez, PhD, is an associate professor at Wayne State University, director of the Institute for Learning and Performance Improvement, associate research professor at the Sonora Institute of Technology in Mexico, and principal of Intelligence Gathering Systems


Table of Contents

Acknowledgmentsp. xi
Prefacep. xiii
The Authorp. xv
Part 1 Introduction to Evaluation
1 Foundations of Evaluationp. 3
A Brief Overview of Evaluation Historyp. 4
Evaluation: Purpose and Definitionp. 5
Performance Improvement: A Conceptual Frameworkp. 8
Making Evaluation Happen: Ensuring Stakeholders' Buy-Inp. 9
The Evaluator: A Job or a Role?p. 10
The Relationship to Other Investigative Processesp. 11
When Does Evaluation Occur?p. 15
General Evaluation Orientationsp. 18
Challenges That Evaluators Facep. 20
Ensuring Commitmentp. 23
Benefits of Evaluationp. 24
Basic Definitionsp. 25
2 Principles of Performance-Based Evaluationp. 27
Principle 1 Evaluation Is Based on Asking the Right Questionsp. 28
Principle 2 Evaluation of Process Is a Function of Obtained Resultsp. 32
Principle 3 Goals and Objectives of Organizations Should Be Based on Valid Needsp. 33
Principle 4 Derive Valid Needs Using a Top-Down Approachp. 34
Principle 5 Every Organization Should Aim for the Best That Society Can Attainp. 34
Principle 6 The Set of Evaluation Questions Drives the Evaluation Studyp. 35
Part 2 Models of Evaluation
3 Overview of Existing Evaluation Modelsp. 39
Overview of Classic Evaluation Modelsp. 40
Selected Evaluation Modelsp. 42
Selecting a Modelp. 43
Conceptualizing a Useful Evaluation That Fits the Situationp. 44
4 Kirkpatrick's Four Levels of Evaluationp. 47
Kirkpatrick's Levelsp. 49
Comments on the Modelp. 54
Strengths and Limitationsp. 55
Application Example: Wagner (1995)p. 56
5 Phillips's Return-on-Investment Methodologyp. 61
Phillips's ROI Processp. 63
Comments on the Modelp. 67
Strengths and Limitationsp. 70
Application Example: Blake (1999)p. 70
6 Brinkerhoff's Success Case Methodp. 75
The SCM Processp. 77
Strengths and Weaknessesp. 78
Application Example: Brinkerhoff (2005)p. 79
7 The Impact Evaluation Processp. 81
The Elements of the Processp. 83
Comments on the Modelp. 96
Strengths and Limitationsp. 97
Application Examplep. 97
8 The Cipp Modelp. 107
Stufflebeam's Four Types of Evaluationp. 108
Articulating Core Values of Programs and Solutionsp. 111
Methods Used in CIPP Evaluationsp. 112
Strengths and Limitationsp. 113
Application Example: Filella-Guiu and Blanch-Pana (2002)p. 113
9 Evaluating Evaluationsp. 117
Evaluation Standardsp. 119
The American Evaluation Association Principles for Evaluatorsp. 120
Application Example: Lynch et al. (2003)p. 122
Part 3 Tools and Techniques of Evaluation
10 Datap. 133
Characteristics of Datap. 135
Scales of Measurementp. 137
Defining Required Data from Performance Objectivesp. 139
Deriving Measurable Indicatorsp. 141
Finding Data Sourcesp. 152
Follow-Up Questions and Datap. 155
11 Data Collectionp. 159
Observation Methodology and the Purpose of Measurementp. 160
Designing the Experimentp. 186
Problems with Classic Experimental Studies in Applied Settingsp. 188
Time-Series Studiesp. 188
Simulations and Gamesp. 189
Document-Centered Methodsp. 191
Conclusionp. 192
12 Analysis of Evaluation Data: Tools and Techniquesp. 195
Analysis of Models and Patternsp. 196
Analysis Using Structured Discussionp. 197
Methods of Quantitative Analysisp. 199
Statisticsp. 200
Graphical Representations of Datap. 210
Measures of Relationshipp. 212
Inferential Statistics: Parametric and Nonparametricp. 214
Interpretationp. 217
13 Communicating the Findingsp. 221
Recommendationsp. 222
Considerations for Implementing Recommendationsp. 225
Developing the Reportp. 226
The Evaluator's Role After the Reportp. 235
Part 4 Continual Improvement
14 Common Errors in Evaluationp. 239
Errors of System Mappingp. 240
Errors of Logicp. 242
Errors of Procedurep. 244
Conclusionp. 246
15 Continual Improvementp. 249
What Is Continual Improvement?p. 250
Monitoring Performancep. 250
Adjusting Performancep. 253
The Role of Leadershipp. 254
16 Contracting for Evaluation Servicesp. 257
The Contractp. 258
Contracting Controlsp. 260
Ethics and Professionalismp. 262
Sample Statement of Workp. 262
17 Intelligence Gathering for Decision Makingp. 271
Performance Measurement Systemsp. 273
Issues in Performance Measurement Systemsp. 275
Conclusionp. 277
18 The Future of Evaluation in Performance Improvementp. 279
Evaluation and Measurement in Performance Improvement Todayp. 281
What Does the Future Hold?p. 282
Conclusionp. 283
References and Related Readingsp. 285
Indexp. 295