Skip to:Content
|
Bottom
Cover image for Data collection : planning for and collecting all types of data
Title:
Data collection : planning for and collecting all types of data
Personal Author:
Series:
The measurement and evaluation series ; 2
Publication Information:
San Francisco, CA : Pfeiffer, 2008
Physical Description:
xxv, 155 p. : ill. ; 23 cm.
ISBN:
9780787987183

Available:*

Library
Item Barcode
Call Number
Material Type
Item Category 1
Status
Searching...
30000010215635 HD69.P75 P446 2008 Open Access Book Book
Searching...

On Order

Summary

Summary

Data Collection

Data Collection is the second of six books in the Measurement and Evaluation Series from Pfeiffer. The proven ROI Methodology--developed by the ROI Institute--provides a practical system for evaluation planning, data collection, data analysis, and reporting. All six books in the series offer the latest tools, most current research, and practical advice for measuring ROI in a variety of settings.

Data Collection offers an effective process for collecting data that is essential to the implementation of the ROI Methodology. The authors outline the techniques, processes, and critical issues involved in successful data collection. The book examines the various methods of data collection, including questionnaires, interviews, focus groups, observation, action plans, performance contracts, and monitoring records. Written for evaluators, facilitators, analysts, designers, coordinators, and managers, Data Collection is a valuable guide for collecting data that are adequate in quantity and quality to produce a complete and credible analysis.


Author Notes

Patricia Pulliam Phillips is an internationally recognized author, consultant, and president and CEO of the ROI Institute, Inc.
Cathy A. Stawarski is program manager of the Strategic Performance Improvement and Evaluation program at the Human Resources Research Organization (HumRRO) in Alexandria, Virginia


Table of Contents

Acknowledgments from the Editorsp. xxi
Principles of the ROI Methodologyp. xxiii
Chapter 1 Using Questionnaires and Surveysp. 1
Types of Questionsp. 1
Questionnaire Design Stepsp. 2
Determine the Specific Information Neededp. 2
Involve Stakeholders in the Processp. 3
Select the Types of Questionsp. 3
Develop the Questionsp. 3
Check the Reading Levelp. 3
Test the Questionsp. 4
Address the Anonymity Issuep. 4
Design for Ease of Tabulation and Analysisp. 4
Develop the Completed Questionnaire and Prepare a Data Summaryp. 5
Improving the Response Rate for Questionnaires and Surveysp. 5
Provide Advance Communicationp. 5
Communicate the Purposep. 6
Describe the Data Integration Processp. 6
Keep the Questionnaire as Simple as Possiblep. 6
Simplify the Response Processp. 6
Use Local Manager Supportp. 7
Let the Participants Know That They Are Part of a Samplep. 7
Consider Incentivesp. 7
Have an Executive Sign the Introductory Letterp. 8
Use Follow-Up Remindersp. 8
Provide a Copy of the Results to the Participantsp. 8
Review the Questionnaire with Participantsp. 9
Consider a Captive Audiencep. 9
Communicate the Timing of Data Flowp. 9
Select the Appropriate Mediump. 10
Consider Anonymous or Confidential Inputp. 10
Pilot Test the Questionnairep. 10
Explain How Long Completing the Questionnaire Will Takep. 11
Personalize the Processp. 11
Provide an Updatep. 11
Final Thoughtsp. 12
Chapter 2 Using Testsp. 13
Types of Testsp. 13
Norm-Referenced Testsp. 13
Criterion-Referenced Testsp. 14
Performance Testsp. 14
Simulationsp. 16
Electromechanical Simulationp. 17
Task Simulationp. 17
Business Gamesp. 17
In-Basket Simulationp. 17
Case Studyp. 18
Role-Playingp. 18
Informal Testsp. 19
Exercises, Problems, or Activitiesp. 19
Self-Assessmentp. 20
Facilitator Assessmentp. 20
Final Thoughtsp. 21
Chapter 3 Using Interviews, Focus Groups, and Observationp. 23
Interviewsp. 23
Types of Interviewsp. 24
Interview Guidelinesp. 24
Develop the Questions to Be Askedp. 24
Test the Interviewp. 24
Prepare the Interviewersp. 25
Provide Clear Instructions to the Participantsp. 25
Schedule the Interviewsp. 25
Focus Groupsp. 25
Applications of Focus Groupsp. 26
Guidelinesp. 27
Plan Topics, Questions, and Strategy Carefullyp. 27
Keep the Group Size Smallp. 27
Use a Representative Samplep. 27
Use Experienced Facilitatorsp. 28
Observationsp. 28
Guidelines for Effective Observationp. 28
Observations Should Be Systematicp. 29
Observers Should Be Knowledgeablep. 29
The Observer's Influence Should Be Minimizedp. 29
Observers Should Be Selected Carefullyp. 30
Observers Must Be Fully Preparedp. 30
Observation Methodsp. 30
Behavior Checklistp. 30
Delayed Reportp. 31
Video Recordingp. 31
Audio Monitoringp. 32
Computer Monitoringp. 32
Final Thoughtsp. 32
Chapter 4 Using Other Data Collection Methodsp. 35
Business Performance Monitoringp. 35
Using Current Measuresp. 36
Identify Appropriate Measuresp. 36
Convert Current Measures to Usable Onesp. 36
Developing New Measuresp. 37
Action Planningp. 38
Developing an Action Planp. 40
Using Action Plans Successfullyp. 42
Communicate the Action Plan Requirement Earlyp. 42
Describe the Action Planning Process at the Beginning of the Programp. 42
Teach the Action Planning Processp. 42
Allow Time to Develop the Planp. 43
Have the Facilitator Approve Action Plansp. 43
Require Participants to Assign a Monetary Value to Each Improvementp. 43
Ask Participants to Isolate the Effects of the Programp. 44
Ask Participants to Provide a Confidence Level for Estimatesp. 44
Require That Action Plans Be Presented to the Groupp. 45
Explain the Follow-Up Processp. 45
Collect Action Plans at the Stated Follow-Up Timep. 46
Summarize the Data and Calculate the ROIp. 46
Applying Action Plansp. 48
Identifying Advantages and Disadvantages of Action Plansp. 51
Performance Contractsp. 51
Final Thoughtsp. 54
Chapter 5 Measuring Reaction and Planned Actionp. 55
Why Measure Reaction and Planned Action?p. 55
Customer Satisfactionp. 55
Immediate Adjustmentsp. 56
Team Evaluationp. 56
Predictive Capabilityp. 56
Importance of Other Levels of Evaluationp. 58
Areas of Feedbackp. 58
Data Collection Issuesp. 63
Timingp. 63
Methodsp. 64
Administrative Guidelinesp. 65
Uses of Reaction Datap. 67
Final Thoughtsp. 69
Chapter 6 Measuring Learning and Confidencep. 71
Why Measure Learning and Confidence?p. 71
The Learning Organizationp. 71
Compliance Issuesp. 72
Development of Competenciesp. 73
Certificationp. 73
Consequences of an Unprepared Workforcep. 73
The Role of Learning in Programsp. 74
Measurement Issuesp. 75
Challengesp. 75
Program Objectivesp. 75
Typical Measuresp. 76
Timingp. 77
Data Collection Methodsp. 79
Administrative Issuesp. 81
Validity and Reliabilityp. 81
Consistencyp. 82
Pilot Testingp. 83
Scoring and Reportingp. 83
Confronting Failurep. 84
Uses of Learning Datap. 84
Final Thoughtsp. 85
Chapter 7 Measuring Application and Implementationp. 87
Why Measure Application and Implementation?p. 87
Obtain Essential Informationp. 88
Track Program Focusp. 88
Discover Problems and Opportunitiesp. 89
Reward Effectivenessp. 90
Challengesp. 90
Linking Application with Learningp. 90
Building Data Collection into the Programp. 90
Ensuring a Sufficient Amount of Datap. 91
Addressing Application Needs at the Outsetp. 91
Measurement Issuesp. 92
Methodsp. 92
Objectivesp. 92
Areas of Coveragep. 93
Data Sourcesp. 93
Timingp. 95
Responsibilitiesp. 96
Data Collection Methodsp. 96
Questionnairesp. 96
Progress with Objectivesp. 97
Use of Program Materials and Handoutsp. 97
Application of Knowledge and Skillsp. 97
Changes in Work Activitiesp. 104
Improvements or Accomplishmentsp. 105
Definition of the Measurep. 105
Amount of Changep. 105
Unit Valuep. 105
Basis for Valuep. 106
Total Annual Impactp. 106
Other Factorsp. 106
Improvements Linked with the Programp. 107
Confidence Levelp. 107
Perception of Investment in the Programp. 107
Link with Output Measuresp. 107
Other Benefitsp. 108
Barriersp. 108
Enablersp. 108
Management Supportp. 108
Other Solutionsp. 109
Target Audience Recommendationsp. 109
Suggestions for Improvementp. 109
Interviews, Focus Groups, and Observationp. 110
Action Plansp. 110
Barriers to Applicationp. 111
Uses of Application Datap. 112
Final Thoughtsp. 112
Chapter 8 Measuring Impact and Consequencesp. 115
Why Measure Business Impact?p. 115
Impact Data Provide Higher-Level Information on Performancep. 115
Impact Data Represent the Business Driver of a Programp. 116
Impact Data Provide Value for Sponsorsp. 117
Impact Data Are Easy to Measurep. 117
Effective Impact Measuresp. 117
Hard Data Measuresp. 118
Soft Data Measuresp. 120
Tangible Versus Intangible Measuresp. 121
Impact Objectivesp. 122
Linking Specific Measures to Programsp. 123
Sources of Impact Datap. 126
Data Collection Methodsp. 127
Monitoring Business Performance Datap. 127
Identify Appropriate Measuresp. 128
Convert Current Measures to Usable Onesp. 128
Develop New Measuresp. 129
Action Plansp. 129
Set Goals and Targetsp. 130
Define the Unit of Measurep. 130
Place a Monetary Value on Each Improvementp. 131
Implement the Action Planp. 131
Document Specific Improvementsp. 132
Isolate the Effects of the Programp. 132
Provide a Confidence Level for Estimatesp. 133
Collect Action Plans at Specified Time Intervalsp. 133
Summarize the Data and Calculate the ROIp. 133
Performance Contractsp. 134
Questionnairesp. 134
Final Thoughtsp. 138
Chapter 9 Selecting the Proper Data Collection Methodp. 139
Matching Exercisep. 139
Selecting the Appropriate Method for Each Levelp. 143
Type of Datap. 143
Investment of Participants' Timep. 143
Investment of Managers' Timep. 144
Costp. 144
Disruption of Normal Work Activitiesp. 144
Accuracyp. 145
Built-In Design Possibilityp. 145
Utility of an Additional Methodp. 146
Cultural Bias of Data Collection Methodp. 146
Final Thoughtsp. 146
Indexp. 147
About the Authorsp. 153
Go to:Top of Page