Available:*
Library | Item Barcode | Call Number | Material Type | Item Category 1 | Status |
---|---|---|---|---|---|
Searching... | 30000010336710 | QA76.9.U83 C664 2014 | Open Access Book | Book | Searching... |
Searching... | 33000000008718 | QA76.9.U83 C664 2014 | Open Access Book | Book | Searching... |
On Order
Summary
Summary
This book provides the necessary tools for the evaluation of the interaction between the user who is disabled and the computer system that was designed to assist that person. The book creates an evaluation process that is able to assess the user's satisfaction with a developed system. Presenting a new theoretical perspective in the human computer interaction evaluation of disabled persons, it takes into account all of the individuals involved in the evaluation process.
Author Notes
Simone Borsci holds a PhD in cognitive psychology at the Sapienza University of Rome and currently works as a researcher at Brunel University of London. His researches are focused on different aspects of interaction: the user experience evaluation of interfaces and artifacts, the user preference analysis before and after use, the application of estimation models for determining an optimized sample size for an evaluation test, and the matching between assistive technologies/medical devices and users' needs. He is also an author or a contributor of more than 30 publications.
Masaaki Kurosu is a professor at the Open University of Japan. He is also the president of Human-Centered Design Network in Japan. Based on his experience as a usability professional in industry and academia, he proposed the concept of user engineering and the idea of artifact development analysis as well as the new concept of experience engineering. Professor Masaaki received his MA in psychology from Waseda University. He served as a conference chair in many international conferences and is an author or a contributor of more than 40 books.
Stefano Federici, PhD, currently serves as a professor of general psychology and psychology of disability at the University of Perugia, Italy. He is a member of the editorial board of Disability and Rehabilitation: AssistiveTechnology and Cognitive Processing as well as of the Scientific Committee of the International Conference on Space Cognition. He has authored more than 150 international and national publications on cognitive psychology, psychotechnology, disability,sexuality and disability, and usability. He currently leads the CognitiveLab research team at the University of Perugia.
Maria Laura Mele is a psychologist. She received her PhD in cognitive, physiological, and personality psychology from the Interuniversity Center for Research on Cognitive Processing in Natural and Artificial Systems (ECoNA) of the Sapienza University of Rome. Her main research topics are focused on usability and user experience of visual and sonified human-computer interfaces, with a focus on both implicit and explicit cognitive components involved in human interaction processes. She is currently a member of the CognitiveLab research team at the University of Perugia.
Table of Contents
Foreword | p. xi |
Foreword | p. xiii |
Preface | p. xv |
Acknowledgments | p. xxiii |
Authors | p. xxv |
Contributors | p. xxvii |
Chapter 1 Brief History of Human-Computer Interaction | p. 1 |
1.1 Historical Progress of Evaluation Models in Human-Computer Interaction Science | p. 1 |
1.1.1 First Period, from 1950 to 1963: The Programmer Is the User | p. 3 |
1.1.2 Second Period, from 1963 to 1984: Evolution of Human-Computer Interaction Models | p. 5 |
1.1.3 Third Period, from 1984 to 1998: Personal Computer and the Internet Era | p. 9 |
1.1.4 Fourth Period, from 1998 until Now: From Interaction Standards to User Interface for All | p. 12 |
1.2 Political Movement and the Standards: Accessibility as the First Pillar | p. 12 |
1.3 Usability and Design Philosophy: The Second and the Third Pillars | p. 15 |
1.3.1 From a Fragmented Set of Usability Evaluation Methods to the Need for a Unified Evaluation Approach | p. 23 |
1.3.2 Design Philosophy | p. 28 |
1.4 Merging Design and Evaluation of Interaction: An Integrated Model of Interaction Evaluation | p. 33 |
Focus Sections of Chapter | p. 1 |
Box 1.1 A Brief Introduction to the Visualization of Networked Data Sets | p. 6 |
Box 1.2 From WCAG 1.0 to WCAG 2.0 | p. 16 |
Box 1.3 GOMS Evaluation Technique | p. 25 |
Box 1.4 ACCESS and AVANTI Project: International Initiatives toward User Interface for All | p. 32 |
Chapter 2 Defining Usability, Accessibility, and User Experience | p. 37 |
2.1 Introduction: Accessibility, Usability, and User Experience in Human-Computer Interaction | p. 37 |
2.2 Concept of Accessibility | p. 39 |
2.3 Usability: From the Small to the Big Perspective | p. 41 |
2.3.1 Usability: Toward a Unified Standard | p. 43 |
2.4 Relationships and Differences between Accessibility and Usability | p. 46 |
2.5 User Experience | p. 49 |
2.5.1 Steps of UX: From the Expectations of the Users before Product Purchase to the Final Impression of the Product | p. 52 |
2.6 Conclusion | p. 54 |
Chapter 3 Why We Should Be Talking about Psychotechnologies for Socialization, Not lust Websites | p. 57 |
3.1 Introduction: The Psychotechnological Evolution | p. 57 |
3.2 What Is Psychotechnology? | p. 58 |
3.3 From Artifacts to Psychotechnologies | p. 66 |
3.4 Psychotechnologies for Socialization | p. 69 |
3.4.1 Studies on Personality Characteristics Associated with Social Networking Sites | p. 77 |
3.4.2 Studies on Social Networking Sites and Identity Construction | p. 78 |
3.5 Web 2.0: From a Network System to an Ecosystem | p. 81 |
3.6 Conclusion | p. 87 |
Focus Sections of Chapter | p. 3 |
Box 3.1 The Biopsychosocial Model and Reciprocal Triadic Causation | p. 60 |
Box 3.2 Positive Technology | p. 71 |
Box 3.3 Mind, Body, and Sex in Cyberspace | p. 73 |
Box 3.4 Facebook Contribution to the 2011 Tunisian Revolution: What Can Cyberpsychology Teach Us about the Arab Spring Uprisings? | p. 85 |
Chapter 4 Equalizing the Relationship between Design and Evaluation | p. 89 |
4.1 Active Role of Today's End-User in the Pervasive Interaction with Psychotechnologies | p. 89 |
4.2 Equalizing the Design and the Evaluation Processes | p. 93 |
4.2.1 Intrasystemic Solution: A New Perspective on the Relation between Design and Evaluation | p. 94 |
4.3 Intrasystemic Solution from a Psychotechnological Perspective | p. 98 |
4.4 Conclusion | p. 102 |
Focus Section of Chapter | p. 4 |
Box 4.1 Smart Future Initiative: The Disappearing Computer and Ubiquitous Computing | p. 90 |
Chapter 5 Why We Need an Integrated Model of Interaction Evaluation | p. 105 |
5.1 Evaluator's Perspective in the Product Life Cycle | p. 105 |
5.2 Objectivity and Subjectivity in Interaction: When the System Overrides the User | p. 107 |
5.2.1 Bridge between Object and Subject: The Integrated Model of Evaluation | p. 109 |
5.3 Problems and Errors in the Evaluation | p. 114 |
5.3.1 Problems and Errors: From the Integrated Model to the Integrated Methodology | p. 117 |
5.4 Discrimination and Matching of Problems and Errors: The Integrated Methodology of Interaction Evaluation | p. 119 |
5.4.1 From the Concept of Mental Model to the Integrated Methodology of Interaction Evaluation | p. 120 |
5.4.2 Goals of the Integrated Methodology of Interaction Evaluation | p. 124 |
5.4.2.1 Identification of the Interaction Problems | p. 127 |
5.4.2.2 Distance between the User and the Designer | p. 127 |
5.4.2.3 How to Measure the Distance: The Evaluator's Role and Evaluation Model | p. 132 |
5.5 How to Use the Integrated Methodology: The Decision Process Carried Out by the Evaluator | p. 136 |
5.6 Conclusion | p. 140 |
Chapter 6 Why Understanding Disabled Users' Experience Matters | p. 143 |
6.1 Disabled Users' Experience | p. 143 |
6.1.1 Big Accessibility Approach | p. 145 |
6.2 Modeling Users' Interaction Behavior: The Simulation Process | p. 151 |
6.3 Decision Process for User Testing: Sample Selection and Representativeness of Data | p. 154 |
6.3.1 Three Keys for Monitoring Participants' Selection Process | p. 157 |
6.3.2 Representativeness of the Sample | p. 160 |
6.4 Simulation and Selection of Disabled Users for Composing Mixed Samples | p. 161 |
6.5 Testing Disabled Users | p. 163 |
6.6 Conclusion | p. 164 |
Focus Section of Chapter | p. 6 |
Box 6.1 How Many People with a Disability Are There in the World? | p. 146 |
Chapter 7 How You Can Set Up and Perform an Interaction Evaluation: Rules and Methods | p. 167 |
7.1 What Is the Evaluation Process? | p. 167 |
7.1.1 Significance of Evaluation: From Commonsense to Evaluation Criteria | p. 168 |
7.1.2 Evaluation in Terms of Measurements and Criteria | p. 170 |
7.1.3 Process of Goal Achievement and Its Assessment | p. 172 |
7.2 UX and Usability: The Importance of the User's Long- and Short-Term Use of a Product | p. 173 |
7.2.1 Dynamic Process of the User Experience | p. 174 |
7.3 Brief Overview of the Techniques for Assessing UX and Usability | p. 176 |
7.4 Effectiveness and Efficiency of the Evaluation Process and the Management of the Gathered Data | p. 179 |
7.4.1 Management of the Qualitative Data: An Overview of the Grounded-Theory Approach | p. 181 |
7.5 Grounded Procedure for the Management of Data and to Determine the Number of Problems Discovered by a Sample | p. 183 |
7.5.1 What Does It Mean to Monitor Problems? | p. 184 |
7.5.2 Refining the p-Value of Heterogeneous Samples through Estimation Models | p. 187 |
7.5.3 Making a Decision on the Basis of the Sample Behavior | p. 190 |
7.6 Conclusion | p. 190 |
Chapter 8 Evaluation Techniques, Applications, and Tools | p. 193 |
8.1 Introduction | p. 193 |
8.2 Inspection and Simulation Methods of the Expected Interaction | p. 196 |
8.2.1 Inspection of the Interaction | p. 197 |
8.2.2 Heuristic Evaluation | p. 198 |
8.2.3 Cognitive Walkthrough Method | p. 200 |
8.2.4 Task Analysis | p. 203 |
8.2.5 Summary of Inspection and Simulation Methods of the Expected Interaction | p. 204 |
8.3 Qualitative and Subjective Measurements for Interaction Analysis | p. 205 |
8.3.1 Questionnaire and Psychometric Tools | p. 206 |
8.3.2 Interview | p. 211 |
8.3.3 Observation | p. 213 |
8.3.4 Diary | p. 214 |
8.3.5 Eye-Tracking Methodology and Biofeedback | p. 214 |
8.3.5.1 Biofeedback Usability and UX Testing | p. 215 |
8.3.5.2 Eye-Tracking Usability and UX Testing | p. 216 |
8.3.6 Summary of the Qualitative and Subjective Measurements for Interaction Analysis | p. 218 |
8.4 Usability Testing and Analysis of Real Interaction | p. 219 |
8.4.1 Usability Testing | p. 219 |
8.4.2 Concurrent Thinking Aloud in Usability Testing | p. 221 |
8.4.3 Retrospective Thinking Aloud in Usability Testing | p. 224 |
8.4.4 Alternative Verbal Protocols for Disabled Users and Partial Concurrent Thinking Aloud | p. 225 |
8.4.5 Remote Testing | p. 227 |
8.4.6 Summary of Usability Testing and the Analysis of Real User Interaction | p. 229 |
8.5 Conclusion | p. 230 |
References | p. 233 |
Index | p. 261 |