Integration of Usability Evaluation Studies via a Novel Meta-Analytic Approach: What are Significant Attributes for Effective Evaluation?
- Authors
- Hwang, Wonil; Salvendy, Gavriel
- Issue Date
- 2009
- Publisher
- TAYLOR & FRANCIS INC
- Citation
- INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, v.25, no.4, pp.282 - 306
- Journal Title
- INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION
- Volume
- 25
- Number
- 4
- Start Page
- 282
- End Page
- 306
- URI
- http://scholarworks.bwise.kr/ssu/handle/2018.sw.ssu/16737
- DOI
- 10.1080/10447310802629793
- ISSN
- 1044-7318
- Abstract
- The overall discovery rates, which are the ratios of sum of unique usability problems detected by all experiment participants against the number of usability problems existed in the evaluated systems, were investigated to find significant factors of usability evaluation through a meta-analytic approach with the n-corrected effect sizes newly defined in this study. Since many studies of usability evaluation have been conducted under specific contexts showing some mixed findings, usability practitioners need holistic and more generalized conclusions. Due to the limited applicability of the traditional meta-analysis to usability evaluation studies, a new meta-analytic approach was established and applied to 38 experiments that reported overall discovery rates of usability problems as a criterion measure. Through the meta-analytic approach with the n-corrected effect sizes, we successfully combined 38 experiments and found evaluator's expertise, report type, and interaction between usability evaluation method and time constraint as significant factors. We suggest that in order to increase overall discovery rates of usability problems, (a) free-style written reports are better than structured written reports; (b) when heuristic evaluation or cognitive walkthrough is used, the usability evaluation experiments should be conducted without time constraint, but when think-aloud is used, time constraint is not an important experimental condition; (c) usability practitioners do not need to be concerned about unit of evaluation, fidelity of evaluated systems, and task type; and (d) HCI experts are better than novice users or evaluators. Our conclusions can guide usability practitioners when determining evaluation contexts, and the meta-analytic approach of this study provides an alternative way to combine the empirical results of usability evaluation besides the traditional meta-analysis.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Engineering > Department of Industrial & Information Systems Engineering > 1. Journal Articles
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.