Usability Inspection: Novice Crowd Inspectors versus Expert
Abstract
Objective: This research study aims to investigate the use of novice crowd inspectors for usability inspection with respect to time spent and the cost incurred. This study compares the results of the novice crowd usability inspection guided by a single expert's heuristic usability inspection (novice crowd usability inspection henceforth) with the expert heuristic usability inspection. Background: Traditional usability evaluation methods are time consuming and expensive. Crowdsourcing has emerged as a cost effective and quick means of software usability evaluation. Method: In this regard, we designed an experiment to evaluate the usability of two websites and a web dashboard. Results: The results of the experiment show that novice crowd usability inspection guided by a single expert's heuristic usability inspection: a). Finds the same usability issues (w.r.t. content & quantity) as expert heuristic usability inspection. b). Is cost effective than expert heuristic usability inspection employing less time duration. Conclusion: Based on the findings of this research study, we can conclude that the novice crowd usability inspection guided by a single expert's heuristic usability inspection and expert heuristic usability inspection, on average, gives the same results in terms of issues identified.
- Publication:
-
arXiv e-prints
- Pub Date:
- October 2021
- DOI:
- 10.48550/arXiv.2110.14228
- arXiv:
- arXiv:2110.14228
- Bibcode:
- 2021arXiv211014228N
- Keywords:
-
- Computer Science - Software Engineering;
- Computer Science - Human-Computer Interaction
- E-Print:
- doi:10.1016/j.jss.2021.111122