Open Access. Powered by Scholars. Published by Universities.®

Computer Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

University of Texas at El Paso

Usability

Publication Year
File Type

Articles 1 - 6 of 6

Full-Text Articles in Computer Engineering

Usability Inspection Methods After 15 Years Of Research And Practice, David G. Novick, Tasha Hollingsed Oct 2007

Usability Inspection Methods After 15 Years Of Research And Practice, David G. Novick, Tasha Hollingsed

Departmental Papers (CS)

Usability inspection methods, such as heuristic evaluation, the cognitive walkthrough, formal usability inspections, and the pluralistic usability walkthrough, were introduced fifteen years ago. Since then, these methods, analyses of their comparative effectiveness, and their use have evolved in different ways. In this paper, we track the fortunes of the methods and analyses, looking at which led to use and to further research, and which led to relative methodological dead ends. Heuristic evaluation and the cognitive walkthrough appear to be the most actively used and researched techniques. The pluralistic walkthrough remains a recognized technique, although not the subject of significant further ...


Toward A More Accurate View Of When And How People Seek Help With Computer Applications, David G. Novick, Edith Elizalde, Nathaniel Bean Oct 2007

Toward A More Accurate View Of When And How People Seek Help With Computer Applications, David G. Novick, Edith Elizalde, Nathaniel Bean

Departmental Papers (CS)

Based on 40 interviews and 11 on-site workplace observations of people using computer applications at work, we confirm that use of printed and on-line help is very low and find that providing greater detail of categories solution methods can present a more realistic picture of users’ behaviors. Observed study participants encountered a usability problem on average about once every 75 minutes and typically spent about a minute looking for a solution. Participants consumed much more time when they were unaware of a direct way of doing something and instead used less effective methods. Comparison of results from different data-collection methods ...


Why Don't People Read The Manual?, David G. Novick, Karen Ward Oct 2006

Why Don't People Read The Manual?, David G. Novick, Karen Ward

Departmental Papers (CS)

Few users of computer applications seek help from the documentation. This paper reports the results of an empirical study of why this is so and examines how, in real work, users solve their usability problems. Based on in-depth interviews with 25 subjects representing a varied cross-section of users, we find that users do avoid using both paper and online help systems. Few users have paper manuals for the most heavily used applications, but none complained about their lack. Online help is more likely to be consulted than paper manuals, but users are equally likely to report that they solve their ...


Usability Over Time, Valerie Mendoza, David G. Novick Sep 2005

Usability Over Time, Valerie Mendoza, David G. Novick

Departmental Papers (CS)

Testing of usability could perhaps be more accurately described as testing of learnability. We know more about the problems of novice users than we know of the problems of experienced users. To understand how these problems differ, and to understand how usability problems change as users change from novice to experienced, we conducted a longitudinal study of usability among middle-school teachers creating Web sites. The study looked at the use both the use of documentation and the underlying software, tracking the causes and extent of user frustration over eight weeks. We validated a categorization scheme for frustration episodes. We found ...


Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick Sep 2005

Root Causes Of Lost Time And User Stress In A Simple Dialog System, Nigel G. Ward, Anais G. Rivera, Karen Ward, David G. Novick

Departmental Papers (CS)

As a priority-setting exercise, we compared interactions between users and a simple spoken dialog system to interactions between users and a human operator. We observed usability events, places in which system behavior differed from human behavior, and for each we noted the impact, root causes, and prospects for improvement. We suggest some priority issues for research, involving not only such core areas as speech recognition and synthesis and language understanding and generation, but also less-studied topics such as adaptive or flexible timeouts, turn-taking and speaking rate.


Accounting For Domain Context In Evaluation, Meriem Chater, David G. Novick Jan 2001

Accounting For Domain Context In Evaluation, Meriem Chater, David G. Novick

Departmental Papers (CS)

Work is situated activity. Taking into account human factors in evaluation involves considering not only users but also their contexts of use. Consequently, the evaluation of systems — from video-games to safetycritical interfaces — requires analysis of context to understand not only the effect of context on usability but also the impact of artifacts' usability on users' environments. In the case of safety-critical systems (SCS), errors (by users or designers) may threaten human lives.
To assess the degree to which interface evaluation methods currently account for context, we have used the research strategy taxonomy of McGrath as a framework for classifying existing ...