MUSiC video analysis and context tools for usability measurement

MUSiC video analysis and context tools for usability measurement lNlfRcHr93 MUSiC Video Analysis and Context for Usability Measurement Miles Macleod and Nigel Bevan National Physical Laboratory Division of Information Technology and Computing Teddington, Middlesex, TWl 10LW, UK miles@ hci.npl.co.uk tel: +4481 9436097 24-29 April1993 Tools KEYWORDS: Usability evaluation, metrics, engineering, observation, video analysis. usability INTRODUCTION Analysis of interaction between users and a system, based on video-assisted observation, can provide a highly informative and effeetive means of evaluating usability. To obtain valid and reliable results, the people observed should be representative users performing representative work tasks in appropriate circumstrmees, and the analysis should be methodical. The MUSiC Performance Measurement Method (PMM) - developed at NPL as part of the ESPRIT Project MUSiC: Metrics for Usability Standards in Computing “ provides a validated method for making and analysing such video recordings to derive performance-based usability metrics. PMM is supported by the DRUM software tool which greatly speeds up anrdysis of video, and helps manage evaluations. USABILITY AND CONTEXT Usability can be defined in terms of the efficiency and satisfaction with which specifkxl users ean achieve specifkd work goals in given environments. The MUSiC Context Guidelines Handbook provides a structured method for identifying and describing key characteristics of the ˜context of use ™ “ the users, tasks and environments for which a system is designed “ and key characteristics of the context of evaluation. It documents how accurately the context of evaluation matches the intended context of use. The PMM gives measures of effectiveness and efficiency of system use, by evaluating task goal achievement and times. It also gives measures of unproductive time (e.g. problems and seeking help), plus diagnostic &ta about location of difficulties, helping identify where specific improvements need to be made. Efficiency and user satisfaction are not necessarily correlated “ a system can be satisfying but not very efficient to use, or vice versa - so there is great advantage in measuring both. The PMM is concerned with one of the core components of usability, efficiency; another MUSiC tool (SUMI) measurea user satisfaction. SOFTWARE SUPPORT: DRUM Video analysis has previously been very time-consuming. It can now be performed considerably faster using the Diagnostic Recorder for Usability Measurement (DRUM) which provides support for the management and analysis of usability evaluations, including the derivation of usability metrics. DRUM assists in many aspects of the evaluator ™s work . management of data through all stages of an evahtation q task analysis to assist identification and analysis of specflc events and usability problems q vi&o control, and creation of an interaction log of each evaluation session q automatd fiid and video replay of any logged event q analysis of logged data and calculation of metrics Iteratively developed since 1990 in collaboration with industry to meet the actual needs of usability testing, DRUM has a graphical user interface, online contextsensitive help and a comprehensive user manual. It runs on Apple Macintosh, and drives a variety of video machines. DRUM allows evaluators to define at a suitable level of analysis the events they wish to log (hierarchically organised if desired). This overcomes difficulties of data analysis which ean be encountered with capture of data at the low level of keystrokes and mouse events (Theaker et al., 1989; Hrtmmontree et al., 1992). DRUM provides full control of the video during logging of a tape. Once any event has been logged, it can be automatically located on the video, and reviewed. There is easy access to previously created logs and other evaluation data files from its database. DRUM supports diagnostic evaluation, including the identification of evaluator-defined critical incidents, and ean be used explicitly for this purpose. ACKNOWLEDGEMENTS This work was supported jointly by the Commission of the European Communities and the Department of Trade and Industry, UK. REFERENCES HIMS: A Tool for HCI 1. Theaker, C. J., et al. Evaluations, in Proe. HCI ™89 Conf, (Nottingham, UK, 58 Sept 1989) Cambridge University Press, pp 427-439. 2. Hammontree, M.L., et al. Integrated Data Capture and Analysis Tools for Research and Testing on Graphical User Irtterfaees, in Proc. CHI ™92, ACM Press, pp 431-432. ACM 0-89791-575-51931000410055 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png

MUSiC video analysis and context tools for usability measurement

Association for Computing Machinery — May 1, 1993

Loading next page...
/lp/association-for-computing-machinery/music-video-analysis-and-context-tools-for-usability-measurement-ieOLqX2PSd
Datasource
Association for Computing Machinery
Copyright
Copyright © 1993 by ACM Inc.
ISBN
0-89791-575-5
doi
10.1145/169059.169072
Publisher site
See Article on Publisher Site

Abstract

lNlfRcHr93 MUSiC Video Analysis and Context for Usability Measurement Miles Macleod and Nigel Bevan National Physical Laboratory Division of Information Technology and Computing Teddington, Middlesex, TWl 10LW, UK miles@ hci.npl.co.uk tel: +4481 9436097 24-29 April1993 Tools KEYWORDS: Usability evaluation, metrics, engineering, observation, video analysis. usability INTRODUCTION Analysis of interaction between users and a system, based on video-assisted observation, can provide a highly informative and effeetive means of evaluating usability. To obtain valid and reliable results, the people observed should be representative users performing representative work tasks in appropriate circumstrmees, and the analysis should be methodical. The MUSiC Performance Measurement Method (PMM) - developed at NPL as part of the ESPRIT Project MUSiC: Metrics for Usability Standards in Computing “ provides a validated method for making and analysing such video recordings to derive performance-based usability metrics. PMM is supported by the DRUM software tool which greatly speeds up anrdysis of video, and helps manage evaluations. USABILITY AND CONTEXT Usability can be defined in terms of the efficiency and satisfaction with which specifkxl users ean achieve specifkd work goals in given environments. The MUSiC Context Guidelines Handbook provides a structured method for identifying and describing key characteristics of the ˜context of use ™ “ the users, tasks and environments for which a system is designed “ and key characteristics of the context of evaluation. It documents how accurately the context of evaluation matches the intended context of use. The PMM gives measures of effectiveness and efficiency of system use, by evaluating task goal achievement and times. It also gives measures of unproductive time (e.g. problems and seeking help), plus diagnostic &ta about location of difficulties, helping identify where specific improvements need to be made. Efficiency and user satisfaction are not necessarily correlated “ a system can be satisfying but not very efficient to use, or vice versa - so there is great advantage in measuring both. The PMM is concerned with one of the core components of usability, efficiency; another MUSiC tool (SUMI) measurea user satisfaction. SOFTWARE SUPPORT: DRUM Video analysis has previously been very time-consuming. It can now be performed considerably faster using the Diagnostic Recorder for Usability Measurement (DRUM) which provides support for the management and analysis of usability evaluations, including the derivation of usability metrics. DRUM assists in many aspects of the evaluator ™s work . management of data through all stages of an evahtation q task analysis to assist identification and analysis of specflc events and usability problems q vi&o control, and creation of an interaction log of each evaluation session q automatd fiid and video replay of any logged event q analysis of logged data and calculation of metrics Iteratively developed since 1990 in collaboration with industry to meet the actual needs of usability testing, DRUM has a graphical user interface, online contextsensitive help and a comprehensive user manual. It runs on Apple Macintosh, and drives a variety of video machines. DRUM allows evaluators to define at a suitable level of analysis the events they wish to log (hierarchically organised if desired). This overcomes difficulties of data analysis which ean be encountered with capture of data at the low level of keystrokes and mouse events (Theaker et al., 1989; Hrtmmontree et al., 1992). DRUM provides full control of the video during logging of a tape. Once any event has been logged, it can be automatically located on the video, and reviewed. There is easy access to previously created logs and other evaluation data files from its database. DRUM supports diagnostic evaluation, including the identification of evaluator-defined critical incidents, and ean be used explicitly for this purpose. ACKNOWLEDGEMENTS This work was supported jointly by the Commission of the European Communities and the Department of Trade and Industry, UK. REFERENCES HIMS: A Tool for HCI 1. Theaker, C. J., et al. Evaluations, in Proe. HCI ™89 Conf, (Nottingham, UK, 58 Sept 1989) Cambridge University Press, pp 427-439. 2. Hammontree, M.L., et al. Integrated Data Capture and Analysis Tools for Research and Testing on Graphical User Irtterfaees, in Proc. CHI ™92, ACM Press, pp 431-432. ACM 0-89791-575-51931000410055

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create folders to
organize your research

Export folders, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month