MUSiC video analysis and context tools for usability measurement
MUSiC video analysis and context tools for usability measurement
Macleod, Miles; Bevan, Nigel
1993-05-01 00:00:00
lNlfRcHr93 MUSiC Video Analysis and Context for Usability Measurement Miles Macleod and Nigel Bevan National Physical Laboratory Division of Information Technology and Computing Teddington, Middlesex, TWl 10LW, UK miles@ hci.npl.co.uk tel: +4481 9436097 24-29 April1993 Tools KEYWORDS: Usability evaluation, metrics, engineering, observation, video analysis. usability INTRODUCTION Analysis of interaction between users and a system, based on video-assisted observation, can provide a highly informative and effeetive means of evaluating usability. To obtain valid and reliable results, the people observed should be representative users performing representative work tasks in appropriate circumstrmees, and the analysis should be methodical. The MUSiC Performance Measurement Method (PMM) - developed at NPL as part of the ESPRIT Project MUSiC: Metrics for Usability Standards in Computing provides a validated method for making and analysing such video recordings to derive performance-based usability metrics. PMM is supported by the DRUM software tool which greatly speeds up anrdysis of video, and helps manage evaluations. USABILITY AND CONTEXT Usability can be defined in terms of the efficiency and satisfaction with which specifkxl users ean achieve specifkd work goals in given environments. The MUSiC Context Guidelines Handbook provides a structured method for identifying and describing key characteristics of the context of use the users, tasks and environments for which a system is designed and key characteristics of the context of evaluation. It documents how accurately the context of evaluation matches the intended context of use. The PMM gives measures of effectiveness and efficiency of system use, by evaluating task goal achievement and times. It also gives measures of unproductive time (e.g. problems and seeking help), plus diagnostic &ta about location of difficulties, helping identify where specific improvements need to be made. Efficiency and user satisfaction are not necessarily correlated a system can be satisfying but not very efficient to use, or vice versa - so there is great advantage in measuring both. The PMM is concerned with one of the core components of usability, efficiency; another MUSiC tool (SUMI) measurea user satisfaction. SOFTWARE SUPPORT: DRUM Video analysis has previously been very time-consuming. It can now be performed considerably faster using the Diagnostic Recorder for Usability Measurement (DRUM) which provides support for the management and analysis of usability evaluations, including the derivation of usability metrics. DRUM assists in many aspects of the evaluator s work . management of data through all stages of an evahtation q task analysis to assist identification and analysis of specflc events and usability problems q vi&o control, and creation of an interaction log of each evaluation session q automatd fiid and video replay of any logged event q analysis of logged data and calculation of metrics Iteratively developed since 1990 in collaboration with industry to meet the actual needs of usability testing, DRUM has a graphical user interface, online contextsensitive help and a comprehensive user manual. It runs on Apple Macintosh, and drives a variety of video machines. DRUM allows evaluators to define at a suitable level of analysis the events they wish to log (hierarchically organised if desired). This overcomes difficulties of data analysis which ean be encountered with capture of data at the low level of keystrokes and mouse events (Theaker et al., 1989; Hrtmmontree et al., 1992). DRUM provides full control of the video during logging of a tape. Once any event has been logged, it can be automatically located on the video, and reviewed. There is easy access to previously created logs and other evaluation data files from its database. DRUM supports diagnostic evaluation, including the identification of evaluator-defined critical incidents, and ean be used explicitly for this purpose. ACKNOWLEDGEMENTS This work was supported jointly by the Commission of the European Communities and the Department of Trade and Industry, UK. REFERENCES HIMS: A Tool for HCI 1. Theaker, C. J., et al. Evaluations, in Proe. HCI 89 Conf, (Nottingham, UK, 58 Sept 1989) Cambridge University Press, pp 427-439. 2. Hammontree, M.L., et al. Integrated Data Capture and Analysis Tools for Research and Testing on Graphical User Irtterfaees, in Proc. CHI 92, ACM Press, pp 431-432. ACM 0-89791-575-51931000410055
http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.pnghttp://www.deepdyve.com/lp/association-for-computing-machinery/music-video-analysis-and-context-tools-for-usability-measurement-ieOLqX2PSd
MUSiC video analysis and context tools for usability measurement
lNlfRcHr93 MUSiC Video Analysis and Context for Usability Measurement Miles Macleod and Nigel Bevan National Physical Laboratory Division of Information Technology and Computing Teddington, Middlesex, TWl 10LW, UK miles@ hci.npl.co.uk tel: +4481 9436097 24-29 April1993 Tools KEYWORDS: Usability evaluation, metrics, engineering, observation, video analysis. usability INTRODUCTION Analysis of interaction between users and a system, based on video-assisted observation, can provide a highly informative and effeetive means of evaluating usability. To obtain valid and reliable results, the people observed should be representative users performing representative work tasks in appropriate circumstrmees, and the analysis should be methodical. The MUSiC Performance Measurement Method (PMM) - developed at NPL as part of the ESPRIT Project MUSiC: Metrics for Usability Standards in Computing provides a validated method for making and analysing such video recordings to derive performance-based usability metrics. PMM is supported by the DRUM software tool which greatly speeds up anrdysis of video, and helps manage evaluations. USABILITY AND CONTEXT Usability can be defined in terms of the efficiency and satisfaction with which specifkxl users ean achieve specifkd work goals in given environments. The MUSiC Context Guidelines Handbook provides a structured method for identifying and describing key characteristics of the context of use the users, tasks and environments for which a system is designed and key characteristics of the context of evaluation. It documents how accurately the context of evaluation matches the intended context of use. The PMM gives measures of effectiveness and efficiency of system use, by evaluating task goal achievement and times. It also gives measures of unproductive time (e.g. problems and seeking help), plus diagnostic &ta about location of difficulties, helping identify where specific improvements need to be made. Efficiency and user satisfaction are not necessarily correlated a system can be satisfying but not very efficient to use, or vice versa - so there is great advantage in measuring both. The PMM is concerned with one of the core components of usability, efficiency; another MUSiC tool (SUMI) measurea user satisfaction. SOFTWARE SUPPORT: DRUM Video analysis has previously been very time-consuming. It can now be performed considerably faster using the Diagnostic Recorder for Usability Measurement (DRUM) which provides support for the management and analysis of usability evaluations, including the derivation of usability metrics. DRUM assists in many aspects of the evaluator s work . management of data through all stages of an evahtation q task analysis to assist identification and analysis of specflc events and usability problems q vi&o control, and creation of an interaction log of each evaluation session q automatd fiid and video replay of any logged event q analysis of logged data and calculation of metrics Iteratively developed since 1990 in collaboration with industry to meet the actual needs of usability testing, DRUM has a graphical user interface, online contextsensitive help and a comprehensive user manual. It runs on Apple Macintosh, and drives a variety of video machines. DRUM allows evaluators to define at a suitable level of analysis the events they wish to log (hierarchically organised if desired). This overcomes difficulties of data analysis which ean be encountered with capture of data at the low level of keystrokes and mouse events (Theaker et al., 1989; Hrtmmontree et al., 1992). DRUM provides full control of the video during logging of a tape. Once any event has been logged, it can be automatically located on the video, and reviewed. There is easy access to previously created logs and other evaluation data files from its database. DRUM supports diagnostic evaluation, including the identification of evaluator-defined critical incidents, and ean be used explicitly for this purpose. ACKNOWLEDGEMENTS This work was supported jointly by the Commission of the European Communities and the Department of Trade and Industry, UK. REFERENCES HIMS: A Tool for HCI 1. Theaker, C. J., et al. Evaluations, in Proe. HCI 89 Conf, (Nottingham, UK, 58 Sept 1989) Cambridge University Press, pp 427-439. 2. Hammontree, M.L., et al. Integrated Data Capture and Analysis Tools for Research and Testing on Graphical User Irtterfaees, in Proc. CHI 92, ACM Press, pp 431-432. ACM 0-89791-575-51931000410055
Recommended Articles
Loading...
There are no references for this article.
You’re reading a free preview. Subscribe to read the entire article.
“Hi guys, I cannot tell you how much I love this resource. Incredible. I really believe you've hit the nail on the head with this site in regards to solving the research-purchase issue.”
Daniel C.
“Whoa! It’s like Spotify but for academic articles.”
@Phil_Robichaud
“I must say, @deepdyve is a fabulous solution to the independent researcher's problem of #access to #information.”
@deepthiw
“My last article couldn't be possible without the platform @deepdyve that makes journal papers cheaper.”
To get new article updates from a journal on your personalized homepage, please log in first, or sign up for a DeepDyve account if you don’t already have one.
Our policy towards the use of cookies
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.