The use of privacy-protected computer vision to measure the quality of healthcare worker hand hygiene

The use of privacy-protected computer vision to measure the quality of healthcare worker hand... Abstract Objectives (i) To demonstrate the feasibility of automated, direct observation and collection of hand hygiene data, (ii) to develop computer visual methods capable of reporting compliance with moment 1 (the performance of hand hygiene before touching a patient) and (iii) to report the diagnostic accuracy of automated, direct observation of moment 1. Design Observation of simulated hand hygiene encounters between a healthcare worker and a patient. Setting Computer laboratory in a university. Participants Healthy volunteers. Main outcome measures Sensitivity and specificity of automatic detection of the first moment of hand hygiene. Methods We captured video and depth images using a Kinect camera and developed computer visual methods to automatically detect the use of alcohol-based hand rub (ABHR), rubbing together of hands and subsequent contact of the patient by the healthcare worker using depth imagery. Results We acquired images from 18 different simulated hand hygiene encounters where the healthcare worker complied with the first moment of hand hygiene, and 8 encounters where they did not. The diagnostic accuracy of determining that ABHR was dispensed and that the patient was touched was excellent (sensitivity 100%, specificity 100%). The diagnostic accuracy of determining that the hands were rubbed together after dispensing ABHR was good (sensitivity 83%, specificity 88%). Conclusions We have demonstrated that it is possible to automate the direct observation of hand hygiene performance in a simulated clinical setting. We used cheap, widely available consumer technology and depth imagery which potentially increases clinical application and decreases privacy concerns. hand hygiene [MeSH], image processing, computer-assisted [MeSH], cross infection [MeSH], quality assurance, healthcare [MeSH] Introduction Healthcare-associated infections (HAI) contribute to morbidity and mortality in healthcare facilities; 5–15% of patients admitted to hospital in developed countries will acquire an HAI [1, 2]. The problem is even greater in high-risk environments such as intensive care units (9–37% of admissions) [3]. HAIs affect almost 200 000 patients in Australian healthcare facilities and result in ~2 million extra hospital bed days annually [4]. Pathogens can be transmitted to susceptible patients by the hands of healthcare workers. Inadequate hand hygiene among healthcare workers was identified as an important cause of HAI by Ignaz Semmelweis in 1846 [1] and remains a problem today. Properly performed hand hygiene effectively reduces HAI [5]. Current World Health Organisation (WHO) and Hand Hygiene Australia guidelines describe the five moments of hand hygiene that must be performed [6, 7]. Unfortunately, compliance rates with hand hygiene are frequently low. Hand hygiene compliance rates in Australia across 860 hospitals were estimated to be 82.8% in June 2015 [8]. Low compliance rates are widespread, and vary between 5% and 81% globally [1]. Surveillance of hand hygiene and the collection of quality assurance data are difficult; an ideal method is not available. Direct observation of the five moments is currently the most common method for auditing hand hygiene compliance. The WHO Hand Hygiene technical reference manual recommends observing a minimum of 200 opportunities per observation period and per unit of observation (e.g. a single ward area) to reliably compare results before and after hand hygiene improvement interventions [6]. Direct observation has major limitations—it is expensive, laborious and prone to bias. It is subject to an observation bias (Hawthorne effect) where healthcare workers change their behaviour whilst being audited, as well as other observation and selection biases [1]. Periods of audit are extremely short compared with the breadth of usual clinical care, resulting in gross undersampling. Bias and undersampling are threats to the accuracy of hand hygiene data, and its validity as a performance indicator. Computer vision is a branch of artificial intelligence that studies how to automatically understand the content of images and video in a human-like manner [9–11]. While computer vision is well established in the area of medical imaging (medical image computing) [12], it is used extremely rarely in clinical medicine where patient (and healthcare worker) privacy is of utmost concern [13]. Concerns about the use of video surveillance in privacy-sensitive environments may be mitigated by the introduction of depth images. Unlike video red, green, blue (RGB) images, depth (or range) images only record the distance of the objects from the camera and do not permit identification of the viewed subjects or to distinguish features beyond outlines. Depth image cameras have become cheap and are widely available. We decided to investigate whether computer vision and depth image cameras could be used to surveil hand hygiene in a way that was both clinically feasible and privacy protecting. Study objectives To demonstrate the feasibility of automated, direct observation and collection of hand hygiene data. To develop computer visual methods capable of reporting compliance with moment 1 (the performance of hand hygiene before touching a patient). To report the diagnostic accuracy of automated, direct observation of moment 1. Methods Simulation of the clinical environment and the first moment of hand hygiene We simulated a hospital bed-space in a laboratory at the University of Technology Sydney. Four volunteers performed the roles of patient and healthcare worker, acting as patient and healthcare worker in turn. The camera was placed above the patient’s head and pointed toward the foot of the bed. Alcohol-based hand rub (ABHR) was placed on a pedestal at the foot of the bed, near the centre of the camera’s frame of view. When the patient was supine, the top of their head was visible to the camera; their face was not. Healthcare workers approached the patient on the bed, with the interaction ending with usual physical examination contact with the patient. Clinically realistic approaches by healthcare workers to the bedside were simulated—this included various combinations with/without dispensing of hand rub, and with/without rubbing of the hands together. Capture and processing of RGB and depth images The distances from the camera to the bottle, and from the camera to the bed were fixed and measured. We used a Kinect camera (Microsoft Corp) to capture depth images along with RGB images. The depth images bring significant advantages to the automated processing by enabling accurate volumetric scene reconstruction, object tracking, and disambiguation of the occlusions which take place when other objects block the camera view of the targeted objects [14]. Depth images are formed by projecting dots on the scene in the near infra-red spectrum and triangulating their distance. To capture the images, we used nuiCapture (v 1.4.0, Cadavid Concepts). The software records synchronous depth and RGB images and automatically extracts the skeleton and face of the tracked subjects from one or multiple Kinect cameras. It also visualises the data using a 3D media player. We exported the files in Matlab format, suitable for processing. To automatically detect the hand hygiene events, we used the depth images, and small RGB patches centred on the hand rub. Determination of compliance with moment 1 of hand hygiene Compliance with moment 1 by a healthcare worker comprises the detection of two events which are expected to take place in the correct order. Event 1 is the use of ABHR which, in our simulations, was placed at the foot of the bed. This event was subdivided into event 1A (dispensing of the hand rub) and event 1B (rubbing of hands together vigorously for a minimum amount of time). Event 2 is the subsequent touching of the patient. Event 2, when not preceded by Event 1 was considered non-compliance with moment 1. The computer vision techniques we used to detect events 1A, 1B and 2 are described below. Computer vision techniques for detection of dispensing ABHR (Event 1A) In each frame, we selected a window of pixels centred on the handrub bottle. Dispensing of handrub was inferred if a hand remained in contact with the bottle for a minimum duration (set to 10 frames). Detection consisted of: (i) skin segmentation (detection of the presence of skin-coloured pixels in the pixel window), (ii) counting of skin pixels in close proximity to the hand rub bottle and (iii) declaring detection if the pixel count was above a given threshold and persisted for a minimum of 10 frames. Computer vision techniques for detection of hand rubbing (Event 1B) This followed only if Event 1A was detected. This detection included (i) detection and removal of the static background scene to highlight the subjects. Detection of the background scene was achieved by running a temporal filter that returned the maximum depth recorded at each pixel location over a period of time (assuming that the background scene would be in view at some point in time); (ii) division of the area of interest (hands) into a grid of overlapping windows and (iii) selection of pixels in each window if they are (a) within a given depth range, (b) they are segmented as skin and (c) they change depth value over time (i.e. are moving objects). A ‘hand hypothesis’ was then formed if the number of selected pixels was above a threshold. When a hand hypothesis was detected, we used a machine learning classifier (a support vector machine) to detect the rubbing of hands [15, 16]. The classifier was trained with 600 manually-annotated images, half depicting hand rubbing and half, still hands. Hand rubbing was declared if its occurrence was detected continuously for at least 50 frames. Computer vision techniques for detection of touching the patient (Event 2) This was similar to the method used for Event 1A (dispensing hand rub). The area of interest around the bed/patient was selected, and detection of skin pixels above threshold was used as a proxy for the detection of bed/patient contact by the healthcare worker’s hands. Outcome measures and diagnostic accuracy Automatic detection of hand hygiene events requires machine learning (or training) from a set of manually-annotated data. The learned procedure can then be applied to another set for testing (validation). Cycles of training and testing should be repeated several times and results averaged in order to marginalise the impact of the data set as a random variable in the experiment [17]. For this reason, our experiments have been carried out following an ‘n-fold cross validation’ protocol. The data set was divided into three subsets, A, B and C, and in each experiment, we have used two joined for training and the third one for testing. This process was repeated three times and the accuracy averaged. The gold standard for compliance with moment 1 was direct observation of the RGB images by study personnel. We developed automated computer visual methods to detect 3 events necessary to determine compliance with moment 1: (1A) dispensing of hand rub by the healthcare worker, (1B) rubbing together of hands by the healthcare worker and (2) touching the patient. For each of these three events, we measured true positive (TP), false negative (FN), true negative (TN) and false positive (FP) detections. Compliance with moment 1 was defined as the complete performance of events 1A, 1B and 2 in the correct order. Violation of moment 1 is defined as the performance of event 2 without preceding performance of events 1A and 1B in correct order. Ethics and reporting This project was exempt from the need for ethical review, according to guidelines for quality improvement in our institution [18, 19]. We followed SQUIRE 2.0 reporting guidelines [20]. Results For the experiments, a total of 26 videos (both depth and colour frames) were acquired. An actor simulating a healthcare worker correctly complied with moment 1 in 18 videos (positive samples), and failed to do so in 8 videos (negative samples). Figure 1 shows typical RGB and depth images from our simulated experiments. Figure 1 View largeDownload slide An example of the images that were used in this work. Left: video (RGB) image of actual scene. Right: processed depth imagery of the same scene. Figure 1 View largeDownload slide An example of the images that were used in this work. Left: video (RGB) image of actual scene. Right: processed depth imagery of the same scene. Application of computer vision to hand hygiene observation The use of computer vision to detect the use of ABHR and rubbing together of the hands is shown in Figure 2. The detection of subsequent touching of the patient is summarised in Figure 3. Figure 2 View largeDownload slide The use of computer vision to detect the use of alcohol-based hand rub (Event 1) Top left: video image of the scene. Top centre: the depth frame. Top right: the skeleton extracted from the depth frame, clearly showing the detected position of the hands. Middle: Event 1A: detection of hand rub use. Bottom: Event 1B: sustained rubbing of hands. Figure 2 View largeDownload slide The use of computer vision to detect the use of alcohol-based hand rub (Event 1) Top left: video image of the scene. Top centre: the depth frame. Top right: the skeleton extracted from the depth frame, clearly showing the detected position of the hands. Middle: Event 1A: detection of hand rub use. Bottom: Event 1B: sustained rubbing of hands. Figure 3 View largeDownload slide The use of computer vision to detect contact between the healthcare worker and patient (Event 2). Figure 3 View largeDownload slide The use of computer vision to detect contact between the healthcare worker and patient (Event 2). Diagnostic accuracy of computer vision detection of hand hygiene moment 1 The videos acquired consisted of the following true events: 26 samples of Event 1A (18 positives and 8 negatives); 26 samples of Event 1B (18 positives and 8 negatives); and 52 samples of Event 2 (26 positives and 26 negatives, obtained by considering the parts where the clinician was close to the patient and did touch it and did not touch it, respectively). The diagnostic accuracy of detecting the three separate events is reported in Table 1 (TP: true positives, TN: true negatives, FP: false positives, FN: false negatives), and corresponding sensitivity (TP/(TP + FN)) and specificity (TN/(TN + FP)). Overall, the sensitivity of our methods in correctly detecting compliance with moment 1 was 83%, and the specificity was 88%. Table 1 Diagnostic accuracy of computer visual detection of three events comprising the first moment of hand hygiene     Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%      Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%  TP, true positive; FN, false negative; FP, false positive; TN, true negative. Table 1 Diagnostic accuracy of computer visual detection of three events comprising the first moment of hand hygiene     Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%      Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%  TP, true positive; FN, false negative; FP, false positive; TN, true negative. Discussion We have demonstrated the feasibility of auditing hand hygiene using depth imagery and computer vision. Our methods were excellent at detecting the dispensing of hand rub and subsequent manual contact of the patient by the healthcare worker (100% detection). Detection occurred in real time and without the need for video (RGB) images. We used widely available, affordable consumer technology (a Microsoft Kinect camera). These methods have the potential to greatly reduce the human labour involved in the collection of hand hygiene compliance data collection. Our findings are significant because HAI and inadequate hand hygiene are a very important public health problem, and the existing strategies for measuring it and managing it are lacking. The bias, undersampling and cost problems of direct observation by human auditors could all potentially be improved by an objective, continuous and inexpensive electronic method such as the one we have described. There is a large Hawthorne effect of auditing on hand hygiene compliance [21]. This can decrease the validity of performance indicator data, but is good for actual hand hygiene practice during periods of audit. Auditing of hand hygiene may be an effective therapeutic intervention for HAI if it can be applied for long periods. We think automated electronic methods are the only way to achieve this. Technological approaches to improving hand hygiene have been employed before [22]. Remote video auditing with feedback [23, 24] is effective but is unlikely to be feasible or affordable on a large scale. Electronic devices can improve training, but are not always effective at improving compliance [25, 26]. Other methods involving sensors on hand rub dispensers, healthcare workers or both are also relatively expensive and require special equipment [27–31]. Our methods do not require special equipment, do not require transmitters or sensors to be applied in the bed area, and are readily deployable anywhere (a single depth image camera is mounted above the head of the bed). Despite the potential for this approach, our study had important limitations. The clinical setting was simulated and highly controlled: a single healthcare worker approached a supine patient, and used ABHR that was positioned in an elevated position at the foot of the bed. Real clinical care is relatively chaotic, and we have not evaluated these methods in that environment. Our methods were not as accurate at detecting the rubbing together of hands by the healthcare worker (83% TP rate). Skin segmentation relies on skin-coloured pixel detection and is reasonably accurate [32], but untested in clinical areas where non-skin coloured gloves are frequently worn. We believe the use of skeletal data provided by the Kinect camera may potentially overcome this problem. We do not know how our methods would perform with multiple healthcare workers in the same area, or with other moments of hand hygiene detection. We have avoided the substantial ethical and privacy concerns that would arise if electronic surveillance measures were deployed in clinical areas by conducting this work in a laboratory simulation. These concerns would be insurmountable if our methods required the capture (and especially storage) of video (RGB) images. By excluding the patient’s face from the field of view, and the exclusive use of non-identifying depth imagery, we believe our methods provide a substantial level of inherent privacy protection. Further development and deployment in clinical areas would need to be conducted with great care and sensitivity [13, 33]. In conclusion, the potential for clinical application is significant. No video imagery needs to be stored (or even captured). The equipment needed is widely available and can be deployed anywhere. It could be paired with real-time feedback to healthcare workers to encourage ABHR use prior to touching their patient. It could generate continuous auditing data for use by managers in real time, or provide aggregate reports whilst avoiding identification or video surveillance of staff. The next logical step would be to evaluate these methods in a real clinical area using volunteers instead of patients. The technology should only be applied widely outside research settings if it is known to reduce HAI, raises no significant privacy concerns, is affordable and robust. Funding No funding. References 1 World Health Organisation (WHO). WHO Guidelines on Hand Hygiene in Health Care: a summary 2009 [updated Accessed 1/8/2015.1/8/2015]. Available from: http://www.who.int/gpsc/5may/tools/who_guidelines-handhygiene_summary.pdf. 2 Spelman DW. Hospital-acquired infections. Med J Aust  2002; 176: 286– 91. Google Scholar PubMed  3 Vincent JL. Nosocomial infections in adult intensive-care units. Lancet  2003; 361: 2068– 77. Google Scholar CrossRef Search ADS PubMed  4 Australian Commission on Safety and Quality in Health Care. Cruickshank M, Ferguson J, editors. Reducing Harm to Patients from Health Care Associated Infection: The Role of Surveillance. 2008 [updated Accessed on 1 Sep 2016. Available from: http://www.safetyandquality.gov.au/wp-content/uploads/2008/01/Reducing-Harm-to-Patient-Role-of-Surveillance1.pdf. 5 Bernard S Refractory out-of-hospital cardiac arrest treated with mechanical CPR, hypothermia, ECMO and early reperfusion. NCT01186614 Australian New Zealand Clinical Trials Registry 2010 [Available from: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=2876&isClinicalTrial=True. 6 World Health Organisation ( 2009). Hand hygiene technical reference manual: to be used by health-care workers, trainers and observers of hand hygiene practices [updated Accessed on 1 Aug 2015. Available from: http://apps.who.int/iris/bitstream/10665/44196/1/9789241598606_eng.pdf. 7 Hand Hygiene Australia. 5 Moments for hand hygiene [updated Accessed 1 Sep 2016. Available from: http://www.hha.org.au/home/5-moments-for-hand-hygiene.aspx. 8 Do A, Cretikos M, Muscatello D et al.  . Epidemiology of out-of-hospital cardiac arrests, NSW, 2012: time, place and person . Sydney: Centre for Epidemiology and Evidence, NSW Ministry of Health, 2013. 9 Haralick RM, Shapiro LG. Glossary of computer vision terms. Pattern Recognit  1991; 24: 69– 93. Google Scholar CrossRef Search ADS   10 Fisher RB, Breckon TP, Dawson-Howe K et al.  . Dictionary of computer vision and image processing . Hoboken, NJ: John Wiley & Sons, Ltd, 2016: 324– 72. Google Scholar CrossRef Search ADS   11 Shah M Fundamentals of computer vision. 1997 [updated Accessed 1 Sep 2015. Available from: http://crcv.ucf.edu/gauss/BOOK.PDF. 12 Handels H, Deserno TM, Meinzer HP et al.  . Image analysis and modeling in medical image computing. Recent developments and advances. Methods Inf Med  2012; 51: 395– 7. Google Scholar PubMed  13 Palmore TN, Henderson DK. Big Brother is Washing…Video Surveillance for Hand Hygiene Adherence, Through the Lenses of Efficacy and Privacy. Clin Infect Dis  2012; 54: 8– 9. Google Scholar CrossRef Search ADS PubMed  14 Jana A. Kinect for windows SDK programming guide . Birmingham, GB: Packt Publishing, December 2012. 15 Cortes C, Vapnik V. Support-vector networks. Machine Learn  1995; 20: 273– 97. 16 Avalli L, Maggioni E, Formica F et al.  . Favourable survival of in-hospital compared to out-of-hospital refractory cardiac arrest patients treated with extracorporeal membrane oxygenation: an Italian tertiary care centre experience. Resuscitation  2012; 83: 579– 83. Google Scholar CrossRef Search ADS PubMed  17 Dietterich TG. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Comput  1998; 10: 1895– 923. Google Scholar CrossRef Search ADS PubMed  18 NSW Health: Office for health and medical research. Human Research Ethics Committees—Quality Improvement & ethical review: a practice guide for NSW (2007, GL2007_020) [updated Accessed 29 Aug 2016. Available from: http://www0.health.nsw.gov.au/policies/gl/2007/GL2007_020.html. 19 BMJ Quality & Safety. Policy on ethics review for quality improvement reports Updated April 2014. [updated Accessed 29 Aug 2016. Available from: http://qualitysafety.bmj.com/site/misc/PolicyonEthicReviews.pdf. 20 Ogrinc G, Davies L, Goodman D et al.  . SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. BMJ Qual Saf  2016; 25: 986– 92. Google Scholar CrossRef Search ADS PubMed  21 Hagel S, Reischke J, Kesselmeier M et al.  . Quantifying the Hawthorne effect in hand hygiene compliance through comparing direct observation with automated hand hygiene monitoring. Infect Control Hosp Epidemiol  2015; 36: 957– 62. Google Scholar CrossRef Search ADS PubMed  22 Srigley JA, Gardam M, Fernie G et al.  . Hand hygiene monitoring technology: a systematic review of efficacy. J Hosp Infect  2015; 89: 51– 60. Google Scholar CrossRef Search ADS PubMed  23 Armellino D, Hussain E, Schilling ME et al.  . Using high-technology to enforce low-technology safety measures: the use of third-party remote video auditing and real-time feedback in healthcare. Clin Infecti Dis  2012; 54: 1– 7. Google Scholar CrossRef Search ADS   24 Armellino D, Trivedi M, Law I et al.  . Replicating changes in hand hygiene in a surgical intensive care unit with remote video auditing and feedback. Am J Infect Control  2013; 41: 925– 7. Google Scholar CrossRef Search ADS PubMed  25 Kwok YL, Callard M, McLaws ML. An automated hand hygiene training system improves hand hygiene technique but not compliance. Am J Infect Control  2015; 43: 821– 5. Google Scholar CrossRef Search ADS PubMed  26 Higgins A, Hannan MM. Improved hand hygiene technique and compliance in healthcare workers using gaming technology. J Hosp Infect  2013; 84: 32– 7. Google Scholar CrossRef Search ADS PubMed  27 Fisher DA, Seetoh T, May-Lin HO et al.  . Automated measures of hand hygiene compliance among healthcare workers using ultrasound: validation and a randomized controlled trial. Infect Control Hosp Epidemiol  2015; 34: 919– 28. Google Scholar CrossRef Search ADS   28 Levchenko AI, Boscart VM, Fernie GR. The effect of automated monitoring and real-time prompting on nurses’ hand hygiene performance. Comput Inform Nurs  2013; 31: 498– 504. Google Scholar CrossRef Search ADS PubMed  29 Marra AR, D’Arco C, Bravim Bde A et al.  . Controlled trial measuring the effect of a feedback intervention on hand hygiene compliance in a step-down unit. Infect Control Hosp Epidemiol  2008; 29: 730– 5. Google Scholar CrossRef Search ADS PubMed  30 Swoboda SM, Earsing K, Strauss K et al.  . Electronic monitoring and voice prompts improve hand hygiene and decrease nosocomial infections in an intermediate care unit. Crit Care Med  2004; 32: 358– 63. Google Scholar CrossRef Search ADS PubMed  31 Sahud AG, Bhanot N, Radhakrishnan A et al.  . An electronic hand hygiene surveillance device: a pilot study exploring surrogate markers for hand hygiene compliance. Infect Control Hosp Epidemiol  2010; 31: 634– 9. Google Scholar CrossRef Search ADS PubMed  32 Jones MJ, Rehg JM. Statistical color models with application to skin detection. Int J Comput Vis  2002; 46: 81– 96. Google Scholar CrossRef Search ADS   33 Tim Lahey. A Watchful eye in hospitals. New York Times. 16 Feb 2014. © The Author(s) 2018. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal for Quality in Health Care Oxford University Press

The use of privacy-protected computer vision to measure the quality of healthcare worker hand hygiene

Loading next page...
 
/lp/ou_press/the-use-of-privacy-protected-computer-vision-to-measure-the-quality-of-7H4xShWp9V
Publisher
Oxford University Press
Copyright
© The Author(s) 2018. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com
ISSN
1353-4505
eISSN
1464-3677
D.O.I.
10.1093/intqhc/mzy099
Publisher site
See Article on Publisher Site

Abstract

Abstract Objectives (i) To demonstrate the feasibility of automated, direct observation and collection of hand hygiene data, (ii) to develop computer visual methods capable of reporting compliance with moment 1 (the performance of hand hygiene before touching a patient) and (iii) to report the diagnostic accuracy of automated, direct observation of moment 1. Design Observation of simulated hand hygiene encounters between a healthcare worker and a patient. Setting Computer laboratory in a university. Participants Healthy volunteers. Main outcome measures Sensitivity and specificity of automatic detection of the first moment of hand hygiene. Methods We captured video and depth images using a Kinect camera and developed computer visual methods to automatically detect the use of alcohol-based hand rub (ABHR), rubbing together of hands and subsequent contact of the patient by the healthcare worker using depth imagery. Results We acquired images from 18 different simulated hand hygiene encounters where the healthcare worker complied with the first moment of hand hygiene, and 8 encounters where they did not. The diagnostic accuracy of determining that ABHR was dispensed and that the patient was touched was excellent (sensitivity 100%, specificity 100%). The diagnostic accuracy of determining that the hands were rubbed together after dispensing ABHR was good (sensitivity 83%, specificity 88%). Conclusions We have demonstrated that it is possible to automate the direct observation of hand hygiene performance in a simulated clinical setting. We used cheap, widely available consumer technology and depth imagery which potentially increases clinical application and decreases privacy concerns. hand hygiene [MeSH], image processing, computer-assisted [MeSH], cross infection [MeSH], quality assurance, healthcare [MeSH] Introduction Healthcare-associated infections (HAI) contribute to morbidity and mortality in healthcare facilities; 5–15% of patients admitted to hospital in developed countries will acquire an HAI [1, 2]. The problem is even greater in high-risk environments such as intensive care units (9–37% of admissions) [3]. HAIs affect almost 200 000 patients in Australian healthcare facilities and result in ~2 million extra hospital bed days annually [4]. Pathogens can be transmitted to susceptible patients by the hands of healthcare workers. Inadequate hand hygiene among healthcare workers was identified as an important cause of HAI by Ignaz Semmelweis in 1846 [1] and remains a problem today. Properly performed hand hygiene effectively reduces HAI [5]. Current World Health Organisation (WHO) and Hand Hygiene Australia guidelines describe the five moments of hand hygiene that must be performed [6, 7]. Unfortunately, compliance rates with hand hygiene are frequently low. Hand hygiene compliance rates in Australia across 860 hospitals were estimated to be 82.8% in June 2015 [8]. Low compliance rates are widespread, and vary between 5% and 81% globally [1]. Surveillance of hand hygiene and the collection of quality assurance data are difficult; an ideal method is not available. Direct observation of the five moments is currently the most common method for auditing hand hygiene compliance. The WHO Hand Hygiene technical reference manual recommends observing a minimum of 200 opportunities per observation period and per unit of observation (e.g. a single ward area) to reliably compare results before and after hand hygiene improvement interventions [6]. Direct observation has major limitations—it is expensive, laborious and prone to bias. It is subject to an observation bias (Hawthorne effect) where healthcare workers change their behaviour whilst being audited, as well as other observation and selection biases [1]. Periods of audit are extremely short compared with the breadth of usual clinical care, resulting in gross undersampling. Bias and undersampling are threats to the accuracy of hand hygiene data, and its validity as a performance indicator. Computer vision is a branch of artificial intelligence that studies how to automatically understand the content of images and video in a human-like manner [9–11]. While computer vision is well established in the area of medical imaging (medical image computing) [12], it is used extremely rarely in clinical medicine where patient (and healthcare worker) privacy is of utmost concern [13]. Concerns about the use of video surveillance in privacy-sensitive environments may be mitigated by the introduction of depth images. Unlike video red, green, blue (RGB) images, depth (or range) images only record the distance of the objects from the camera and do not permit identification of the viewed subjects or to distinguish features beyond outlines. Depth image cameras have become cheap and are widely available. We decided to investigate whether computer vision and depth image cameras could be used to surveil hand hygiene in a way that was both clinically feasible and privacy protecting. Study objectives To demonstrate the feasibility of automated, direct observation and collection of hand hygiene data. To develop computer visual methods capable of reporting compliance with moment 1 (the performance of hand hygiene before touching a patient). To report the diagnostic accuracy of automated, direct observation of moment 1. Methods Simulation of the clinical environment and the first moment of hand hygiene We simulated a hospital bed-space in a laboratory at the University of Technology Sydney. Four volunteers performed the roles of patient and healthcare worker, acting as patient and healthcare worker in turn. The camera was placed above the patient’s head and pointed toward the foot of the bed. Alcohol-based hand rub (ABHR) was placed on a pedestal at the foot of the bed, near the centre of the camera’s frame of view. When the patient was supine, the top of their head was visible to the camera; their face was not. Healthcare workers approached the patient on the bed, with the interaction ending with usual physical examination contact with the patient. Clinically realistic approaches by healthcare workers to the bedside were simulated—this included various combinations with/without dispensing of hand rub, and with/without rubbing of the hands together. Capture and processing of RGB and depth images The distances from the camera to the bottle, and from the camera to the bed were fixed and measured. We used a Kinect camera (Microsoft Corp) to capture depth images along with RGB images. The depth images bring significant advantages to the automated processing by enabling accurate volumetric scene reconstruction, object tracking, and disambiguation of the occlusions which take place when other objects block the camera view of the targeted objects [14]. Depth images are formed by projecting dots on the scene in the near infra-red spectrum and triangulating their distance. To capture the images, we used nuiCapture (v 1.4.0, Cadavid Concepts). The software records synchronous depth and RGB images and automatically extracts the skeleton and face of the tracked subjects from one or multiple Kinect cameras. It also visualises the data using a 3D media player. We exported the files in Matlab format, suitable for processing. To automatically detect the hand hygiene events, we used the depth images, and small RGB patches centred on the hand rub. Determination of compliance with moment 1 of hand hygiene Compliance with moment 1 by a healthcare worker comprises the detection of two events which are expected to take place in the correct order. Event 1 is the use of ABHR which, in our simulations, was placed at the foot of the bed. This event was subdivided into event 1A (dispensing of the hand rub) and event 1B (rubbing of hands together vigorously for a minimum amount of time). Event 2 is the subsequent touching of the patient. Event 2, when not preceded by Event 1 was considered non-compliance with moment 1. The computer vision techniques we used to detect events 1A, 1B and 2 are described below. Computer vision techniques for detection of dispensing ABHR (Event 1A) In each frame, we selected a window of pixels centred on the handrub bottle. Dispensing of handrub was inferred if a hand remained in contact with the bottle for a minimum duration (set to 10 frames). Detection consisted of: (i) skin segmentation (detection of the presence of skin-coloured pixels in the pixel window), (ii) counting of skin pixels in close proximity to the hand rub bottle and (iii) declaring detection if the pixel count was above a given threshold and persisted for a minimum of 10 frames. Computer vision techniques for detection of hand rubbing (Event 1B) This followed only if Event 1A was detected. This detection included (i) detection and removal of the static background scene to highlight the subjects. Detection of the background scene was achieved by running a temporal filter that returned the maximum depth recorded at each pixel location over a period of time (assuming that the background scene would be in view at some point in time); (ii) division of the area of interest (hands) into a grid of overlapping windows and (iii) selection of pixels in each window if they are (a) within a given depth range, (b) they are segmented as skin and (c) they change depth value over time (i.e. are moving objects). A ‘hand hypothesis’ was then formed if the number of selected pixels was above a threshold. When a hand hypothesis was detected, we used a machine learning classifier (a support vector machine) to detect the rubbing of hands [15, 16]. The classifier was trained with 600 manually-annotated images, half depicting hand rubbing and half, still hands. Hand rubbing was declared if its occurrence was detected continuously for at least 50 frames. Computer vision techniques for detection of touching the patient (Event 2) This was similar to the method used for Event 1A (dispensing hand rub). The area of interest around the bed/patient was selected, and detection of skin pixels above threshold was used as a proxy for the detection of bed/patient contact by the healthcare worker’s hands. Outcome measures and diagnostic accuracy Automatic detection of hand hygiene events requires machine learning (or training) from a set of manually-annotated data. The learned procedure can then be applied to another set for testing (validation). Cycles of training and testing should be repeated several times and results averaged in order to marginalise the impact of the data set as a random variable in the experiment [17]. For this reason, our experiments have been carried out following an ‘n-fold cross validation’ protocol. The data set was divided into three subsets, A, B and C, and in each experiment, we have used two joined for training and the third one for testing. This process was repeated three times and the accuracy averaged. The gold standard for compliance with moment 1 was direct observation of the RGB images by study personnel. We developed automated computer visual methods to detect 3 events necessary to determine compliance with moment 1: (1A) dispensing of hand rub by the healthcare worker, (1B) rubbing together of hands by the healthcare worker and (2) touching the patient. For each of these three events, we measured true positive (TP), false negative (FN), true negative (TN) and false positive (FP) detections. Compliance with moment 1 was defined as the complete performance of events 1A, 1B and 2 in the correct order. Violation of moment 1 is defined as the performance of event 2 without preceding performance of events 1A and 1B in correct order. Ethics and reporting This project was exempt from the need for ethical review, according to guidelines for quality improvement in our institution [18, 19]. We followed SQUIRE 2.0 reporting guidelines [20]. Results For the experiments, a total of 26 videos (both depth and colour frames) were acquired. An actor simulating a healthcare worker correctly complied with moment 1 in 18 videos (positive samples), and failed to do so in 8 videos (negative samples). Figure 1 shows typical RGB and depth images from our simulated experiments. Figure 1 View largeDownload slide An example of the images that were used in this work. Left: video (RGB) image of actual scene. Right: processed depth imagery of the same scene. Figure 1 View largeDownload slide An example of the images that were used in this work. Left: video (RGB) image of actual scene. Right: processed depth imagery of the same scene. Application of computer vision to hand hygiene observation The use of computer vision to detect the use of ABHR and rubbing together of the hands is shown in Figure 2. The detection of subsequent touching of the patient is summarised in Figure 3. Figure 2 View largeDownload slide The use of computer vision to detect the use of alcohol-based hand rub (Event 1) Top left: video image of the scene. Top centre: the depth frame. Top right: the skeleton extracted from the depth frame, clearly showing the detected position of the hands. Middle: Event 1A: detection of hand rub use. Bottom: Event 1B: sustained rubbing of hands. Figure 2 View largeDownload slide The use of computer vision to detect the use of alcohol-based hand rub (Event 1) Top left: video image of the scene. Top centre: the depth frame. Top right: the skeleton extracted from the depth frame, clearly showing the detected position of the hands. Middle: Event 1A: detection of hand rub use. Bottom: Event 1B: sustained rubbing of hands. Figure 3 View largeDownload slide The use of computer vision to detect contact between the healthcare worker and patient (Event 2). Figure 3 View largeDownload slide The use of computer vision to detect contact between the healthcare worker and patient (Event 2). Diagnostic accuracy of computer vision detection of hand hygiene moment 1 The videos acquired consisted of the following true events: 26 samples of Event 1A (18 positives and 8 negatives); 26 samples of Event 1B (18 positives and 8 negatives); and 52 samples of Event 2 (26 positives and 26 negatives, obtained by considering the parts where the clinician was close to the patient and did touch it and did not touch it, respectively). The diagnostic accuracy of detecting the three separate events is reported in Table 1 (TP: true positives, TN: true negatives, FP: false positives, FN: false negatives), and corresponding sensitivity (TP/(TP + FN)) and specificity (TN/(TN + FP)). Overall, the sensitivity of our methods in correctly detecting compliance with moment 1 was 83%, and the specificity was 88%. Table 1 Diagnostic accuracy of computer visual detection of three events comprising the first moment of hand hygiene     Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%      Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%  TP, true positive; FN, false negative; FP, false positive; TN, true negative. Table 1 Diagnostic accuracy of computer visual detection of three events comprising the first moment of hand hygiene     Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%      Dispensed hand rub  Did not dispense hand rub    Event 1A Dispensing hand rub  Detected  TP: 18 out of 18  FP: 0 out of 8  Sensitivity = 100%    Not detected  FN: 0 out of 18  TN: 8 out of 8  Specificity = 100%      Rubbing of hands  No rubbing of hands    Event 1B Rubbing of hands  Detected  TP: 15 out of 18 (83%)  FP: 1 out of 8 (12%)  Sensitivity = 83%    Not detected  FN: 3 out of 18 (17%)  TN: 7 out of 8 (88%)  Specificity = 88%      Touched patient  Did not touch patient    Event 2 Touching patient  Detected  TP: 26 out of 26  FP: 0 out of 26  Sensitivity = 100%    Not detected  FN: 0 out of 26  TN: 26 out of 26  Specificity = 100%  TP, true positive; FN, false negative; FP, false positive; TN, true negative. Discussion We have demonstrated the feasibility of auditing hand hygiene using depth imagery and computer vision. Our methods were excellent at detecting the dispensing of hand rub and subsequent manual contact of the patient by the healthcare worker (100% detection). Detection occurred in real time and without the need for video (RGB) images. We used widely available, affordable consumer technology (a Microsoft Kinect camera). These methods have the potential to greatly reduce the human labour involved in the collection of hand hygiene compliance data collection. Our findings are significant because HAI and inadequate hand hygiene are a very important public health problem, and the existing strategies for measuring it and managing it are lacking. The bias, undersampling and cost problems of direct observation by human auditors could all potentially be improved by an objective, continuous and inexpensive electronic method such as the one we have described. There is a large Hawthorne effect of auditing on hand hygiene compliance [21]. This can decrease the validity of performance indicator data, but is good for actual hand hygiene practice during periods of audit. Auditing of hand hygiene may be an effective therapeutic intervention for HAI if it can be applied for long periods. We think automated electronic methods are the only way to achieve this. Technological approaches to improving hand hygiene have been employed before [22]. Remote video auditing with feedback [23, 24] is effective but is unlikely to be feasible or affordable on a large scale. Electronic devices can improve training, but are not always effective at improving compliance [25, 26]. Other methods involving sensors on hand rub dispensers, healthcare workers or both are also relatively expensive and require special equipment [27–31]. Our methods do not require special equipment, do not require transmitters or sensors to be applied in the bed area, and are readily deployable anywhere (a single depth image camera is mounted above the head of the bed). Despite the potential for this approach, our study had important limitations. The clinical setting was simulated and highly controlled: a single healthcare worker approached a supine patient, and used ABHR that was positioned in an elevated position at the foot of the bed. Real clinical care is relatively chaotic, and we have not evaluated these methods in that environment. Our methods were not as accurate at detecting the rubbing together of hands by the healthcare worker (83% TP rate). Skin segmentation relies on skin-coloured pixel detection and is reasonably accurate [32], but untested in clinical areas where non-skin coloured gloves are frequently worn. We believe the use of skeletal data provided by the Kinect camera may potentially overcome this problem. We do not know how our methods would perform with multiple healthcare workers in the same area, or with other moments of hand hygiene detection. We have avoided the substantial ethical and privacy concerns that would arise if electronic surveillance measures were deployed in clinical areas by conducting this work in a laboratory simulation. These concerns would be insurmountable if our methods required the capture (and especially storage) of video (RGB) images. By excluding the patient’s face from the field of view, and the exclusive use of non-identifying depth imagery, we believe our methods provide a substantial level of inherent privacy protection. Further development and deployment in clinical areas would need to be conducted with great care and sensitivity [13, 33]. In conclusion, the potential for clinical application is significant. No video imagery needs to be stored (or even captured). The equipment needed is widely available and can be deployed anywhere. It could be paired with real-time feedback to healthcare workers to encourage ABHR use prior to touching their patient. It could generate continuous auditing data for use by managers in real time, or provide aggregate reports whilst avoiding identification or video surveillance of staff. The next logical step would be to evaluate these methods in a real clinical area using volunteers instead of patients. The technology should only be applied widely outside research settings if it is known to reduce HAI, raises no significant privacy concerns, is affordable and robust. Funding No funding. References 1 World Health Organisation (WHO). WHO Guidelines on Hand Hygiene in Health Care: a summary 2009 [updated Accessed 1/8/2015.1/8/2015]. Available from: http://www.who.int/gpsc/5may/tools/who_guidelines-handhygiene_summary.pdf. 2 Spelman DW. Hospital-acquired infections. Med J Aust  2002; 176: 286– 91. Google Scholar PubMed  3 Vincent JL. Nosocomial infections in adult intensive-care units. Lancet  2003; 361: 2068– 77. Google Scholar CrossRef Search ADS PubMed  4 Australian Commission on Safety and Quality in Health Care. Cruickshank M, Ferguson J, editors. Reducing Harm to Patients from Health Care Associated Infection: The Role of Surveillance. 2008 [updated Accessed on 1 Sep 2016. Available from: http://www.safetyandquality.gov.au/wp-content/uploads/2008/01/Reducing-Harm-to-Patient-Role-of-Surveillance1.pdf. 5 Bernard S Refractory out-of-hospital cardiac arrest treated with mechanical CPR, hypothermia, ECMO and early reperfusion. NCT01186614 Australian New Zealand Clinical Trials Registry 2010 [Available from: https://www.anzctr.org.au/Trial/Registration/TrialReview.aspx?id=2876&isClinicalTrial=True. 6 World Health Organisation ( 2009). Hand hygiene technical reference manual: to be used by health-care workers, trainers and observers of hand hygiene practices [updated Accessed on 1 Aug 2015. Available from: http://apps.who.int/iris/bitstream/10665/44196/1/9789241598606_eng.pdf. 7 Hand Hygiene Australia. 5 Moments for hand hygiene [updated Accessed 1 Sep 2016. Available from: http://www.hha.org.au/home/5-moments-for-hand-hygiene.aspx. 8 Do A, Cretikos M, Muscatello D et al.  . Epidemiology of out-of-hospital cardiac arrests, NSW, 2012: time, place and person . Sydney: Centre for Epidemiology and Evidence, NSW Ministry of Health, 2013. 9 Haralick RM, Shapiro LG. Glossary of computer vision terms. Pattern Recognit  1991; 24: 69– 93. Google Scholar CrossRef Search ADS   10 Fisher RB, Breckon TP, Dawson-Howe K et al.  . Dictionary of computer vision and image processing . Hoboken, NJ: John Wiley & Sons, Ltd, 2016: 324– 72. Google Scholar CrossRef Search ADS   11 Shah M Fundamentals of computer vision. 1997 [updated Accessed 1 Sep 2015. Available from: http://crcv.ucf.edu/gauss/BOOK.PDF. 12 Handels H, Deserno TM, Meinzer HP et al.  . Image analysis and modeling in medical image computing. Recent developments and advances. Methods Inf Med  2012; 51: 395– 7. Google Scholar PubMed  13 Palmore TN, Henderson DK. Big Brother is Washing…Video Surveillance for Hand Hygiene Adherence, Through the Lenses of Efficacy and Privacy. Clin Infect Dis  2012; 54: 8– 9. Google Scholar CrossRef Search ADS PubMed  14 Jana A. Kinect for windows SDK programming guide . Birmingham, GB: Packt Publishing, December 2012. 15 Cortes C, Vapnik V. Support-vector networks. Machine Learn  1995; 20: 273– 97. 16 Avalli L, Maggioni E, Formica F et al.  . Favourable survival of in-hospital compared to out-of-hospital refractory cardiac arrest patients treated with extracorporeal membrane oxygenation: an Italian tertiary care centre experience. Resuscitation  2012; 83: 579– 83. Google Scholar CrossRef Search ADS PubMed  17 Dietterich TG. Approximate statistical tests for comparing supervised classification learning algorithms. Neural Comput  1998; 10: 1895– 923. Google Scholar CrossRef Search ADS PubMed  18 NSW Health: Office for health and medical research. Human Research Ethics Committees—Quality Improvement & ethical review: a practice guide for NSW (2007, GL2007_020) [updated Accessed 29 Aug 2016. Available from: http://www0.health.nsw.gov.au/policies/gl/2007/GL2007_020.html. 19 BMJ Quality & Safety. Policy on ethics review for quality improvement reports Updated April 2014. [updated Accessed 29 Aug 2016. Available from: http://qualitysafety.bmj.com/site/misc/PolicyonEthicReviews.pdf. 20 Ogrinc G, Davies L, Goodman D et al.  . SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process. BMJ Qual Saf  2016; 25: 986– 92. Google Scholar CrossRef Search ADS PubMed  21 Hagel S, Reischke J, Kesselmeier M et al.  . Quantifying the Hawthorne effect in hand hygiene compliance through comparing direct observation with automated hand hygiene monitoring. Infect Control Hosp Epidemiol  2015; 36: 957– 62. Google Scholar CrossRef Search ADS PubMed  22 Srigley JA, Gardam M, Fernie G et al.  . Hand hygiene monitoring technology: a systematic review of efficacy. J Hosp Infect  2015; 89: 51– 60. Google Scholar CrossRef Search ADS PubMed  23 Armellino D, Hussain E, Schilling ME et al.  . Using high-technology to enforce low-technology safety measures: the use of third-party remote video auditing and real-time feedback in healthcare. Clin Infecti Dis  2012; 54: 1– 7. Google Scholar CrossRef Search ADS   24 Armellino D, Trivedi M, Law I et al.  . Replicating changes in hand hygiene in a surgical intensive care unit with remote video auditing and feedback. Am J Infect Control  2013; 41: 925– 7. Google Scholar CrossRef Search ADS PubMed  25 Kwok YL, Callard M, McLaws ML. An automated hand hygiene training system improves hand hygiene technique but not compliance. Am J Infect Control  2015; 43: 821– 5. Google Scholar CrossRef Search ADS PubMed  26 Higgins A, Hannan MM. Improved hand hygiene technique and compliance in healthcare workers using gaming technology. J Hosp Infect  2013; 84: 32– 7. Google Scholar CrossRef Search ADS PubMed  27 Fisher DA, Seetoh T, May-Lin HO et al.  . Automated measures of hand hygiene compliance among healthcare workers using ultrasound: validation and a randomized controlled trial. Infect Control Hosp Epidemiol  2015; 34: 919– 28. Google Scholar CrossRef Search ADS   28 Levchenko AI, Boscart VM, Fernie GR. The effect of automated monitoring and real-time prompting on nurses’ hand hygiene performance. Comput Inform Nurs  2013; 31: 498– 504. Google Scholar CrossRef Search ADS PubMed  29 Marra AR, D’Arco C, Bravim Bde A et al.  . Controlled trial measuring the effect of a feedback intervention on hand hygiene compliance in a step-down unit. Infect Control Hosp Epidemiol  2008; 29: 730– 5. Google Scholar CrossRef Search ADS PubMed  30 Swoboda SM, Earsing K, Strauss K et al.  . Electronic monitoring and voice prompts improve hand hygiene and decrease nosocomial infections in an intermediate care unit. Crit Care Med  2004; 32: 358– 63. Google Scholar CrossRef Search ADS PubMed  31 Sahud AG, Bhanot N, Radhakrishnan A et al.  . An electronic hand hygiene surveillance device: a pilot study exploring surrogate markers for hand hygiene compliance. Infect Control Hosp Epidemiol  2010; 31: 634– 9. Google Scholar CrossRef Search ADS PubMed  32 Jones MJ, Rehg JM. Statistical color models with application to skin detection. Int J Comput Vis  2002; 46: 81– 96. Google Scholar CrossRef Search ADS   33 Tim Lahey. A Watchful eye in hospitals. New York Times. 16 Feb 2014. © The Author(s) 2018. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices)

Journal

International Journal for Quality in Health CareOxford University Press

Published: May 15, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off