TY - JOUR AU1 - Datta, V AU2 - Bann, S AU3 - Aggarwal, R AU4 - Mandalia, M AU5 - Hance, J AU6 - Darzi, A AB - Abstract Background The technical skills of surgical trainees are difficult to assess and compare objectively. This study involved a structured, multistation, technical skills examination that enables the stratification of surgical trainees. Methods Twenty-two surgeons (five basic surgical trainees, eight junior specialist trainees, four senior specialist trainees and five consultants) participated in the study. All undertook a five-station technical skills examination consisting of three synthetic simulations (bowel anastomosis, vascular anastomosis, saphenofemoral dissection) and two virtual reality-based (flexible sigmoidoscopy and laparoscopy) assessment stations. Video-based analyses and in-built computer scoring were used to measure each surgeon's performance. The mean rank was determined for each variable, and the sum of the mean ranks produced a total score. Results There was a significant improvement in overall performance with increasing seniority (P < 0·001). Significant differences were observed between basic surgical trainees and junior specialist trainees (P = 0·019), and between junior and senior specialist trainees (P = 0·048), but not between senior trainees and consultants. Conclusion This examination successfully differentiated surgical skill, both between surgeons with different grades of experience and within the target study group of specialist trainees. The examination is feasible in terms of the timeframe needed to complete tasks, cost, and efficiency in performing video-based assessments. Introduction Recent interest in cases of medical mishap has led to increased public attention regarding assessment of competence before independent practice. Surgical competence involves several attributes, including knowledge, decision-making ability, communication and technical skills1,2. In the UK, assessment of surgical trainees has become more rigorous and structured in recent years3, with many of these attributes now appraised as part of a surgical training curriculum4. Trainees undergo annual assessment designed to evaluate progress through the training programme. This process involves the current consultant supervisor providing feedback of overall performance, including a subjective appraisal of operative ability. Other methods of technical skills evaluation have concentrated on analysis of numbers of procedures performed5, although these evaluations are also subjective and may suffer from bias6. Tools for objective assessment of surgical skill can be divided into rating scales and motion analysis systems. Checklist and Objective Structured Clinical Examination (OSCE)-type scales for assessment of technical ability are effective and reliable methods of appraising surgical dexterity in inanimate bench, animal, cadaveric and live operating simulation environments7–12. Motion analysis has been utilized for the assessment of component surgical tasks, such as knot-tying, suturing and bowel anastomosis13,14. Virtual reality (VR) simulation is also an objective and valid assessor of proficiency in the domains of laparoscopy and endoscopy15–19. There have been recent developments in tools for assessing surgical skill with inanimate models, to reduce the effects of patient variability in the assessment process12,14,20. Other studies have related technical skills assessments to outcomes, such as leak rates following vascular anastomosis21. For such assessments to become incorporated into the surgical curriculum, a stepwise approach that parallels the defined stages of training is necessary. Surgical training in the UK currently lasts for about 8 years and is divided into basic training (usually about 3 years, involving exposure to a broad range of surgical specialties) and higher or specialist training (usually about 5 years). In the case of general surgical trainees, this second phase of training typically involves rotation through all of the recognized subspecialties (for example, vascular and colorectal) followed by a dedicated period of senior training in a specific area (such as hepatobiliary pancreatic surgery). Formal examinations dictate progression between basic and specialist training, and a system of annual appraisal plus an exit examination determines the completion of satisfactory training and entry to the consultant grade. Mackay et al.14 have developed an approach to assessing the technical ability of junior surgical trainees that is designed to be incorporated early in a surgical career, thereby ensuring that basic skills are learnt before entry to a higher surgical training programme. The examination is generic and involves bench-top models of surgical tasks such as knot-tying and enterotomy closure. The next stage of assessment is to develop a higher-fidelity examination for higher surgical trainees, with the aim of ensuring ongoing proficiency and enabling poorly performing trainees to be identified. This is more difficult, as the nature of the procedures becomes increasingly complex. The aim of this study was to integrate a set of previously validated laboratory-based tasks into a multistation examination designed to appraise and compare the technical ability of higher surgical trainees. Methods Twenty-two participants from within the North West Thames General Surgical Training Programme were recruited to the study. They were divided into four groups: five basic surgical trainees, eight junior specialist trainees in their first 3 years of higher training, four senior specialist trainees in their final 2 years of higher training, and five consultant general surgeons. The examination was aimed specifically at assessing performance in the junior specialist trainee group, with basic trainees and senior registrar/consultant groups enabling definition of the lower and upper limits of performance respectively. All trainees had completed the Royal College of Surgeons' Basic Surgical Skills Course; each participant performed the assessment procedures in a clinical setting. Assessment tasks The examination consisted of five separate tasks designed to assess a variety of technical skills involving gastrointestinal and vascular anastomoses, laparoscopy and endoscopy. The tasks were chosen to reflect procedures that specialist trainees in general surgery are expected to perform proficiently within the first 3 years of speciality training (Table 1). Three tasks were based on synthetic simulation models: small bowel anastomosis (Limbs and Things, Bristol, UK), vein patch insertion into an artery at depth (Annex Art, Anglesey, UK) and dissection of a saphenofemoral junction in the groin (Limbs and Things). Two tasks were based on VR surgical simulations: flexible sigmoidoscopy (Pre-Op™ VR endoscopic simulator; Immersion Medical, Gaithersburg, Maryland, USA) and laparoscopic manipulation (MIST-VR™ laparoscopic trainer; Mentice Corporation, Gothenburg, Sweden). Face, construct and content validity had been established previously for all five procedures13,15,18–20,22. Table 1 Assessment tasks and skills tested Task . Type of material . Technical skills tested . Cognitive skills tested . Saphenofemoral vein junction dissection Synthetic tissues in groin simulator Incision, recognition of tissue planes and tissue handling Saphenofemoral dissection Small bowel anastomosis Synthetic bowel Approximation skills, suturing, knot-tying Bowel anastomosis Vascular vein patch insertion Synthetic vein patch and artery Suturing skills, suture handling, apposition skill, operating at depth Vascular anastomosis Flexible sigmoidoscopy Virtual reality programme with interactive sigmoidoscope Use and manipulation of endoscope, recognition of pathology Lower gastrointestinal endoscopy MIST-VR™ laparoscopic tasks Virtual reality simulator with interactive instruments Hand–eye coordination, laparoscopic skills and manipulation Basic laparoscopy Task . Type of material . Technical skills tested . Cognitive skills tested . Saphenofemoral vein junction dissection Synthetic tissues in groin simulator Incision, recognition of tissue planes and tissue handling Saphenofemoral dissection Small bowel anastomosis Synthetic bowel Approximation skills, suturing, knot-tying Bowel anastomosis Vascular vein patch insertion Synthetic vein patch and artery Suturing skills, suture handling, apposition skill, operating at depth Vascular anastomosis Flexible sigmoidoscopy Virtual reality programme with interactive sigmoidoscope Use and manipulation of endoscope, recognition of pathology Lower gastrointestinal endoscopy MIST-VR™ laparoscopic tasks Virtual reality simulator with interactive instruments Hand–eye coordination, laparoscopic skills and manipulation Basic laparoscopy Open in new tab Table 1 Assessment tasks and skills tested Task . Type of material . Technical skills tested . Cognitive skills tested . Saphenofemoral vein junction dissection Synthetic tissues in groin simulator Incision, recognition of tissue planes and tissue handling Saphenofemoral dissection Small bowel anastomosis Synthetic bowel Approximation skills, suturing, knot-tying Bowel anastomosis Vascular vein patch insertion Synthetic vein patch and artery Suturing skills, suture handling, apposition skill, operating at depth Vascular anastomosis Flexible sigmoidoscopy Virtual reality programme with interactive sigmoidoscope Use and manipulation of endoscope, recognition of pathology Lower gastrointestinal endoscopy MIST-VR™ laparoscopic tasks Virtual reality simulator with interactive instruments Hand–eye coordination, laparoscopic skills and manipulation Basic laparoscopy Task . Type of material . Technical skills tested . Cognitive skills tested . Saphenofemoral vein junction dissection Synthetic tissues in groin simulator Incision, recognition of tissue planes and tissue handling Saphenofemoral dissection Small bowel anastomosis Synthetic bowel Approximation skills, suturing, knot-tying Bowel anastomosis Vascular vein patch insertion Synthetic vein patch and artery Suturing skills, suture handling, apposition skill, operating at depth Vascular anastomosis Flexible sigmoidoscopy Virtual reality programme with interactive sigmoidoscope Use and manipulation of endoscope, recognition of pathology Lower gastrointestinal endoscopy MIST-VR™ laparoscopic tasks Virtual reality simulator with interactive instruments Hand–eye coordination, laparoscopic skills and manipulation Basic laparoscopy Open in new tab The examination lasted for a maximum of 2·5 h (30 min per station), as these timings had been shown previously to be sufficient even for the slowest and least experienced trainees13,15,18–20,22. Methods of assessment Performance data from the synthetic simulation tasks were recorded on digital videotape. They were analysed retrospectively and independently by three trained observers who were not involved in organizing the examinations and were unaware of the identities of participants. Anonymity was ensured by videotaping only the bench model and the gloved surgeon's hands. The Objective Structured Assessments of Technical Skill (OSATS) technique, which consists of an eight-variable structured global rating scale (rated 1–5, maximum score 40), was used to measure performance (Table 2). A generic scale was used for assessment of surgical skill, as in previously published validation studies for each of these three tasks20,22. Table 2 Global rating scale from the Objective Structured Assessment of Technical Skill (OSATS)9 Open in new tab Table 2 Global rating scale from the Objective Structured Assessment of Technical Skill (OSATS)9 Open in new tab The two computer-based simulators have in-built assessment packages based on the variables time, economy of movement and error scores. These can be summed to provide an overall score of performance. These assessment methods were again identical to those used in previously validated studies of VR simulation13,15,18. Statistical analysis As the underlying distribution of the observations was unknown, a non-parametric approach was employed for statistical analysis. Kruskal–Wallis and Mann–Whitney U tests were used to compare performance between the experience groups. P < 0·050 was considered statistically significant. Results The results for each candidate in the five tasks are shown in Figs 1–3. The scores obtained for each task were then ranked. As there were 22 participants, the best performance at each station was given a value of 22 and the worst a value of one. The combined scores are shown in Fig. 4. Fig. 1 Open in new tabDownload slide Objective Structured Assessment of Technical Skill (OSATS) global rating score for small bowel anastomosis, vein patch insertion and saphenofemoral dissection tasks for each candidate. A higher score indicates better performance Fig. 2 Open in new tabDownload slide MIST VR™ laparoscopic simulator results for each candidate. A lower score indicates better performance Fig. 3 Open in new tabDownload slide Sigmoidoscopy simulator results for each candidate. A higher score indicates better performance Fig. 4 Open in new tabDownload slide Combined ranking for all five tasks for each candidate There was a significant improvement in overall performance with increasing seniority (P < 0·001, Kruskal–Wallis test). The median (range) values of the ranks for each group were: basic surgical trainees, 26·5 (9·0–45·0); junior specialist trainees, 54·0 (40·0–77·0); senior specialist trainees, 78·0 (58·5–83·5); and consultants, 82·0 (64·5–89·5). There were significant differences in the performance of basic trainees and junior specialist trainees (P = 0·019), and junior and senior specialist trainees (P = 0·048), but not senior trainees and consultants (P = 0·623) (all Mann–Whitney U test). For the three video-based tasks, the reliability of scores between raters was assessed by comparing the rankings of each rater for each candidate. Fig. 5 shows the findings for small bowel anastomosis, and reveals close agreement between the three raters (rS = 0·602, P < 0·001). Similar results were achieved for vein patch insertion and saphenofemoral dissection (rS = 0·738, P < 0·001 and rS = 0·803, P < 0·001 respectively). Fig. 5 Open in new tabDownload slide Objective Structured Assessment of Technical Skill (OSATS) global rating score for small bowel anastomosis for individual examiners, demonstrating inter-rater agreement Discussion There is growing appreciation both within and outside the surgical community of the need for more objective methods of assessing technical performance. Skills assessment is starting to be incorporated into training programmes and other groups7–12 have demonstrated the validity of multistation skills examinations, run in an OSCE format, with objective measurements of performance. Assessments developed for surgical trainees by Martin et al.9 involved diverse tasks, ranging from excision of a skin lesion to control of inferior vena cava haemorrhage. In the present study, tasks were designed specifically to examine techniques and procedures that a junior specialist registrar is expected to perform independently and competently after the first 3 years of higher surgical training. The attainment of these skills parallels recommendations made by the American Council of Graduate Medical Education23. Competence in basic tasks is assumed as a criterion for entry to the surgical training programme. A problem with multistation assessments has been the number of expert examiners required. The decision to videotape the three tasks involving synthetic models reduced the number of examiners required as well as the possibility of rater bias. A number of factors need to be considered when choosing appropriate models for comparisons of technical ability. Is the task valid—can it discriminate correctly between experienced, intermediate and novice performers? Is there a ceiling or floor effect—is the task is too easy, so that the whole subject group performs well, or too difficult, so that even good candidates may perform badly, making discrimination within the target group impossible? Can the particular skill being assessed be isolated from the task? Inanimate and VR models of surgical tasks were used throughout as they allow exact replication of conditions for all candidates, permitting reliable comparisons between individuals. Human and animal models suffer in this respect, as anatomical and pathological variations can result in large differences between subjects. This examination model also differed from previous studies employing multistation assessments in that the tasks had already been validated individually. Therefore, the question was not whether a task was valid, but whether it was appropriate for the defined target group of subjects. Evaluations of technical skill have many potential uses beyond competency assessment. Repeated examinations can determine technical performance during training, leading to the development of longitudinal analysis and formative feedback. The latter would enable constructive criticism of performance, allowing specific deficiencies to be highlighted and corrected with targeted training. This could be used to determine progress or a need for repetition within a training schedule. Evaluations could also be used to assess the quality of a training programme and highlight deficiencies within it. This study did reveal a ceiling effect, in performance at senior specialist trainee and consultant level, in agreement with previous studies19–22. Technical ability depends on inherent knowledge of the procedure (cognitive element) and the process of execution (manual dexterity component). It is well established that manual dexterity skills are acquired first in the process of gaining expertise. Cognitive factors take longer to learn, and it is the acquisition of cognitive skill that differentiates competent from virtuoso performance24,25. The target group of junior specialist trainees enrolled in the present study were in the middle of overall training, and their need for more complex technical skills and more basic cognitive traits to complete the tasks successfully was reflected in their scores. For technical assessments to be used with confidence, objectivity, validity, reliability and feasibility need to be demonstrated. The objectivity and validity of the tasks used in this study have already been shown. The study demonstrated good reliability between observers. From a clinical perspective, the tasks are complementary and test the skills required of a junior surgical specialist trainee. This examination has confirmed feasibility at several levels. In terms of time, it took 2·5 h to complete the tasks and up to five trainees could be assessed at any one time, requiring a maximum of only two researchers to run the examination. With regard to cost, this was approximately £100 per trainee based on the total cost of all simulations, sutures and instruments used per candidate. Finally, the examination itself is flexible; both the models and assessment tools can be transported easily between sites. Although this study focused on the assessment of skill in general surgical trainees, the same methodology could be applied in other surgical specialties, or fields of medicine that require proficiency in manual dexterity. References 1 Baldwin PJ , Paisley AM, Brown SP. Consultant surgeons' opinion of the skills required of basic surgical trainees . Br J Surg 1999 ; 86 : 1078 – 1082 . Google Scholar Crossref Search ADS PubMed WorldCat 2 Thomas WE . Core skills, courses and competency . Ann R Coll Surg Engl 2000 ; 82 ( Suppl ): 18 – 20 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 3 Calman KC . Specialist training in the UK . Lancet 1997 ; 350 : 1852 . Google Scholar Crossref Search ADS PubMed WorldCat 4 Pollock AV . How do we measure surgical competence? Eur J Surg 1996 ; 162 : 355 – 360 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 5 Schwartz RW , Donnelly MB, Sloan DA, Johnson SB, Strodel WE. The relationship between faculty ward evaluations, OSCE, and ABSITE as measures of surgical intern performance . Am J Surg 1995 ; 169 : 414 – 417 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 6 Darzi A , Smith S, Taffinder N. Assessing operative skill. Needs to become more objective . BMJ 1999 ; 318 : 887 – 888 . Google Scholar Crossref Search ADS PubMed WorldCat 7 Kopta JA . An approach to the evaluation of operative skills . Surgery 1971 ; 70 : 297 – 303 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 8 Faulkner H , Regehr G, Martin J, Reznick R. Validation of an objective structured assessment of technical skill for surgical residents . Acad Med 1996 ; 71 : 1363 – 1365 . Google Scholar Crossref Search ADS PubMed WorldCat 9 Martin JA , Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C et al. Objective structured assessment of technical skill (OSATS) for surgical residents . Br J Surg 1997 ; 84 : 273 – 278 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 10 Anastakis DJ , Regehr G, Reznick RK, Cusimano M, Murnaghan J, Brown M et al. Assessment of technical skills transfer from the bench training model to the human model . Am J Surg 1999 ; 177 : 167 – 170 . Google Scholar Crossref Search ADS PubMed WorldCat 11 Szalay D , MacRae H, Regehr G, Reznick R. Using operative outcome to assess technical skill . Am J Surg 2000 ; 180 : 234 – 237 . Google Scholar Crossref Search ADS PubMed WorldCat 12 Bruce NC . Evaluation of procedural skills of internal medical residents . Acad Med 1989 ; 64 : 213 – 216 . Google Scholar Crossref Search ADS PubMed WorldCat 13 Datta V , Mackay S, Mandalia M, Darzi A. The use of electromagnetic motion tracking analysis to objectively measure open surgical skill in the laboratory-based model . J Am Coll Surg 2001 ; 193 : 479 – 485 . Google Scholar Crossref Search ADS PubMed WorldCat 14 Mackay S , Datta V, Chang A, Shah J, Kneebone R, Darzi A. Multiple Objective Measures of Skill (MOMS): a new approach to the assessment of technical ability in surgical trainees . Ann Surg 2003 ; 238 : 291 – 300 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 15 Taffinder N , Sutton C, Fishwick RJ, McManus IC, Darzi A. Validation of virtual reality to teach and assess psychomotor skills in laparoscopic surgery: results from randomised controlled studies using the MIST VR laparoscopic simulator . Stud Health Technol Inform 1998 ; 50 : 124 – 130 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 16 Tasto JL , Verstreken K, Brown JM, Bauer JJ. PreOp endoscopy simulator: from bronchoscopy to ureteroscopy . Stud Health Technol Inform 2000 ; 70 : 344 – 349 . Google Scholar PubMed OpenURL Placeholder Text WorldCat 17 Johnston R , Weiss P. Analysis of virtual reality technology applied in education . Minimally Invasive Ther Allied Technol 1997 ; 6 : 126 – 127 . Google Scholar Crossref Search ADS WorldCat 18 Taffinder NJ , McManus IC, Gul Y, Russell RC, Darzi A. Effect of sleep deprivation on surgeons' dexterity on laparoscopy simulator . Lancet 1998 ; 352 : 1191 . Google Scholar Crossref Search ADS PubMed WorldCat 19 Datta V , Mandalia M, Mackay S, Darzi A. The PreOp flexible sigmoidoscopy trainer. Validation and early evaluation of a virtual reality based system . Surg Endosc 2002 ; 16 : 1459 – 1463 . Google Scholar Crossref Search ADS PubMed WorldCat 20 Datta V , Bann S, Beard J, Mandalia M, Darzi A. Comparison of bench test evaluations of surgical skill with live operating performance assessments . J Am Coll Surg 2004 ; 199 : 603 – 606 . Google Scholar Crossref Search ADS PubMed WorldCat 21 Datta V , Mandalia M, Mackay S, Chang A, Cheshire N, Darzi A. Relationship between skill and outcome in the laboratory-based model . Surgery 2002 ; 131 : 318 – 323 . Google Scholar Crossref Search ADS PubMed WorldCat 22 Datta V , Chang A, Mackay S, Darzi A. The relationship between motion analysis and surgical technical assessments . Am J Surg 2002 ; 184 : 70 – 73 . Google Scholar Crossref Search ADS PubMed WorldCat 23 American Council of Graduate Medical Education and American Board of Medical Specialties . Toolbox of assessment methods. In Credentialing Physian Specialists: a World Perspective. American Council of Graduate Medical Education and American Board of Medical Specialties : Chicago , 2000 ; 67 – 79 . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC 24 Wade MG . Developmental motor learning . Exerc Sport Sci Rev 1976 ; 4 : 375 – 394 . Google Scholar Crossref Search ADS PubMed WorldCat 25 Annett J . Acquisition of skill . Br Med Bull 1971 ; 27 : 266 – 271 . Google Scholar Crossref Search ADS PubMed WorldCat Copyright © 2006 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) Copyright © 2006 British Journal of Surgery Society Ltd. Published by John Wiley & Sons, Ltd. TI - Technical skills examination for general surgical trainees JF - British Journal of Surgery DO - 10.1002/bjs.5330 DA - 2006-08-17 UR - https://www.deepdyve.com/lp/oxford-university-press/technical-skills-examination-for-general-surgical-trainees-94dHps7aNe SP - 1139 EP - 1146 VL - 93 IS - 9 DP - DeepDyve ER -