Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning

The script concordance test in radiation oncology: validation study of a new tool to assess... Background: The Script Concordance test (SCT) is a reliable and valid tool to evaluate clinical reasoning in complex situations where experts' opinions may be divided. Scores reflect the degree of concordance between the performance of examinees and that of a reference panel of experienced physicians. The purpose of this study is to demonstrate SCT's usefulness in radiation oncology. Methods: A 90 items radiation oncology SCT was administered to 155 participants. Three levels of experience were tested: medical students (n = 70), radiation oncology residents (n = 38) and radiation oncologists (n = 47). Statistical tests were performed to assess reliability and to document validity. Results: After item optimization, the test comprised 30 cases and 70 questions. Cronbach alpha was 0.90. Mean scores were 51.62 (± 8.19) for students, 71.20 (± 9.45) for residents and 76.67 (± 6.14) for radiation oncologists. The difference between the three groups was statistically significant when compared by the Kruskall-Wallis test (p < 0.001). Conclusion: The SCT is reliable and useful to discriminate among participants according to their level of experience in radiation oncology. It appears as a useful tool to document the progression of reasoning during residency training. Experienced practitioners possess elaborate networks of Background In oncology, a constant flow of new data from research knowledge, called scripts [7] fitted to adapt to their clini- exposes the physician to an abundance of treatment alter- cal tasks. Scripts allow the clinician to determine diagno- natives [1-3]. The clinician is often challenged by ill- sis, strategies of investigation, or treatment options. defined problems [4,5] characterized by uncertainty, and Scripts begin to appear during medical school and are opinions on the treatment of a particular patient may dif- refined over years of clinical experience [8]. The script fer considerably [6]. Reasoning on treatment options concordance test (SCT), which is based on cognitive psy- should be monitored to provide information on residents' chology script theory [9], provides a way to assess reason- strengths and weaknesses and to guide their learning. Page 1 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 ing skills in the context of uncertainty that often characterizes oncology. SCT makes it possible to include real-life situations that are scarcely ever measured with usual tests. It probes the multiple judgments that are made in the clinical reason- ing process. Scoring reflects the degree of concordance of these judgments to those of a panel of reference. A series of studies, held in domains such as intra-operative deci- sion-making skills [10], urology [11] or family medicine [12,13] documents the reliability and construct validity of test scores. Research questions were 1- Is it possible to obtain reliable scores on clinical reason- ing in radiation oncology? 2- Do SCT scores reflect participants' level of clinical expe- rience? 3- How the test is perceived by residents and experienced professionals? Methods Instrument SCTs [9] are made up of cases that incorporate the uncer- tainty of practice situations. Several options are relevant in solving the diagnostic or management problem posed by Ite tions of the test Figure 1 ms from the pulmonary, urological and breast cancer por- the situation. Case scenarios are followed by a series of Items from the pulmonary, urological and breast questions, presented in three parts (See Figure 1). The first cancer portions of the test. part ("if you were thinking of") contains a relevant option. The second part ("and then you find") presents a new clinical finding, such as a physical sign, a pre-existing condition, an imaging study or a laboratory test result. clinical experience in radiation oncology, acquired by an The third part ("this option becomes") is a five-point Lik- elective rotation (students taking these elective rotations ert scale that captures the examinees' decision. The task for often consider specializing in radio-oncology). The sec- examinees is to decide what effect the new finding has in ond group consisted of the population of residents of the direction (positive, negative or neutral) and intensity, on three residency programs in radiation oncology in the the status of the option. This effect is captured with a Lik- province of Quebec (Montreal, Laval and McGill Univer- ert scale because script theory assumes that clinical rea- sities) – a total of 52 residents. The third group consisted soning is composed of a series of qualitative judgments of the whole population of board-certified practitioners in [7]. The radio-oncology test was constructed by two radi- radiation oncology of the province of Quebec (n = 62). ation oncologists. Cases were taken from the three most The test was administered in the French language. prevalent fields in the cancer patient population: pulmo- nary, urological and breast cancers (10 cases per field). Each examinee received instructions on the particular for- Each case, presented in a short scenario, was followed by mat of the SCT and on the classification of the AJCC three related test items. (American Joint Committee on Cancer) of the tested can- cers. Participation was voluntary. Demographic data was Subjects collected for students and residents. Anonymity was guar- Three groups were sampled, representing three levels of anteed for certified board specialists. The project was clinical experience. The first group consisted of 4th-year approved by the ethics committee of the University of medical students (n = 70) from the University of Mon- Montreal. treal. Students took the exam immediately after a lecture given on radiation oncology. Only four of them possessed Page 2 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 Scoring homogeneity, non-parametric alternatives were used. To SCT scoring is based on the comparison of answers pro- evaluate the capacity to significantly discriminate the vided by examinees with those of a reference panel, com- scores of the three groups, the non-parametric Kruskall- posed of physicians with experience in the field. Panel Wallis test was applied. The non-parametric Mann-Whit- members are asked to complete the test individually, and ney test was used to assess more specifically the difference their answers are used to develop the scoring key [9]. In in scores between residents and radiation oncologists, this study the radio-oncologists of the province of Quebec junior residents (PGY-1 to PGY-3) and senior residents made both the third level of clinical experience and the (PGY-4 to PGY-5). panel of reference. All tests were bilateral, and p values < 0.05 were consid- With this scoring method, the maximum score for each ered statistically significant. No correction for multiple question is 1, for the modal answer from the reference tests was applied. The test results were treated anony- panel. Other panel members' choices are attributed a par- mously. The analysis was done with the SPSS (Statistical tial credit, proportional to the number of members having Package for Social Sciences) software, version 11.0. provided that answer on the Likert scale divided by the modal value for the item. Answers not chosen by any Feasibility panel members receive zero. For example, suppose the ref- Data were collected informally on test construction diffi- erence panel, made of 42 radio-oncologists, respond to a culties and on residents' and board-certified specialists' question in the following way: none choose the "+2", and reactions to the content and format of the test. "+1" ratings, 2 choose the "0" rating, 10 choose the "-1" rating and 30 choose the "-2" rating. The modal answer in Results this example is "-2". An examinee choosing this rating will Subjects receive 1. Selecting the "-1" rating will earn 0.33 (10/30) Participants signed a consent form before taking the test. and the "0" rating, 0.06 (2/30). No points are accorded for All 70 students agreed to participate. Among residents, all selecting the "+1" or "+2" ratings. those from the University of Montreal (22), half of those from McGill (8/16) and 8 out of 14 from Laval took the With this method, all questions have the same maximum test (no reason was provided by those who declined, but (1) and minimum (0) value. Scores obtained on each at McGill it was mainly a result of the language barrier). question are added to obtain a total score for the test. With The 38 participating residents represent 72% of radiation SCTs, a theoretical score of 100 would mean that the per- oncology residents of the province, 70% (26) were juniors son had answered each item in the same way that the and 30% (11) were seniors (1 resident did not specify his/ majority of panel members did. In reality, such a score is her year of residency). Forty-seven (76%) of the 62 board- never reached, even by panel members. Panel means certified radiation oncologists in the province agreed to found in tests administered in other specialities were gen- participate. Among them, 81% had their practice in Uni- erally in the 80s. versity hospitals. While there was no time constraint, most participants completed the test in under an hour. Statistical analysis To avoid bias, when radio-oncologists where used as Among board-certified radiation oncologists three were members of the panel, scores for each question were com- outliers (total test score under two standard deviations puted using a scoring key that excluded their own from the mean), and two had too many missing data. All response to that question. When they were studied as five were taken out of the group. The panel and third level third level of experience, their scores were computed with of experience was therefore made up of 42 persons. the scoring key used for the other participants. Missing data An item analysis was done to detect problematic ques- One student and one resident were removed from the tions. Questions with a low item-total correlation (r < study due to too many missing data (more than four miss- 0.10) were removed. The normality of the distributions ing answers in either the pulmonary, urological or breast was evaluated with the Kolmogorov-Smirnov statistical cancer sections). For all other participants, missing test. The total scores of the subjects of each group were dis- answers were replaced by the average score of all other tributed. Reliability was estimated using the Cronbach questions from that section of the test. This represents less alpha internal consistency coefficient. than 0.5% of all test answers. Participating in the analyses were 69 students, 37 radiation oncology residents and 42 The Levene test was used to evaluate the homogeneity of radiation oncologists. the variance of the three groups. ANOVA analysis was planned for group comparison. In case of lack of variance Page 3 of 6 (page number not for citation purposes) Students Residents Panel Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 Item analysis Participant scores The test taken by participants was composed of 10 cases in The mean score and the score variability are 51.6 (SD = each section with three related questions. After item anal- 8.2; range 32.7–74.9) for students, 71.2 (SD = 9.5; range ysis, five questions were removed from the lung cancer = 53.2–85.8) for residents, and 76.7 (SD = 6.1; range = section, five from the urological cancer section, and 10 61.8–90.2) for board-certified radiation oncologists. The from the breast cancer section. A maximum of two ques- score distributions for each group are presented graphi- tions were discarded per case. After item optimization, the cally in Figure 2. The score of one of the students (74.9) test comprised 30 cases and 70 questions. The normality differs significantly from the average. This individual did of score distributions was verified with the Kolmogorov- a one-month rotation in radiation oncology and a one- Smirnov statistical test (Z > 0.558; p > 0.736). month rotation in medical oncology. Reliability Since the variances in the mean scores were not homoge- The Cronbach alpha coefficient value for the optimized neous (p = 0.016), non-parametric tests were used to eval- test is 0.90 (0.72 for the 25 questions on lung cancer, 0.78 uate the capacity of the test to detect differences according for the 25 questions on urological cancer, and 0.78 for the to clinical experience. There was a significant difference (p 20 items on breast cancer). < 0.001) between the mean of the scores of the three groups of examinees. There is also a significant difference 30 40 50 60 70 80 90 100 SCT score SCT Score Distr Figure 2 ibutions for Each Group SCT Score Distributions for Each Group. Page 4 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 within residents, with junior residents (PGY-1 to PGY-3) Some of the results warrant comments. One student having a mean of 68.9 ( ± 10.0) and senior residents obtained a significantly higher score than the other mem- (PGY-4 to PGY-5) of 76.5 ( ± 5.0; Z = -2.193, p = 0.028). bers of his group. This student had completed two elective rotations in the field and was aiming for a residency in Feasibility of and reactions to the test format radiation oncology. Senior residents had scores that were Despite their lack of knowledge and experience in the close to those of the panel, thus indicating readiness for field, students were stimulated by the test format and were autonomous practice in the tested domains. On the other eager to get their test result. Students, residents and spe- hand, some students and residents had low scores, indi- cialists completed the test with pleasure, and recruitment cating that SCT may potentially be used to identify resi- for the study was easy. Participants from the three groups dents who might not have good clinical judgment and succeeded in completing the test in a short time (less than may need to take remedial action. an hour for most) and expressed their appreciation of the similarity between the situations described in the scenar- The study has several limitations. It addresses only three ios and real situations encountered in their practice. specific areas of radiation oncology, and participants come from a limited geographic area. In the future, it would be interesting to repeat this experiment using a tool Discussion With SCT, examinees are probed on a specific component extended to include other pathologies in oncology of clinical reasoning: data interpretation, i.e. a crucial step (gynaecological, digestive, head and neck, etc). These within the clinical reasoning process [14]. It measures the spheres of competency have fewer practising experts and degree of concordance between the examinee's perform- the results could be different. The majority of experts who ance and that of a reference panel on a series of case-based completed the SCT practise in a university centre (81%) tasks. As it is inferred that high scores correspond to opti- and often specialize in one or more specific areas of radi- mal use of information in the context of these specific ation oncology. Therefore, certain panel members who tasks, the test therefore provides an indication of clinical answered questions on lung cancer have not actually reasoning quality. Clinicians find the test appealing treated this pathology for many years. A forthcoming because it contains cognitive tasks similar to those they study will examine the influence of this specialization of encounter during their daily practice. Furthermore, as radiation oncologists and the optimal number of panel opposed to many other tests that require revision of members. knowledge for optimal performance, a clinician can fill out the test at any time without any preparation. Conclusion SCT seems to measure a dimension of reasoning and With SCT, examinees are not assessed against pre-set crite- knowledge that is different from those evaluated by usual ria or compared to their own group. Residents are physi- assessment tools. It explores the interpretation of data in cians who wish to become, after training, members of the a clinical context, with ability clearly related to clinical population of certified specialists of their field of study. experience. This study provides evidence in favour of SCT Thus it is legitimate to compare their reasoning perform- as a reliable and valid tool to evaluate the clinical reason- ance to a panel that is representative of physicians in that ing of radiation oncology residents. The use of this instru- field. The panel, made up of 76% of board-certified radio- ment will allow for a more comprehensive evaluation of a oncologists of the province, was highly representative of resident's performance in this specialty. A low score on this population. SCT could indicate residents who need assistance in the development of their reasoning capacity. Test scores appear reliable, with Cronbach alpha coeffi- cient reaching a value of 0.90 for 70 questions and for one Competing interests hour of testing time. This compares very favourably with The authors declare that they have no competing interests. other test formats when compared in units of testing time [15]. The test showed a capacity to reflect clinical experi- Authors' contributions ence in the field. Participants with more experience in CL contributed to conception and design, acquisition of radiation oncology scored higher on the SCT. This was data and interpretation of data and has been involved in true for students when compared with the other two lev- drafting the manuscript. RG contributed to conception els, for residents when compared with board-certified and design, analysis and interpretation of data and has radiation oncologists and for junior residents when com- been involved in revising the manuscript critically. DN pared with senior residents. These results indicate that the contributed to acquisition of data and revised the manu- SCT format should be useful for documenting learning script. BC contributed to conception and design, analysis alongside residency training. and interpretation of data and has been involved in draft- Page 5 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 ing the manuscript. All authors read and approved the final manuscript. Acknowledgements Évelyne Sauvé for data collection. Source of funding: CPASS (Centre de Pédagogie Appliquée aux Sciences de la Santé), Faculté de médecine, Uni- versité de Montréal References 1. Blackstock AW, Govindan R: Definitive chemoradiation for the treatment of locally advanced non-small-cell lung cancer. J Clin Oncol 2007, 25(28):4146-4152. 2. Boughey JC, Gonzalez RJ, Bonner E, Kuerer HM: Current treat- ment and clinical trial developments for ductal carcinoma in situ of the breast. Oncologist 2007, 12(11):1276-1287. 3. Hede K: Radioactive "seed" implants may rival surgery for low-risk prostate cancer patients. J Natl Cancer Inst 2007, 99(20):1507-1509. 4. Schön D: The Reflective Practitioner: How Professionals Think in Action New York: Basic Books; 1983. 5. Fox R: Medical Uncertainty Revisited. In Handbook of Social Stud- ies in Health and Medicine Edited by: Albrecht G, Fitzpatrick R, Scrim- shaw S. London: Sage Publications; 2000:409-425. 6. Hool GR, Church JM, Fazio VW: Decision-making in rectal can- cer surgery: survey of North American colorectal residency programs. Dis Colon Rectum 1998, 41(2):147-152. 7. Charlin B, Boshuizen HPA, Custers EJFM, Feltovich Paul J: Scripts and clinical reasoning. Med Educ 2007, 41:1179-1185. 8. Schmidt HG, Norman GR, Boshuizen HP: A cognitive perspective on medical expertise: theory and implication. Acad Med 1990, 65(10):611-21. 9. Charlin B, Roy L, Brailovsky C, Vleuten C Van der: The Script Con- cordance Test: a Tool to Assess the Reflective Clinician. Teaching and Learning in Medicine 2000, 12:189-195. 10. Meterissian S, Zabolotny B, Gagnon R, Charlin B: Is the script con- cordance test a valid instrument for assessment of intraop- erative decision-making skills? Am J Surg 2007, 193:248-251. 11. Sibert L, Darmoni SJ, Dahamna B, Hellot MF, Weber J, Charlin B: Online clinical reasoning assessment with the Script Con- cordance Test: results of a French pilot study. BMC Medical Education 2006, 6:45. 12. Gagnon R, Charlin B, Coletti M, Sauvé E, Vleuten C Van der: Assess- ment in the context of uncertainty: How many members are needed on the panel of reference of a script concordance test? Med Educ 2005, 39:284-291. 13. Charlin B, Vleuten C Van der: Standardized assessment in con- text of uncertainty: The script concordance approach. Evalu- ation and the Health Professions 2004, 27:304-319. 14. Fournier JP, Demeester A, Charlin B: Script Concordance Tests: Guidelines for construction. BMC Medical Informatics and Decision Making 2008, 8:18. 15. Wass V: Assessment of clinical competence. Lancet 2001, 357(9260):945-949. Publish with Bio Med Central and every scientist can read your work free of charge "BioMed Central will be the most significant development for disseminating the results of biomedical researc h in our lifetime." Sir Paul Nurse, Cancer Research UK Your research papers will be: available free of charge to the entire biomedical community peer reviewed and published immediately upon acceptance cited in PubMed and archived on PubMed Central yours — you keep the copyright BioMedcentral Submit your manuscript here: http://www.biomedcentral.com/info/publishing_adv.asp Page 6 of 6 (page number not for citation purposes) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Radiation Oncology Springer Journals

The script concordance test in radiation oncology: validation study of a new tool to assess clinical reasoning

Loading next page...
 
/lp/springer-journals/the-script-concordance-test-in-radiation-oncology-validation-study-of-6sSVyu2WHw

References (22)

Publisher
Springer Journals
Copyright
Copyright © 2009 by Lambert et al; licensee BioMed Central Ltd.
Subject
Medicine & Public Health; Oncology; Radiotherapy
eISSN
1748-717X
DOI
10.1186/1748-717X-4-7
pmid
19203358
Publisher site
See Article on Publisher Site

Abstract

Background: The Script Concordance test (SCT) is a reliable and valid tool to evaluate clinical reasoning in complex situations where experts' opinions may be divided. Scores reflect the degree of concordance between the performance of examinees and that of a reference panel of experienced physicians. The purpose of this study is to demonstrate SCT's usefulness in radiation oncology. Methods: A 90 items radiation oncology SCT was administered to 155 participants. Three levels of experience were tested: medical students (n = 70), radiation oncology residents (n = 38) and radiation oncologists (n = 47). Statistical tests were performed to assess reliability and to document validity. Results: After item optimization, the test comprised 30 cases and 70 questions. Cronbach alpha was 0.90. Mean scores were 51.62 (± 8.19) for students, 71.20 (± 9.45) for residents and 76.67 (± 6.14) for radiation oncologists. The difference between the three groups was statistically significant when compared by the Kruskall-Wallis test (p < 0.001). Conclusion: The SCT is reliable and useful to discriminate among participants according to their level of experience in radiation oncology. It appears as a useful tool to document the progression of reasoning during residency training. Experienced practitioners possess elaborate networks of Background In oncology, a constant flow of new data from research knowledge, called scripts [7] fitted to adapt to their clini- exposes the physician to an abundance of treatment alter- cal tasks. Scripts allow the clinician to determine diagno- natives [1-3]. The clinician is often challenged by ill- sis, strategies of investigation, or treatment options. defined problems [4,5] characterized by uncertainty, and Scripts begin to appear during medical school and are opinions on the treatment of a particular patient may dif- refined over years of clinical experience [8]. The script fer considerably [6]. Reasoning on treatment options concordance test (SCT), which is based on cognitive psy- should be monitored to provide information on residents' chology script theory [9], provides a way to assess reason- strengths and weaknesses and to guide their learning. Page 1 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 ing skills in the context of uncertainty that often characterizes oncology. SCT makes it possible to include real-life situations that are scarcely ever measured with usual tests. It probes the multiple judgments that are made in the clinical reason- ing process. Scoring reflects the degree of concordance of these judgments to those of a panel of reference. A series of studies, held in domains such as intra-operative deci- sion-making skills [10], urology [11] or family medicine [12,13] documents the reliability and construct validity of test scores. Research questions were 1- Is it possible to obtain reliable scores on clinical reason- ing in radiation oncology? 2- Do SCT scores reflect participants' level of clinical expe- rience? 3- How the test is perceived by residents and experienced professionals? Methods Instrument SCTs [9] are made up of cases that incorporate the uncer- tainty of practice situations. Several options are relevant in solving the diagnostic or management problem posed by Ite tions of the test Figure 1 ms from the pulmonary, urological and breast cancer por- the situation. Case scenarios are followed by a series of Items from the pulmonary, urological and breast questions, presented in three parts (See Figure 1). The first cancer portions of the test. part ("if you were thinking of") contains a relevant option. The second part ("and then you find") presents a new clinical finding, such as a physical sign, a pre-existing condition, an imaging study or a laboratory test result. clinical experience in radiation oncology, acquired by an The third part ("this option becomes") is a five-point Lik- elective rotation (students taking these elective rotations ert scale that captures the examinees' decision. The task for often consider specializing in radio-oncology). The sec- examinees is to decide what effect the new finding has in ond group consisted of the population of residents of the direction (positive, negative or neutral) and intensity, on three residency programs in radiation oncology in the the status of the option. This effect is captured with a Lik- province of Quebec (Montreal, Laval and McGill Univer- ert scale because script theory assumes that clinical rea- sities) – a total of 52 residents. The third group consisted soning is composed of a series of qualitative judgments of the whole population of board-certified practitioners in [7]. The radio-oncology test was constructed by two radi- radiation oncology of the province of Quebec (n = 62). ation oncologists. Cases were taken from the three most The test was administered in the French language. prevalent fields in the cancer patient population: pulmo- nary, urological and breast cancers (10 cases per field). Each examinee received instructions on the particular for- Each case, presented in a short scenario, was followed by mat of the SCT and on the classification of the AJCC three related test items. (American Joint Committee on Cancer) of the tested can- cers. Participation was voluntary. Demographic data was Subjects collected for students and residents. Anonymity was guar- Three groups were sampled, representing three levels of anteed for certified board specialists. The project was clinical experience. The first group consisted of 4th-year approved by the ethics committee of the University of medical students (n = 70) from the University of Mon- Montreal. treal. Students took the exam immediately after a lecture given on radiation oncology. Only four of them possessed Page 2 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 Scoring homogeneity, non-parametric alternatives were used. To SCT scoring is based on the comparison of answers pro- evaluate the capacity to significantly discriminate the vided by examinees with those of a reference panel, com- scores of the three groups, the non-parametric Kruskall- posed of physicians with experience in the field. Panel Wallis test was applied. The non-parametric Mann-Whit- members are asked to complete the test individually, and ney test was used to assess more specifically the difference their answers are used to develop the scoring key [9]. In in scores between residents and radiation oncologists, this study the radio-oncologists of the province of Quebec junior residents (PGY-1 to PGY-3) and senior residents made both the third level of clinical experience and the (PGY-4 to PGY-5). panel of reference. All tests were bilateral, and p values < 0.05 were consid- With this scoring method, the maximum score for each ered statistically significant. No correction for multiple question is 1, for the modal answer from the reference tests was applied. The test results were treated anony- panel. Other panel members' choices are attributed a par- mously. The analysis was done with the SPSS (Statistical tial credit, proportional to the number of members having Package for Social Sciences) software, version 11.0. provided that answer on the Likert scale divided by the modal value for the item. Answers not chosen by any Feasibility panel members receive zero. For example, suppose the ref- Data were collected informally on test construction diffi- erence panel, made of 42 radio-oncologists, respond to a culties and on residents' and board-certified specialists' question in the following way: none choose the "+2", and reactions to the content and format of the test. "+1" ratings, 2 choose the "0" rating, 10 choose the "-1" rating and 30 choose the "-2" rating. The modal answer in Results this example is "-2". An examinee choosing this rating will Subjects receive 1. Selecting the "-1" rating will earn 0.33 (10/30) Participants signed a consent form before taking the test. and the "0" rating, 0.06 (2/30). No points are accorded for All 70 students agreed to participate. Among residents, all selecting the "+1" or "+2" ratings. those from the University of Montreal (22), half of those from McGill (8/16) and 8 out of 14 from Laval took the With this method, all questions have the same maximum test (no reason was provided by those who declined, but (1) and minimum (0) value. Scores obtained on each at McGill it was mainly a result of the language barrier). question are added to obtain a total score for the test. With The 38 participating residents represent 72% of radiation SCTs, a theoretical score of 100 would mean that the per- oncology residents of the province, 70% (26) were juniors son had answered each item in the same way that the and 30% (11) were seniors (1 resident did not specify his/ majority of panel members did. In reality, such a score is her year of residency). Forty-seven (76%) of the 62 board- never reached, even by panel members. Panel means certified radiation oncologists in the province agreed to found in tests administered in other specialities were gen- participate. Among them, 81% had their practice in Uni- erally in the 80s. versity hospitals. While there was no time constraint, most participants completed the test in under an hour. Statistical analysis To avoid bias, when radio-oncologists where used as Among board-certified radiation oncologists three were members of the panel, scores for each question were com- outliers (total test score under two standard deviations puted using a scoring key that excluded their own from the mean), and two had too many missing data. All response to that question. When they were studied as five were taken out of the group. The panel and third level third level of experience, their scores were computed with of experience was therefore made up of 42 persons. the scoring key used for the other participants. Missing data An item analysis was done to detect problematic ques- One student and one resident were removed from the tions. Questions with a low item-total correlation (r < study due to too many missing data (more than four miss- 0.10) were removed. The normality of the distributions ing answers in either the pulmonary, urological or breast was evaluated with the Kolmogorov-Smirnov statistical cancer sections). For all other participants, missing test. The total scores of the subjects of each group were dis- answers were replaced by the average score of all other tributed. Reliability was estimated using the Cronbach questions from that section of the test. This represents less alpha internal consistency coefficient. than 0.5% of all test answers. Participating in the analyses were 69 students, 37 radiation oncology residents and 42 The Levene test was used to evaluate the homogeneity of radiation oncologists. the variance of the three groups. ANOVA analysis was planned for group comparison. In case of lack of variance Page 3 of 6 (page number not for citation purposes) Students Residents Panel Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 Item analysis Participant scores The test taken by participants was composed of 10 cases in The mean score and the score variability are 51.6 (SD = each section with three related questions. After item anal- 8.2; range 32.7–74.9) for students, 71.2 (SD = 9.5; range ysis, five questions were removed from the lung cancer = 53.2–85.8) for residents, and 76.7 (SD = 6.1; range = section, five from the urological cancer section, and 10 61.8–90.2) for board-certified radiation oncologists. The from the breast cancer section. A maximum of two ques- score distributions for each group are presented graphi- tions were discarded per case. After item optimization, the cally in Figure 2. The score of one of the students (74.9) test comprised 30 cases and 70 questions. The normality differs significantly from the average. This individual did of score distributions was verified with the Kolmogorov- a one-month rotation in radiation oncology and a one- Smirnov statistical test (Z > 0.558; p > 0.736). month rotation in medical oncology. Reliability Since the variances in the mean scores were not homoge- The Cronbach alpha coefficient value for the optimized neous (p = 0.016), non-parametric tests were used to eval- test is 0.90 (0.72 for the 25 questions on lung cancer, 0.78 uate the capacity of the test to detect differences according for the 25 questions on urological cancer, and 0.78 for the to clinical experience. There was a significant difference (p 20 items on breast cancer). < 0.001) between the mean of the scores of the three groups of examinees. There is also a significant difference 30 40 50 60 70 80 90 100 SCT score SCT Score Distr Figure 2 ibutions for Each Group SCT Score Distributions for Each Group. Page 4 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 within residents, with junior residents (PGY-1 to PGY-3) Some of the results warrant comments. One student having a mean of 68.9 ( ± 10.0) and senior residents obtained a significantly higher score than the other mem- (PGY-4 to PGY-5) of 76.5 ( ± 5.0; Z = -2.193, p = 0.028). bers of his group. This student had completed two elective rotations in the field and was aiming for a residency in Feasibility of and reactions to the test format radiation oncology. Senior residents had scores that were Despite their lack of knowledge and experience in the close to those of the panel, thus indicating readiness for field, students were stimulated by the test format and were autonomous practice in the tested domains. On the other eager to get their test result. Students, residents and spe- hand, some students and residents had low scores, indi- cialists completed the test with pleasure, and recruitment cating that SCT may potentially be used to identify resi- for the study was easy. Participants from the three groups dents who might not have good clinical judgment and succeeded in completing the test in a short time (less than may need to take remedial action. an hour for most) and expressed their appreciation of the similarity between the situations described in the scenar- The study has several limitations. It addresses only three ios and real situations encountered in their practice. specific areas of radiation oncology, and participants come from a limited geographic area. In the future, it would be interesting to repeat this experiment using a tool Discussion With SCT, examinees are probed on a specific component extended to include other pathologies in oncology of clinical reasoning: data interpretation, i.e. a crucial step (gynaecological, digestive, head and neck, etc). These within the clinical reasoning process [14]. It measures the spheres of competency have fewer practising experts and degree of concordance between the examinee's perform- the results could be different. The majority of experts who ance and that of a reference panel on a series of case-based completed the SCT practise in a university centre (81%) tasks. As it is inferred that high scores correspond to opti- and often specialize in one or more specific areas of radi- mal use of information in the context of these specific ation oncology. Therefore, certain panel members who tasks, the test therefore provides an indication of clinical answered questions on lung cancer have not actually reasoning quality. Clinicians find the test appealing treated this pathology for many years. A forthcoming because it contains cognitive tasks similar to those they study will examine the influence of this specialization of encounter during their daily practice. Furthermore, as radiation oncologists and the optimal number of panel opposed to many other tests that require revision of members. knowledge for optimal performance, a clinician can fill out the test at any time without any preparation. Conclusion SCT seems to measure a dimension of reasoning and With SCT, examinees are not assessed against pre-set crite- knowledge that is different from those evaluated by usual ria or compared to their own group. Residents are physi- assessment tools. It explores the interpretation of data in cians who wish to become, after training, members of the a clinical context, with ability clearly related to clinical population of certified specialists of their field of study. experience. This study provides evidence in favour of SCT Thus it is legitimate to compare their reasoning perform- as a reliable and valid tool to evaluate the clinical reason- ance to a panel that is representative of physicians in that ing of radiation oncology residents. The use of this instru- field. The panel, made up of 76% of board-certified radio- ment will allow for a more comprehensive evaluation of a oncologists of the province, was highly representative of resident's performance in this specialty. A low score on this population. SCT could indicate residents who need assistance in the development of their reasoning capacity. Test scores appear reliable, with Cronbach alpha coeffi- cient reaching a value of 0.90 for 70 questions and for one Competing interests hour of testing time. This compares very favourably with The authors declare that they have no competing interests. other test formats when compared in units of testing time [15]. The test showed a capacity to reflect clinical experi- Authors' contributions ence in the field. Participants with more experience in CL contributed to conception and design, acquisition of radiation oncology scored higher on the SCT. This was data and interpretation of data and has been involved in true for students when compared with the other two lev- drafting the manuscript. RG contributed to conception els, for residents when compared with board-certified and design, analysis and interpretation of data and has radiation oncologists and for junior residents when com- been involved in revising the manuscript critically. DN pared with senior residents. These results indicate that the contributed to acquisition of data and revised the manu- SCT format should be useful for documenting learning script. BC contributed to conception and design, analysis alongside residency training. and interpretation of data and has been involved in draft- Page 5 of 6 (page number not for citation purposes) Radiation Oncology 2009, 4:7 http://www.ro-journal.com/content/4/1/7 ing the manuscript. All authors read and approved the final manuscript. Acknowledgements Évelyne Sauvé for data collection. Source of funding: CPASS (Centre de Pédagogie Appliquée aux Sciences de la Santé), Faculté de médecine, Uni- versité de Montréal References 1. Blackstock AW, Govindan R: Definitive chemoradiation for the treatment of locally advanced non-small-cell lung cancer. J Clin Oncol 2007, 25(28):4146-4152. 2. Boughey JC, Gonzalez RJ, Bonner E, Kuerer HM: Current treat- ment and clinical trial developments for ductal carcinoma in situ of the breast. Oncologist 2007, 12(11):1276-1287. 3. Hede K: Radioactive "seed" implants may rival surgery for low-risk prostate cancer patients. J Natl Cancer Inst 2007, 99(20):1507-1509. 4. Schön D: The Reflective Practitioner: How Professionals Think in Action New York: Basic Books; 1983. 5. Fox R: Medical Uncertainty Revisited. In Handbook of Social Stud- ies in Health and Medicine Edited by: Albrecht G, Fitzpatrick R, Scrim- shaw S. London: Sage Publications; 2000:409-425. 6. Hool GR, Church JM, Fazio VW: Decision-making in rectal can- cer surgery: survey of North American colorectal residency programs. Dis Colon Rectum 1998, 41(2):147-152. 7. Charlin B, Boshuizen HPA, Custers EJFM, Feltovich Paul J: Scripts and clinical reasoning. Med Educ 2007, 41:1179-1185. 8. Schmidt HG, Norman GR, Boshuizen HP: A cognitive perspective on medical expertise: theory and implication. Acad Med 1990, 65(10):611-21. 9. Charlin B, Roy L, Brailovsky C, Vleuten C Van der: The Script Con- cordance Test: a Tool to Assess the Reflective Clinician. Teaching and Learning in Medicine 2000, 12:189-195. 10. Meterissian S, Zabolotny B, Gagnon R, Charlin B: Is the script con- cordance test a valid instrument for assessment of intraop- erative decision-making skills? Am J Surg 2007, 193:248-251. 11. Sibert L, Darmoni SJ, Dahamna B, Hellot MF, Weber J, Charlin B: Online clinical reasoning assessment with the Script Con- cordance Test: results of a French pilot study. BMC Medical Education 2006, 6:45. 12. Gagnon R, Charlin B, Coletti M, Sauvé E, Vleuten C Van der: Assess- ment in the context of uncertainty: How many members are needed on the panel of reference of a script concordance test? Med Educ 2005, 39:284-291. 13. Charlin B, Vleuten C Van der: Standardized assessment in con- text of uncertainty: The script concordance approach. Evalu- ation and the Health Professions 2004, 27:304-319. 14. Fournier JP, Demeester A, Charlin B: Script Concordance Tests: Guidelines for construction. BMC Medical Informatics and Decision Making 2008, 8:18. 15. Wass V: Assessment of clinical competence. Lancet 2001, 357(9260):945-949. Publish with Bio Med Central and every scientist can read your work free of charge "BioMed Central will be the most significant development for disseminating the results of biomedical researc h in our lifetime." Sir Paul Nurse, Cancer Research UK Your research papers will be: available free of charge to the entire biomedical community peer reviewed and published immediately upon acceptance cited in PubMed and archived on PubMed Central yours — you keep the copyright BioMedcentral Submit your manuscript here: http://www.biomedcentral.com/info/publishing_adv.asp Page 6 of 6 (page number not for citation purposes)

Journal

Radiation OncologySpringer Journals

Published: Feb 9, 2009

There are no references for this article.