Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Continuing Medical Education and the Physician as a Learner: Guide to the Evidence

Continuing Medical Education and the Physician as a Learner: Guide to the Evidence One faculty member in a professional school referred to continuing education as "shouting out of windows," and an analysis of the programs at his institution shows the aptness of his metaphor: Faculty members who can be persuaded to do so give lectures on subjects of their own choosing to audiences they do not know, who have assembled only because they want to put in enough hours of classroom attendance so that they can meet a relicensure requirement. As a result, every profession now has members who vigorously oppose what they regard as the excessive promotion of continuing education.—Cyril O. Houle, 19801 Researchers of the past decade produced systematic reviews of continuing medical education (CME) and other strategies intended to change physician behavior and improve patient outcomes.2-7 The subjects of the reviews included such concepts as audit and feedback, chart-based reminders, clinical practice guidelines, and formal lectures. Defined as interventions to change the behavior of physicians, the effects of those strategies were inconsistent across practitioners, settings, and behaviors.3-8 As a result, in the midst of contemporary discussions about quality improvement and the effects of continuing education, there is no singularly effective method for improving physician performance.6,8 Physicians must accept responsibility for their own continuous learning: setting goals and selecting educational activities to achieve those goals. We searched the Research and Development Resource Base in Continuing Medical Education and the Specialised Register of the Cochrane Effective Practice and Organization of Care group, supplemented by searches of MEDLINE from 1992 to February 2002 for systematic reviews and evidence of CME and its effect on both physicians and CME planners. A New Definition of CME In 1992, the traditional definition of CME broadened as a result of a systematic review of 50 randomized controlled trials (BOX).7 This review reported that physicians and CME providers were engaged in learning activities extending beyond the conventional lecture hall. Computer-aided instruction on patient-related problems, reading materials, and visits to practice sites from health care professionals who were trained to improve physician performance were described as positive CME interventions because they prepared physicians for change and further learning. Patient education materials, clinical practice guidelines, and flow charts enabled change to occur. Chart audit with feedback, reminders about desired clinical actions, and the opinions of influential local physicians confirmed or reinforced change in the desired direction. Subsequent studies4,5,9 identified these activities as more discrete interventions (Table), with 3 major consistent findings. The factors identified in these studies that are most effective include assessment of learning needs, a necessary precursor to effective CME5-7; interaction among physician-learners with opportunities to practice the skills learned2,3,5; and sequenced and multifaceted educational activities.3-5 Continuing medical education strategies that enable and reinforce change are more likely than other more traditional, passive activities to influence behavior.3-5 Physician-learners and CME providers should design and select strategies to optimize improvement of both physician performance and health care outcomes. Box. Continuing Medical Education Interventions Educational materials: distribution of published or printed recommendations for clinical care, including clinical practice guidelines, audiovisual materials, and electronic publications Conferences: participation in conferences, lectures, workshops, or traineeships outside the practice setting Outreach visits: use of a trained person who meets with providers in their practice settings to provide information for improving the providers' performance Local opinion leaders: use of providers explicitly nominated by their colleagues as educationally influential Patient-mediated interventions: interventions for which information was sought from or given directly to patients by others (eg, direct mailings to patients, patient counseling delivered by others, or clinical information collected directly from patients and given to the physician) Audit and feedback: any summary of clinical performance of health care over a specified period, with or without recommendations for clinical action; the information may have been obtained from medical records, computerized databases, patients, or by observation Reminders: any intervention (manual or computerized) that prompts the physician to perform a clinical action (eg, concurrent or intervisit reminders to professionals about desired actions such as screening or other preventive services, enhanced laboratory reports, or administrative support [eg, follow-up appointment systems or stickers on charts]) Multifaceted interventions: select combinations of the above 7 interventions (eg, outreach visits followed by clinical information collected directly from patients and a computer reminder to counsel certain patients regarding a specific disorder) Needs Assessment: Precursor to Change Assessment of learning needs is crucial for effective CME.5-7 Regardless of whether physicians are involved in conferences or reading that prepares them for change, workshops or demonstrations enabling change at the practice site, or audit with feedback and reminders to evaluate patients' progress, it is important for physicians to recognize the need to change their behavior, knowledge base, or skills.7 Irrespective of hospital- or office-based practice, primary or specialty care, a change in physicians' knowledge or skills was associated with an identified reason for the change prior to its implementation.5 Physician performance improved when learning experiences incorporated tests of knowledge and assessments of clinical practice needs.7 Physician-learners progress at their own rates, depending upon their motivation, their knowledge of a problem, or the perception of a gap between current knowledge and skills and those that are desired. When gaps are demonstrated and educational resources are extended strategically to help the learner, change occurs more frequently within each type of intervention.5 A variety of tools is available to help physicians and CME providers determine learning needs. Most medical specialty boards in the United States offer written or oral examinations, or both, of knowledge.10 Such tests present excellent opportunities for physicians to assess their knowledge against facts and principles that inform essential clinical decisions. Benchmarking11 is a tool for physicians to compare their personal performance with standards of excellence demonstrated by top performers in a peer group. This approach to assessment has been shown to enhance the effectiveness of physician performance in ambulatory care.12 Utilization review provides institutional information to make comparisons based upon hospital admission rates, mortality and morbidity rates, and medical error rates. Such data may be used before and after educational interventions to judge success in promulgating change.13 Personal learning portfolios describe significant learning events,14 enabling physicians to assess—on an ongoing basis—the questions they find important to answer in maintaining competent clinical performance. The Accreditation Council for Graduate Medical Education (ACGME) lists self-assessment tools for use by practicing physicians as they contemplate ethics, professionalism, and practice-based learning and improvement.15 Physicians can search the ACGME Web site to learn about the validity, feasibility, and psychometric characteristics of selected self-assessment tools. Descriptions of who has used each instrument, how many times, and in what settings also can be found.16 The Change Readiness Inventory (CRI),17 developed from analysis of 775 changes described by 340 North American physicians,18 may be used by CME providers to give physicians a voice in the development of efforts that may facilitate changes in their clinical performance. Reasons to change, such as regulations or clinical advances, and barriers to change, such as low motivation, lack of time, or lack of proper equipment in systems of health care, can be discovered with the CRI. Continuing medical education providers can improve the prospects for change by helping physicians integrate systematic quality improvement efforts with CME, including the assessment of need and evaluation of progress toward clinical goals.11,12 Interactive Learning and Opportunities to Practice Two-way communication maintained over time enables the convergence of ideas between CME teachers and physician-learners. Adding enabling strategies such as patient education materials or reminders can help facilitate change at the practice site.3,5 While lectures, conferences, and short courses may predispose physicians toward change, didactic lectures by themselves do not play a significant role in immediately affecting physician performance or improving patient health care.2,3,5 Educational activities that use interactive techniques such as case discussion or hands-on practice sessions generally are more effective in changing behavior and patient outcomes.3 Interactive workshops can result in changes to knowledge or skills; didactic sessions alone are unlikely to change professional practice.2,3,5,7 Sequenced and Multifaceted Activities Continuing medical education strategies designed to use 2 or more interventions can lead to change in practice.3-5 For example, physicians provided with educational material on the measurement of survival probabilities and the cost of intensive care, followed up by a bedside display of probabilities, reduced test ordering.19 Physicians who received educational material for patients, a reminder to offer them nicotine gum, and a 4-hour training session on counseling for smoking cessation advice experienced higher rates of success helping patients to stop smoking at 1 year.20 Mailed materials, follow-up telephone calls, and presentations at primary care meetings caused a significant decrease in inappropriate referrals and increased appropriate referrals to otolaryngologists.4,21 Physicians should choose educational activities with clear goals and the opportunity to progress incrementally toward achievement of those goals. Outcome Evaluation Apart from specialty certification or recertification, the type of progress measured formally by graduation from undergraduate and graduate medical education does not exist when graduate medical education is completed. Each physician monitors his or her own learning, managing the design of its continuity and effects. Changes in clinical behavior can be accomplished and measured through chart audit with feedback.4,5,7 The strategy is central to continuous quality improvement in health care.11 Constructs of the strategy may be found in the Bi-Cycle Approach to Quality Assurance, described by Brown22 as an outer patient and health care cycle and an inner change or education cycle. Outcomes may be measured by improved patient compliance with selected regimens5,7,23 or reduced numbers of inappropriate hospital stays.5,7,24 Two instruments shown to be effective in tracking change in physician behavior and functional outcome improvement for patients include the Karnofsky Performance Status Scale,25 which enables the monitoring of patients whose performance may vary from able to carry on normal activity to unable to work or unable to care for self, and the Short Form-36,26 which enables assessments of physical and emotional well-being. Questions regarding truth in measurement and methods for assessing outcomes continue to challenge CME and health care professionals. Randomized controlled trials may continue enabling physicians and CME providers to examine the effects of learning and the performance of clinical behaviors, but minor clinical actions often have major consequences remote in space and time.11 There are some who believe that randomized controlled trials take too long and that cohort and case-control designs are more appropriate to the practicalities of quality improvement studies.11 The Cochrane Effective Practice and Organisation of Care group27 recently accepted a new protocol for a systematic review of the best evidence on continuous quality improvement and its effects on professional practice and patient outcomes. Conclusions Traditional CME is a time-based system of credits awarded for attending conferences, workshops, or lectures. The activities are typically teacher-initiated, using passive educational models (eg, lecture). Recent studies14,28 suggest that physicians benefit from reflection on their progress and development of their next learning projects or questions. What can physicians do? Physicians should reconsider the perspective of CME consisting solely of lectures, grand rounds, or medical staff meetings. They should participate in educational activities that offer personal involvement in thinking about professional practice and in identifying learning needs. To achieve its greatest potential, CME must be truly continuing, not casual, sporadic, or opportunistic.1 Physicians must recognize the ongoing opportunities to generate important questions, interpret new knowledge, and judge how to apply that knowledge in clinical settings. Essentially, this means that CME must be self-directed by the physician, including management of the content of and context for learning. In turn, the opportunities for self-directed learning must enhance the knowledge and skills required for critical reflection on practice and measurement of improvement. References 1. Houle CO. Continuing Learning in the Professions. San Francisco, Calif: Jossey-Bass; 1980:266. 2. O'Brien MA, Freemantle N, Oxman AD. et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes [Cochrane Review]. Oxford, England: Cochrane Library, Update Software; 2002;issue 1. 3. Davis D, O'Brien MA, Freemantle N. et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA.1999;282:867-874.Google Scholar 4. Davis DA, Taylor-Vaisey A. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. CMAJ.1997;157:408-416.Google Scholar 5. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA.1995;274:700-705.Google Scholar 6. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ.1995;153:1423-1431.Google Scholar 7. Davis D, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials. JAMA.1992;268:1111-1117.Google Scholar 8. Grol R. Improving the quality of medical care: building bridges among professional pride, payer profit, and patient satisfaction. JAMA.2001;284:2578-2585.Google Scholar 9. Davis DA, Lindsay EA, Mazmanian PE. The effectiveness of CME interventions. In: Davis DA, Fox RD, eds. The Physician as Learner: Linking Research to Practice. Chicago, Ill: American Medical Association; 1994:245-280. 10. Norcini JJ. Recertification in the United States. BMJ.1999;319:1183-1185.Google Scholar 11. Berwick DM. Developing and testing changes in delivery of care. Ann Intern Med.1998;128:651-656.Google Scholar 12. Kiefe CI, Allison JJ, Williams OD. et al. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA.2001;285:2871-2879.Google Scholar 13. Soumerai SB, McLaughlin TH, Gurwitz JH. et al. Effect of local medical opinion leaders on quality of care for acute myocardial infarction: a randomized controlled trial. JAMA.1998;279:1358-1363.Google Scholar 14. Campbell CM, Parboosingh J, Gondocz T. et al. Study of the factors influencing the stimulus to learning recorded by physicians keeping a learning portfolio. J Cont Educ Health Prof.1999;19:16-24.Google Scholar 15. ACGME Outcome Project. Practice-based learning and improvement: assessment approaches. Available at: http://www.acgme.org/outcome/assess/PBLI_Index.asp. Accessibility verified May 9, 2002. 16. ACGME Outcome Project. Available at: http://www.acgme.org/Outcome/. Accessibility verified May 29, 2002. 17. Fox RD. Theory and practice in continuing professional development. J Cont Educ Health Prof.2000;20:238-246.Google Scholar 18. Fox RD, Mazmanian PE, Putnam RW. A theory of learning and change. In: Fox RD, Mazmanian PE, Putnam RW, eds. Changing and Learning in the Lives of Physicians. New York, NY: Praeger Publishers; 1989:161-175. 19. Pollack MM, Getson PR. Pediatric critical care cost containment: combined actuarial and clinical program. Crit Care Med.1991;19:12-20.Google Scholar 20. Wilson DM, Taylor DW, Gilbert JR. et al. A randomized trial of a family physician intervention for smoking cessation. JAMA.1988;260:1570-1574.Google Scholar 21. Benninger MS, King F, Nichols RD. Management guidelines for improvement of otolaryngology referrals from primary care physicians. Otolaryngol Head Neck Surg.1995;113:446-452.Google Scholar 22. Brown CR. The continuing education component of the bi-cycle approach to quality assurance. In: Egdahl RH, Gertman PM, eds. Quality Health Care: The Role of Continuing Medical Education. Germantown, Md: Aspen Systems Corp; 1977:11-18. 23. Maiman LA, Becker MH, Liptak GS. et al. Improving pediatricians'compliance-enhancing practices. AJDC.1988;142:773-779.Google Scholar 24. Restuccia JD. The effect of concurrent feedback in reducing inappropriate hospital utilization. Med Care.1982;20:46-62.Google Scholar 25. Schaafsma J, Osoba D. The Karnofsky Performance Status Scale re-examined: a cross-validation with the EORTC-C30. Qual Life Res.1994;3:413-424.Google Scholar 26. Ware Jr JE. SF-36 health survey update. Spine.2000;25:3130-3139.Google Scholar 27. The Cochrane Effective Practice and Organisation of Care Group (EPOC). Aberdeen, United Kingdom: University of Aberdeen; 2001. 28. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA.2002;287:226-235.Google Scholar http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png JAMA American Medical Association

Continuing Medical Education and the Physician as a Learner: Guide to the Evidence

JAMA , Volume 288 (9) – Sep 4, 2002

Loading next page...
 
/lp/american-medical-association/continuing-medical-education-and-the-physician-as-a-learner-guide-to-CvoTUFlOpr
Publisher
American Medical Association
Copyright
Copyright © 2002 American Medical Association. All Rights Reserved.
ISSN
0098-7484
eISSN
1538-3598
DOI
10.1001/jama.288.9.1057
Publisher site
See Article on Publisher Site

Abstract

One faculty member in a professional school referred to continuing education as "shouting out of windows," and an analysis of the programs at his institution shows the aptness of his metaphor: Faculty members who can be persuaded to do so give lectures on subjects of their own choosing to audiences they do not know, who have assembled only because they want to put in enough hours of classroom attendance so that they can meet a relicensure requirement. As a result, every profession now has members who vigorously oppose what they regard as the excessive promotion of continuing education.—Cyril O. Houle, 19801 Researchers of the past decade produced systematic reviews of continuing medical education (CME) and other strategies intended to change physician behavior and improve patient outcomes.2-7 The subjects of the reviews included such concepts as audit and feedback, chart-based reminders, clinical practice guidelines, and formal lectures. Defined as interventions to change the behavior of physicians, the effects of those strategies were inconsistent across practitioners, settings, and behaviors.3-8 As a result, in the midst of contemporary discussions about quality improvement and the effects of continuing education, there is no singularly effective method for improving physician performance.6,8 Physicians must accept responsibility for their own continuous learning: setting goals and selecting educational activities to achieve those goals. We searched the Research and Development Resource Base in Continuing Medical Education and the Specialised Register of the Cochrane Effective Practice and Organization of Care group, supplemented by searches of MEDLINE from 1992 to February 2002 for systematic reviews and evidence of CME and its effect on both physicians and CME planners. A New Definition of CME In 1992, the traditional definition of CME broadened as a result of a systematic review of 50 randomized controlled trials (BOX).7 This review reported that physicians and CME providers were engaged in learning activities extending beyond the conventional lecture hall. Computer-aided instruction on patient-related problems, reading materials, and visits to practice sites from health care professionals who were trained to improve physician performance were described as positive CME interventions because they prepared physicians for change and further learning. Patient education materials, clinical practice guidelines, and flow charts enabled change to occur. Chart audit with feedback, reminders about desired clinical actions, and the opinions of influential local physicians confirmed or reinforced change in the desired direction. Subsequent studies4,5,9 identified these activities as more discrete interventions (Table), with 3 major consistent findings. The factors identified in these studies that are most effective include assessment of learning needs, a necessary precursor to effective CME5-7; interaction among physician-learners with opportunities to practice the skills learned2,3,5; and sequenced and multifaceted educational activities.3-5 Continuing medical education strategies that enable and reinforce change are more likely than other more traditional, passive activities to influence behavior.3-5 Physician-learners and CME providers should design and select strategies to optimize improvement of both physician performance and health care outcomes. Box. Continuing Medical Education Interventions Educational materials: distribution of published or printed recommendations for clinical care, including clinical practice guidelines, audiovisual materials, and electronic publications Conferences: participation in conferences, lectures, workshops, or traineeships outside the practice setting Outreach visits: use of a trained person who meets with providers in their practice settings to provide information for improving the providers' performance Local opinion leaders: use of providers explicitly nominated by their colleagues as educationally influential Patient-mediated interventions: interventions for which information was sought from or given directly to patients by others (eg, direct mailings to patients, patient counseling delivered by others, or clinical information collected directly from patients and given to the physician) Audit and feedback: any summary of clinical performance of health care over a specified period, with or without recommendations for clinical action; the information may have been obtained from medical records, computerized databases, patients, or by observation Reminders: any intervention (manual or computerized) that prompts the physician to perform a clinical action (eg, concurrent or intervisit reminders to professionals about desired actions such as screening or other preventive services, enhanced laboratory reports, or administrative support [eg, follow-up appointment systems or stickers on charts]) Multifaceted interventions: select combinations of the above 7 interventions (eg, outreach visits followed by clinical information collected directly from patients and a computer reminder to counsel certain patients regarding a specific disorder) Needs Assessment: Precursor to Change Assessment of learning needs is crucial for effective CME.5-7 Regardless of whether physicians are involved in conferences or reading that prepares them for change, workshops or demonstrations enabling change at the practice site, or audit with feedback and reminders to evaluate patients' progress, it is important for physicians to recognize the need to change their behavior, knowledge base, or skills.7 Irrespective of hospital- or office-based practice, primary or specialty care, a change in physicians' knowledge or skills was associated with an identified reason for the change prior to its implementation.5 Physician performance improved when learning experiences incorporated tests of knowledge and assessments of clinical practice needs.7 Physician-learners progress at their own rates, depending upon their motivation, their knowledge of a problem, or the perception of a gap between current knowledge and skills and those that are desired. When gaps are demonstrated and educational resources are extended strategically to help the learner, change occurs more frequently within each type of intervention.5 A variety of tools is available to help physicians and CME providers determine learning needs. Most medical specialty boards in the United States offer written or oral examinations, or both, of knowledge.10 Such tests present excellent opportunities for physicians to assess their knowledge against facts and principles that inform essential clinical decisions. Benchmarking11 is a tool for physicians to compare their personal performance with standards of excellence demonstrated by top performers in a peer group. This approach to assessment has been shown to enhance the effectiveness of physician performance in ambulatory care.12 Utilization review provides institutional information to make comparisons based upon hospital admission rates, mortality and morbidity rates, and medical error rates. Such data may be used before and after educational interventions to judge success in promulgating change.13 Personal learning portfolios describe significant learning events,14 enabling physicians to assess—on an ongoing basis—the questions they find important to answer in maintaining competent clinical performance. The Accreditation Council for Graduate Medical Education (ACGME) lists self-assessment tools for use by practicing physicians as they contemplate ethics, professionalism, and practice-based learning and improvement.15 Physicians can search the ACGME Web site to learn about the validity, feasibility, and psychometric characteristics of selected self-assessment tools. Descriptions of who has used each instrument, how many times, and in what settings also can be found.16 The Change Readiness Inventory (CRI),17 developed from analysis of 775 changes described by 340 North American physicians,18 may be used by CME providers to give physicians a voice in the development of efforts that may facilitate changes in their clinical performance. Reasons to change, such as regulations or clinical advances, and barriers to change, such as low motivation, lack of time, or lack of proper equipment in systems of health care, can be discovered with the CRI. Continuing medical education providers can improve the prospects for change by helping physicians integrate systematic quality improvement efforts with CME, including the assessment of need and evaluation of progress toward clinical goals.11,12 Interactive Learning and Opportunities to Practice Two-way communication maintained over time enables the convergence of ideas between CME teachers and physician-learners. Adding enabling strategies such as patient education materials or reminders can help facilitate change at the practice site.3,5 While lectures, conferences, and short courses may predispose physicians toward change, didactic lectures by themselves do not play a significant role in immediately affecting physician performance or improving patient health care.2,3,5 Educational activities that use interactive techniques such as case discussion or hands-on practice sessions generally are more effective in changing behavior and patient outcomes.3 Interactive workshops can result in changes to knowledge or skills; didactic sessions alone are unlikely to change professional practice.2,3,5,7 Sequenced and Multifaceted Activities Continuing medical education strategies designed to use 2 or more interventions can lead to change in practice.3-5 For example, physicians provided with educational material on the measurement of survival probabilities and the cost of intensive care, followed up by a bedside display of probabilities, reduced test ordering.19 Physicians who received educational material for patients, a reminder to offer them nicotine gum, and a 4-hour training session on counseling for smoking cessation advice experienced higher rates of success helping patients to stop smoking at 1 year.20 Mailed materials, follow-up telephone calls, and presentations at primary care meetings caused a significant decrease in inappropriate referrals and increased appropriate referrals to otolaryngologists.4,21 Physicians should choose educational activities with clear goals and the opportunity to progress incrementally toward achievement of those goals. Outcome Evaluation Apart from specialty certification or recertification, the type of progress measured formally by graduation from undergraduate and graduate medical education does not exist when graduate medical education is completed. Each physician monitors his or her own learning, managing the design of its continuity and effects. Changes in clinical behavior can be accomplished and measured through chart audit with feedback.4,5,7 The strategy is central to continuous quality improvement in health care.11 Constructs of the strategy may be found in the Bi-Cycle Approach to Quality Assurance, described by Brown22 as an outer patient and health care cycle and an inner change or education cycle. Outcomes may be measured by improved patient compliance with selected regimens5,7,23 or reduced numbers of inappropriate hospital stays.5,7,24 Two instruments shown to be effective in tracking change in physician behavior and functional outcome improvement for patients include the Karnofsky Performance Status Scale,25 which enables the monitoring of patients whose performance may vary from able to carry on normal activity to unable to work or unable to care for self, and the Short Form-36,26 which enables assessments of physical and emotional well-being. Questions regarding truth in measurement and methods for assessing outcomes continue to challenge CME and health care professionals. Randomized controlled trials may continue enabling physicians and CME providers to examine the effects of learning and the performance of clinical behaviors, but minor clinical actions often have major consequences remote in space and time.11 There are some who believe that randomized controlled trials take too long and that cohort and case-control designs are more appropriate to the practicalities of quality improvement studies.11 The Cochrane Effective Practice and Organisation of Care group27 recently accepted a new protocol for a systematic review of the best evidence on continuous quality improvement and its effects on professional practice and patient outcomes. Conclusions Traditional CME is a time-based system of credits awarded for attending conferences, workshops, or lectures. The activities are typically teacher-initiated, using passive educational models (eg, lecture). Recent studies14,28 suggest that physicians benefit from reflection on their progress and development of their next learning projects or questions. What can physicians do? Physicians should reconsider the perspective of CME consisting solely of lectures, grand rounds, or medical staff meetings. They should participate in educational activities that offer personal involvement in thinking about professional practice and in identifying learning needs. To achieve its greatest potential, CME must be truly continuing, not casual, sporadic, or opportunistic.1 Physicians must recognize the ongoing opportunities to generate important questions, interpret new knowledge, and judge how to apply that knowledge in clinical settings. Essentially, this means that CME must be self-directed by the physician, including management of the content of and context for learning. In turn, the opportunities for self-directed learning must enhance the knowledge and skills required for critical reflection on practice and measurement of improvement. References 1. Houle CO. Continuing Learning in the Professions. San Francisco, Calif: Jossey-Bass; 1980:266. 2. O'Brien MA, Freemantle N, Oxman AD. et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes [Cochrane Review]. Oxford, England: Cochrane Library, Update Software; 2002;issue 1. 3. Davis D, O'Brien MA, Freemantle N. et al. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA.1999;282:867-874.Google Scholar 4. Davis DA, Taylor-Vaisey A. Translating guidelines into practice: a systematic review of theoretic concepts, practical experience and research evidence in the adoption of clinical practice guidelines. CMAJ.1997;157:408-416.Google Scholar 5. Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA.1995;274:700-705.Google Scholar 6. Oxman AD, Thomson MA, Davis DA, Haynes RB. No magic bullets: a systematic review of 102 trials of interventions to improve professional practice. CMAJ.1995;153:1423-1431.Google Scholar 7. Davis D, Thomson MA, Oxman AD, Haynes RB. Evidence for the effectiveness of CME: a review of 50 randomized controlled trials. JAMA.1992;268:1111-1117.Google Scholar 8. Grol R. Improving the quality of medical care: building bridges among professional pride, payer profit, and patient satisfaction. JAMA.2001;284:2578-2585.Google Scholar 9. Davis DA, Lindsay EA, Mazmanian PE. The effectiveness of CME interventions. In: Davis DA, Fox RD, eds. The Physician as Learner: Linking Research to Practice. Chicago, Ill: American Medical Association; 1994:245-280. 10. Norcini JJ. Recertification in the United States. BMJ.1999;319:1183-1185.Google Scholar 11. Berwick DM. Developing and testing changes in delivery of care. Ann Intern Med.1998;128:651-656.Google Scholar 12. Kiefe CI, Allison JJ, Williams OD. et al. Improving quality improvement using achievable benchmarks for physician feedback: a randomized controlled trial. JAMA.2001;285:2871-2879.Google Scholar 13. Soumerai SB, McLaughlin TH, Gurwitz JH. et al. Effect of local medical opinion leaders on quality of care for acute myocardial infarction: a randomized controlled trial. JAMA.1998;279:1358-1363.Google Scholar 14. Campbell CM, Parboosingh J, Gondocz T. et al. Study of the factors influencing the stimulus to learning recorded by physicians keeping a learning portfolio. J Cont Educ Health Prof.1999;19:16-24.Google Scholar 15. ACGME Outcome Project. Practice-based learning and improvement: assessment approaches. Available at: http://www.acgme.org/outcome/assess/PBLI_Index.asp. Accessibility verified May 9, 2002. 16. ACGME Outcome Project. Available at: http://www.acgme.org/Outcome/. Accessibility verified May 29, 2002. 17. Fox RD. Theory and practice in continuing professional development. J Cont Educ Health Prof.2000;20:238-246.Google Scholar 18. Fox RD, Mazmanian PE, Putnam RW. A theory of learning and change. In: Fox RD, Mazmanian PE, Putnam RW, eds. Changing and Learning in the Lives of Physicians. New York, NY: Praeger Publishers; 1989:161-175. 19. Pollack MM, Getson PR. Pediatric critical care cost containment: combined actuarial and clinical program. Crit Care Med.1991;19:12-20.Google Scholar 20. Wilson DM, Taylor DW, Gilbert JR. et al. A randomized trial of a family physician intervention for smoking cessation. JAMA.1988;260:1570-1574.Google Scholar 21. Benninger MS, King F, Nichols RD. Management guidelines for improvement of otolaryngology referrals from primary care physicians. Otolaryngol Head Neck Surg.1995;113:446-452.Google Scholar 22. Brown CR. The continuing education component of the bi-cycle approach to quality assurance. In: Egdahl RH, Gertman PM, eds. Quality Health Care: The Role of Continuing Medical Education. Germantown, Md: Aspen Systems Corp; 1977:11-18. 23. Maiman LA, Becker MH, Liptak GS. et al. Improving pediatricians'compliance-enhancing practices. AJDC.1988;142:773-779.Google Scholar 24. Restuccia JD. The effect of concurrent feedback in reducing inappropriate hospital utilization. Med Care.1982;20:46-62.Google Scholar 25. Schaafsma J, Osoba D. The Karnofsky Performance Status Scale re-examined: a cross-validation with the EORTC-C30. Qual Life Res.1994;3:413-424.Google Scholar 26. Ware Jr JE. SF-36 health survey update. Spine.2000;25:3130-3139.Google Scholar 27. The Cochrane Effective Practice and Organisation of Care Group (EPOC). Aberdeen, United Kingdom: University of Aberdeen; 2001. 28. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA.2002;287:226-235.Google Scholar

Journal

JAMAAmerican Medical Association

Published: Sep 4, 2002

References