Lessons learned from piloting mHealth informatics practice curriculum into a medical elective

Lessons learned from piloting mHealth informatics practice curriculum into a medical elective Abstract Introduction This case study reports the development and delivery of an mHealth elective piloted for first-year undergraduate medical students at Monash University (Australia) and the lessons learned by designers. Results The students were not as adept at using mHealth devices as the literature had predicted. Expert speakers using mHealth for practice perceptibly engaged students. Force-field analysis was a useful basis for devising end-user evaluative research tools for practice. Combining small- and large-group discussions with eLearning discussions promoted student engagement with new concepts and associated jargon. Assessment by mHealth informatics champions supported the students’ independent learning. Lessons learned Promotion of mHealth curriculum must be transparent and clear. Our elective delivery was hampered by a lack of suitable mobile device ownership and limited availability of useful, free apps. Technological jargon required clarification. Educators require particular mHealth informatics and educational expertise to support mHealth pedagogies. This learning helps to prepare medical curriculum designers for addressing evolving mHealth practice horizons. curriculum innovation, health informatics education, medical education, telehealth, telemedicine INTRODUCTION Despite the growing presence of mobile devices (mHealth) and medical applications (apps) in the health care workplace, designers of medical education curricula have largely overlooked these aspects of health practice.1 Medical registration requirements and changing legislation necessitate the inclusion of health informatics components into the curriculum. These pressures have combined to intensify recently, growing increasingly urgent in the face of Australian legislation and culminating in the Australian Medical Association’s support of government practice assessments to evaluate information handling operations.2,3 Evidence suggests that health care curricula, particularly in medicine courses, is overcrowded, working against the explicit inclusion of eHealth, let alone mHealth, informatics components.4,5 Assessments of mHealth quality of patient care outcomes in the literature are complex, with most claims based on patchy, understudied, and inconsistent data.6,7 The literature also indicates that real-time medical communication, improved care outcomes, and patient health gains signify the worthiness of the mHealth patient benefit goal.6–8 Most public health patients and their physicians already rely upon mHealth informatics for care regardless of their competency. The application of mHealth in the private sector appears to be less consistent.7 Medical students seem to be adept at using mobile apps, although this does not necessarily translate to comfort using mHealth for practice in real life.1 The literature suggests that universities need to educate and train medical students to use this technology to improve the quality of patient care. Such training can provide a foundation for medical students to develop comfort and confidence with using the technology in care settings after they graduate and go on as doctors.1–3,9,10 Indeed, a Monash (Australia) study shows that the majority of undergraduate medical students plan to use mHealth applications to improve the quality of patient care outcomes after graduation, although they receive no formal training during their medical education.11 The Australian Curriculum Framework for Junior Doctors scope of practice suggests that we should expect medical education to link patient outcomes to current physician competence in mHealth.5,13 Therefore, it is necessary to develop strategies that address this learning deficit. This study documents our attempt to embed mHealth informatics into current medical curricula. To address the mismatch between our curricula and reality, a single-semester elective option, “Computer Games and Applications for Health and Wellbeing,” was introduced into the first year of a medical course at Monash University. This student-centered, experiential elective aimed to allow students to acquire and develop skills using devices and mHealth apps framed in a clinical context. Study design This qualitative case study analyzes the process of embedding mHealth informatics into an undergraduate elective program, part of the Year 1 medical curriculum. The faculty-educator relied on working through the Gibbs model of self-reflection with a faculty-mentor for guidance during elective delivery.12 The reflections, done after every elective session, collated data from educator self-assessment notes, observed student body language and participation, and reviewed the outcomes from straw polls of students. This approach is well suited to exploring and sustaining rapid change processes while curriculum evolves.12 University human ethics approval was obtained for this study. mHEALTH ELECTIVE DESIGN AND DELIVERY The Monash undergraduate bachelor of medicine and bachelor of surgery is a hybrid problem-based learning curriculum arranged in themes.13 About 300 first-year students were required to take 1 elective selected from a range of options during the second semester. Approximately 20 electives were advertised on the Learning Management System (LMS): painting, indigenous culture, medical humanities, surgical anatomy, music, mental health first aid, the science of sleep, and others. The students studied independently for 4 h and attended 2 h of in-class delivery. The electives ran for 10 weeks, as 2-week clinical placement opportunities occurred during the same semester. To pass Semester 2, students were required to successfully complete 1 elective. The mHealth elective covered themes across the first-year medical education, including knowledge management, critical thinking, and professional behavior. The cohort of 15 students who enrolled in the elective was organized into small groups of 3. Each student group selected 1 category of mHealth application, such as color blindness tools, to evaluate. Choices were limited to free, open-source applications using tablets or smartphones that could be connected to the university system and the Internet from the classroom. Learning outcomes Learning outcomes (LOs) for the mHealth component were informed by feedback from students, consultation with colleagues, familiarity with the relevant educational and health informatics literature, medical registration and regulatory expertise, and professional competence.14–16 Often literature that discussed the mHealth curriculum pointed to a need for initial health and mobile technology skills assessment and follow-up training sessions for students. The requirement for health and mobile technology skills assessment is borne out by overseas experience.16,17 The Australasian College of Health Informatics membership made several suggestions for meaningful LOs on their e-mail forum. Members often used their own professional networks to support design and development of the elective.14,15 The final LOs designed by faculty for this elective used all feedback and are illustrated in Figure 1. Figure 1. View largeDownload slide Learning outcomes. Figure 1. View largeDownload slide Learning outcomes. The syllabus The mHealth syllabus was published on the LMS and in unit guides that supported each elective (Supplementary Appendix 1). The unit guides facilitate a contextual understanding for students, support their engagement, and ensure clarity. The titles, focus, and types of classes for the elective are shown in Table 1. Table 1. Elective session topics and delivery Session title  Focus  Type  1. Introduction to course and learning outcomes  Explore screenshots and videos  Small-group discussion, technical skills quiz  2. Identify and critique evaluation tools  Apply learner knowledge and experience of the mHealth to practice scenarios  Small-group discussion, force-field analysis  3. Devise small-group evaluation tools  Guest presentation; discuss force-field analyses and workshop student-developed tools  Small-group discussion; devise evaluation tool  4. Computer games for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  5. Computer games for health and wellness  Synthesize information and analyze own Internet-based practice  Small-group problem solving with evaluation tool and own devices  6. Smartphone apps for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  7. Smartphone apps for health and wellness  Synthesize information and analyze own smartphone practice  Small-group problem solving with evaluation tool and own devices  8. Social networking for health and wellness  Guest presentation; explore screenshots and videos  Workshop; large-group discussion  9. Social networking for health and wellness  Synthesize information and analyze own social media practice  Small-group problem solving with evaluation tool and own devices  10. 3D applications for health and wellness  Explore virtual worlds in health care; usability analysis  Large- and small-group discussion; technical skills quiz  11. Group presentationsa  Present small-group evaluation findings  Assessment: small-group presentations  Session title  Focus  Type  1. Introduction to course and learning outcomes  Explore screenshots and videos  Small-group discussion, technical skills quiz  2. Identify and critique evaluation tools  Apply learner knowledge and experience of the mHealth to practice scenarios  Small-group discussion, force-field analysis  3. Devise small-group evaluation tools  Guest presentation; discuss force-field analyses and workshop student-developed tools  Small-group discussion; devise evaluation tool  4. Computer games for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  5. Computer games for health and wellness  Synthesize information and analyze own Internet-based practice  Small-group problem solving with evaluation tool and own devices  6. Smartphone apps for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  7. Smartphone apps for health and wellness  Synthesize information and analyze own smartphone practice  Small-group problem solving with evaluation tool and own devices  8. Social networking for health and wellness  Guest presentation; explore screenshots and videos  Workshop; large-group discussion  9. Social networking for health and wellness  Synthesize information and analyze own social media practice  Small-group problem solving with evaluation tool and own devices  10. 3D applications for health and wellness  Explore virtual worlds in health care; usability analysis  Large- and small-group discussion; technical skills quiz  11. Group presentationsa  Present small-group evaluation findings  Assessment: small-group presentations  aSmall groups analyzed sessions of their choice for final presentations outside of the elective. RESULTS During the technical quiz at the first session, it became apparent that many students were not sophisticated end users of mHealth for practice or study. A straw poll indicated that they needed explicit and contextual explanations of mHealth terminology and jargon to meet LOs. Small groups of students researched and devised a useful health IT glossary from which to launch their learning. The glossary included definitions for terms such as “cache” and “interoperable” and medical informatics communication standards such as SnoMed-CT. The glossary was uploaded to the LMS, so students were able to discuss, modify, and add useful definitions throughout the elective. (The glossary is located on a legacy LMS, and so we are unable to show an example here.) The first session of the elective was modified on the fly when a smartphone discussion flowed from the technical quiz. The students could use smartphones for social media and the university LMS, but evaluating apps did not seem possible to them. Their smartphones were old and had been superseded by much newer models. No student owned a computer tablet, although 2 owned laptop computers. While we toured some pertinent mHealth apps together for group discussion, the students argued that the screen space on their mobile phones was too small for useful evaluation purposes. A set of iPad tablets was therefore borrowed from another department in the faculty and loaned to students for use during the elective. Students commenced designing evaluative tools using force-field analysis to conduct end-user assessments of mHealth apps in week 3. Force-field analysis examines the reasons for and against the phenomenon in question. The analysis assessed measures embodied in student tools to ensure that these moved toward specific goals. The educator led discussions about construct validity and whether the students’ tools were actually measuring the constructs they had decided upon in small groups. We also examined whether each measure related appropriately to others embodied in their tools. Students reported extended use of these instruments to analyze other apps they had discovered in clinical settings across the semester. The evaluative tools compared app measures for fitness of purpose against the students’ own expectations as end users and gold standard health management measures articulated in clinical practice guidelines. Figure 2 illustrates a representative sample of the evaluative student-constructed tools applied during the elective. Figure 2. View largeDownload slide Sample evaluative tool. Figure 2. View largeDownload slide Sample evaluative tool. Two dedicated sessions were delivered by expert speakers. In week 3, a final-year student discussed the application of telehealth and telemedicine to practice reality, using the preliminary results of his research. Week 8 was led by a physician who is also an academician and mHealth informatics champion, on the application of mobile social networking to practice (Supplementary Appendix 2). Student engagement was palpable during these sessions; the educator observed positive body language, lively discussion, and attentive listening. Student teams took turns to present and lead class discussions about their work every fortnight. Our discussions indicated a greater level of analytical detail, combining medical concepts with mHealth concepts, by the time of the oral technical skills quiz for students in week 10. More nuanced discussions occurred over time, and there was increased comfort using mHealth and other IT terminology, as shown in Figure 2. An ongoing feedback process during in-class presentations allowed students to recalibrate and improve their practices.18 They could consider whether their work was successful and ensure that everything was on track to meet LO expectations, while having sufficient time to ameliorate concerns. Facilitating student discussions required higher-order thinking from the educator later in the elective than it had earlier. During the last session, each group presented its evaluations of mHealth tools for practice to invited assessors. The final student presentations were: Gaming and Addiction 3D Anatomy Apps Medical Smartphone Apps The Usability of Online Brain-Training Apps Investigating and Critiquing the Relationship Between Electronic Games and the Onset of Dementia Assessors included faculty members, academicians, and external health informatics experts, who provided summative assessments in accordance with the rubric, depicted in Figure 3. Figure 3. View largeDownload slide Assessment rubric. Figure 3. View largeDownload slide Assessment rubric. Faculty evaluation Assessment of faculty resources needed for the elective during educator self-review with mentors yielded useful data. Developing and delivering the elective required tailoring to real-life learning and teaching facilities in a financially constrained context. As a corollary, funding was not sufficient to allow the purchase of most serious games and apps for student review, limiting those that could be used for the program. Ultimately, educator research funds provided students with a small sum of money to allow the purchase of useful apps. As student presentation sessions commenced, we needed to devise ways to connect iPads on the Macintosh platform to personal computers on the Windows platform in classrooms and on IT networks, because Windows and Macintosh are not interoperable without third-party apps. Sometimes a smartphone or computer laptop, based on yet another platform, needed to be connected to an iPad or personal computer to facilitate large-group discussions and presentations. So we needed to work on ways to display presentation files in various formats and combine numerous platforms concurrently during sessions. This frequently required the educator to devise on-the-fly workarounds. Faculty evaluative data did not identify specific responses for individual electives. However, the educator reviewed all aspects of the elective with her mentor weekly and in accordance with the Gibbs model. The reviews considered issues such as the title of the elective, expectations of students, their motivations as articulated in class, and gender imbalance in enrollment; only 1 female student selected the elective. We also reviewed curriculum design and the assessment approach. Qualitative data were gathered from invited speakers and assessors. The conclusions drawn from this evaluative process are limited, but nevertheless valuable for planning future mHealth medical curriculum. DISCUSSION OF LESSONS LEARNED Designing and implementing the “Computer Games and Applications for Health and Wellbeing” elective, underpinned by a structured self-refection process, yielded several helpful lessons. Our key lessons – promoting the program, using technological terms, addressing mobile device ownership, devising simple end-user evaluative tools using force-field analysis, growing students’ confidence in their presentation and communication skills, harnessing experts, and tailoring the syllabus for real life in a financially constrained university context – are discussed and illustrated in Table 2. Table 2. Key lessons learned Topic  Discussion  Lesson  Promotion  Appealing elective title to engage students: term “computer games” embedded in the title 1st elective session: discussion; misleading title – some students playing games rather than doing a serious exploration of games and apps relating to practice Gender bias in enrollment (1/15 female)  Construction of clear and transparent titles is necessary, ensuring that such misconceptions are avoided May also assist in redressing enrollment gender bias  Terminology  Designers assumed high level of mHealth skill among students: assumption disproved during technical quiz session in week 1 Required extensive explanation and clarification of mHealth informatics jargon and terminology Educator adopted action learning approach to facilitate a glossary on the LMS for students to create, examine, and refresh as they needed  It is better to include these elements at the beginning of the program design process  Device ownership  Student feedback: smartphone screens were not large or clear enough for them to assess mobile apps Faculty loan of devices resolved the issue  Borrow suitable devices from faculty, or approach business re devices for students  Force-field analysis  Small- and large-group force-field discussions enabled students to see the big picture  Prepared students re rigorous research design  Skills confidence  Educator observations and technical quiz results: evidence of student confidence increasing during elective Improved student communication and presentation skills demonstrated Formal educator self-reflection supported interactive peer feedback provided to students during presentation and large-group discussion activities  Ongoing feedback loop – students identify issues and understand action resulting from in-class discussions  Experts  Including experts ensured suitably constructed LOs that reflected practice reality and identified content that was meaningful to students Robust assessors with key expertise Allowed students to see their future selves reflected in experts  Built links, enhanced relationships Provided a high profile with the faculty  Faculty  Faculty needs educator with technical and educational health informatics and mHealth expertise Technical expertise – strong knowledge of platforms and systems as well as capability to link a range of platforms into the university IT system and the Internet  Demands technically and educationally qualified faculty-educators  Budget  Tight: self-reflection a critical supporting component underpinning our adjustments during elective delivery  Flexible educator, open to on-the-fly adjustments  Topic  Discussion  Lesson  Promotion  Appealing elective title to engage students: term “computer games” embedded in the title 1st elective session: discussion; misleading title – some students playing games rather than doing a serious exploration of games and apps relating to practice Gender bias in enrollment (1/15 female)  Construction of clear and transparent titles is necessary, ensuring that such misconceptions are avoided May also assist in redressing enrollment gender bias  Terminology  Designers assumed high level of mHealth skill among students: assumption disproved during technical quiz session in week 1 Required extensive explanation and clarification of mHealth informatics jargon and terminology Educator adopted action learning approach to facilitate a glossary on the LMS for students to create, examine, and refresh as they needed  It is better to include these elements at the beginning of the program design process  Device ownership  Student feedback: smartphone screens were not large or clear enough for them to assess mobile apps Faculty loan of devices resolved the issue  Borrow suitable devices from faculty, or approach business re devices for students  Force-field analysis  Small- and large-group force-field discussions enabled students to see the big picture  Prepared students re rigorous research design  Skills confidence  Educator observations and technical quiz results: evidence of student confidence increasing during elective Improved student communication and presentation skills demonstrated Formal educator self-reflection supported interactive peer feedback provided to students during presentation and large-group discussion activities  Ongoing feedback loop – students identify issues and understand action resulting from in-class discussions  Experts  Including experts ensured suitably constructed LOs that reflected practice reality and identified content that was meaningful to students Robust assessors with key expertise Allowed students to see their future selves reflected in experts  Built links, enhanced relationships Provided a high profile with the faculty  Faculty  Faculty needs educator with technical and educational health informatics and mHealth expertise Technical expertise – strong knowledge of platforms and systems as well as capability to link a range of platforms into the university IT system and the Internet  Demands technically and educationally qualified faculty-educators  Budget  Tight: self-reflection a critical supporting component underpinning our adjustments during elective delivery  Flexible educator, open to on-the-fly adjustments  CONCLUSION Future preparations to embed mHealth into the medical curriculum require more planning and research in the future than our mHealth elective pilot, which was opportunistic. The first-year elective program yielded an opportunity to use existing course structure but limited the sustainability and inclusion of mHealth in the curriculum. Experience in design and delivery of this elective has informed content for knowledge management, which is a core component of the current undergraduate medical curriculum at Monash. We also plan to embed mHealth into problem-based learning scenarios, such as with a scenario design where an app is used to manage a specified patient condition, or similar. We hope that our experience fosters further robust academic exploration of this domain to meet evolving legal and medical scope-of-practice curricula. ACKNOWLEDGMENTS Dr Chris Bain, Dr Kaihan Yao, Mr Mick Foy, Ms Nicole Peeters, and the Australasian College of Health Informatics members and fellows who contributed to the design and delivery of the mHealth elective component. Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sector. Competing Interests The authors have no competing interests to declare. Authorship contribution Equal co-authorship, that is, 50% JF and 50% JL. SUPPLEMENTARY MATERIAL Supplementary material is available at Journal of the American Medical Informatics Association online. References 1 Otto A, Kushniruk A. Incorporation of medical informatics and information technology as core components of undergraduate medical education – Time for change! Stud Health Technol Inform.  2009; 143: 62– 67. Google Scholar PubMed  2 Department of Human Services. Practice Incentives Program (PIP), Medicare. Australian Government Web Page. AMA (2014) OAIC to Assess Use of PCEHR by Practices . http://www.medicareaustralia.gov.au/provider/incentives/pip/ 12. Accessed October 26, 2014. 3 The Australian Medical Association (AMA). OAIC Assess PCEHR Practices . https://ama.com.au/gpnn/oaic-assess-use-pcehr-practices. Accessed October 27, 2014. 4 Gray K, Dattakumar A, Maeder A, Butler-Henderson K, Chenery H. Advancing Ehealth Education for the Clinical Health Professions; Final Report . 2014. Sydney: Australian Government Department of Education Office for Learning and Teaching; 2013. http://clinicalinformaticseducation.pbworks.com/w/page/37009016/Clinical%20Informatics%20Education. Accessed June 3, 2015. 5 Confederation of Postgraduate Medical Councils (CPMC). Australian Curriculum Framework for Junior Doctors . 2011. http://curriculum.cpmec.org.au/. Accessed February 9, 2017. 6 van der Vaart R. Development of the digital health literacy instrument: measuring a broad spectrum of Health 1.0 and 2.0 skills. J Med Internet Res.  2017; 19 1: e27. Google Scholar CrossRef Search ADS PubMed  7 World Health Organization. mHealth: New horizons for health using mobile technologies. Global Observatory for eHealth Series, Vol. 3 , 2011. http://www.who.int/goe/publications/goe_mhealth_web.pdf. Accessed January 27, 2017. 8 Coiera E, Westbrook J. Should clinical software be regulated? Med J Aust.  2006; 184 12: 600– 01. Google Scholar PubMed  9 Fernando J. Clinical software on personal mobile devices needs regulation. Med J Aust.  2012; 196 7: 437. Google Scholar CrossRef Search ADS PubMed  10 Watson M, Jolly B. The future of Australian medical education: a focus on technology. Med J Aust.  2012;( 1 Suppl 3): 26– 28. 11 Koehler N, Yao K, Vujovic O, McMenamin C. Medical students’ use of and attitudes towards medical applications. J Mobile Technol Med.  2012; 1 4: 16– 21. Google Scholar CrossRef Search ADS   12 Gibbs G. Learning by Doing: A Guide to Teaching and Learning Methods . Further Education Unit. Oxford: Oxford Polytechnic; 1988. https://thoughtsmostlyaboutlearning.files.wordpress.com/2015/12/learning-by-doing-graham-gibbs.pdf. Accessed July 29, 2017. 13 Monash University MED1011. Medicine: Handbook on the Internet . https://www.monash.edu.au/pubs/handbooks/units/MED1011.html. Accessed February 26, 2017. PubMed PubMed  14 Australasian College of Health Informatics (ACHI). Splash Page on the Internet . Australasian College of Health Informatics, 2013. http://www.achi.org.au. Accessed June 3, 2016. 15 Fernando J. Personal Web Page . 2013. http://users.monash.edu.au/∼juanitaf/. Accessed June 3, 2016. 16 Berglunda M, Nilssona C, Révaya P, Petersson G, Nilsson G. Nurses’ and nurse students’ demands of functions and usability in a PDA. Int J Med Inform.  2007; 76 7: 530– 37. Google Scholar CrossRef Search ADS PubMed  17 November N, Day K. Using undergraduates' digital literacy skills to improve their discipline- specific writing: A dialogue. Int J Scholarship Teaching Learn  2012; 6 2: Article 5. Google Scholar CrossRef Search ADS   18 Molloy E, Boud D. Changing conceptions of feedback. In Boud D, Molloy E, eds. Feedback in Higher and Professional Education . London: Routledge; 2013: 11– 33. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of the American Medical Informatics Association Oxford University Press

Lessons learned from piloting mHealth informatics practice curriculum into a medical elective

Loading next page...
 
/lp/ou_press/lessons-learned-from-piloting-mhealth-informatics-practice-curriculum-pZnW0tlIos
Publisher
Oxford University Press
Copyright
© The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
ISSN
1067-5027
eISSN
1527-974X
D.O.I.
10.1093/jamia/ocx076
Publisher site
See Article on Publisher Site

Abstract

Abstract Introduction This case study reports the development and delivery of an mHealth elective piloted for first-year undergraduate medical students at Monash University (Australia) and the lessons learned by designers. Results The students were not as adept at using mHealth devices as the literature had predicted. Expert speakers using mHealth for practice perceptibly engaged students. Force-field analysis was a useful basis for devising end-user evaluative research tools for practice. Combining small- and large-group discussions with eLearning discussions promoted student engagement with new concepts and associated jargon. Assessment by mHealth informatics champions supported the students’ independent learning. Lessons learned Promotion of mHealth curriculum must be transparent and clear. Our elective delivery was hampered by a lack of suitable mobile device ownership and limited availability of useful, free apps. Technological jargon required clarification. Educators require particular mHealth informatics and educational expertise to support mHealth pedagogies. This learning helps to prepare medical curriculum designers for addressing evolving mHealth practice horizons. curriculum innovation, health informatics education, medical education, telehealth, telemedicine INTRODUCTION Despite the growing presence of mobile devices (mHealth) and medical applications (apps) in the health care workplace, designers of medical education curricula have largely overlooked these aspects of health practice.1 Medical registration requirements and changing legislation necessitate the inclusion of health informatics components into the curriculum. These pressures have combined to intensify recently, growing increasingly urgent in the face of Australian legislation and culminating in the Australian Medical Association’s support of government practice assessments to evaluate information handling operations.2,3 Evidence suggests that health care curricula, particularly in medicine courses, is overcrowded, working against the explicit inclusion of eHealth, let alone mHealth, informatics components.4,5 Assessments of mHealth quality of patient care outcomes in the literature are complex, with most claims based on patchy, understudied, and inconsistent data.6,7 The literature also indicates that real-time medical communication, improved care outcomes, and patient health gains signify the worthiness of the mHealth patient benefit goal.6–8 Most public health patients and their physicians already rely upon mHealth informatics for care regardless of their competency. The application of mHealth in the private sector appears to be less consistent.7 Medical students seem to be adept at using mobile apps, although this does not necessarily translate to comfort using mHealth for practice in real life.1 The literature suggests that universities need to educate and train medical students to use this technology to improve the quality of patient care. Such training can provide a foundation for medical students to develop comfort and confidence with using the technology in care settings after they graduate and go on as doctors.1–3,9,10 Indeed, a Monash (Australia) study shows that the majority of undergraduate medical students plan to use mHealth applications to improve the quality of patient care outcomes after graduation, although they receive no formal training during their medical education.11 The Australian Curriculum Framework for Junior Doctors scope of practice suggests that we should expect medical education to link patient outcomes to current physician competence in mHealth.5,13 Therefore, it is necessary to develop strategies that address this learning deficit. This study documents our attempt to embed mHealth informatics into current medical curricula. To address the mismatch between our curricula and reality, a single-semester elective option, “Computer Games and Applications for Health and Wellbeing,” was introduced into the first year of a medical course at Monash University. This student-centered, experiential elective aimed to allow students to acquire and develop skills using devices and mHealth apps framed in a clinical context. Study design This qualitative case study analyzes the process of embedding mHealth informatics into an undergraduate elective program, part of the Year 1 medical curriculum. The faculty-educator relied on working through the Gibbs model of self-reflection with a faculty-mentor for guidance during elective delivery.12 The reflections, done after every elective session, collated data from educator self-assessment notes, observed student body language and participation, and reviewed the outcomes from straw polls of students. This approach is well suited to exploring and sustaining rapid change processes while curriculum evolves.12 University human ethics approval was obtained for this study. mHEALTH ELECTIVE DESIGN AND DELIVERY The Monash undergraduate bachelor of medicine and bachelor of surgery is a hybrid problem-based learning curriculum arranged in themes.13 About 300 first-year students were required to take 1 elective selected from a range of options during the second semester. Approximately 20 electives were advertised on the Learning Management System (LMS): painting, indigenous culture, medical humanities, surgical anatomy, music, mental health first aid, the science of sleep, and others. The students studied independently for 4 h and attended 2 h of in-class delivery. The electives ran for 10 weeks, as 2-week clinical placement opportunities occurred during the same semester. To pass Semester 2, students were required to successfully complete 1 elective. The mHealth elective covered themes across the first-year medical education, including knowledge management, critical thinking, and professional behavior. The cohort of 15 students who enrolled in the elective was organized into small groups of 3. Each student group selected 1 category of mHealth application, such as color blindness tools, to evaluate. Choices were limited to free, open-source applications using tablets or smartphones that could be connected to the university system and the Internet from the classroom. Learning outcomes Learning outcomes (LOs) for the mHealth component were informed by feedback from students, consultation with colleagues, familiarity with the relevant educational and health informatics literature, medical registration and regulatory expertise, and professional competence.14–16 Often literature that discussed the mHealth curriculum pointed to a need for initial health and mobile technology skills assessment and follow-up training sessions for students. The requirement for health and mobile technology skills assessment is borne out by overseas experience.16,17 The Australasian College of Health Informatics membership made several suggestions for meaningful LOs on their e-mail forum. Members often used their own professional networks to support design and development of the elective.14,15 The final LOs designed by faculty for this elective used all feedback and are illustrated in Figure 1. Figure 1. View largeDownload slide Learning outcomes. Figure 1. View largeDownload slide Learning outcomes. The syllabus The mHealth syllabus was published on the LMS and in unit guides that supported each elective (Supplementary Appendix 1). The unit guides facilitate a contextual understanding for students, support their engagement, and ensure clarity. The titles, focus, and types of classes for the elective are shown in Table 1. Table 1. Elective session topics and delivery Session title  Focus  Type  1. Introduction to course and learning outcomes  Explore screenshots and videos  Small-group discussion, technical skills quiz  2. Identify and critique evaluation tools  Apply learner knowledge and experience of the mHealth to practice scenarios  Small-group discussion, force-field analysis  3. Devise small-group evaluation tools  Guest presentation; discuss force-field analyses and workshop student-developed tools  Small-group discussion; devise evaluation tool  4. Computer games for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  5. Computer games for health and wellness  Synthesize information and analyze own Internet-based practice  Small-group problem solving with evaluation tool and own devices  6. Smartphone apps for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  7. Smartphone apps for health and wellness  Synthesize information and analyze own smartphone practice  Small-group problem solving with evaluation tool and own devices  8. Social networking for health and wellness  Guest presentation; explore screenshots and videos  Workshop; large-group discussion  9. Social networking for health and wellness  Synthesize information and analyze own social media practice  Small-group problem solving with evaluation tool and own devices  10. 3D applications for health and wellness  Explore virtual worlds in health care; usability analysis  Large- and small-group discussion; technical skills quiz  11. Group presentationsa  Present small-group evaluation findings  Assessment: small-group presentations  Session title  Focus  Type  1. Introduction to course and learning outcomes  Explore screenshots and videos  Small-group discussion, technical skills quiz  2. Identify and critique evaluation tools  Apply learner knowledge and experience of the mHealth to practice scenarios  Small-group discussion, force-field analysis  3. Devise small-group evaluation tools  Guest presentation; discuss force-field analyses and workshop student-developed tools  Small-group discussion; devise evaluation tool  4. Computer games for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  5. Computer games for health and wellness  Synthesize information and analyze own Internet-based practice  Small-group problem solving with evaluation tool and own devices  6. Smartphone apps for health and wellness  Explore screenshots and videos  Workshop; large-group discussion  7. Smartphone apps for health and wellness  Synthesize information and analyze own smartphone practice  Small-group problem solving with evaluation tool and own devices  8. Social networking for health and wellness  Guest presentation; explore screenshots and videos  Workshop; large-group discussion  9. Social networking for health and wellness  Synthesize information and analyze own social media practice  Small-group problem solving with evaluation tool and own devices  10. 3D applications for health and wellness  Explore virtual worlds in health care; usability analysis  Large- and small-group discussion; technical skills quiz  11. Group presentationsa  Present small-group evaluation findings  Assessment: small-group presentations  aSmall groups analyzed sessions of their choice for final presentations outside of the elective. RESULTS During the technical quiz at the first session, it became apparent that many students were not sophisticated end users of mHealth for practice or study. A straw poll indicated that they needed explicit and contextual explanations of mHealth terminology and jargon to meet LOs. Small groups of students researched and devised a useful health IT glossary from which to launch their learning. The glossary included definitions for terms such as “cache” and “interoperable” and medical informatics communication standards such as SnoMed-CT. The glossary was uploaded to the LMS, so students were able to discuss, modify, and add useful definitions throughout the elective. (The glossary is located on a legacy LMS, and so we are unable to show an example here.) The first session of the elective was modified on the fly when a smartphone discussion flowed from the technical quiz. The students could use smartphones for social media and the university LMS, but evaluating apps did not seem possible to them. Their smartphones were old and had been superseded by much newer models. No student owned a computer tablet, although 2 owned laptop computers. While we toured some pertinent mHealth apps together for group discussion, the students argued that the screen space on their mobile phones was too small for useful evaluation purposes. A set of iPad tablets was therefore borrowed from another department in the faculty and loaned to students for use during the elective. Students commenced designing evaluative tools using force-field analysis to conduct end-user assessments of mHealth apps in week 3. Force-field analysis examines the reasons for and against the phenomenon in question. The analysis assessed measures embodied in student tools to ensure that these moved toward specific goals. The educator led discussions about construct validity and whether the students’ tools were actually measuring the constructs they had decided upon in small groups. We also examined whether each measure related appropriately to others embodied in their tools. Students reported extended use of these instruments to analyze other apps they had discovered in clinical settings across the semester. The evaluative tools compared app measures for fitness of purpose against the students’ own expectations as end users and gold standard health management measures articulated in clinical practice guidelines. Figure 2 illustrates a representative sample of the evaluative student-constructed tools applied during the elective. Figure 2. View largeDownload slide Sample evaluative tool. Figure 2. View largeDownload slide Sample evaluative tool. Two dedicated sessions were delivered by expert speakers. In week 3, a final-year student discussed the application of telehealth and telemedicine to practice reality, using the preliminary results of his research. Week 8 was led by a physician who is also an academician and mHealth informatics champion, on the application of mobile social networking to practice (Supplementary Appendix 2). Student engagement was palpable during these sessions; the educator observed positive body language, lively discussion, and attentive listening. Student teams took turns to present and lead class discussions about their work every fortnight. Our discussions indicated a greater level of analytical detail, combining medical concepts with mHealth concepts, by the time of the oral technical skills quiz for students in week 10. More nuanced discussions occurred over time, and there was increased comfort using mHealth and other IT terminology, as shown in Figure 2. An ongoing feedback process during in-class presentations allowed students to recalibrate and improve their practices.18 They could consider whether their work was successful and ensure that everything was on track to meet LO expectations, while having sufficient time to ameliorate concerns. Facilitating student discussions required higher-order thinking from the educator later in the elective than it had earlier. During the last session, each group presented its evaluations of mHealth tools for practice to invited assessors. The final student presentations were: Gaming and Addiction 3D Anatomy Apps Medical Smartphone Apps The Usability of Online Brain-Training Apps Investigating and Critiquing the Relationship Between Electronic Games and the Onset of Dementia Assessors included faculty members, academicians, and external health informatics experts, who provided summative assessments in accordance with the rubric, depicted in Figure 3. Figure 3. View largeDownload slide Assessment rubric. Figure 3. View largeDownload slide Assessment rubric. Faculty evaluation Assessment of faculty resources needed for the elective during educator self-review with mentors yielded useful data. Developing and delivering the elective required tailoring to real-life learning and teaching facilities in a financially constrained context. As a corollary, funding was not sufficient to allow the purchase of most serious games and apps for student review, limiting those that could be used for the program. Ultimately, educator research funds provided students with a small sum of money to allow the purchase of useful apps. As student presentation sessions commenced, we needed to devise ways to connect iPads on the Macintosh platform to personal computers on the Windows platform in classrooms and on IT networks, because Windows and Macintosh are not interoperable without third-party apps. Sometimes a smartphone or computer laptop, based on yet another platform, needed to be connected to an iPad or personal computer to facilitate large-group discussions and presentations. So we needed to work on ways to display presentation files in various formats and combine numerous platforms concurrently during sessions. This frequently required the educator to devise on-the-fly workarounds. Faculty evaluative data did not identify specific responses for individual electives. However, the educator reviewed all aspects of the elective with her mentor weekly and in accordance with the Gibbs model. The reviews considered issues such as the title of the elective, expectations of students, their motivations as articulated in class, and gender imbalance in enrollment; only 1 female student selected the elective. We also reviewed curriculum design and the assessment approach. Qualitative data were gathered from invited speakers and assessors. The conclusions drawn from this evaluative process are limited, but nevertheless valuable for planning future mHealth medical curriculum. DISCUSSION OF LESSONS LEARNED Designing and implementing the “Computer Games and Applications for Health and Wellbeing” elective, underpinned by a structured self-refection process, yielded several helpful lessons. Our key lessons – promoting the program, using technological terms, addressing mobile device ownership, devising simple end-user evaluative tools using force-field analysis, growing students’ confidence in their presentation and communication skills, harnessing experts, and tailoring the syllabus for real life in a financially constrained university context – are discussed and illustrated in Table 2. Table 2. Key lessons learned Topic  Discussion  Lesson  Promotion  Appealing elective title to engage students: term “computer games” embedded in the title 1st elective session: discussion; misleading title – some students playing games rather than doing a serious exploration of games and apps relating to practice Gender bias in enrollment (1/15 female)  Construction of clear and transparent titles is necessary, ensuring that such misconceptions are avoided May also assist in redressing enrollment gender bias  Terminology  Designers assumed high level of mHealth skill among students: assumption disproved during technical quiz session in week 1 Required extensive explanation and clarification of mHealth informatics jargon and terminology Educator adopted action learning approach to facilitate a glossary on the LMS for students to create, examine, and refresh as they needed  It is better to include these elements at the beginning of the program design process  Device ownership  Student feedback: smartphone screens were not large or clear enough for them to assess mobile apps Faculty loan of devices resolved the issue  Borrow suitable devices from faculty, or approach business re devices for students  Force-field analysis  Small- and large-group force-field discussions enabled students to see the big picture  Prepared students re rigorous research design  Skills confidence  Educator observations and technical quiz results: evidence of student confidence increasing during elective Improved student communication and presentation skills demonstrated Formal educator self-reflection supported interactive peer feedback provided to students during presentation and large-group discussion activities  Ongoing feedback loop – students identify issues and understand action resulting from in-class discussions  Experts  Including experts ensured suitably constructed LOs that reflected practice reality and identified content that was meaningful to students Robust assessors with key expertise Allowed students to see their future selves reflected in experts  Built links, enhanced relationships Provided a high profile with the faculty  Faculty  Faculty needs educator with technical and educational health informatics and mHealth expertise Technical expertise – strong knowledge of platforms and systems as well as capability to link a range of platforms into the university IT system and the Internet  Demands technically and educationally qualified faculty-educators  Budget  Tight: self-reflection a critical supporting component underpinning our adjustments during elective delivery  Flexible educator, open to on-the-fly adjustments  Topic  Discussion  Lesson  Promotion  Appealing elective title to engage students: term “computer games” embedded in the title 1st elective session: discussion; misleading title – some students playing games rather than doing a serious exploration of games and apps relating to practice Gender bias in enrollment (1/15 female)  Construction of clear and transparent titles is necessary, ensuring that such misconceptions are avoided May also assist in redressing enrollment gender bias  Terminology  Designers assumed high level of mHealth skill among students: assumption disproved during technical quiz session in week 1 Required extensive explanation and clarification of mHealth informatics jargon and terminology Educator adopted action learning approach to facilitate a glossary on the LMS for students to create, examine, and refresh as they needed  It is better to include these elements at the beginning of the program design process  Device ownership  Student feedback: smartphone screens were not large or clear enough for them to assess mobile apps Faculty loan of devices resolved the issue  Borrow suitable devices from faculty, or approach business re devices for students  Force-field analysis  Small- and large-group force-field discussions enabled students to see the big picture  Prepared students re rigorous research design  Skills confidence  Educator observations and technical quiz results: evidence of student confidence increasing during elective Improved student communication and presentation skills demonstrated Formal educator self-reflection supported interactive peer feedback provided to students during presentation and large-group discussion activities  Ongoing feedback loop – students identify issues and understand action resulting from in-class discussions  Experts  Including experts ensured suitably constructed LOs that reflected practice reality and identified content that was meaningful to students Robust assessors with key expertise Allowed students to see their future selves reflected in experts  Built links, enhanced relationships Provided a high profile with the faculty  Faculty  Faculty needs educator with technical and educational health informatics and mHealth expertise Technical expertise – strong knowledge of platforms and systems as well as capability to link a range of platforms into the university IT system and the Internet  Demands technically and educationally qualified faculty-educators  Budget  Tight: self-reflection a critical supporting component underpinning our adjustments during elective delivery  Flexible educator, open to on-the-fly adjustments  CONCLUSION Future preparations to embed mHealth into the medical curriculum require more planning and research in the future than our mHealth elective pilot, which was opportunistic. The first-year elective program yielded an opportunity to use existing course structure but limited the sustainability and inclusion of mHealth in the curriculum. Experience in design and delivery of this elective has informed content for knowledge management, which is a core component of the current undergraduate medical curriculum at Monash. We also plan to embed mHealth into problem-based learning scenarios, such as with a scenario design where an app is used to manage a specified patient condition, or similar. We hope that our experience fosters further robust academic exploration of this domain to meet evolving legal and medical scope-of-practice curricula. ACKNOWLEDGMENTS Dr Chris Bain, Dr Kaihan Yao, Mr Mick Foy, Ms Nicole Peeters, and the Australasian College of Health Informatics members and fellows who contributed to the design and delivery of the mHealth elective component. Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sector. Competing Interests The authors have no competing interests to declare. Authorship contribution Equal co-authorship, that is, 50% JF and 50% JL. SUPPLEMENTARY MATERIAL Supplementary material is available at Journal of the American Medical Informatics Association online. References 1 Otto A, Kushniruk A. Incorporation of medical informatics and information technology as core components of undergraduate medical education – Time for change! Stud Health Technol Inform.  2009; 143: 62– 67. Google Scholar PubMed  2 Department of Human Services. Practice Incentives Program (PIP), Medicare. Australian Government Web Page. AMA (2014) OAIC to Assess Use of PCEHR by Practices . http://www.medicareaustralia.gov.au/provider/incentives/pip/ 12. Accessed October 26, 2014. 3 The Australian Medical Association (AMA). OAIC Assess PCEHR Practices . https://ama.com.au/gpnn/oaic-assess-use-pcehr-practices. Accessed October 27, 2014. 4 Gray K, Dattakumar A, Maeder A, Butler-Henderson K, Chenery H. Advancing Ehealth Education for the Clinical Health Professions; Final Report . 2014. Sydney: Australian Government Department of Education Office for Learning and Teaching; 2013. http://clinicalinformaticseducation.pbworks.com/w/page/37009016/Clinical%20Informatics%20Education. Accessed June 3, 2015. 5 Confederation of Postgraduate Medical Councils (CPMC). Australian Curriculum Framework for Junior Doctors . 2011. http://curriculum.cpmec.org.au/. Accessed February 9, 2017. 6 van der Vaart R. Development of the digital health literacy instrument: measuring a broad spectrum of Health 1.0 and 2.0 skills. J Med Internet Res.  2017; 19 1: e27. Google Scholar CrossRef Search ADS PubMed  7 World Health Organization. mHealth: New horizons for health using mobile technologies. Global Observatory for eHealth Series, Vol. 3 , 2011. http://www.who.int/goe/publications/goe_mhealth_web.pdf. Accessed January 27, 2017. 8 Coiera E, Westbrook J. Should clinical software be regulated? Med J Aust.  2006; 184 12: 600– 01. Google Scholar PubMed  9 Fernando J. Clinical software on personal mobile devices needs regulation. Med J Aust.  2012; 196 7: 437. Google Scholar CrossRef Search ADS PubMed  10 Watson M, Jolly B. The future of Australian medical education: a focus on technology. Med J Aust.  2012;( 1 Suppl 3): 26– 28. 11 Koehler N, Yao K, Vujovic O, McMenamin C. Medical students’ use of and attitudes towards medical applications. J Mobile Technol Med.  2012; 1 4: 16– 21. Google Scholar CrossRef Search ADS   12 Gibbs G. Learning by Doing: A Guide to Teaching and Learning Methods . Further Education Unit. Oxford: Oxford Polytechnic; 1988. https://thoughtsmostlyaboutlearning.files.wordpress.com/2015/12/learning-by-doing-graham-gibbs.pdf. Accessed July 29, 2017. 13 Monash University MED1011. Medicine: Handbook on the Internet . https://www.monash.edu.au/pubs/handbooks/units/MED1011.html. Accessed February 26, 2017. PubMed PubMed  14 Australasian College of Health Informatics (ACHI). Splash Page on the Internet . Australasian College of Health Informatics, 2013. http://www.achi.org.au. Accessed June 3, 2016. 15 Fernando J. Personal Web Page . 2013. http://users.monash.edu.au/∼juanitaf/. Accessed June 3, 2016. 16 Berglunda M, Nilssona C, Révaya P, Petersson G, Nilsson G. Nurses’ and nurse students’ demands of functions and usability in a PDA. Int J Med Inform.  2007; 76 7: 530– 37. Google Scholar CrossRef Search ADS PubMed  17 November N, Day K. Using undergraduates' digital literacy skills to improve their discipline- specific writing: A dialogue. Int J Scholarship Teaching Learn  2012; 6 2: Article 5. Google Scholar CrossRef Search ADS   18 Molloy E, Boud D. Changing conceptions of feedback. In Boud D, Molloy E, eds. Feedback in Higher and Professional Education . London: Routledge; 2013: 11– 33. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices)

Journal

Journal of the American Medical Informatics AssociationOxford University Press

Published: Apr 1, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 12 million articles from more than
10,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Unlimited reading

Read as many articles as you need. Full articles with original layout, charts and figures. Read online, from anywhere.

Stay up to date

Keep up with your field with Personalized Recommendations and Follow Journals to get automatic updates.

Organize your research

It’s easy to organize your research with our built-in tools.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve Freelancer

DeepDyve Pro

Price
FREE
$49/month

$360/year
Save searches from
Google Scholar,
PubMed
Create lists to
organize your research
Export lists, citations
Read DeepDyve articles
Abstract access only
Unlimited access to over
18 million full-text articles
Print
20 pages/month
PDF Discount
20% off