TY - JOUR AU - Bailey,, Thomas AB - Abstract Objective Audio-enhanced computer-assisted self-interviews (ACASIs) are useful adjuncts for clinical care but are rarely integrated into the electronic health record (EHR). We created a flexible framework for integrating an ACASIs with clinical decision support (CDS) into the EHR. We used this program to identify adolescents at risk for sexually transmitted infections (STIs) in the emergency department (ED). We provide an overview of the software platform and qualitative user acceptance. Materials and Methods We created an ACASI with a CDS algorithm to identify adolescents in need of STI testing. We offered it to 15- to 21-year-old patients in our ED, regardless of ED complaint. We collected user feedback via the ACASI. These were programmed into REDCap (Research Electronic Data Capture), and an iOS application utilizing Apple ResearchKit generated a tablet compatible representation of the ACASI for patients. A custom software program created an HL7 (Health Level Seven) message containing a summary of responses, CDS recommendations, and STI test orders, which were transmitted to the EHR. Results In the first year, 1788 of 6227 (28.7%) eligible adolescents completed the survey. Technical issues led to decreased use for several months. Patients rated the system favorably, with 1583 of 1787 (88.9%) indicating that they were “somewhat” or “very comfortable” answering questions electronically and 1291 of 1787 (72.2%) preferring this format over face-to-face interviews or paper questionnaires. Conclusions We present a novel use for REDCap to combine patient-answered questionnaires and CDS to improve care for adolescents at risk for STIs. Our program was well received and the platform can be used across disparate patients, topics, and information technology infrastructures. emergency care, sexually transmitted diseases, electronic health record, clinical decision support, REDCap INTRODUCTION The ubiquity of electronic health records (EHRs) provides new opportunities to implement clinical decision support (CDS) tools at the point of care. One such opportunity is applying CDS to information obtained from patients via electronic questionnaires. Computer-assisted self-interviews (CASIs), self-driven questionnaires, and audio-enhanced CASIs (ACASIs) have been documented in the literature as methods that facilitate obtaining accurate sensitive social information from adolescent and adult patients in a variety of clinical settings.1–4 Though reports of these methods have been reported, the penetration of these methods into clinical care is somewhat limited. In the pediatric population, sustained implementations of systems using patient-reported outcomes (PROs), including some that combine PROs with CDS, have been reported.5–7 The most well known of these is PROMIS (Patient-Reported Outcomes Measurement Information System), which uses validated methods to measure quality of life in several domains. However, these systems are proprietary, limiting the ability to adapt them for use in new settings. While some EHRs provide the ability to deliver patient questionnaires, access to build and deliver these are contained within the EHR. Doing so may limit their use and adaptability as EHR deployments between institutions vary. Direct modifications to the EHR can be costly, and CDS tools developed for use within one institution may require modification to fit the workflow of another. We have previously demonstrated in a pilot study the ability for an ACASI with an integrated CDS system to facilitate screening for sexually transmitted infections (STIs) among adolescents in the pediatric emergency department (ED).8 The software used in our pilot study was designed for use by research staff and not for sustainable use in a clinical setting, but did provide information in real-time to ED clinicians to identify adolescents who may need STI testing. With the success of our pilot, we endeavored to develop a software platform that could be integrated into our ED workflow, provide EHR integration, and potentially be adapted for other clinical domains and environments. The objective of this article is to describe the electronic framework we developed, ED workflow used for our program, challenges faced, and feedback received during implementation. MATERIALS AND METHODS Setting, inclusion or exclusion, patient confidentiality, and institutional review board Our pediatric ED is a level I trauma center and tertiary referral center with approximately 50 000 patient visits per year. The hospital is located in an area with a very high prevalence of STIs, frequently ranking among the top 3 incidences of gonorrhea and chlamydia in the United States.9,10 The questionnaire was offered to all patients 15-21 years of age presenting for care in our ED. The intervention was intended as “opt-out,” such that patients had to actively decline participation. Families and patients received a brochure that provided an overview of our program; however, the questionnaire content was not described and was not disclosed by staff. Parents were not allowed to view the tablet and could only be told about patient responses or need for STI testing if the patient provided consent to do so. Our staff was instructed to not offer the questionnaire to patients if a family member demanded to know the content of the questionnaire or patient responses. This was modeled after our pilot study8 and also reflects Missouri state law that provides minors the ability to confidentially receive care for STIs without parental consent. Patients were not eligible if they were receiving care for sexual abuse or assault, unable to independently use a tablet, or deemed too ill to participate by the ED treatment team. Patients with repeat ED visits could take the survey multiple times. We recommended a 3-month time period before retaking the survey; however, we could not create EHR prompts to inform staff when the patient last completed the survey. It was at the discretion of the patient in such cases to decide whether to complete the survey on subsequent visits. The Washington University Institutional Review Board (IRB) classified our intervention as “non-human subject research” and not subject to IRB review. They classified it as such because we instituted this as standard of care in our ED, and as such was classified similar to a quality improvement initiative. Secondary use of these data does require IRB review. Patient questionnaire We developed a questionnaire designed to obtain a sexual history from adolescents receiving care in our ED, asking questions related to types of sexual intercourse, number of partners, prior testing for STIs, and prior positive STI tests. The questionnaire was based on our pilot work, combining questions from the Centers for Disease Control and Prevention with those generated by an interdisciplinary group of content experts.8 The questionnaire has been previously tested through cognitive interviews of adolescents. We made minimal changes to the questionnaire for this intervention and did not repeat this process. We developed a CDS algorithm to provide recommendations as to whether a patient should receive testing for gonorrhea or chlamydia or human immunodeficiency virus (HIV) (Supplementary Appendix). These recommendations are based on existing recommendations from the Centers for Disease Control and Prevention10 and American Academy of Pediatrics11 and our pilot work. After completing the questionnaire, our CDS tool provided 1 of 4 classifications for gonorrhea or chlamydia testing: (1) patient should be offered testing today; (2) patient has received sufficient testing but is overall at increased risk for STIs based on their sexual history, (3) patient does not need to be offered testing today, and (4) patient did not answer enough questions to provide a recommendation. For HIV testing, our CDS tool provided 1 of 3 classifications: (1) patient should be offered testing today; (2) patient has HIV, and thus testing is not indicated; and (3) patient did not answer enough questions to provide a recommendation. Patients who received a recommendation for immediate testing could electronically “opt-in” to the test(s) using the tablet. This drove steps in the EHR, as described subsequently. Female patients could also request pregnancy testing through the questionnaire. Female patients who requested pregnancy during the questionnaire were also asked to confirm at the end their desire for testing and asked to “opt-in” to pregnancy testing. We restricted our intervention to these infections based on national guidelines and recommendations from local experts in STIs. The decision to test for other infections in symptomatic patients was beyond the scope of our intervention, and was determined by ED providers using local guidelines and our ED standard of care. Patients could provide phone numbers to aid notification of positive tests and create treatment plans after discharge. Those that provided an email address automatically received a message with a link to a hospital website with reproductive health information. Email was not used for any other purpose. Questionnaire responses were integrated into the EHR, as described subsequently. If a guardian requests records, our medical records department reviews and removes sensitive information, such as information related to STIs, unless the patient has given consent. REDCap Our pilot study relied on Microsoft Silverlight and a Windows-based notebook computer and was designed for research purposes only.8 We made substantial changes, attempting to employ a mobile-friendly format and software architecture that provided flexibility for easy modifications. Figure 1 provides a schematic overview of the interaction among the patient, software programs, and EHR. Figure 1. Open in new tabDownload slide Flow of information from patient to physician. EHR: electronic health record; HL7: Health Level Seven; REDCap: Research Electronic Data Capture. Figure 1. Open in new tabDownload slide Flow of information from patient to physician. EHR: electronic health record; HL7: Health Level Seven; REDCap: Research Electronic Data Capture. We used our university’s instance of REDCap (Research Electronic Data Capture) to program our branch-logic questionnaire and CDS. REDCap is a widely available, secure web application for building and managing online surveys and databases.12 There is no fee for use at our institution, although using advanced functions requires paid support. We created our questionnaire within REDCap and used it to provide CDS by using its calculated values functions. See the Supplementary Appendix for the full questionnaire, risk evaluation algorithm, and calculated value equations used in REDCap to classify responses. REDCap provides significant flexibility, with modifications to questionnaires easily made by users without significant experience. Modifications to existing questions and answer choices can be made instantaneously via REDCap, as can changes to calculated fields that provided the CDS. We used functionality within REDCap to automatically generate and send the email described previously to participants who provided an email address. Status/post We employed a software platform called Status/post to implement our ACASIs on iPads (Version 1.5, developed by Christopher Metts). This native iOS application calls functions provided by the REDCap application programming interface to download the survey design and CDS logic defined in calculated fields. Changes made to the questionnaire in REDCap are easily updated in Status/post using a simple refresh feature. The downloaded metadata is parsed to render the survey using Apple ResearchKit user interface elements.13 An advantage of this approach over the standard REDCap mobile app format is the presentation of surveys in a clean, more responsive, and intuitive format that guides the patient, 1 question at a time. Additionally, as a native iOS application, the platform integrates with the iPad operating system to provide functionality that is either not possible or is poorly supported through web-based applications on the iPad. This functionality includes arm-band barcode scanning using the iPad’s built-in camera for patient identification, an option for questions and answers to be read out loud, survey delivery and temporary on-device storage of survey results when network outages are experienced, and transmission of results when network connectivity is restored. The native architecture also allows for transmission of survey results directly from the iPad to both REDCap for storage and analyses, and transmission to a custom software program we called “The Mediator,” which formats the survey results and provides CDS for delivery to the electronic medical record. Mediator, HL7, and the EHR The EHR used during this program is Wellsoft (Somerset, NJ). When 15- to 21-year-old patients were registered in the ED, the EHR generated a nursing order for an “Adolescent Health Questionnaire” that reminded ED staff to provide the questionnaire to patients. The Mediator is a custom software program developed in C# programming language.The primary function of the mediator is to generate an HL7 (Health Level Seven) message. HL7 is widely used for transmitting text reports, and support of electronic data exchange based on HL7 is now required.14,15 The Mediator received patient responses from Status/post coded identically as they are stored in REDCap; it used the survey results to generate 2 separate HL7 messages, which were subsequently transmitted to our EHR: A summary of patient responses in paragraph format that made it easy for clinicians to review patient responses to the questionnaire (Supplementary Appendix). The EHR had a dedicated tab to review the patient summary. Test orders in the EHR for patients that received a recommendation for STI testing and consented via the ACASI to receive testing. The EHR generated the appropriate test for gonorrhea or chlamydia, HIV, and/or pregnancy when appropriate. Test orders were actionable by nursing staff without requiring a review or electronic signature from the ED provider; orders were able to be signed later without impairing test processing or results. HL7 messages were deposited in a network folder that was scanned every 2 minutes by the EHR. The time from survey completion to patient responses and test orders appearing in the EHR was 1-3 minutes. We did not formally measure this with timestamps in our data collection. However, 1-3 minutes was an accurate representation of time for results to appear during testing, and anecdotally during implementation.” The mediator is capable of storing patient answers in a database similar to what is uploaded in REDCap, and provided a redundant method to store patient responses for both clinical and research use. When a completed survey was imported into the EHR, the EHR provided a track board notification to alert clinicians the responses were available for review. Software and hardware security As with all iOS applications, installing Status/post on the iPads places the app in a “sandbox,” which is a set of controls that limit the app’s access to files, preferences, network resources, and hardware. This security measure also ensures that other apps installed on the same device do not have access to data collected by Status/post. Collected survey responses were sent to the Mediator and REDCap concurrently, each through Secure Socket Layer encrypted network connections, and each requiring an authorized security token be submitted. Upon successful posting of a participant’s responses to both endpoints, the responses were deleted from the on-device database. The Status/post application was additionally subject to review by the St. Louis Children’s Hospital security team to ensure compliance with all facility and regulatory requirements. Incomplete sessions were cancelled and deleted if not completed within 20 minutes. We installed Status/post on Apple iPads. Each iPad was secured inside a rolling cart with an adjustable arm, providing mobility and security. Each iPad had a privacy screen that limited the viewable angle, providing privacy for the patient to use the tablet. Each patient received a pair of disposable earbuds that enabled them to listen to the introductory video and use the audio assistance for questions without others in the room being able to hear. Survey process After launching the program, ED staff entered the patient’s account number into Status/post by either (1) utilizing the rear-facing iPad camera to barcode scan the patient wrist band, which auto-populated the 11-digit patient account number, or (2) manually entering the number. The account number is unique to the patient visit, and was used by the EHR to match the HL7 message from the Mediator to the patients ED visit. ED staff also entered the patient’s date of birth and biologic gender into the program manually. The gender entered by the ED staff was used to drive elements of the branch-logic questionnaire that were dependent on the patient’s gender, such as pregnancy testing. While patient’s self-reported gender was included in their questionnaire results, it was not part of the CDS tool. Survey and program feedback At the end of each survey, patients could evaluate the survey in a number of domains (Supplementary Appendix S1). They were asked several multiple choice questions with Likert-type scales, and also given the opportunity to type in open-ended feedback about the program. We created an online survey which ED staff and providers could respond anonymously. They were asked on a 5-point Likert-type scale about their satisfaction with the program as a whole, with responses ranging from1 (very dissatisfied) to 5 (very satisfied), and asked for comments regarding what was and was not working well with the program. We also encouraged staff to directly provide feedback to the physician and nursing leadership overseeing the program. Outcomes We used descriptive statistics (mean, median, interquartile range) to characterize the feedback given by patients and ED staff. In June, 2018 our hospital switched EHRs. Our program was paused until resources were available to integrate our program into the new EHR. Thus, we are presenting data from the first 13 months of implementation (May 2017 to May 2018). Eligible patients and reasons for exclusion and nonparticipants were identified via information abstracted from the EHR, including nursing documentation. Nurses documented a patient as having “recently completed” based on patient self-report. Illness severity was based on patients having an ESI triage acuity score of 1 or nursing documentation. Abuse victims were identified based on the ED chief complaint or nursing documentation. Nurses documented a survey as “complete” when a patient agreed to take the questionnaire. We defined participants as those who had a completed survey in our EHR. RESULTS Implementation During the 13-month period, 7128 patients 15-21 years of age visited our ED, of which 6194 were eligible to participate. Among eligible patients, 1788 of 6194 (28.9%) completed the questionnaire. See Figure 2 for additional information. Over 50% of nonparticipants had no nursing documentation regarding survey status. This comprises patients without completed surveys who met age, acuity, and chief complaint, but where ED staff did not refusal or exclusions. This group could represent patients from any exclusion or nonparticipant categories, and instances where the survey was not offered. Figure 2. Open in new tabDownload slide Eligible audio-enhanced computer-assisted self-interview patients. EHR: electronic health record; Figure 2. Open in new tabDownload slide Eligible audio-enhanced computer-assisted self-interview patients. EHR: electronic health record; Figure 3 provides a monthly breakdown of completed surveys and eligible patients. Peak usage occurred in the first 3 months. In November, 2017, we began encountering software issues related to Apple iOS operating software updates causing compatibility issues with Status/post. These errors led to surveys being unable to be completed, and occurred simultaneously with an increase in ED volume, leading to decreased utilization of the program. Survey completion reached a nadir in January 2018, when only 17 of 508 (3.3%) of patients completed the ACASI, as our program was not available for use most of the month. Updates to the Status/post software resolved the issue, after which utilization subsequently increased. Figure 3. Open in new tabDownload slide Audio-enhanced computer-assisted self-interview (ACASI) completion by month. Figure 3. Open in new tabDownload slide Audio-enhanced computer-assisted self-interview (ACASI) completion by month. In 155 instances, ED staff documented that the patient had completed the ACASI; however, no questionnaire results were captured in the database. These may represent data lost due to software or hardware difficulties not identified in advance of using the software program. While we were unable to delineate the etiology of lost survey responses, issues such as those noted above, as well as transient lost wireless connections on the tablet and temporary outages of our institution’s REDCap server represent potential causes of missing surveys results. The use of the barcode scanning from patient’s wristbands using the iPad camera demonstrated some inconsistencies. ED staff reported that positioning the patient’s wristband to be scanned was awkward, and so the use of this feature was inconsistent. Overall, 815 of 1788 (45.6%) of enrollments used the barcode scan, with 973 of 1788 (54.4%) using manual entry. The patient survey results and test recommendations could be found in the EHR in the same area where laboratory and radiograph results were available. Figure 4 provides a representative example of the summary of patient responses and testing recommendations available for review by clinicians within the EHR. Figure 4. Open in new tabDownload slide Example electronic health record summary of audio-enhanced computer-assisted self-interview responses and clinical decision support recommendations. Figure 4. Open in new tabDownload slide Example electronic health record summary of audio-enhanced computer-assisted self-interview responses and clinical decision support recommendations. Unfortunately, we had 1 instance of a Status/post app update being put into production before testing was complete. This version did not properly handle creation of unique participant identifiers in REDCap for each user. This did not impact transmission of patient responses into the EHR, but led to erasure of entries in the REDCap database. Fortunately, the mediator functioned as a secondary method to store patient responses, and we were able to recover all lost survey responses using this method. Program feedback We received 28 responses from ED staff and physicians through our anonymous survey: 18 from nurses, 4 from EMTs, 4 from physicians, 1 from a paramedic, and 1 in which the role was not given. We have approximately 55 nurses, 20 paramedics or emergency medical technicians, and 15 physicians who work primarily in our ED. This does not include physicians and nurses who only occasionally work in our ED and may have had minimal interaction with the program. Staff reviewed the program positively, with 4 of 28 (14.8%) indicating that they were very satisfied, 16 of 27 (57.1%) were satisfied, 4 of 28 (14.3%) were unsure, 3 of 28 (10.7%) were dissatisfied, and 0 were very dissatisfied. Table 1 summarizes the feedback provided by participants at the end of the ACASI. The preponderance of participants indicated the ACASI was short, easy to use, and easy to understand. Over 70% indicated they preferred the electronic format over other methods of answering these questions. While only 386 of 1787 (21.6%) of patients indicated they made use of the audio component to have questions read aloud, over 80% of those that used the audio found it helpful. Of the 1787 participants, 551 (30.8%) chose to provide an email address and 1167 (65.3%) provided at least 1 phone number. Table 1. Feedback from adolescents who used audio-enhanced computer-assisted self-interview Survey time (n = 1787)  Very short 736 (41.2)  Somewhat short 441 (24.7)  Moderate 439 (24.6)  Somewhat long 83 (4.6)  Very long 39 (2.2)  Skippeda 49 (2.7) Ease of answering on tablet (n = 1787)  Very easy 1406 (78.7)  Somewhat easy 177 (9.9)  Moderate 140 (7.8)  Somewhat hard 17 (1.0)  Very hard 8 (0.4)  Skippeda 39 (2.2) Ease of understanding language (n = 1787)  Very easy 1477 (82.7)  Somewhat easy 137 (7.7)  Moderate 113 (6.3)  Somewhat hard 13 (0.7)  Very hard 5 (0.3)  Skippeda 42 (2.4) How helpful were the headphones? (n = 386)  Very helpful 277 (71.8)  Somewhat helpful 42 (10.9)  Moderate 40 (10.4)  Somewhat unhelpful 4 (1.0)  Very unhelpful 9 (2.3)  Skippeda 14 (3.6) Comfort level in answering electronically (n = 1787)  Very comfortable 1102 (61.7)  Somewhat comfortable 256 (14.3)  Neutral 282 (15.8)  Somewhat uncomfortable 72 (4.0)  Very uncomfortable 30 (1.7)  Skippeda 45 (2.5) Preference for answering in the future (n = 1787)  Computer 1291 (72.2)  Paper and pencil 54 (3.0)  Face to face 226 (12.6)  Skippeda 216 (12.1) Survey time (n = 1787)  Very short 736 (41.2)  Somewhat short 441 (24.7)  Moderate 439 (24.6)  Somewhat long 83 (4.6)  Very long 39 (2.2)  Skippeda 49 (2.7) Ease of answering on tablet (n = 1787)  Very easy 1406 (78.7)  Somewhat easy 177 (9.9)  Moderate 140 (7.8)  Somewhat hard 17 (1.0)  Very hard 8 (0.4)  Skippeda 39 (2.2) Ease of understanding language (n = 1787)  Very easy 1477 (82.7)  Somewhat easy 137 (7.7)  Moderate 113 (6.3)  Somewhat hard 13 (0.7)  Very hard 5 (0.3)  Skippeda 42 (2.4) How helpful were the headphones? (n = 386)  Very helpful 277 (71.8)  Somewhat helpful 42 (10.9)  Moderate 40 (10.4)  Somewhat unhelpful 4 (1.0)  Very unhelpful 9 (2.3)  Skippeda 14 (3.6) Comfort level in answering electronically (n = 1787)  Very comfortable 1102 (61.7)  Somewhat comfortable 256 (14.3)  Neutral 282 (15.8)  Somewhat uncomfortable 72 (4.0)  Very uncomfortable 30 (1.7)  Skippeda 45 (2.5) Preference for answering in the future (n = 1787)  Computer 1291 (72.2)  Paper and pencil 54 (3.0)  Face to face 226 (12.6)  Skippeda 216 (12.1) Values are n (%). a Patient selected the option indicating that they wished to skip question. b Among those who indicated they used headphones to have questionnaire read aloud. Open in new tab Table 1. Feedback from adolescents who used audio-enhanced computer-assisted self-interview Survey time (n = 1787)  Very short 736 (41.2)  Somewhat short 441 (24.7)  Moderate 439 (24.6)  Somewhat long 83 (4.6)  Very long 39 (2.2)  Skippeda 49 (2.7) Ease of answering on tablet (n = 1787)  Very easy 1406 (78.7)  Somewhat easy 177 (9.9)  Moderate 140 (7.8)  Somewhat hard 17 (1.0)  Very hard 8 (0.4)  Skippeda 39 (2.2) Ease of understanding language (n = 1787)  Very easy 1477 (82.7)  Somewhat easy 137 (7.7)  Moderate 113 (6.3)  Somewhat hard 13 (0.7)  Very hard 5 (0.3)  Skippeda 42 (2.4) How helpful were the headphones? (n = 386)  Very helpful 277 (71.8)  Somewhat helpful 42 (10.9)  Moderate 40 (10.4)  Somewhat unhelpful 4 (1.0)  Very unhelpful 9 (2.3)  Skippeda 14 (3.6) Comfort level in answering electronically (n = 1787)  Very comfortable 1102 (61.7)  Somewhat comfortable 256 (14.3)  Neutral 282 (15.8)  Somewhat uncomfortable 72 (4.0)  Very uncomfortable 30 (1.7)  Skippeda 45 (2.5) Preference for answering in the future (n = 1787)  Computer 1291 (72.2)  Paper and pencil 54 (3.0)  Face to face 226 (12.6)  Skippeda 216 (12.1) Survey time (n = 1787)  Very short 736 (41.2)  Somewhat short 441 (24.7)  Moderate 439 (24.6)  Somewhat long 83 (4.6)  Very long 39 (2.2)  Skippeda 49 (2.7) Ease of answering on tablet (n = 1787)  Very easy 1406 (78.7)  Somewhat easy 177 (9.9)  Moderate 140 (7.8)  Somewhat hard 17 (1.0)  Very hard 8 (0.4)  Skippeda 39 (2.2) Ease of understanding language (n = 1787)  Very easy 1477 (82.7)  Somewhat easy 137 (7.7)  Moderate 113 (6.3)  Somewhat hard 13 (0.7)  Very hard 5 (0.3)  Skippeda 42 (2.4) How helpful were the headphones? (n = 386)  Very helpful 277 (71.8)  Somewhat helpful 42 (10.9)  Moderate 40 (10.4)  Somewhat unhelpful 4 (1.0)  Very unhelpful 9 (2.3)  Skippeda 14 (3.6) Comfort level in answering electronically (n = 1787)  Very comfortable 1102 (61.7)  Somewhat comfortable 256 (14.3)  Neutral 282 (15.8)  Somewhat uncomfortable 72 (4.0)  Very uncomfortable 30 (1.7)  Skippeda 45 (2.5) Preference for answering in the future (n = 1787)  Computer 1291 (72.2)  Paper and pencil 54 (3.0)  Face to face 226 (12.6)  Skippeda 216 (12.1) Values are n (%). a Patient selected the option indicating that they wished to skip question. b Among those who indicated they used headphones to have questionnaire read aloud. Open in new tab Table 2 gives representative responses of open-ended feedback provided by patients at the end of the ACASI, and feedback given by ED staff members through the anonymous, online survey. Adolescents provided a variety of responses, with most being brief and supportive. Specific comments were related to appreciation of the interventions being offered, questions related to privacy with parents, and 2 instances of patients indicating they were uncomfortable with answering the questions in the ED setting. Table 2. Representative verbatim feedback from ED patients and staff Adolescent patient feedback I dislike that we have to answer these question although it’s not complicated because we can be honest and not worry about being judged on our sexual actions even if they aren't things we should do as adolescents. I'm proud that this hospital offers us this in order to voice somethings we often keep to ourselves. I think this is a great way to understand what's going on with kids my age especially like the question to see if it was okay to eat my parents know. I'm glad that y'all have this for the teens because we do have a lot going on in this generation with sex. Privately with a doctor would be better some teens would rather talk to a unbiased doctor rather then say they were gay or having sex on a computer their parents could read at anytime. Why were my sexual preferences and history important today? It felt odd being asked if i have had anal sex when I'm here do to a car accident. I don't feel some of those questions were necessary. It made me uncomfortable. ED staff feedback The multiple interruptions the pt receives while trying to complete the questionnaire make it quite difficult, The survery times out & the pt gets frustrated having to start again and does not want to complete it. Otherwise, I like the program and am excited about it. I feel the patients like the way it it presented and I rarely have difficulty getting them to start it. Its the completion of it that becomes difficult. Once we worked out the kinks, things have been going pretty smoothly. My only question is in terms of pt privacy. I've had parents speak for the kids and I would ask again to make the this is what the patient wants. Is it alright to have to parents leave the room in those cases? My concern is that I've had a few residents say they didn't ask sexual history questions because they knew they were doing the survey and I'm not sure that that is the goal nor is appropriate. I provided in the moment feedback but if others are reporting this it might need a little guidance. Adolescent patient feedback I dislike that we have to answer these question although it’s not complicated because we can be honest and not worry about being judged on our sexual actions even if they aren't things we should do as adolescents. I'm proud that this hospital offers us this in order to voice somethings we often keep to ourselves. I think this is a great way to understand what's going on with kids my age especially like the question to see if it was okay to eat my parents know. I'm glad that y'all have this for the teens because we do have a lot going on in this generation with sex. Privately with a doctor would be better some teens would rather talk to a unbiased doctor rather then say they were gay or having sex on a computer their parents could read at anytime. Why were my sexual preferences and history important today? It felt odd being asked if i have had anal sex when I'm here do to a car accident. I don't feel some of those questions were necessary. It made me uncomfortable. ED staff feedback The multiple interruptions the pt receives while trying to complete the questionnaire make it quite difficult, The survery times out & the pt gets frustrated having to start again and does not want to complete it. Otherwise, I like the program and am excited about it. I feel the patients like the way it it presented and I rarely have difficulty getting them to start it. Its the completion of it that becomes difficult. Once we worked out the kinks, things have been going pretty smoothly. My only question is in terms of pt privacy. I've had parents speak for the kids and I would ask again to make the this is what the patient wants. Is it alright to have to parents leave the room in those cases? My concern is that I've had a few residents say they didn't ask sexual history questions because they knew they were doing the survey and I'm not sure that that is the goal nor is appropriate. I provided in the moment feedback but if others are reporting this it might need a little guidance. ED: emergency department. Open in new tab Table 2. Representative verbatim feedback from ED patients and staff Adolescent patient feedback I dislike that we have to answer these question although it’s not complicated because we can be honest and not worry about being judged on our sexual actions even if they aren't things we should do as adolescents. I'm proud that this hospital offers us this in order to voice somethings we often keep to ourselves. I think this is a great way to understand what's going on with kids my age especially like the question to see if it was okay to eat my parents know. I'm glad that y'all have this for the teens because we do have a lot going on in this generation with sex. Privately with a doctor would be better some teens would rather talk to a unbiased doctor rather then say they were gay or having sex on a computer their parents could read at anytime. Why were my sexual preferences and history important today? It felt odd being asked if i have had anal sex when I'm here do to a car accident. I don't feel some of those questions were necessary. It made me uncomfortable. ED staff feedback The multiple interruptions the pt receives while trying to complete the questionnaire make it quite difficult, The survery times out & the pt gets frustrated having to start again and does not want to complete it. Otherwise, I like the program and am excited about it. I feel the patients like the way it it presented and I rarely have difficulty getting them to start it. Its the completion of it that becomes difficult. Once we worked out the kinks, things have been going pretty smoothly. My only question is in terms of pt privacy. I've had parents speak for the kids and I would ask again to make the this is what the patient wants. Is it alright to have to parents leave the room in those cases? My concern is that I've had a few residents say they didn't ask sexual history questions because they knew they were doing the survey and I'm not sure that that is the goal nor is appropriate. I provided in the moment feedback but if others are reporting this it might need a little guidance. Adolescent patient feedback I dislike that we have to answer these question although it’s not complicated because we can be honest and not worry about being judged on our sexual actions even if they aren't things we should do as adolescents. I'm proud that this hospital offers us this in order to voice somethings we often keep to ourselves. I think this is a great way to understand what's going on with kids my age especially like the question to see if it was okay to eat my parents know. I'm glad that y'all have this for the teens because we do have a lot going on in this generation with sex. Privately with a doctor would be better some teens would rather talk to a unbiased doctor rather then say they were gay or having sex on a computer their parents could read at anytime. Why were my sexual preferences and history important today? It felt odd being asked if i have had anal sex when I'm here do to a car accident. I don't feel some of those questions were necessary. It made me uncomfortable. ED staff feedback The multiple interruptions the pt receives while trying to complete the questionnaire make it quite difficult, The survery times out & the pt gets frustrated having to start again and does not want to complete it. Otherwise, I like the program and am excited about it. I feel the patients like the way it it presented and I rarely have difficulty getting them to start it. Its the completion of it that becomes difficult. Once we worked out the kinks, things have been going pretty smoothly. My only question is in terms of pt privacy. I've had parents speak for the kids and I would ask again to make the this is what the patient wants. Is it alright to have to parents leave the room in those cases? My concern is that I've had a few residents say they didn't ask sexual history questions because they knew they were doing the survey and I'm not sure that that is the goal nor is appropriate. I provided in the moment feedback but if others are reporting this it might need a little guidance. ED: emergency department. Open in new tab Staff members primarily used the online survey to provide suggestions or frustrations with the program, with some positive feedback. Their concerns were mostly related to process issues such as patients being interrupted, talking with parents, and technical issues. One physician provided feedback that residents were not asking sexual history questions because patients were answering them on the ACASI and wanted to ensure trainees were still discussing the topic with their patients. DISCUSSION To our knowledge, this is the first demonstration of a method for using REDCap as a platform to integrate patient-answered questionnaires into the EHR. In doing so, we were able to use our platform as a novel way to integrate CDS into the EHR, automate downstream actions using that CDS, and make results available in real-time for clinical use. While REDCap is “specifically geared to support online or offline data capture for research studies and operations,”16 our effort demonstrates a new method to integrate patient data directly into the EHR without significant modifications to the EHR itself. This approach provides flexibility that may lend itself to widespread use. Constructing surveys in REDCap is straightforward and does not require significant experience. Using the calculated values functions allows for CDS to be applied, giving the ability to generate recommendations based. This function can be applied in any situation in which data entered into REDCap can be “scored” to provide a recommendation. This is also useful in situations in which PROs are of value, even if CDS is not required, as providing easy methods to capture these data are increasingly relevant for modern health care.17,18 This approach can also be used directly by clinicians to enter clinical data and receive recommendations that could be integrated into the EHR. In situations in which creating CDS in the EHR is limited, this platform could be used as an alternative. In the ED setting, this could be applied for scenarios such as when a patient needs a head computed tomography in cases of blunt trauma,19 or when a patient needs testing for a pulmonary embolus.20 We had some notable successes. Our platform provided the opportunity to make changes to the patient questions and answer choices, which only necessitated changes directly in REDCap. These question choices were automatically updated within the Status/post without EHR modifications, greatly simplifying the change process. Larger changes, such as adding entirely new questions, did require updates to the Mediator that generated our HL7 message. However, these changes did not require any modifications to the EHR, and were performed rapidly. Our CDS tool was built entirely within REDCap; this allowed us to both test in REDCap and make changes without changes to other parts of the platform, demonstrating the platforms’ flexibility. The Mediator worked without difficulty; when it received patient responses from Status/post, it was able to generate an HL7 message and transmit it to our EHR, Wellsoft, in essentially 100% of cases. While we have not tested integration with other EHRs, the standards used that allowed us to integrate into Wellsoft are available in all modern EHRs. This manuscript does not discuss STI testing outcomes related to our program—exploration of details of the information disclosed by patients and their STI testing outcomes will be explored in future manuscripts. We believe reporting the technical aspects of our platform as a precursor to outcomes reporting is necessarily to frame the outcomes themselves.21 Interestingly, only 20% of participants indicated they used the audio. In our pilot, over 50% of all participants found it “very helpful” or “helpful.” As we recruited from a broader patient population, it is possible the literacy level was different. Comfort with electronic devices has changed over time as well. Additionally, the audio in our pilot study was a recorded voice, while this was computer generated. Audio is not a core component of systems such as PROMIS, and further research is needed to assess its utility. The physician who expressed concern about how trainees use the questionnaire results raised an interesting point. When designing our program, our focus was largely on the asymptomatic patient (ie, those whose ED chief complaints were unlikely related to STIs). Such patients are unlikely to be privately interviewed by ED physicians regarding STI risk factors or receive STI testing in the absence of our program. While we encouraged all providers to review questionnaire results when available, our primary focus on implementation was using the CDS to drive STI test ordering in patients classified at risk, and this will be explored in the future studies. Our software- and risk-based STI screening in the ED is novel, but similar approaches have been used screen to adolescents in the ED for substance abuse, depression, and suicide.22–25 Our work contributes to the growing literature demonstrating the feasibility of screening adolescents in the ED setting for high-risk conditions. Our novel approach did encounter some challenges. We had a significant decrease in program use due to technical issues. We believe this was multifactorial: (1) technical issues occurred during peak ED volumes and acuity, giving staff less time to troubleshoot; (2) as the problems were resolved, our ED was preparing for a new EHR, diverting resources for monitoring and feedback; and (3) this was adopted as our new “standard-of-care” with no internal precedent and little external precedent, thus identifying technical and traditional solutions to resolve issues was challenging. While our application was built using a mobile app platform and benefitted from testing that occurs with other studies using the same platform, our study had certain requirements that were unique. Chief among these is that each installed app engaged multiple users, rather than following a single user longitudinally. As such, rigorous testing of the unique aspects of our app was needed on the occasion of installing updated versions. As mentioned in the Results, in one instance, updates to the iPad operating system were installed by nursing staff when prompts appeared during use in the ED; Status/post had not been fully tested with the new version of the iPad operating system, leading to technical until a new version of Status/post could be installed. Device management software can be used to restrict iOS updates, however these controls had not been activated on our iPads. We activated these controls after recognizing this oversight. Status/post is used at multiple institutions with different versions of REDCap; as we received notification of REDCap updates at our institution, the advanced notice was used to help test across different versions. Our REDCap servers were occasionally offline for upgrades or maintenance. This did not significantly impact the program as Status/post did not require an active connection to REDCap to provide the survey to patients and transmit results to the EHR. When the REDCap server was unavailable, Status/post was designed to transmit the data when the connection resumed. Limitations Our technical difficulties limiting questionnaire use was a significant limitation. Using this platform at other institutions would require some initial availability of software programmers and minor EHR modifications. Once implemented, some level of ongoing support is required in case of software malfunctions. This program focused solely on implementing a sustainable method to electronically identify risk factors for STIs and provide testing among asymptomatic patients. However, in-depth education for patients identified as at risk was not an integral part of our program. Implementation in the ED was limited by lack of an automated function to identify patients who had recently completed the questionnaire. Our ability to obtain feedback from our ED staff was limited by lack of engagement with our anonymous survey. CONCLUSION We created a novel software platform to integrate patient-answered questionnaires and CDS into the EHR. While additional refinements of the platform are necessary, this platform holds significant promise for clinical use. FUNDING Research reported in this publication was supported by a grant from Washington University Institute of Clinical and Translational Sciences grant KL2 TR000450 (F.A.A) from the National Center for Advancing Translational Sciences of the National Institutes of Health. It was also supported by a grant from the St. Louis Children’s Hospital Foundation (PR-2015-57) (F.A.A). AUTHOR CONTRIBUTionS FA conceived the program, oversaw its creation and implementation, performed data analysis, and wrote the first draft of the manuscript, critically appraised the manuscript, and oversaw the final manuscript. IL, RK, KK, BM, developed the Mediator, oversaw its integration with Status/post and the EHR and managed the databases. CM developed the Status/post software and oversaw its integration with REDCap and the Mediator. PP and TB provided guidance in creation of the program and its implementation, and provided critical appraisal of the manuscript All authors reviewed and approved the final manuscript. ACKNOWLEDGMENTS Thank you to the patients, staff, and physicians of the St. Louis Children’s Hospital Emergency Unit for their support of this program. Thank you to Sherry Lassa-Claxton for her assistance in designing the REDCap questionnaire and clinical decision support algorithms. CONFLICT OF INTEREST STATEMENT None declared. REFERENCES 1 Kurth AE , Martin DP , Golden MR. A comparison between audio computer-assisted self-interviews and clinician interviews for obtaining the sexual history . Sex Transm Dis 2004 ; 31 ( 12 ): 719 – 26 . Google Scholar Crossref Search ADS PubMed WorldCat 2 Beauclair R , Meng F , Deprez N , et al. . Evaluating audio computer assisted self-interviews in urban south African communities: evidence for good suitability and reduced social desirability bias of a cross-sectional survey on sexual behaviour . BMC Med Res Methodol 2013 ; 13 ( 1 ): 11. Google Scholar Crossref Search ADS PubMed WorldCat 3 Van Der Elst EM , Okuku HS , Nakamya P , et al. . Is audio computer-assisted self-interview (ACASI) useful in risk behaviour assessment of female and male sex workers, Mombasa, Kenya? PLoS One 2009 ; 4 ( 5 ): e5340. Google Scholar Crossref Search ADS PubMed WorldCat 4 Richens J , Copas A , Sadiq ST , et al. . A randomised controlled trial of computer-assisted interviewing in sexual health clinics . Sex Transm Infect 2010 ; 86 ( 4 ): 310–4 . Google Scholar Crossref Search ADS PubMed WorldCat 5 Anand V , Biondich PG , Liu GC , Rosenman MB , Downs SM. Child health improvement through computer automation: the CHICA system . Stud Health Tech Inform 2004 ; 107 ( Pt 1 ): 187 – 91 . OpenURL Placeholder Text WorldCat 6 Carroll AE , Biondich PG , Anand V , et al. . Targeted screening for pediatric conditions with the CHICA system . J Am Med Inform Assoc 2011 ; 18 ( 4 ): 485 – 90 . Google Scholar Crossref Search ADS PubMed WorldCat 7 Broderick J , DeWit EM , Rothrock N , Crane P , Forrest CB. Advances in patient-reported outcomes: the NIH PROMIS® measures . EGEMS (Wash DC) 2013 ; 1 ( 1 ): 1015. Google Scholar PubMed OpenURL Placeholder Text WorldCat 8 Ahmad FA , Jeffe DB , Plax K , et al. . Computerized self-interviews improve Chlamydia and gonorrhea testing among youth in the emergency department . Ann Emerg Med 2014 ; 64 ( 4 ): 376 – 84 . Google Scholar Crossref Search ADS PubMed WorldCat 9 Bureau of HIV, STD, and Hepatitis, Division of Community and Public Health. 2013 Epidemiologic Profiles of HIV, STD, and Hepatitis in Missouri. Jefferson City, MO: Missouri Department of Health and Senior Services; 2014 . 10 Centers for Disease Control and Prevention. Sexually Transmitted Disease Surveillance 2017 . Atlanta, GA : U.S. Department of Health and Human Services ; 2018 . Google Scholar Google Preview OpenURL Placeholder Text WorldCat COPAC 11 Committee on Adolesence; Society for Adolescent Health and Medicine. Screening for nonviral sexually transmitted infections in adolescents and young adults . Pediatrics 2014 ; 134 ( 1 ): e302 – 11 . Crossref Search ADS PubMed WorldCat 12 Harris PA , Taylor R , Thielke R , Payne J , Gonzalez N , Conde JG. Research electronic data capture (REDCap)–a metadata-driven methodology and workflow process for providing translational research informatics support . J Biomed Inform 2009 ; 42 ( 2 ): 377 – 81 . Google Scholar Crossref Search ADS PubMed WorldCat 13 ResearchKit and CareKit. https://www.apple.com/researchkit/ Accessed April 25, 2019. 14 Quinn J. An HL7 (Health Level Seven) overview . J AHIMA 1999 ; 70 ( 7 ): 32 – 4 ; quiz 35–6. Google Scholar PubMed OpenURL Placeholder Text WorldCat 15 Health Level Seven International. HL7 Backgrounder Brief; 2018 . http://www.hl7.org/newsroom/HL7backgrounderbrief.cfm Accessed December 17, 2018. 16 Vanderbilt University. REDCap; 2018 . https://www.project-redcap.org/ Accessed December 17, 2018. 17 Rose M , Bezjak A. Logistics of collecting patient-reported outcomes (PROs) in clinical practice: an overview and practical examples . Qual Life Res 2009 ; 18 ( 1 ): 125 – 36 . Google Scholar Crossref Search ADS PubMed WorldCat 18 Jensen RE , Rothrock NE , DeWitt EM , et al. . The role of technical advances in the adoption and integration of patient-reported outcomes in clinical care . Med Care 2015 ; 53 ( 2 ): 153 – 9 . Google Scholar Crossref Search ADS PubMed WorldCat 19 Goldberg HS , Paterno MD , Grundmeier RW , et al. . Use of a remote clinical decision support service for a multicenter trial to implement prediction rules for children with minor blunt head trauma . Int J Med Inform 2016 ; 87 : 101 – 10 . Google Scholar Crossref Search ADS PubMed WorldCat 20 Drescher FS , Chandrika S , Weir ID , et al. . Effectiveness and acceptability of a computerized decision support system using modified Wells criteria for evaluation of suspected pulmonary embolism . Ann Emerg Med 2011 ; 57 ( 6 ): 613 – 21 . Google Scholar Crossref Search ADS PubMed WorldCat 21 Yen PY , Bakken S. Review of health information technology usability study methodologies . J Am Med Inform Assoc 2012 ; 19 ( 3 ): 413 – 22 . Google Scholar Crossref Search ADS PubMed WorldCat 22 Williams JR , Ho ML , Grupp-Phelan J. The acceptability of mental health screening in a pediatric emergency department . Pediatr Emerg Care 2011 ; 27 ( 7 ): 611 – 5 . Google Scholar Crossref Search ADS PubMed WorldCat 23 Fein JA , Pailler ME , Barg FK , et al. . Feasibility and effects of a web-based adolescent psychiatric assessment administered by clinical staff in the pediatric emergency department . Arch Pediatr Adolesc Med 2010 ; 164 ( 12 ): 1112 – 7 . Google Scholar Crossref Search ADS PubMed WorldCat 24 Bromberg JR , Spirito A , Chun T , et al. . Methodology and demographics of a brief adolescent alcohol screen validation study . Pediatr Emerg Care 2017 Jul 3 [E-pub ahead of print]. OpenURL Placeholder Text WorldCat 25 Linakis JG , Bromberg JR , Casper TC , et al. . Reliability and validity of the newton screen for alcohol and cannabis misuse in a pediatric emergency department sample . J Pediatr 2019 ; 210 : 154–60.e1. OpenURL Placeholder Text WorldCat © The Author(s) 2019. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - Using REDCap and Apple ResearchKit to integrate patient questionnaires and clinical decision support into the electronic health record to improve sexually transmitted infection testing in the emergency department JO - Journal of the American Medical Informatics Association DO - 10.1093/jamia/ocz182 DA - 2020-02-01 UR - https://www.deepdyve.com/lp/oxford-university-press/using-redcap-and-apple-researchkit-to-integrate-patient-questionnaires-pvN1EhBELJ SP - 265 VL - 27 IS - 2 DP - DeepDyve ER -