TY - JOUR AU - Krenzelok, Edward, P. AB - Abstract Purpose. Current and alternative approaches to documenting drug information (DI) questions at DI centers in the United States and features of a software program used for documenting poison information are described. Methods. A survey was distributed through an Internet listserver for DI specialists in institutional or academic practices. The survey requested information regarding the use of electronic databases to document DI inquiries and the type of databases used. Documented DI questions during a nine-month period using Visual Dotlab (Visual Dotlab Enterprises, Fresno, CA) at the University of Pittsburgh Medical Center (UPMC) DI center were analyzed to demonstrate the feasibility and efficiency of the software for documenting DI questions. Results. Thirty-three DI centers responded to the survey, and 22 (67%) used electronic databases for question documentation. Of these, 15 (68%) were created with Microsoft Office Access (Microsoft Corporation, Redmond, WA). At the UPMC DI center, 841 DI questions were documented using Visual Dotlab. Questions from pharmacists, nurses, and physicians accounted for 654 (78%) of entries. Drug–drug interactions (12%) and dosage recommendations (12%) were the most common types of questions received. On average, DI specialists spent 46 minutes per response, which required an average of 1.7 follow-up calls per inquiry. Quality assurance was performed on 98% of questions documented. Conclusion. Visual Dotlab is not well-known and is currently not used widely by DI centers. However, it includes features that may help DI centers document and retrieve DI questions efficiently and comprehensively. Computers, Data collection, Databases, Documentation, Dosage, Drug information centers, Drug information, Drug interactions, Poisoning, Quality assurance Drug information (DI) centers across the country collectively receive hundreds of medication-related questions on a daily basis.1 When a question is received, documentation of the research and the response is necessary to provide rapid recall of information and to avoid litigation.2 The documentation of DI requests also allows for quality assessment of medication-information responses.3 Therefore, a method of documentation enabling DI centers to record and retrieve each question and response quickly and accurately is of great importance. Traditionally, DI centers have used manual paper systems to store DI questions and responses, and more than 50% of centers were using this type of system in 2003.1 This approach is inefficient and antiquated and may lead to improper recording of information and to delays in question retrieval. Electronic documentation eliminates some of the problems associated with these systems, including illegible handwriting, incomplete documentation, and limited storage space.2 Other benefits of electronic documentation systems include increased efficiency, cost savings, security, disaster recovery, remote access, and process consistency.4 To our knowledge, commercially available electronic options for DI centers to record and document queries and responses are limited, and there are no publications in the pharmacy literature detailing commercial electronic DI documentation programs. One option, Visual Dotlab (Visual Dotlab Enterprise 4.3.6, WBM Software, Fresno, CA), is currently being used for DI question documentation at the University of Pittsburgh Medical Center (UPMC) DI center. Because this program was originally created for documenting poison information cases, DI centers may not be aware of its unique features and its availability as a commercial electronic documentation tool in the DI setting. Therefore, the objectives of this analysis were to determine current documentation methods at DI centers and to describe Visual Dotlab, a novel approach to DI question documentation, as an alternative option. Data collection In order to establish that the system being used at the UPMC DI center was unique, it was necessary to determine the current approach to question documentation at other DI centers. A survey was posted on the Consortium for the Advancement of Medication Information Policy and Research DI Internet listserver. The questions that were asked prompted respondents for their DI center setting, the means of DI question documentation, the type of center clientele, and the number of questions received on a monthly basis. If participants indicated that their center used an electronic database, they were asked whether their system was developed internally or purchased commercially. Information about available entry fields of the program, user capabilities, and overall satisfaction was also collected. All responses were collected between September 12, 2007, and October 12, 2007, and no reminder e-mail was sent. After the results of the survey were collected and analyzed, DI call data from the UPMC DI center were extracted from Visual Dotlab. DI inquiry reports were generated using Crystal Seagate Reports (Crystal Reports version 11.0.0.895, Business Objects SA, San Jose, CA). The DI center began using Visual Dotlab at the start of 2007; therefore, calls received from January 1, 2007, until September 15, 2007, were analyzed. Data collected included the number of calls received, requestor type, category of request, research time, follow-up contacts (defined as any communication generated to send or retrieve information related to the DI question), and quality assurance performed. All data were analyzed using descriptive statistics. Documentation of DI questions The Internet survey generated 33 responses from DI centers that subscribed to the listserver. However, questions regarding service areas and call volumes were at the bottom of the survey and only 27 of the 33 respondents addressed these queries. Because the total number of DI centers that received the survey could not be identified, a response rate was not assessed. Nineteen centers (58%) provided services in a university setting, and 13 (39%) provided services in an institutional setting. The remaining respondent (3%) was associated with a managed care organization. Fifty-six percent of responding centers provided services to health care professionals only (15 of 27), while 44% (12 of 27) also provided DI to the public. Monthly question volume varied among responding centers; 9 centers (33%) fielded less than 50 questions per month, 8 centers (30%) fielded between 51 and 100 questions per month, 3 centers (11%) fielded between 101 and 200 questions per month, and 7 centers (26%) fielded more than 200 questions per month. Of the 33 participating centers, 22 (67%) indicated the use of an electronic database for DI question documentation. Only 2 (9%) of these centers employed a commercial database (Toxicall [1], DocuPharm DI [1]), while the remaining 20 (91%) developed their own databases. Of these 20 internally developed electronic databases, 15 (75%) were created using Microsoft Office Access (Microsoft Corporation, Redmond, WA). Table 1 lists information entered in the databases. Twenty-one (95%) respondents indicated that their electronic programs were capable of running reports to analyze data, 20 (91%) had the capability to restrict access with username and password functions, and 16 (73%) used programs that allowed specialists to perform quality assurance on documented questions. Twenty-one (95%) respondents indicated that they found their systems to be very reliable, and 20 (91%) were very satisfied with their current programs. Data analysis from Visual Dotlab The UPMC DI center responded to 841 DI requests during the nine-month evaluation period. The most common callers were pharmacists (n = 230, 27%), nurses (n = 218, 26%), and physicians (n = 206, 25%), accounting for 78% of the inquiries. Additional categories represented by requestors included nurse practitioners (n = 44, 5%) and physician assistants (n = 28, 3%). Questions were recorded in 25 different categories, with the most common being dosage (12%) and drug–drug interaction (12%) questions (Table 2). Call research time was recorded in 5-minute intervals. The average time to response was 46 minutes per call, though the most common time recorded was 5 minutes, accounting for 141 (17%) calls. One-hundred-nine calls (13%) required 30 minutes, followed by 86 calls (10%) requiring 60 minutes. The majority of calls required either 1 (377 of 841, 45%) or 2 (328 of 841, 19.5%) follow-up contacts, with an average of 1.7 contacts required per DI question. Administration at the UPMC DI center performed quality assurance on 98% of calls. Description of Visual Dotlab In January 2007, the UPMC DI center and the Pittsburgh Poison Center (PPC) merged administratively, but not operationally, because of the similar nature and requirements of their services. Before this union, the UPMC DI center recorded questions and responses in a Microsoft Office Access database. Over time, various issues complicated the experience with this program at the UPMC DI center. The program responded slowly and crashed on a regular basis. Retrieving documented questions proved challenging because of misspelled words and the inability of the system to ignore older entries of similar questions. Additionally, the database had limited predetermined options when entering information into data fields and limited character space for each entry. Another problem with the database was the ability of the user to accidentally overwrite previously entered questions. Lastly, this database did not have a mechanism for providing specialist-directed quality-assurance communication. Collectively, these problems led to an overall decrease in accuracy and efficiency. At the time of the merger, the PPC was using Visual Dotlab, a program developed by a pharmacist to improve poison information question documentation. It is a Windows-based, 32-bit, client-server, transaction-based, patient-management system currently in use at 12 poison control centers throughout North America (T. Carson, personal communication, 2008 Aug).5 The UPMC DI center chose to evaluate Visual Dotlab as an option for DI question documentation because of its immediate availability at the newly merged center. While Visual Dotlab is designed for documenting poison information questions and primarily services poison control centers, many of its features are applicable to DI. A list of the features of Visual Dotlab can be found in Table 3.5 Documenting DI questions in Visual Dotlab was found to be systematic and comprehensive. When a call is received, the specialist opens a new case and selects the “Drug Information” category from the “Call Type” dropdown menu. Based on the question, he or she can then select a subcategory (Table 2). The caller name, site, occupation or relation, and contact information are collected and documented in Visual Dotlab. A Microsoft Office Word (version 11.8106.8202 SP2, 2003), DI question template that correlates with the designation of the subcategory is opened. These question templates were developed by DI center administrators and list common background questions. Only 7 (32%) survey respondents’ databases incorporated background information into their documentation. The ability to record background information and patient history enables our specialists to conduct thorough research and provide individualized responses on a case-by-case basis, as well as to determine the applicability of past responses to current clinical scenarios. Once background questions are answered by the caller, the specialist or student documents each medication associated with the call in the substances section by typing in the medication name. Because Visual Dotlab uses the Micromedex Poisindex (Thomson Reuters, Greenwood Village, CO) database for DI purposes, when a medication is entered, it is standardized through a direct link, reducing spelling errors of medication names and improving recall of questions. Each medication listed in Poisindex is associated with a unique seven-digit code and a generic substance code that facilitate data collection, analysis, and location of all records that contain that code. Similar capabilities of medication name standardization were only available in 4 databases (18%) according to responses from our survey. Once the research has been conducted and a response has been formulated, the user copies all information from the document and pastes it into the “History” field. All entries recorded in this field are automatically time stamped with the specialist’s name. Finally, after documenting the number of follow-up contacts and research time required to answer the query, the user closes the case after Visual Dotlab ensures its completeness. Once the history field is closed, the document may not be altered, thereby protecting the response for future reference and for medical–legal purposes. When generating reports, any field captured in Visual Dotlab may be queried and multiple fields may be queried simultaneously. Data captured by Visual Dotlab are extensive and uncomplicated. In addition, Visual Dotlab allows for a simple quality-assurance (QA) process. Administrators can check cases completed by specialists and provide feedback through specialist-directed “QA notes,” allowing the user to review comments and make corrections as well as learn from each case. Another advantage is that the QA notes do not become part of the visible documentation record. Because the program was developed for poison information, it adds an improved element when DI questions involve an adverse effect. Only two survey respondents indicated that their databases were capable of recording adverse drug event (ADE) data. Visual Dotlab captures information critical to ADE cases, including symptoms, laboratory values, treatments, medication errors, and outcomes. Furthermore, all of the signs and symptoms are standardized and organized by organ system and allow for the uniformity of the data. For example, increased heart rate is always documented as tachycardia. An additional field allows the specialist to document whether the symptom is related or unrelated to the medication. This capability is important because of the increased emphasis placed on medication safety and ADE reporting. While the incorporation of the program into daily use at the DI center has been straightforward, there have been some challenges with Visual Dotlab. As evidenced by the analysis of the call data, specialists initially used the “other” category too often when documenting calls. It is believed that the specialists chose the “other” category at such a high rate because they were unfamiliar with the new categories available within Visual Dotlab. This reduced the ability of reviewers to interpret what types of questions the center was receiving. Since then, the specialists have been educated about all the available categories and they now know that they should only use the “other” category in indefinable cases. While question recall has improved since the database change, finding previous entries can still be difficult because of the combining of poison and DI calls into a single database. However, there are filters that allow segregation of the DI data into a separate data set. Overall, the reaction to the change by the DI staff has been positive. The program requires minimal training for students and residents on rotation at the DI center, and they typically demonstrated independence within three days. Though not required, Visual Dotlab installation and training time are each billed at a rate of $150 per hour. At the UPMC DI center, installation time required eight hours at a total cost of $1200. Maintenance fees are $3000 per year, which include all updates and software support. There are no limitations on the number of users with access to the database. These fees are based on current pricing for poison information centers; Visual Dotlab costs may vary for DI centers. Currently, Visual Dotlab is not certified for Microsoft Windows Vista and must be run on Microsoft Windows XP. The program requires a local server with 512 Mb of random access memory and 5–10 Gb of free disk space per computer. Limitations of this analysis primarily involve the survey portion. Because the primary focus of this paper was to disseminate information about Visual Dotlab, the survey did not encompass all DI centers, limiting the extrapolation of the results. It is possible that more DI centers are using computerized databases to document DI questions, and some may even be using Visual Dotlab. There was also no way to calculate a response rate to the survey because multiple DI specialists from DI centers subscribe to the listserver. Therefore, it is unknown how many centers received the request for information. Additionally, the survey did not determine if the surveyed centers were using paper documentation in addition to the electronic databases. The survey also was designed to assess the computer system requirements and database costs of the responding centers, but many participants failed to answer these questions. Another limitation was that the majority of responding centers indicated a high level of reliability and satisfaction with their current databases. However, the survey was not designed to influence DI centers to use Visual Dotlab, but to determine if dissemination about its use at UPMC was warranted. Finally, questions measuring the clientele and call volume of the centers were at the end of the survey, so some of the centers not using electronic systems did not answer these questions. Conclusion Visual Dotlab is not well-known and is currently not used widely by DI centers. However, it includes features that may help DI centers document and retrieve DI questions efficiently and comprehensively. Table 1. Available Entry Fields in Drug Information Center Databases Database Field No. (%) Responses (n= 22) Date/time of call 22 (100) Caller occupation 22 (100) Caller location 21 (95) Category of question 20 (91) Time spent to answer question 19 (86) Workload or follow-up count 14 (64) References 12 (55) Responder 11 (50) Question 9 (41) Answer 8 (36) Background/history 7 (32) Medication name standardization 4 (18) Preferred delivery method for response 4 (18) Urgency of response 3 (14) Log or case number 3 (14) Adverse drug event data 2 (9) Caller specialty or area of practice 2 (9) Materials sent to caller 2 (9) Database Field No. (%) Responses (n= 22) Date/time of call 22 (100) Caller occupation 22 (100) Caller location 21 (95) Category of question 20 (91) Time spent to answer question 19 (86) Workload or follow-up count 14 (64) References 12 (55) Responder 11 (50) Question 9 (41) Answer 8 (36) Background/history 7 (32) Medication name standardization 4 (18) Preferred delivery method for response 4 (18) Urgency of response 3 (14) Log or case number 3 (14) Adverse drug event data 2 (9) Caller specialty or area of practice 2 (9) Materials sent to caller 2 (9) Table 1. Available Entry Fields in Drug Information Center Databases Database Field No. (%) Responses (n= 22) Date/time of call 22 (100) Caller occupation 22 (100) Caller location 21 (95) Category of question 20 (91) Time spent to answer question 19 (86) Workload or follow-up count 14 (64) References 12 (55) Responder 11 (50) Question 9 (41) Answer 8 (36) Background/history 7 (32) Medication name standardization 4 (18) Preferred delivery method for response 4 (18) Urgency of response 3 (14) Log or case number 3 (14) Adverse drug event data 2 (9) Caller specialty or area of practice 2 (9) Materials sent to caller 2 (9) Database Field No. (%) Responses (n= 22) Date/time of call 22 (100) Caller occupation 22 (100) Caller location 21 (95) Category of question 20 (91) Time spent to answer question 19 (86) Workload or follow-up count 14 (64) References 12 (55) Responder 11 (50) Question 9 (41) Answer 8 (36) Background/history 7 (32) Medication name standardization 4 (18) Preferred delivery method for response 4 (18) Urgency of response 3 (14) Log or case number 3 (14) Adverse drug event data 2 (9) Caller specialty or area of practice 2 (9) Materials sent to caller 2 (9) Table 2. Category of Information Requests Captured in Visual Dotlab at the University of Pittsburgh Medical Center Drug Information Center Category No. (%) Calls (n= 841) Other 163 (19.4) Dosage 100 (11.9) Drug–drug interactions 100 (11.9) Dosage form/formulation 75 (8.9) Medication administration 59 (7) Adverse effects (no exposure) 54 (6.4) Medication availability 49 (5.8) Indications/subtherapeutic use 36 (4.3) Stability/storage 31 (3.7) Compatibility of parenteral medications 29 (3.5) Brand/generic name clarifications 28 (3.3) Compounding 24 (2.9) Pharmacokinetics 23 (2.7) Exposure 18 (2.1) Contraindications 12 (1.4) Therapeutic drug monitoring 9 (1.1) Pharmacology 6 (0.7) Regulatory 6 (0.7) Calculations 5 (0.6) Drug–food interactions 5 (0.6) Dietary supplement, herbal, homeopathic 4 (0.5) Foreign drug 3 (0.4) Drug use during breastfeeding 2 (0.2) Generic substitution 0 Medication disposal 0 Category No. (%) Calls (n= 841) Other 163 (19.4) Dosage 100 (11.9) Drug–drug interactions 100 (11.9) Dosage form/formulation 75 (8.9) Medication administration 59 (7) Adverse effects (no exposure) 54 (6.4) Medication availability 49 (5.8) Indications/subtherapeutic use 36 (4.3) Stability/storage 31 (3.7) Compatibility of parenteral medications 29 (3.5) Brand/generic name clarifications 28 (3.3) Compounding 24 (2.9) Pharmacokinetics 23 (2.7) Exposure 18 (2.1) Contraindications 12 (1.4) Therapeutic drug monitoring 9 (1.1) Pharmacology 6 (0.7) Regulatory 6 (0.7) Calculations 5 (0.6) Drug–food interactions 5 (0.6) Dietary supplement, herbal, homeopathic 4 (0.5) Foreign drug 3 (0.4) Drug use during breastfeeding 2 (0.2) Generic substitution 0 Medication disposal 0 Table 2. Category of Information Requests Captured in Visual Dotlab at the University of Pittsburgh Medical Center Drug Information Center Category No. (%) Calls (n= 841) Other 163 (19.4) Dosage 100 (11.9) Drug–drug interactions 100 (11.9) Dosage form/formulation 75 (8.9) Medication administration 59 (7) Adverse effects (no exposure) 54 (6.4) Medication availability 49 (5.8) Indications/subtherapeutic use 36 (4.3) Stability/storage 31 (3.7) Compatibility of parenteral medications 29 (3.5) Brand/generic name clarifications 28 (3.3) Compounding 24 (2.9) Pharmacokinetics 23 (2.7) Exposure 18 (2.1) Contraindications 12 (1.4) Therapeutic drug monitoring 9 (1.1) Pharmacology 6 (0.7) Regulatory 6 (0.7) Calculations 5 (0.6) Drug–food interactions 5 (0.6) Dietary supplement, herbal, homeopathic 4 (0.5) Foreign drug 3 (0.4) Drug use during breastfeeding 2 (0.2) Generic substitution 0 Medication disposal 0 Category No. (%) Calls (n= 841) Other 163 (19.4) Dosage 100 (11.9) Drug–drug interactions 100 (11.9) Dosage form/formulation 75 (8.9) Medication administration 59 (7) Adverse effects (no exposure) 54 (6.4) Medication availability 49 (5.8) Indications/subtherapeutic use 36 (4.3) Stability/storage 31 (3.7) Compatibility of parenteral medications 29 (3.5) Brand/generic name clarifications 28 (3.3) Compounding 24 (2.9) Pharmacokinetics 23 (2.7) Exposure 18 (2.1) Contraindications 12 (1.4) Therapeutic drug monitoring 9 (1.1) Pharmacology 6 (0.7) Regulatory 6 (0.7) Calculations 5 (0.6) Drug–food interactions 5 (0.6) Dietary supplement, herbal, homeopathic 4 (0.5) Foreign drug 3 (0.4) Drug use during breastfeeding 2 (0.2) Generic substitution 0 Medication disposal 0 Table 3. Features of Visual Dotlab (VDL)5 Feature Description Hospital information integrated into case entry Limitless hospital phone numbers, addresses, or personnel storage Staff information database Holds addresses, phone numbers, hire dates, evaluation due dates, and other important employee information Multiple open case-entry forms Allows for more than one simultaneous case Case follow-up tracking Separate follow-up form with multiple sorting and reporting options; links case entry and follow-up forms Limitless substance entries May enter multiple medications on a single case Poisindex (Micromedex, Thomson Reuters, Greenwood Village, CO) access Substance codes, generic codes, and product descriptors are pulled into VDL records Online calculator Calculated results automatically entered into appropriate fields Date and time of call Automatically stamps time when new case opens Last edit time Automatically displays the most recent edit time when a case is opened Unlimited text entry Once stored, information becomes read-only, preventing modifications to previously entered cases Automatic spell checking Misspelled words display with red tilde underscore; correct spellings may be prompted by a right-click Data verification Encompasses over 250 different criteria to ensure internal data consistency; generates detailed list of logical errors Data triggers Can be established to fire additional data collection notifications or collection tools Case templates Can be created to facilitate entry of cases involving common substances; determined by user at implementation Multiple preset and user-definable query parameters User-friendly defaults for case retrieval Limitable user access Three levels of security for record access Encrypted login passwords Increases security Association with Crystal Reports (Business Objectis SA, San Jose, CA) to generate ad hoc reports Reports are built in Crystal Seagate Reports and linked to VDL Searching capabilities Can search by patient name, caller name, substance, and other fields Feature Description Hospital information integrated into case entry Limitless hospital phone numbers, addresses, or personnel storage Staff information database Holds addresses, phone numbers, hire dates, evaluation due dates, and other important employee information Multiple open case-entry forms Allows for more than one simultaneous case Case follow-up tracking Separate follow-up form with multiple sorting and reporting options; links case entry and follow-up forms Limitless substance entries May enter multiple medications on a single case Poisindex (Micromedex, Thomson Reuters, Greenwood Village, CO) access Substance codes, generic codes, and product descriptors are pulled into VDL records Online calculator Calculated results automatically entered into appropriate fields Date and time of call Automatically stamps time when new case opens Last edit time Automatically displays the most recent edit time when a case is opened Unlimited text entry Once stored, information becomes read-only, preventing modifications to previously entered cases Automatic spell checking Misspelled words display with red tilde underscore; correct spellings may be prompted by a right-click Data verification Encompasses over 250 different criteria to ensure internal data consistency; generates detailed list of logical errors Data triggers Can be established to fire additional data collection notifications or collection tools Case templates Can be created to facilitate entry of cases involving common substances; determined by user at implementation Multiple preset and user-definable query parameters User-friendly defaults for case retrieval Limitable user access Three levels of security for record access Encrypted login passwords Increases security Association with Crystal Reports (Business Objectis SA, San Jose, CA) to generate ad hoc reports Reports are built in Crystal Seagate Reports and linked to VDL Searching capabilities Can search by patient name, caller name, substance, and other fields Table 3. Features of Visual Dotlab (VDL)5 Feature Description Hospital information integrated into case entry Limitless hospital phone numbers, addresses, or personnel storage Staff information database Holds addresses, phone numbers, hire dates, evaluation due dates, and other important employee information Multiple open case-entry forms Allows for more than one simultaneous case Case follow-up tracking Separate follow-up form with multiple sorting and reporting options; links case entry and follow-up forms Limitless substance entries May enter multiple medications on a single case Poisindex (Micromedex, Thomson Reuters, Greenwood Village, CO) access Substance codes, generic codes, and product descriptors are pulled into VDL records Online calculator Calculated results automatically entered into appropriate fields Date and time of call Automatically stamps time when new case opens Last edit time Automatically displays the most recent edit time when a case is opened Unlimited text entry Once stored, information becomes read-only, preventing modifications to previously entered cases Automatic spell checking Misspelled words display with red tilde underscore; correct spellings may be prompted by a right-click Data verification Encompasses over 250 different criteria to ensure internal data consistency; generates detailed list of logical errors Data triggers Can be established to fire additional data collection notifications or collection tools Case templates Can be created to facilitate entry of cases involving common substances; determined by user at implementation Multiple preset and user-definable query parameters User-friendly defaults for case retrieval Limitable user access Three levels of security for record access Encrypted login passwords Increases security Association with Crystal Reports (Business Objectis SA, San Jose, CA) to generate ad hoc reports Reports are built in Crystal Seagate Reports and linked to VDL Searching capabilities Can search by patient name, caller name, substance, and other fields Feature Description Hospital information integrated into case entry Limitless hospital phone numbers, addresses, or personnel storage Staff information database Holds addresses, phone numbers, hire dates, evaluation due dates, and other important employee information Multiple open case-entry forms Allows for more than one simultaneous case Case follow-up tracking Separate follow-up form with multiple sorting and reporting options; links case entry and follow-up forms Limitless substance entries May enter multiple medications on a single case Poisindex (Micromedex, Thomson Reuters, Greenwood Village, CO) access Substance codes, generic codes, and product descriptors are pulled into VDL records Online calculator Calculated results automatically entered into appropriate fields Date and time of call Automatically stamps time when new case opens Last edit time Automatically displays the most recent edit time when a case is opened Unlimited text entry Once stored, information becomes read-only, preventing modifications to previously entered cases Automatic spell checking Misspelled words display with red tilde underscore; correct spellings may be prompted by a right-click Data verification Encompasses over 250 different criteria to ensure internal data consistency; generates detailed list of logical errors Data triggers Can be established to fire additional data collection notifications or collection tools Case templates Can be created to facilitate entry of cases involving common substances; determined by user at implementation Multiple preset and user-definable query parameters User-friendly defaults for case retrieval Limitable user access Three levels of security for record access Encrypted login passwords Increases security Association with Crystal Reports (Business Objectis SA, San Jose, CA) to generate ad hoc reports Reports are built in Crystal Seagate Reports and linked to VDL Searching capabilities Can search by patient name, caller name, substance, and other fields References 1 Rosenberg JM, Koumis T, Nathan JP et al. Current status of pharmacist-operated drug information centers in the United States. Am J Health-Syst Pharm . 2004 ; 61 : 2023 –32. Crossref Search ADS PubMed 2 Erbele SM, Heck AM, Blankenship CS. Survey of computerized documentation system use in drug information centers. Am J Health-Syst Pharm . 2001 ; 58 : 695 –7. Crossref Search ADS PubMed 3 American Society of Health-System Pharmacists. ASHP guidelines on the provision of medication information by pharmacists. Am J Health-Syst Pharm . 1996 ; 53 : 1843 –5. Crossref Search ADS PubMed 4 BuyerZone. Document management software buyer’s guide. www.buyerzone.com/software/doc_man_software/printable_bg.html (accessed 2007 Aug 20). 5 WBM Software. Visual Dotlab product page. www.wbmsoft.com/prod01.htm (accessed 2008 Jun 12). Author notes The authors have declared no potential conflicts of interest. Copyright © 2009, American Society of Health-System Pharmacists, Inc. All rights reserved. TI - Documenting drug information questions using software for poison information documentation JF - American Journal of Health-System Pharmacy DO - 10.2146/ajhp080338 DA - 2009-06-01 UR - https://www.deepdyve.com/lp/oxford-university-press/documenting-drug-information-questions-using-software-for-poison-CYQj0kCbrZ SP - 1039 VL - 66 IS - 11 DP - DeepDyve ER -