Access the full text.
Sign up today, get DeepDyve free for 14 days.
abstract review article Use of Wearable, Mobile, and Sensor Technology in Cancer Clinical Trials As the availability and sophistication of mobile health (mHealth) technology (wearables, mobile technology, and sensors) continues to increase, there is great promise that these tools will be transformative for clinical trials and drug development. This review provides an overview of the current landscape of potential measurement options, including the various types of data collected, methods/tools for collecting them, and a crosswalk of available options. The opportunities and potential drawbacks of mHealth in cancer clinical trials are discussed. Specific concerns related to data accuracy, provenance, and regulatory issues are highlighted, with suggestions for how to address these in future research. Next steps for establishing mHealth methods and tools as legitimate and accepted measures in oncology clinical trials include continuation of regulatory definition by the FDA; establishment of security standards and protocols; refinement and imple- mentation of methods to establish and document data accuracy; and finally, creation of feedback loops wherein regulators receive updates from researchers with better and more timely data, which should decrease trial times and lessen drug development costs. Implementing mHealth technologies into cancer clinical trials has the potential to transform and propel oncology drug development and precision medicine to keep pace with the rapidly increasing developments in genomics and immunology. Clin Cancer Inform. © 2018 by American Society of Clinical Oncology BACKGROUND AND SIGNIFICANCE devices, smart electronic devices worn on the body as implants or accessories (including activ- The pace and cadence of cancer clinical trials ity trackers); mobile technology, portable devices have not kept stride with the exponential growth that perform tasks through wireless cellular ser- and discovery now seen in genomics and immu- vices; and sensors, devices, modules, and sub- notherapy. Furthermore, development of new systems that measure the environment. Use therapies for cancer is disproportionately expen- of these technologies as part of data collection sive and time consuming. Estimates of the cost methods in multi-institution trials could help to to bring a new cancer drug to market range from increase sample size and participant diversity. $648 million to $2.7 billion, with a median time Finally, better and more-specific health outcome 1,2 of development of 7.3 years. Only 35% of the Suzanne M. Cox data provided by wearables and mobile technol- drugs that make it to phase II are graduated to Ashley Lane ogy could help to justify sufficient clinical benefit phase III, with millions already spent on research of a new treatment, which may increase the like- Samuel L. and development. These failures are multifac- Volchenboum lihood of proceeding to a large randomized trial. torial, but small sample sizes and differences Currently, clinical trials data are collected either between phase II and phase III study designs Author affiliations and 3 during the clinic visit, from the electronic health are primary drivers. These designs include dif- support information (if record (EHR), or from participants as patient- applicable) appear at the ferences between end points used, populations end of this article. reported outcomes (PROs). The sourcing of data enrolled, and an overestimation of therapeutic Corresponding author: from the EHR is beyond the scope of this review, benefit in prior studies. The collection of better Samuel L. Volchenboum, but tremendous gains are likely as data from data during phase I and II trials would increase MD, PhD, University of the medical record are made available through the likelihood of a successful transition to phase Chicago, 900 E 57 St, automated systems rather than through the cur- 5th Floor, Chicago, IL III. 60637; e-mail: slv@ rent manual curation into proprietary electronic uchicago.edu. New technologies for data collection present or even paper forms. This review mainly focuses Licensed under the prime opportunities to standardize data col- on data collected from the patient outside their Creative Commons lected across phases to increase comparability standard-of-care interactions with the health Attribution 4.0 License. of results. These technologies include wearable care system. These data are part of a larger body © 2018 by American Society of Clinical Oncology ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 1 of what often is termed real-world data, which a performance outcome. The latter two require include patient-related data and external data input from a clinician, whereas PROs and sources such as publicly available data sets and ObsROs do not. Many of the currently accepted environmental data. When applied to clinical tri- metrics for assessing PROs (eg, sleep quality, als, real-world data include EHR; PRO; or wear- activity, mood) in clinical trials are paper sur- able, sensor, and mobile device data. veys or eHealth/mHealth tools validated through published trials. The various COA measures are The term eHealth refers to any use of informa- defined by the Food and Drug Administration tion and communication technologies for health (FDA) and listed in Table 1. care. mHealth is a subset of eHealth and refers specifically to the use of mobile and wireless devices to improve health. eHealth (and by Data Collection extension, mHealth) technologies are important to improving the success rates for clinical trials to The categorization of the various consumer- increase comparability of data, sample size, and facing technologies used to support clinical trials insight into clinical benefit. eHealth technologies is helpful. A report issued by the Duke University that simply relocate a validated metric to the Margolis Center for Health Policy described four digital realm have been widely adopted through types of consumer-facing mHealth data: patient/ smartphones and mobile and desktop devices caregiver-reported data, task-based measures, to collect data, and examples include standard 9 active sensor data, and passive sensor data. surveys used in PROMIS (Patient-Reported Collectively, these data are valuable to clini- Outcomes Measurement Information System) cal research, but the utility of each type varies or performance tests of cognitive, motor, and according to the individual study. Researchers sensory function and self-reported measures of who want to incorporate data from wearables, emotional function for adults and children in the mobile devices, and sensors into their clinical National Institutes of Health Toolbox HealthMea- trials should consider which of these data types sures (www.healthmeasures.net). Researchers to incorporate into their study design. Nonmo- are still grappling with the implications of adopt- bile eHealth technologies (eg, Web applications ing wearables and sensors for data collection in [apps] accessible through desktop computers) clinical trials, however, and their concerns are also should be evaluated as possible ways not unfounded. Thus, this review also focuses to collect data. The potential intersections of on the mHealth and eHealth technologies that mHealth data and outcome measures are listed the clinical research industry has been more in Table 2. reluctant to adopt, particularly where novel data types or new end points are concerned. In most Patient/caregiver-reported data can be collected cases, the discussion is equally applicable to on a smartphone, tablet, or computer through cancer and noncancer trials, and issues specific an app or Web site and include patient surveys, to oncology studies are denoted. questionnaires, and compliance and symp- tom diaries. Electronic collection of these PRO/ ObsRO measures is, by far, the most common CURRENT AND POTENTIAL MEASUREMENT OPTIONS way in which mHealth technologies currently are used to support clinical trials. Use of PRO mea- Types of Data sures to evaluate symptomatic adverse events is an important area of study for cancer clinical Most cancer clinical trials test the safety and/ or efficacy of a new therapy. Participants are trials. Outside COA measures, mHealth apps are being used to encourage and track medi- observed in the health care system during office 11,12 visits, and data are collected through laboratory, cation adherence, and emerging platforms are using artificial intelligence and image cap- imaging, and other studies. Other research- related data reported for clinical trials are either ture to document compliance. Although many categorized as clinical outcomes assessment platforms for symptoms monitoring of patients (COA) or non-COA measures, such as biomark- treated with chemotherapy exist, comprehen- ers or surrogate end points (Table 1). COAs sive systems that include a full range of alerts must be a PRO or an observer-reported out- and reports for clinicians and pharmacists, come (ObsRO), a clinician-reported outcome, or reminders for patients, a clinician interface, and 2 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics Table 1. Clinical Outcomes Assessment (COA) Measures Measure Description COA Any assessment that may be influenced by human choices, judgment, or motivation and may support either direct or indirect evidence of treatment benefit. Unlike biomarkers that rely completely on an automated process or algorithm, COAs depend on implementation, interpretation, and reporting from a patient, a clinician, or an observer. PRO A measurement that is based on a report that comes from the patient (ie, study participant) about the status of his or her health condition without amendment or interpretation of the report by a clinician or anyone else. A PRO can be measured by self-report or interview provided that the interviewer records only the patient’s response. Symptoms or other unobservable concepts known only to the patient (eg, pain severity, nausea) can be measured only by PRO measures. PROs also can assess patient perspective on functioning or activities that others also may observe. ObsRO A measurement that is based on an observation by someone other than the patient or a health professional, including a parent, spouse, or other nonclinical caregiver who is in a position to observe and report regularly on a specific aspect of the patient’s health. An ObsRO measure does not include medical judgment or interpretation and can only include events or behaviors that can be observed (eg, observers cannot validly report an infant’s pain intensity [a symptom] but can report infant behavior believed to be caused by pain [crying]). ClinRO A measurement that is based on a report that comes from a trained health care professional after observation of a patient’s health condition. A ClinRO measure involves a clinical judgment or interpretation of the observable signs, behaviors, or other physical manifestations believed to be related to a disease or condition. ClinRO measures cannot directly assess symptoms that are known only to the patient (eg, pain intensity). PerfO A measurement that is based on tasks performed by a patient according to instructions administered by a health care professional. PerfOs require patient cooperation and motivation and include measures of gait speed (eg, timed 25-foot walk test), memory recall, or other cognitive testing (eg, digit symbol substitution test). For the purpose of embracing mHealth’s potential to capture these data remotely, the ePRO Consortium has proposed the extension of the definition of PerfO to include unsupervised settings where performance data are captured and/or measured by sensors. We adopt the consortium’s extended definition for the purposes of Table 2. NOTE. Definitions are derived from the FDA’s COA glossary of terms. Abbreviations: ClinRO, clinician-reported outcome; COA, clinical outcomes assessment; mHealth, mobile health; ObsRO, observer- reported outcome; PerfO, performance outcome; PRO, patient-reported outcome. decision support only now are emerging to sup- would be considered surrogate end points (and port clinical trials. not COA measures), groups like the ePRO Con- sortium are promoting the appropriate use of Task-based measures are objective measure- wearables in trials. ments sometimes collected in the clinical setting. For instance, the 6-minute walk test Active sensor data include measurements about (6MWT) is used to measure functional status in a person that quantifies something about his or patients with lung disease and has been val- her physiology, mental state, or ability to com- idated as a measure of health in patients with plete an activity, with the key condition of requir- breast cancer. Devices that contain accelerom- ing an activation step. For example, research eters and gyroscopes can be used to produce participants might measure their blood glucose, surrogate measures for the 6MWT, with mixed take their blood pressure, or record their weight. 17-19 reporting of accurate results and inaccurate These data would mostly qualify as non-COA results. The accuracy seems to depend on the measures, although this is an important area device used, so standardized approaches and of active development, and there is increasing expert recommendations may help research- interest in understanding how these measures ers to make decisions about the best device can be used to support a clinical trial. Although for task-based measures. Research participants not traditional COA measures, active sensor can be given task-based measures outside the data could be used to support these assess- clinical setting, such as completion of a smart- ments. For instance, a patient with cancer who phone-based memory or dexterity test. But these reports pain might record an elevated heart rate, tests would not qualify by the current definitions which could provide evidentiary support for this of COAs. Although these measures currently PRO. Numerous studies have demonstrated ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 3 Table 2. Intersections of mHealth Data and Outcome Measures COA Examples Non-COA Examples mHealth Data PRO ClinRO PerfO Biomarker or Type (no HCP) (requires HCP) (requires HCP) Surrogate End Point Patient/ Online or mobile Photo of wound uploaded NA AI system to photogram caregiver- device–based survey, by patient and reviewed by and document medication reported data questionnaire, or diary clinician adherence Task-based Necessarily subjective, NA 6MWT through a wearable Smartphone-based test of measures but an mHealth app could or smartphone dexterity or memory be used to support and document medication adherence Active sensor NA, although a sensor could NA Smartphone-connected Home blood glucose, weight, data be used to confirm or add spirometer to assess asthma and blood pressure monitors data to a PRO (eg, feeling control faint and heart rate) Passive sensor NA, although passive NA Evidence of activity or Wearables and sensors to data data (eg, the Apple Watch fitness in a patient’s monitor heart rate, sleep, and [Cupertino, CA] sleep app) wearable data (v 6MWT) activity can be used to confirm PRO data (eg, PSQI) Abbreviations: 6MWT, 6-minute walk test; AI, artificial intelligence; app, application; ClinRO, clinician-reported outcome; COA, clinical outcomes assessment; HCP, health care provider; mHealth, mobile health; NA, not applicable; PerfO, performance outcome; PRO, patient-reported outcome; PSQI, Pittsburgh Sleep Quality Index. the accuracy and representativeness of data 6MWT), passive sensor data could be mined for collected through active sensors, mostly from evidence that would support the participant’s sensors designed to collect a single measure. functional status even better. These data also could be used to determine inclusion/exclusion Mobile-enabled blood glucose monitors, wire- less pulmonary artery pressure monitors, and eligibility for studies, especially those that mea- sure an oncology patient’s functional status or balance quality assessment are a few of the sen- sors that have been developed and validated for health-related quality of life before trial. 22-24 specific measures. WEARABLES AND SENSORS: OPPORTUNITY AND Passive sensor data are similar to active sensor POTENTIAL DRAWBACKS data in that they are measurements of a per- son’s physiology, mental state, and activities. Most PRO measures are surrogates for quantifi- In contrast, however, passive data collection, by able activities. For instance, in oncology clinical definition, does not require active intervention. trials, sleep quality often is assessed by admin- For example, a Fitbit fitness tracker (Fitbit, San istering the Pittsburgh Sleep Quality Index, a Francisco, CA) or other wearable can be used survey given every 2 weeks that has been vali- to passively collect heart rate, step count, and dated against polysomnography (the gold stan- sleep/wake activity. Although the prevailing, and dard for sleep quality). Rather than reliance on perhaps most exciting, application of mHealth the research participant to faithfully recall sleep technology to clinical trials, these measurements patterns over a 2-week period, this metric might do not technically fit within the common frame- be better measured through nightly continuous work of COA measurements. Rather, a research sleep monitoring by using a wearable such as participant’s heart rate or sleep quality is more the wGT3X-BT activity monitor (ActiGraph, Pen- likely to be considered a digital biomarker or sacola, FL). Likewise, physical activity recorded surrogate end point. In addition, these passive in a daily diary likely is better captured through measurements could be used to confirm PRO/ step counts from a wearable or mobile device. ObsRO data, such as the use of step counts to With the former, patients record their percep- corroborate activity survey metrics. Similarly, tion of how they slept or how active they were, although performance outcomes traditionally whereas with the latter, the device measures have required a health care provider to actively patients’ activity; these also may reflect different engage the research participant in a test (eg, measures. Wearables and sensors can capture 4 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics physiologic data such as heart rate, pulse oxim- hardware and be routed to the manufacturer’s etry, and blood glucose. Although these metrics server. Only then can the data be downloaded do not have an obvious PRO correlate, they for additional analysis. During this process, the could be useful in some clinical trial settings. data may be aggregated, normalized, and/or The capturing of data at the point of experi- transformed, and these operations are opaque ence is primed to transform the clinical trial to the researcher, which makes comprehensive landscape, which offers researchers access to data analysis and interpretation difficult. This high-quality, real-time data on research partici- concern is especially problematic for consumer pants. The capturing of these data may shorten devices because few offer the ability to trace the trial times, decrease costs, and increase safety. data to a particular device and instead must be To realize these benefits, the clinical trial com- retrieved through a user’s account on the man- munity must address limitations and overcome ufacturer’s Web site. In addition, concerns exist the drawbacks in the current technology. about patients or others tampering with data; some wearable technology companies have Concerns about the use of wearables and sen- implemented data encryption to minimize the sors in clinical trials generally fall into three 32 ability to manually change data. Blockchain categories: data accuracy, provenance, and reg- technology also may offer a way to increase the ulatory. First are concerns about the accuracy of 33 data security and improve data provenance. the data collected. Comparisons of the data from Regulatory issues are the third main concern various consumer and commercial-grade wear- about the use of wearables and other sensors ables demonstrate variability among devices 27-30 in clinical trials. These issues include uncertain- related to body placement and other factors. ties about how easily the data will be accepted Comparison across devices has shown that by the FDA when submitted in support of an step count, for instance, is relatively accurate investigational new drug application. The FDA for most wearables for 18- to 39-year-olds but 27 has released guidance on the use of wearables more variable in older age groups. Another and mobile devices and their applications for study found a significant difference in accelera- clinical care but not for clinical trials. This lack tion values between hip and wrist placement of 28 of guidance for clinical trials may underlie the the same wearable. A comparison across eight pharmaceutical industry’s reluctance to adopt different wearables showed error rates between mobile technologies more broadly. Despite the 9.3% and 23.5% in the measurement of daily 29 lack of FDA guidance, the pace of adoption energy expenditure. Although issues may need continues to increase, with at least 300 trials in to be worked out with respect to the accuracy of 2015 incorporating wearables. More than other wearable technologies, researchers have begun areas, oncology clinical trials have been slow to to identify relevant issues related to hardware in adopt wearable and mobile technology. One some cases and data analysis in others; testing reason may be that the labeling claims for can- of solutions will follow rapidly. The validity of cer drugs usually are tied to overall survival and using mobile devices to collect PROs through not to improvements in symptoms. But as PROs app-based surveys and text messages has are increasingly put forward as claims for oncol- been more straightforward (with less variability ogy drug approval, the ability to measure and between versions) than has that of current wear- 31 track metrics through wearables and sensors will able technology with physical measures. become increasingly important. The second concern, data provenance, is docu- mentation of the source and lineage of the data. NEXT STEPS FOR ONCOLOGY CLINICAL TRIALS Raw data are difficult or impossible to retrieve directly from most wearable devices; instead, the For clinical outcomes assessment, the indus- data are transformed and filtered on the devices try already uses many mHealth and eHealth before storage on the connected mobile device solutions. Several companies, such as Medi- before upload to the manufacturer’s server. Only data, Rho, Parexel, and ICON, offer electronic a few devices provide software developer kits or COA platforms, and the most common tools protocols to access data directly. Where that is are smartphones, computers, and tablets. Con- not an option, the data must first pass through cerns about equivalence between electronic the manufacturer’s proprietary software and/or and paper versions of measurement instruments ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 5 36 largely have been dismissed. An opportunity third-party certification program to streamline now exists to use wearable and sensor data to marketing of higher-risk devices. As detailed in support COA measures. In addition, we should this plan, groups may be able to take advantage be able to leverage wearable and sensor mea- of the National Evaluation System for Health surements as surrogate end points and digital Technology to generate evidence and analytics biomarkers (Table 2). To maximize the benefits for devices, which hopefully will facilitate the of these new data collection methods for oncol- selection of the best mHealth technology for a ogy clinical trials, we must address the accuracy, given study. provenance, and regulatory issues. Security Regulation Security of wearables and sensors remains a Although the FDA has not given explicit guid- concern. A recent analysis found > 8,000 secu- 40,41 rity vulnerabilities in cardiac pacemakers. ance on the use of mHealth technology for clin- ical trials, we can be optimistic about how these Another report found a widespread lack of proper security and privacy measures in fitness devices and sensors are being considered. The FDA’s new draft guidance released in June 2017 trackers, which leave users’ data unencrypted and their geolocation information exposed. clarifies the 2003 Code of Federal Regulations, Title 21, Part 11, rules for electronic records and However, there is ample guidance from the FDA, the National Institute of Standards and Technol- signatures as they apply to clinical trials. A risk-based approach is recommended because ogy, and other agencies on the need for robust security policies and procedures that device devices that capture, record, and transmit data are inherently of lower risk than devices intended manufacturers must heed, including end-to- 42,43 end encryption and auditing/logging of data. to process data. The information systems that import the data from wearables used in the clin- Although a detailed analysis of security con- siderations is beyond the scope of this review, ical care of patients must comply with privacy standards as defined by the Health Insurance successful implementation of wearables and sensors into clinical trials will require adherence Portability and Accountability Act. to these standards. The sponsor is responsible for validating that the mHealth technology used reliably captures, transmits, and records data. For example, if a Supportive Evidence Vívoactive smartwatch (Garmin, Olathe, KS) Device accuracy remains a concern, but improve- records 5,000 steps in a particular day, the val- ments to the technology, increasing knowledge idation should ensure that this value is reliably on how to use the devices and for what pur- captured, recorded, and transmitted in the data poses, and a better statistical approach to under- capture system. Of note, this guidance does not standing the data all contribute to more comfort address the performance of wearables and sen- in the use of wearables and sensors for clinical sors. The FDA considers the source data as the trials. The ePRO Consortium has recommended information first recorded permanently in the four requirements to demonstrate that a device EHR or data capture system and not the data is suitable in clinical trials: content validity, reli- stored on the device itself. To preserve lineage, ability assessment, concurrent validity, and the data must be encrypted at rest and in motion. ability to detect change. Content validity refers In addition to the new draft guidance, the FDA to the device’s provision of evidence relevant to also announced the new Digital Health Innova- the population being studied. For instance, bet- tion Action Plan focused on fostering innovation ter ambulation may be important for a patient at the intersection of medicine and digital health who undergoes radiation therapy for cancer, and 38,39 technology. Along with the new guidance in a fitness tracker used for monitoring should be the 21st Century Cures Act, new rules will help able to measure this metric in this specific pop- to define which low-risk devices fall outside the ulation. Reliability assessment should be per- FDA’s jurisdiction. Other changes also have formed by the manufacturer or a third party to been announced as part of the Digital Health demonstrate that data are consistent within and Innovation Action Plan, including a pilot of a between devices. A meta-analysis that evaluated 6 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics Fig 1. Considerations for incorporating mobile health Define the objective: Patient engagement, ePRO, or data for regulatory submission technology into oncology Categorize the tool: SaMD, MMA, PRO clinical trials. Shown are SaMD: Ensure scientific, clinical performance, and analytical validity the steps to take when con- sidering the incorporation of MMA: Ensure 510(k) approval, scientific and clinical performance, analytic validity wearables, devices, and/or Ensure tool validation ePRO: Ensure content validity, conduct reliability assessment, demonstrate sensors into a clinical trial. concurrent validity and ability to detect change 21 CFR 11, Code of Federal Not SaMD, MMA, ePRO: Ensure analytic validity and 21 CFR 11 compliance Regulations, Title 21, Part 11, that established the Develop oversight analytics of vendor: Measure service quality and conduct, develop customized set of key Food and Drug Administra- risk and performance indicators tion regulations on electron- ic records and electronic signatures; 510(k), Food and Drug Administration reliability of wearables and trackers found that most manufacturers provide aggregate mea- premarket notification of 22 validation studies published through 2015 sures, analysis needs to include a more-rigorous intent to market a medical recommended steps that consumers and manu- approach to these time series data. device; ePRO, electronic facturers can take to improve the validity of these patient-reported outcome; MMA, mobile medical appli- measurements. Concurrent validity must be cation; SaMD, software as a Use of mHealth in an Oncology Clinical Trial achieved by comparing the device measurement medical device. to a gold standard. For some measurements, If a researcher determines that an mHealth this may entail a simple comparison of the met- measure fits within the clinical hypothesis, he ric with a concurrently measured standard, such or she should next consider its value for the as Zoom HRV (LifeTrak, St Paul, MN) sleep study. Value may come from better or other- data alongside polysomnography or the most wise unavailable data or the ability to determine frequently used PRO measure, the Pittsburgh inclusion/exclusion criteria efficiently for eligible Sleep Quality Index. For other metrics, concur- patients. Patient acceptance of the wearable or rent validity will be much more challenging. For device also is a key consideration. To be effec- example, an analysis of the literature for stud- tive, patients must wear the device or use the ies of falls and freezing of gait in patients with app; therefore, design, battery life, water resis- Parkinson disease found a wide variety of meth- 46 tance, and comfort must all be maximized. ods, wearables, and sensitivity and specificity. These patient-focused factors may be even more Finally, any device used for a trial should have important for oncology patients who experience the ability to detect change when a difference fatigue, nausea, anorexia, or other treatment-re- exists, such as a chemotherapeutic regimen that lated adverse effects or events. increases resting heart rate by 10 beats per min- After these baseline factors are met, researchers ute on average. who wish to use mHealth can follow the mea- In addition to normalizing devices against clinical surement and regulation principles outlined in measures, researchers need guidance in choos- Fig 1. Although regulations and recommen- ing the best devices for a desired metric, and dations continue to evolve, researchers should the FDA may provide such guidance. Modeling communicate with regulators to increase under- of error is critical to understanding the expected standing and information on the use of mHealth distributions of data and to look for outliers. The tools in clinical trials and to help to reduce the collection of large amounts of baseline normal risk of and barriers to incorporating wearables data also is important because such data will and other mHealth tools. To develop a com- allow device manufacturers and data analysts to prehensive strategy to adopt mHealth technol- model normal populations in addition to groups ogies, pharmaceutical companies and research whose measurements may fall outside the nor- centers need to document an adoption plan and mal distribution. By understanding the accuracy commit to the capital and time investment. The of the measurements and the expected distri- establishment of a dedicated innovation team bution of the data, researchers can monitor for will aid in the research of all available options compliance and use and even validate that the and validate them internally. actual study participant is wearing the device. Finally, the informatics methods for analyzing In conclusion, the rise of wearables and sensors these complex data are just emerging. Although will be transformative for clinical trials and drug ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 7 development. Although valid concerns are cited through the development pipeline, whereas non- about accuracy, validity, security, and regulation, efficacious therapies or those with unacceptable regulators, device manufacturers, researchers, adverse effect profiles do not. The collection of and clinical systems likely will address these large amounts of disparate patient data will fuel issues as the speed at which mobile technol- better modeling of diseases and drug responses. ogy permeates health care continues to accel- Combined with the ever-increasing torrents of 50,51 erate. mHealth technologies will provide genomic and other omics data, we will see great researchers with better and more timely data, gains in our ability to diagnose and treat disease. which should decrease trial times and lessen drug development costs. Furthermore, better DOI: https://doi.org/10.1200/CCI.17.00147 algorithms for analyzing data and making infer- Published online on ascopubs.org/journal/cci on ences will ensure that effective drugs proceed June 29, 2018. AUTHOR CONTRIBUTIONS ASCO's conflict of interest policy, please refer to www. asco.org/rwc or ascopubs.org/jco/site/ifc. Conception and design: All authors Collection and assembly of data: Suzanne M. Cox, Ashley Suzanne M. Cox Lane Consulting or Advisory Role: Litmus Health Data analysis and interpretation: Suzanne M. Cox, Ashley Lane Ashley Lane Manuscript writing: All authors Employment: Allergan, Litmus Health (I) Final approval of manuscript: All authors Accountable for all aspects of the work: All authors Samuel L. Volchenboum Stock and Other Ownership Interests: Litmus Health AUTHORS' DISCLOSURES OF Consulting or Advisory Role: Accordant Health Services, a POTENTIAL CONFLICTS OF INTEREST CVS Caremark Company, Baxter The following represents disclosure information provided Travel, Accommodations, Expenses: Marcus Evans by authors of this manuscript. All relationships are considered compensated. Relationships are self-held ACKNOWLEDGMENT unless noted. I = Immediate Family Member, Inst = My Institution. Relationships may not relate to the subject We acknowledge the helpful input offered by Bill Byrom, matter of this manuscript. For more information about senior director of product innovation at ICON. Affiliations Suzanne M. Cox and Samuel L. Volchenboum, University of Chicago, Chicago, IL; and Ashley Lane and Samuel L. Volchenboum, Litmus Health, Austin, TX. REFERENCES 1. DiMasi JA, Grabowski HG, Hansen RW: Innovation in the pharmaceutical industry: New estimates of R&D costs. J Health Econ 47:20-33, 2016 2. Prasad V, Mailankody S: Research and development spending to bring a single cancer drug to market and revenues after approval. JAMA Intern Med 177:1569-1575, 2017 3. Fazzari M, Heller G, Scher HI: The phase II/III transition. Toward the proof of efficacy in cancer clinical trials. Control Clin Trials 21:360-368, 2000 4. Khozin S, Blumenthal GM, Pazdur R: Real-world data for clinical evidence generation in oncology. J Natl Cancer Inst 109: 2017 5. Healthcare Information and Management Systems Society: Definitions of mHealth, 2012. http:// www.himss.org/definitions-mhealth 6. US Food and Drug Administration: Clinical Outcome Assessment (COA): Glossary of T erms, 2017. https:// www.fda.gov/Drugs/DevelopmentApprovalProcess/DrugDevelopmentToolsQualificationProgram/ ucm370262.htm 7. Gnanasakthy A, Mordin M, Evans E, et al: A review of patient-reported outcome labeling in the United States (2011-2015). Value Health 20:420-429, 2017 8 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 8. Byrom B, Watson C, Doll H, et al: Selection of and evidentiary considerations for wearable devices and their measurements for use in regulatory decision making: Recommendations from the ePRO Consortium, 2017. http://linkinghub.elsevier.com/retrieve/pii/S1098301517335325 9. Duke Margolis Center for Health Policy: Mobilizing mHealth Innovation for Real-World Evidence Generation, 2017. https://healthpolicy.duke.edu/sites/default/files/atoms/files/duke-margolis_ mhealth_action_plan.pdf 10. Basch E, Rogak LJ, Dueck AC: Methods for implementing and reporting patient-reported outcome (PRO) Measures of symptomatic adverse events in cancer clinical trials. Clin Ther 38:821-830, 11. Anglada-Martínez H, Martin-Conde M, Rovira-Illamola M, et al: Feasibility and preliminary outcomes of a Web and smartphone-based medication self-management platform for chronically ill patients. J Med Syst 40:99, 2016 12. Passardi A, Rizzo M, Maines F, et al: Optimisation and validation of a remote monitoring system (Onco-TreC) for home-based management of oral anticancer therapies: An Italian multicentre feasibility study. BMJ Open 7:e014617, 2017 13. Labovitz DL, Shafner L, Reyes Gil M, et al: Using artificial intelligence to reduce the risk of nonadherence in patients on anticoagulation therapy. Stroke 48:1416-1419, 2017 14. Beck SL, Eaton LH, Echeverria C, et al: SymptomCare@Home: Developing an integrated symptom monitoring and management system for outpatients receiving chemotherapy. Comput Inform Nurs 35:520-529, 2017 15. Enright PL: The six-minute walk test. Respir Care 48:783-785, 2003 16. Galiano-Castillo N, Arroyo-Morales M, Ariza-Garcia A, et al: The six-minute walk test as a measure of health in breast cancer patients. J Aging Phys Act 24:508-515, 2016 17. Capela NA, Lemaire ED, Baddour N: Novel algorithm for a smartphone-based 6-minute walk test application: Algorithm, application development, and evaluation. J Neuroeng Rehabil 12:19, 18. Jehn M, Schmidt-Trucksäess A, Schuster T, et al: Accelerometer-based quantification of 6-minute walk test performance in patients with chronic heart failure: Applicability in telemedicine. J Card Fail 15:334-340, 2009 19. Hilton NP: A quantification of the treadmill 6-min walk test using the MyWellness KeyTM accelerometer. J Sport Health Sci 4:188-194, 2014 20. Chowdhury A, Sehgal S, Rabih F, et al: Accuracy of accelerometer (fitbit® Chargehr™) measured distance and heart rate in patients with pulmonary arterial hypertension. Am Thoracic Soc 195:A3116, 2017 21. Proffitt A: Navigating a connected world: Sensors in clinical trials and care, 2017. http://www. clinicalinformaticsnews.com/2017/04/11/navigating-a-connected-world-sensors-in-clinical- trials-and-care.aspx 22. Bailey T, Bode BW, Christiansen MP, et al: The performance and usability of a factory-calibrated flash glucose monitoring system. Diabetes Technol Ther 17:787-794, 2015 23. Hewson DJ, Duchêne J, Hogrel J-Y: Validation of balance-quality assessment using a modified bathroom scale. Physiol Meas 36:207-218, 2015 24. Abraham WT, Adamson PB, Hasan A, et al: Safety and accuracy of a wireless pulmonary artery pressure monitoring system in patients with heart failure. Am Heart J 161:558-566, 2011 25. Bai J, Sun Y, Schrack JA, et al: A two-stage model for wearable device data. Biometrics https:// doi.org/10.1111/biom.12781 [epub ahead of print on October 10, 2017] 26. Buysse DJ, Reynolds CF III, Monk TH, et al: The Pittsburgh Sleep Quality Index: A new instrument for psychiatric practice and research. Psychiatry Res 28:193-213, 1989 27. Case MA, Burwick HA, Volpp KG, et al: Accuracy of smartphone applications and wearable devices for tracking physical activity data. JAMA 313:625-626, 2015 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 9 28. Hildebrand M, VAN Hees VT, Hansen BH, et al: Age group comparability of raw accelerometer output from wrist- and hip-worn monitors. Med Sci Sports Exerc 46:1816-1824, 2014 29. Lee J-M, Kim Y, Welk GJ: Validity of consumer-based physical activity monitors. Med Sci Sports Exerc 46:1840-1848, 2014 30. Modave F, Guo Y, Bian J, et al: Mobile device accuracy for step counting across age groups. JMIR Mhealth Uhealth 5:e88, 2017 31. Anthony CA, Lawler EA, Glass NA, et al: Delivery of patient-reported outcome instruments by automated mobile phone text messaging. Hand (N Y) 12:614-621, 2017 32. Hilts A, Parsons C, Knockel J: Every step you fake: A comparative analysis of fitness tracker privacy and security. Open effect report, 2016. https://openeffect.ca/reports/Every_Step_You_ Fake.pdf 33. Miseta E: Clinical news roundup: Industry seeks FDA guidance on mHealth technologies, 2016. https://www.clinicalleader.com/doc/clinical-news-roundup-industry-seeks-fda-guidance-on- mhealth-technologies-0001 34. Taylor NP: Number of clinical trials to use wearables nears 300, 2015. http://www.fiercebiotech. com/r-d/number-of-clinical-trials-to-use-wearables-nears-300 35. Le Saux O, Falandry C, Gan HK, et al: Changes in the use of end points in clinical trials for elderly cancer patients over time. Ann Oncol 28:2606-2611, 2017 36. Muehlhausen W, Doll H, Quadri N, et al: Equivalence of electronic and paper administration of patient-reported outcome measures: A systematic review and meta-analysis of studies conducted between 2007 and 2013. Health Qual Life Outcomes 13:167, 2015 37. US Food and Drug Administration. Use of electronic records and electronic signatures in clinical investigations under 21 CFR part 11 – questions and answers: Guidance for industry, 2017. https://www.fda.gov/downloads/drugs/guidancecomplianceregulatoryinformation/guidances/ ucm563785.pdf 38. US Food and Drug Administration: Digital Health Innovation Action Plan. https://www.fda.gov/ downloads/MedicalDevices/DigitalHealth/UCM568735.pdf 39. Gottlieb S: Fostering medical innovation: A plan for digital health devices, 2017. https://blogs.fda. gov/fdavoice/index.php/2017/06/fostering-medical-innovation-a-plan-for-digital-health-devices 40. Dietsche E: Analysis finds 8,000+ security flaws in pacemakers, 2017. https://medcitynews. com/2017/05/security-flaws-pacemakers 41. Rios B, Butts J: Security evaluation of the implantable cardiac device ecosystem archit- ecture and implementation interdependicies, 2017. https://drive.google.com/file/d/0B_ GspGER4QQTYkJfaVlBeGVCSW8/view 42. US Food and Drug Administration: Cybersecurity, 2017. https://www.fda.gov/MedicalDevices/ DigitalHealth/ucm373213.htm 43. National Institute of Standards and Technology: Framework for improving critical infrastructure cybersecurity, 2017. https://www.nist.gov/sites/default/files/documents/2017/12/05/draft-2_ framework-v1-1_without-markup.pdf 44. Evenson KR, Goto MM, Furberg RD: Systematic review of the validity and reliability of consumer- wearable activity trackers. Int J Behav Nutr Phys Act 12:159, 2015 45. Silva de Lima AL, Evers LJW, Hahn T, et al: Freezing of gait and fall detection in Parkinson’s disease using wearable sensors: A systematic review. J Neurol 264:1642-1654, 2017 46. McCarthy M: Considerations for the use of wearables in clinical trials, 2017. https://www. lifescienceleader.com/doc/considerations-for-the-use-of-wearables-in-clinical-trials-0001 47. Wolfram T : Wearables: Where is clinical development headed next?2015. http://www.appliedclinicaltrialsonline. com/print/292165?page=full 48. Zheng K: Wearable devices in clinical trials: The current landscape in pharma, 2017. http://www. prometrika.com/blog/2017/06/29/222 10 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 49. Alsumidaie M: A framework to incorporate mHealth, wearables in clinical trials, 2016. http://www. appliedclinicaltrialsonline.com/print/324480?page=full 50. SCORR Marketing and Applied Clinical Trials: mHealth wearables report 2016, 2016. https:// www.scorrmarketing.com/wp-content/uploads/SCORR-ACT-mHeatlhWearablesReport-2016.pdf 51. Kotz D, Gunter CA, Kumar S, et al: Privacy and security in mobile health: A research agenda. Computer (Long Beach Calif) 49:22-30, 2016 ascopubs.org/journal/cci JCO™ Clinical Cancer Informatics 11
JCO Clinical Cancer Informatics – Wolters Kluwer Health
Published: Jun 29, 2018
Keywords: FURIN, KCNK3, NCKIPSD, MMD
Access the full text.
Sign up today, get DeepDyve free for 14 days.