TY - JOUR AU1 - Mendenhall, G, Stuart AB - Implanted cardiac devices, primarily consisting of implantable cardioverter-defibrillators (ICDs), pacemakers, and implantable loop recorders, are capable of recording physiological signals from the body that may provide health status and numerous other information about habits. Never in history has this amount of physiological and cardiac data been generated and recorded regarding humans and their behaviours through these implanted devices, and as these devices become increasingly inexpensive and indications for use increase, this generated data will be a valuable source for research, monitoring, and potential commercial use. Additionally, more lives are dependent on the proper functioning of the devices, as they also become more connected with remote access and possible re-programming of functionality. With the functionality and data stores, the techniques for securing information and protecting patient privacy must be continuously and expertly maintained, as a single point of compromise, whether on the device itself, through its communication protocols, or in online storage, may affect all patients and users of the device. The Muddy Waters case In August 2016, the Muddy Waters LLC investment firm (New York, NY, USA) reported that a subset of St. Jude Medical (Saint Paul, MN, USA) ICDs contained a security flaw and were vulnerable to unauthorized access. These claims were made without notifying St. Jude in advance, although no programmes or direct details of the exploit were provided. Immediate disclosure of flaws and subsequent compromise of a computer system are often termed ‘zero-day’ exploits, referring to the time that the manufacturer has had to examine code and issue a repair. St. Jude vigorously denied that the device compromise were significant, citing lack of detail of the demonstration and inconsistencies. Furthermore, the Muddy Waters firm was ‘short’ St. Jude medical stock, and stood to profit on the drop of the value of the shares of St. Jude. In September of 2016, St. Jude issued a lawsuit against Muddy Waters for defamation. Subsequent revelations during litigation showed the cybersecurity corporation MedSec (Miami, FL, USA) had notified Muddy Waters of the potential security issue, and no notice was given to St. Jude before claims widely made. As part of their defence, Muddy Waters hired an external information technology security firm, Bishop Fox (Tempe, AZ, USA), to conclusively demonstrate that the devices had truly been compromised and form an expert witness report, giving further details into the device security issue. Security vulnerabilities identified In the expert witness report,1 Bishop Fox outlines the device attack and felt that the Muddy Waters accusations were accurate as described. The attack uses a Merlin@home system monitor that was compromised to gain authorization over the device and total control, a process known as ‘rooting’. In general, electronic devices, whether cellphones, computers, or pacemaker system analyzers (PSA), operate in user modes which limits the control over the device, both for security purposes and internal safety to prevent unintended modification of critical programmes or parameters. However, to preserve this limitation of access there must be essentially perfect programming; once any hole is found the protection programmes themselves can be modified and the device is compromised. The security group used the compromised St. Jude Merlin@home unit and a laptop, to issue commands to an unmodified St. Jude ICD that allowed delivery of a T-wave shock, switch off therapy, or drain the device battery (Figure 1A). The firm noted that this could turn the relatively easily obtainable home monitoring unit, together with a laptop, into a mobile device that could compromise devices and potentially have a lethal outcome. With modifications the range of the Merlin@home communication, nominally approximately 10 feet, could be increased (Figure 1B), allowing an attacker to scan a crowd or area to compromise a device from a distance. Figure 1 View largeDownload slide (A) Direct connection to the Merlin@home transmitter allows compromise of the device internal security and issue of commands to stock (unmodified, unpatched) implantable cardiac device. (B) A compromised Merlin device is portable, and with modification to increase transmission range could be used to scan crowds and issue commands or re-programme vulnerable devices. Figure 1 View largeDownload slide (A) Direct connection to the Merlin@home transmitter allows compromise of the device internal security and issue of commands to stock (unmodified, unpatched) implantable cardiac device. (B) A compromised Merlin device is portable, and with modification to increase transmission range could be used to scan crowds and issue commands or re-programme vulnerable devices. St. Jude made design choices that generally assumed that the devices would not have their internal software compromised, and maintained secret keys that are common to communication to all devices. They used weakened, but still encrypted, methods of radiofrequency transmission. A real-world analogy is use of the same key for all houses in a gated community. Individual homes (patient devices) are still locked, and any intruder would have to get past the main gate, but once the key is known it is no longer a barrier to access, and there may be ways around the main gate (rooting device). A better policy is to maintain individual keys and strong encryption even within areas that typically should not be vulnerable to attack. After initially denying presence of the security issue, St. Jude released firmware updates that improved security of affected devices. A Merlin@home update2 was released in January 2017, and new device firmware was released in August 2017. There is a small but non-zero risk of placing the device in a backup safety mode or rendering it unresponsive with the update, and the balance of these dual low risks are left up to the patient and physician. Food and Drug Administration response In April of 2017, the United States Food and Drug Administration (FDA) sent a warning letter3 to St. Jude criticizing the inadequate response to a third party cybersecurity risk assessment from April 2015, which had previously outlined the weak encryption and the vulnerability associated with hard-coded universal locking codes. The letter noted that failure to promptly correct the violations may result in regulatory action being initiated by the FDA. The new device firmware4 pushed to the system analysers contain updates that ensure compliance and update the security to strong encryption. Other incidents Other cardiac device manufacturers have experienced similar attacks to their programmers or remote monitoring system. At the ‘Black Hat’ hacker conference in 2018, security firm WhiteScope (Half Moon Bay, CA, USA) demonstrated compromise of a Medtronic Corporation (Minneapolis, MN, USA) 2090 programmer bought off an internet auction and sales site. In this exploit,5 a network was created that fooled the programmer into thinking it was connecting to the Medtronic official servers, which allowed uploading of unsigned, potentially malicious code to the PSA to run. The researchers note that the 2090 device accepted code as long as it was claimed to be coming from Medtronic and had a few minimal checks, there was no ‘secure signing’ that was checked or enforced. The researchers note that a ‘rogue’ or badly programmed PSA device could then be re-introduced into a clinical care environment. In a statement6 most recently released 28 June 2018, Medtronic has acknowledged the possibility of this compromise and has elected to not update existing software due to the low overall risk and the requirement of both physical access and sophisticated programming techniques to change the 2090. They note that all Medtronic programmers should only be used to connect to secure, uncompromised networks for software updates, and thus the overall risk is acceptable. Future devices will have enhanced authentication and code-signing techniques implemented. Lessons learned from device compromises 1. Inter-device communication will be examined and protocols ‘sniffed’, so strong encryption must be used for any communication of sensitive information over public or readily accessible channels. For the device manufacturer, a major lesson is that the comforts of a relatively lax security through proprietary or obscure protocols is increasingly insufficient, with the latter termed ‘security through obscurity’. As there are more internet-facing portals or consumer-accessible equipment, there will be more eyes and opportunity to pop the hood of devices, literally and figuratively, to explore protocols and attempt to decipher communication. This is particularly important with the next generation of implanted devices which may use consumer or non-proprietary protocols to allow patient access to medical data, where many other devices may easily listen into the communication. It may seem that strong encryption should always be used in any medical device storing data; however, encryption can make debugging and troubleshooting difficult, may introduce new software bugs, break compatibility, hinder authorized or legitimate access, require updates and software patches, increase computational complexity and power requirements, and must be done without error to be effective. If not needed, encryption is not universally advised. As an analogy in the physical world, rapidly needed hospital supplies are typically not kept behind lock and key to prevent the inefficiencies of needless authentication and potential delays or denials of access. Areas of already restricted access may have less than strong encryption, such as a four digit keylock, but doors leading in from a public city street would have higher security restriction such as a keycard, user-specific combination, or both. Devices given to the customer or with internet accessible front-ends are most similar to the public facing front door in the city, with the continuous potential and low barrier to attempted unauthorized access. 2. Any physical device that is released to the public could be examined and rooted. As best practice, device manufacturers should assume that their products will be examined in detail if released, widely available, and technically possible. Thus communication devices like home monitors should never rely on simple common authentication factors, such as a common key, which, once compromised through examination of device memory or other means, negates security of the class of device. Maintaining security of a device that can be examined is very difficult—as an example, Apple Computer, Inc. (Cupertino, CA, USA) has thousands of engineers working on their systems with security a major priority, and typically their devices are compromised within months of release, also termed ‘jailbreaking’. A single error in implementation of a security protocol or access vulnerability can allow unchecked code to run, which allows opening or modification of the entire system. A chain is only as strong as its weakest link. With assumption that protocols and key values will be known, standard computer security best practices can be implemented with strong encryption, non-reuse or absence of master keys, and proper seeding of random cryptographic values can ensure that compromise or system knowledge of a single unit does not lead to compromise of an entire family of devices. These types of security are used to maintain secure communication in other high security enterprises such as banking, and there is no technical barrier to implementation by all major device manufacturers. 3. Any device that has critical functions accessed over an internet-facing or publicly accessible interface must be able to be securely updated or patchable. As software is not likely to be perfect when released, continuous and automated patching of home programmers should take place as vulnerabilities are found as condition of continued interaction with an implanted device, and detected rooted programmers taken offline or completely reinitialized with updates. Similar to other industries, the updates that are issued to a device can verify its non-compromised state and then update to the latest software. In the same fashion, compromised devices can be identified and taken off-line. Additionally, device manufactures may increasingly choose to rely on well-established software technology specifications and suites such as Bluetooth LE (Low Energy) (Bluetooth Special Interest Group, Kirkland, WA, USA) for consumer or inter-device connectivity. These protocols generally are well-examined and compromises are discovered early in their life cycle. In the event that a vulnerability in the underlying protocol is uncovered the ability to patch device using the protocol, possibly without intervention required from the user, is again required. 4. Just as in the physical world, complex security systems often have to choose between increased security and the convenience of clear and facile programming, and ready user access of a device. In the case of the Medtronic 2090 lack of code-signing, the security firms WhiteScope and QED Secure Solutions notified Medtronic, who was aware of the potential for code compromise, but Medtronic felt this was an acceptable risk after evaluation, which required a compromised network with imitation Medtronic servers. As an analogue, somebody could put on a white coat and copied name badge and access numerous somewhat restricted areas of the hospital, but even after notification of an event, putting up a strict badge-check and access control may not be worth overhead and restriction to legitimate users. 5. Aggressive counterattack of individuals or groups uncovering security vulnerabilities is unlikely to be optimal use of resources. With the Muddy Waters incident, it is disappointing that potential compromise of device would be used as an attempted avenue for profit via securities trading and anticipated stock movement, and worrisome that St. Jude was not given forewarning of vulnerabilities. While alerting of manufacturers prior to the public would certainly be expected of reputable security firms, in this case the exact implementation details of the exploits were not made public, and no patients have been reported harmed by the potential compromise, and further detail publicly emerged as a result of defence of legal action brought by the manufacturer. Curiosity and the intellectual challenge of examining and decrypting complex systems will provide continued motivation for all hackers, and the monetary gain from holding companies and patient well-being for ransom, and financial/securities manipulation will lure more malicious or self-serving operators. Aggressive counterattacks may encourage anonymous release of compromise information instead of directly approaching the company, due to fears of legal or criminal retaliation. Focus should remain on the swift remedy of the vulnerability and lessons provided from the exploit. Non-malicious, or ‘white hat’ hackers will generally share the information uncovered with the affected party and allow them to fix the findings, while true ‘black hat’ hackers may keep the exploits uncovered and use them for malicious purposes. In between, are ‘grey hat’ hackers, such as the Muddy Waters firm, who performed financial exploitation, without release of potentially lethal details of the compromise to the public. What should physicians do? There have been no reports of the compromise of any implanted cardiac device being used to harm patients. Using an exploit for patient harm, to destroy or disorganize data, to cause confusion in the medical environment, or other malicious purposes remains highly unethical and illegal under existing laws and regulations. At this point the risks remain largely theoretical and compromises require significant overhead, equipment, knowledge, or expense to implement. However, with increasing ubiquity of devices and technology these barriers may be reduced. Physicians who deal with implanted devices should encourage updates of PSA or remote monitoring software and promptly install software updates and patches, and comply with directives issued by companies, with understanding of the manufacturer-supplied estimates of software update failure. Often the existing risk of device failure or compromise has to be weighed against the risk of upgrade or intervention, particularly when upgrading the implanted device itself. These are typically low for the non-pacemaker dependent or non-critically dependent patient. Since absolute risks of action vs non-action remain extremely low, a great deal of time on the risk of upgrade with the typical patient with an implanted device is generally not needed. If patients are dependent on pacemaker functions of the device, the extremely small chance of failure of update and device entering reversion mode may outweigh the currently exceedingly low theoretical risk of device exploit during the period until device change is indicated. Manufacturers will have to closely ensure that the compromise of consumer protocols do not allow unfettered access to critical device functions and maintain an infrastructure for patching and device updates in order to prevent a potential widespread vulnerability. Conflict of interest: G.S.M. has consulting for Medtronic and Biotronik. He has equity in Grektek, LLC which manufacturers arrhythmia detectors. Footnotes The opinions expressed in this article are not necessarily those of the Editors of Europace or of the European Society of Cardiology. References 1 MedSec St. Jude Expert Witness Report. https://medsec.com/stj_expert_witness_report.pdf (20 September 2018, date last accessed). 2 St. Jude Medical Cybersecurity Update. https://www.sjm.com/en/patients/arrhythmias/resources-support/cyber-update (20 September 2018, date last accessed). 3 US Food & Drug Administration. https://www.fda.gov/ICECI/EnforcementActions/WarningLetters/2017/ucm552687.htm (20 September 2018, date last accessed). 4 US Food & Drug Administration Firmware Update to Address Abbott Cybersecurity Vulnerabilities. https://www.fda.gov/medicaldevices/safety/alertsandnotices/ucm573669.htm (20 September 2018, date last accessed). 5 Department of Homeland Security Cyber Emergency Response Team. https://ics-cert.us-cert.gov/advisories/ICSMA-18-058-01 (20 September 2018, date last accessed). 6 Medtronic Security Bulletin Vulnerability Summary CareLink 2090 Programmer. http://www.medtronic.com/content/dam/medtronic-com/us-en/corporate/documents/REV-Medtronic-2090-Security-Bulletin_FNL.pdf (20 September 2018, date last accessed). Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please email: journals.permissions@oup.com. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - The challenges of implanted cardiac device security: lessons from recent compromises JF - Europace DO - 10.1093/europace/euy264 DA - 2019-04-01 UR - https://www.deepdyve.com/lp/oxford-university-press/the-challenges-of-implanted-cardiac-device-security-lessons-from-53Kr16O1Ge SP - 535 VL - 21 IS - 4 DP - DeepDyve ER -