TY - JOUR AU - Sheppard, Maria, K AB - Abstract mHealth, the use of mobile and wireless technologies in healthcare, and mHealth apps, a subgroup of mHealth, are expected to result in more person-focussed healthcare. These technologies are predicted to make patients more motivated in their own healthcare, reducing the need for intensive medical intervention. Thus, mHealth app technology might lead to a redesign of existing healthcare architecture making the system more efficient, sustainable, and less expensive. As a disruptive innovation, it might destabilise the existing healthcare organisation through a changed role for healthcare professionals with patients accessing care remotely or online. This account coincides with the broader narrative of National Health Service policy-makers, which focusses on personalised healthcare and greater patient responsibility with the potential for significant cost reductions. The article proposes that while the concept of mHealth apps as a disruptive technology and the narrative of personalisation and responsibilisation might support a transformation of the healthcare system and a reduction of costs, both are dependent on patient trust in the safety and security of the new technology. Forcing trust in this field may only be achieved with the application of traditional and other regulatory mechanisms and with this comes the risk of reducing the effect of the technology’s disruptive potential. Disruptive innovation, Health apps, Patient self-management, Safety and security regulation, Transformation of existing healthcare architecture, Trust I. INTRODUCTION Since 2008 when the Apple iTunes Apps Store was launched the number of mobile software applications has expanded exponentially. This has also happened in the healthcare field with both inbuilt smartphone apps or downloaded mHealth apps,1 which include lifestyle and well-being apps and medical (device) apps.2 A figure of 350,000 health apps available worldwide from the major apps stores of Apple, Google, and windows/Microsoft has been cited recently.3 Needless to say that apps generally as well as health apps have evolved to play an increasingly important role in everyday life.4 mHealth apps can be patient/consumer-facing apps or those which target healthcare professionals providing them with quicker access to patient information and triaging, patient monitoring, and medical information.5 This article will focus predominantly on the former apps with their emphasis on patient participation (eg apps supporting self-testing for patients on warfarin, diabetes self-help apps, apps tracking asthma medication, etc) as the apps targeted at healthcare professionals are subject to different legal and ethical considerations.6 mHealth apps and mobile health technologies generally have been predicted to result in more patient-focussed healthcare with medicine becoming more personalised and patients/consumers more self-motivated reducing the need for hospitalisation. Not only would this innovative technology transform people’s lives but potentially lead to a redesign of existing healthcare infrastructures and the role of healthcare professionals, it has the potential of cost savings in healthcare to make it more efficient and sustainable.7 In effect, it has been suggested that this is due to its function as a disruptive innovation, which supports transformation in current healthcare practices while reducing the financial pressure on healthcare systems.8 At the same time, this account coincides with one of the policy narratives of the UK National Health Service (NHS), the narrative of personalised healthcare making patients/consumers more engaged in and responsible for their own health with a potential for a significant reduction in healthcare costs.9 The article proposes that while the concept of health apps as a disruptive technology and the narrative of personalisation and responsiblisation might support a change of the healthcare system and a reduction of healthcare costs, both are dependent on trust, namely the people’s trust in the safety and security of this new technology. The article proceeds as follows: it will discuss the thesis of disruptive innovation as propounded by Christensen in terms of the disruptive potential of ‘health’ apps. Next the overlap of Christensen’s thesis with the policy of personalisation and responsibilisation of the patient in the UK NHS in terms of institutional change and envisaged cost reductions is considered. The article then discusses the current laws and regulations around the new app technology regarding user safety and data security potentially affecting their uptake and suggests that strategies of building patient trust could help neutralise the uncertainties around technological innovation. It concludes that tension exists between the objective of a transformation of healthcare envisaged by both the concept of disruptive innovation and the policies of the NHS and a regulatory environment conducive to encouraging patient trust. There is a need for balance as this trust may be gained at the expense of the forces for change when traditional and other regulatory mechanisms are applied reducing the effect of this technology’s disruptive potential. II. mHEALTH APPS AND THE CONCEPT OF DISRUPTIVE INNOVATION There is a general consensus in the UK as in other developed countries that current healthcare costs are unsustainable. In many developed countries, ageing populations and increasing chronic disease, combined with rising expectations, have led to escalating healthcare costs.10 Coupled with these factors is an expected mismatch between demand for primary care and the supply of doctors with baby boomers entering retirement.11 Efficiencies could be achieved by the opportunities created by new technologies which have a disruptive potential12 while at the same time enabling person-centred care and patient choice leading to a reduction in cost and a reduction in the need for primary care medicine specialists, especially for the treatment of chronic disease.13 The theory of disruptive innovation was developed by Christensen and others in the context of business administration in order to describe how successful products are challenged and replaced by newer and cheaper entrants to a market because existing firms seem unable to respond quickly enough.14 This theory has been transposed to the healthcare sector to help understand what inhibits innovation in the sector where the more dominant players are focussed on creating innovations, which are for more profitable, high-end customers.15 Although not being able to do Christensen’s theory justice in a paper of this length, one of Christensen’s arguments is that ‘existing companies have a high probability of beating entrant attackers when the contest is about sustaining innovations’ but they ‘almost always lose to attackers armed with disruptive innovations’.16 Hence, disruptive innovation occurs when technology makes a more affordable or accessible service available for a new population of consumers at the lower end of the market,17 namely ‘[it] enable[s] a larger population of less-skilled, less-wealthy people to do things in a more convenient, lower-cost setting, which historically could only be done by specialists in a less convenient setting.’18 One might then argue that in the healthcare sector, such disruptive technologies lead to the destabilisation of existing infrastructure and healthcare organisations through a changed role for healthcare professionals with patients accessing care remotely or online helping to reduce the high costs of existing health services. Smartphones and mHealth apps appear consistent with this disruptive innovation model.19 mHealth apps are a subgroup of mHealth, namely mobile technologies such as smartphones, patient monitoring devices, personal digital assistants, and other wireless devices for healthcare purposes.20 The defining characteristic of mHealth is that it is patient-facing; that is, patients interact directly with mHealth hardware and software, frequently without the direct involvement of conventional healthcare providers.21 As pillars for the mHealth paradigm smartphones and mobile apps are emerging as powerful tools for health information transfer, being able to support medical practice, health information delivery, patient screening, monitoring of physiological signs, direct care, and patient education. They could thus support the delivery of high quality and enable more accurate diagnosis and treatment.22 As software equivalents of physical products and services, these mobile apps are aimed at convenient and decentralised care with their potential to replace the healthcare professional with the patient it has been suggested that they are a disruptive innovation.23 These patient-facing mHealth apps are likely to change the traditional face-to-face doctor–patient relationship as the core of medical practice and need to be distinguished from apps, which are aimed at healthcare professionals, nurses, physicians, and carers. Unlike in the current healthcare infrastructure where patients have to visit the healthcare professional, either in a General Practitioner (GP) practice or in a hospital setting, and their data are entered into a database, with mobile patient-facing apps it would be the patient entering the data with no need or a reduced need to visit a fixed healthcare environment. It is likely therefore that these apps will destabilise the healthcare architecture and infrastructure with less location-based interaction between healthcare provider and patient.24 At the same time, they are likely to enable more convenient and cheaper access to healthcare with information and advice immediately and directly available to the patient/consumer. Thus, these ‘apps qualify as disruptive with their initial undershooting, low price, and convenience’.25 They are likely to encourage individuals to take greater responsibility for their health with the ensuing greater emphasis on prevention rather than cure. Technological disruption occurs since through a replacement of ‘the medical professional with a far lower skilled, but much cheaper, caregiver and stakeholder—the patient’.26 In addition, mHealth apps are also disruptive because the app developers are not the current healthcare incumbents but tend to be small start-ups,27 and the incumbents do not own the platforms and app stores on which the apps are built. According to the European Commission’s Green Paper on mobile health, the app market is dominated by individuals or small companies, with 30% of mobile app developers being companies of individuals and a further 34.3% small companies with two to nine employees.28 The Commission supports these app developers through its Digital Agenda for Europe with a series of entrepreneurship initiatives under ‘Startup Europe60’, a platform for tools and programmes for people who want to create web start-ups in Europe.29 With the rapid growth in the numbers of health apps, the question arises whether the disruptive force of the app technology might be stymied in future. The success of disruptive innovation depends on factors such as the regulation encountered by the technology. While the US market for app technology has been growing in a relatively unregulated space,30 this is true to a lesser extent in the European market. Any regulation of new technology can be problematic since traditional regulatory frameworks tend not to fit well but may slow down development.31 Regulatory control of new technologies, such as of their safety and security, may lead to an increase rather than a decrease in healthcare cost, and thereby lose one of the advantages of disruption. Essentially, as will be discussed in more detail below, in the UK the safety regulation for apps depends on whether the app is identified as a medical device in which case it is regulated by the Medical Device Regulation (MDR)32 or whether it is a non-medical app, a distinction which is by no means always clear. Regarding the security regulation for apps, this is regulated under the General Data Protection Regulation (GDPR)33 and the regulation differs between health data and non-health-related personal data it imposes a considerable burden on the small start-up app developer. One might therefore conclude that the legal framework is devised with the incumbent stakeholders rather than small start-ups in mind with new technology adding to the existing regulatory system and controls resulting in increased costs.34 Therefore rules designed to safeguard patients/consumers may have the unintended consequence of maintaining the status quo and hindering lower cost technology from reaching the healthcare sector.35 III. PERSONALISATION AND RESPONSIBLISATION NARRATIVE OF UK POLICY-MAKERS mHealth apps are not only adequate candidates for disruptive theory proponents they also feed into the personalisation and responsibilisation narrative of policy-makers in the UK, concepts not only of healthcare that is tailored to a person’s specific characteristics and to his choices but also of healthcare where more responsibility is given to individuals rather than medical professionals.36 The more person-centred care supported by health apps with the patient’s direct access to healthcare services regardless of time and place is empowering patients and their families to self-care, being suitable to help address both chronic and lifestyle-related diseases.37 The NHS Apps Library, for example, lists apps helping patients taking control of their diabetes, heart disease, asthma, COPD, obesity, and many others.38 Personalisation of healthcare or patient-centred healthcare is a concept supported by UK healthcare policy-makers as giving the user more choice and autonomy over his treatment.39 This is in contrast to the more paternalistic, top-down approach. ‘With personalisation, the focus is on individuals and their idiosyncratic needs, values, and preferences rather than on groups and their common needs.’40 It is a concept also referred to in several government White Papers such High Quality Care for All41 and Liberating the NHS: Greater Choice and Control.42 The former White Paper states that people ‘expect not just services that are there when they need them, and treat them how they want them to, but that they can influence and shape for themselves’,43 whereas the latter describes personalised healthcare as being ‘about engaging people in making choices about how they want to manage their care’.44 The development of health apps falls squarely into the theme of personalised healthcare and self-management of the English NHS. The Green Paper on mobile health by the European Commission also suggests that apps may be one of the tools to contribute to more patient-focussed healthcare which supports the changed role of patients from a passive to a more participative role.45 Thus, technology-based solutions are expected to support self-management improving people’s well-being and quality of life and can contribute to the empowerment of patients as they can manage their health more actively.46 Empowerment, in turn, has been defined as ‘the discovery and development of one’s inherent capacity to be responsible for one’s own life … as helping patients discover and develop the inherent capacity to be responsible for [their] own life’.47 At the same time, personalisation of healthcare or involving patients in the choices to be made regarding their treatment illustrates the double-edged character of this development—on the one hand, individuals are accorded more power, on the other, they are obliged to take responsibility for that choice and its outcomes.48 Policy-makers in return for facilitating greater choice and personalised healthcare support the idea of self-management by holding patients accountable for their choices. In this light, the government White Paper Liberating the NHS: Greater Choice and Control describes the concept of ‘responsibilisation’ as patients taking more responsibility for their health and treatment choices49 and building ownership of, and a shared responsibility for, managing their conditions, especially when lifestyle changes may be needed.50 Likewise, the Government White Paper High Quality Care for All speaks of patients who are empowered by choice as being more likely to take responsibility.51 This emphasis on assuming responsibility for the management of one’s health and healthcare is also encapsulated in the NHS Constitution for England: ‘Please recognise that you can make a significant contribution to your own, and your family’s good health and well-being and take some personal responsibility for it.’52 The personalisation agenda therefore seems to promise social justice by ‘liberating service-users from top-down, paternalistic control, transforming them from passive recipients of ‘care’ into active agents of their own well-being.’53 This narrative of policy-makers of the patients’ responsibilisation is only to some extent grounded in the idea of individuals actively managing lifelong health and wellness. Rather, it can also be explained in the light of the political concerns of how the escalating costs of the NHS which are still at the forefront of the political debate are to be managed.54 The responsibilisation of the patient can thus also be viewed as a strategy with specific political objectives, to reduce costs in the NHS. Changing the role of patients from a rather passive to a more participative role and making them to take responsibility for their health, it is thought, will reduce hospital admissions of patients with long-term chronic conditions and might therefore result in lessening the cost of NHS healthcare.55 The narrative of personalisation and responsibilisation is not only a strategy of fiscal prudence but may also be used by policy-makers as a strategy of destabilisation to disrupt the entrenched institutional architecture of the NHS and to encourage reform.56 It might even be argued that a narrative of personalisation and responsibilisation of the empowered patient/consumer has a greater chance of garnering consensus around governments’ reform programmes lending them legitimacy.57 Just as mHealth apps support the model of disruptive innovative technologies, they could also be viewed as coinciding with the personalisation and responsibilisation agenda by policy-makers; they also shift responsibility to the patient thus potentially reducing the costs of healthcare and leading to a disruption of the structure of the NHS. Whether considered from the perspective of health technologies or the policy narrative used by policy-makers mobile health technologies such as health apps can thus be regarded not just as ‘add-ons’ but as enablers of real system change.58 This theory is also expressed in the European Union (EU) Green Paper on mobile health, namely that by empowering the patient and the consequential shift towards more patient-centric care a redesign of existing infrastructures and healthcare organisations currently organised around healthcare professionals may be required.59 Not only would their role change as they may have to monitor patients remotely but other factors further encouraging system change would be the rapid advance in mobile technologies and apps, and the increase in opportunities for the integration of mobile health into existing eHealth services, with the continued growth in coverage of mobile cellular networks.60 There is clearly a need for change because health systems are under increasing pressure to perform under multiple health challenges, chronic staff shortages, and limited budgets so that the potential benefit from the opportunities mobile technologies represent and at relatively low cost is welcome.61 Whether or not the desired system transformation will occur on the basis of disruptive innovation depends on the success of the new technology. As already mentioned above, the success may be hindered by the regulation encountered by the new technology. Two examples of how health app technology may be hampered by regulation are the legal rules concerning their safety and security which may slow down adoption in the healthcare sector. Both the regulation on safety and security are governed by EU regulations and directives transposed into English national law. IV. THE REGULATION OF HEALTH APPS AND THE IMPLICATIONS FOR SAFETY Whether or not a mHealth app is subject to an extensive legal framework depends on its classification as an active medical device. Only this classification brings the specific safety and performance requirements of medical device law into play such as the involvement of notified bodies and the certifcation mark (CE marking) declaring that the product meets EU standards for health, safety and environmental protection. Apps designated as wellness or fitness apps—which is the majority—on the contrary are not subject to the same requirements. It is further debatable whether the latter are covered by any specific safety and performance regulations at all. This is because software wellness or fitness apps are unlikely to be governed by the General Product Safety Directive (GPSD),62 which targets products.63 Specifically, the GPSD establishes a general obligation for manufacturers to ensure that their products, which do not fall within the scope of sectoral legislation or national regulations covering product safety, are manufactured in compliance with its safety requirements.64 The directive appears to exclude software by its reference to tangible products such as product characteristics, product composition, packaging, instructions for assembly,65 producers, manufacturers, and distributors.66 According to the Commission Staff Working Document on the Existing EU Legal Framework Applicable to Lifestyle and Wellbeing Apps, the GPSD applies to manufactured products and is therefore not clear whether it applies to lifestyle and well-being apps.67 Even if the GPSD does apply to apps, the level of safety requirements remains unclear. Under the GPSD, a safe product is interpreted as one that ‘under normal or reasonably foreseeable conditions of use does not present any risk or only the minimum risks compatible with the product’s use, considered to be acceptable’. Yet, it is uncertain what would constitute an acceptable risk regarding normal product usage. Furthermore, it is not clear if and to what extent lifestyle and well-being apps could pose a risk to citizens’ health,68 with one of the main objectives of the GPSD being to ensure the protection of the health and safety of consumers.69 While the lack of safety regulations for software as wellness and fitness apps turns on the definition of the term ‘product’, medical device law recognises specific standalone software as a medical device. Even prior to the advent of apps, the original Medical Device Directive (MDD) (93/42/EEC) mentioned software and defined a medical device as being hardware with or without a software element.70 More recently, the EU MDD classified certain standalone software as an active medical device: any instrument, apparatus, appliance, software, material or other article, whether used alone or in combination, including the software intended by its manufacturer to be used specifically for diagnostic and/or therapeutic purposes and necessary for its proper application…71 (emphasis added) Under the new MDR,72 applicable from 26 May 2020, this classification has undergone little change regarding the inclusion of software in the definition of medical device. The MDR, however, does include implants and reagents as additional devices, which were formerly covered by separate directives.73 A major conundrum is the fact that the legal distinction between a medical device app and lifestyle or well-being app is not easy to make. The MDD amending Directive Council Directive 93/42/EEC of 14 June 1993 concerning medical devices emphasises two aspects in its definition of software as a medical device,74 which the MDR largely follows.75 Under both sets of rules, the software, ie the health app must be intended by its manufacturer to be used specifically for diagnostic and/or therapeutic purposes and it must be necessary for its proper application, intended by the manufacturer to be used for human beings for one or of the following purposes (with a slight extension of the purposes under the MDR displayed in bold): — diagnosis, prevention, monitoring, prediction, prognosis, treatment or alleviation of disease,76 — diagnosis, monitoring, treatment, alleviation of or compensation for an injury or handicap (disability), — investigation, replacement or modification of the anatomy or of a physiological process or state, … and which does not achieve its principal intended action in or on the human body by pharmacological, immunological or metabolic means, but which may be assisted in its function by such means. There are therefore two essential elements to the definition of standalone software as a medical device, namely the objective element of the medical purpose that the device needs to fulfil and the subjective element of the manufacturer’s intent to produce a device to serve a medical purpose. It follows that any software which does not fulfil these two elements would be classified as a mere consumer product and not a medical device. This is also supported by the MEDDEV guidance 2.1/677 which refers to the MDD, stating that ‘software in its own right, when specifically intended by the manufacturer to be used for one or more of the medical purposes set out in the definition of a medical device, is a medical device. Software for general purposes when used in a healthcare setting is not a medical device’.78 Regarding the objective element of the medical purpose, the software has to show the list of medical purposes is quite broad and allows different interpretations in individual cases. However, MEDDEV and the UK the MHRA guidelines,79 although generally considered non-binding,80 provide assistance in making the distinction between a medical app or device and a non-medical or wellness and fitness app. Both provide a decision tree with examples to aid developers. Accordingly, an app is most likely not a medical device if it is for patient education; monitors fitness/health/well-being; is for professional medical education; stores or transmits medical data without change; if it only has administrative function, eg software that is used to book an appointment, request a prescription, or have a virtual consultation; software that provides reference information to help a healthcare professional to use their knowledge to make a clinical decision; and is for storing data as databases or simple just data.81 MEDDEV provides some examples of software which could be considered a medical device, namely software intended ‘to create or modify medical information or, if any modifications of the medical information are made, to facilitate the perceptual and/or interpretative tasks performed by the healthcare professional’. Software for the benefit of individual patients if intended ‘to be used for the evaluation of patient data to support or influence the medical care provided’ to that patient could be medical software but not if it ‘refers to aggregate population data, provides generic diagnostic or treatment pathways, scientific literature, medical atlases, models and templates as well as software for epidemiologic studies or registers’.82 A borderline manual serves as a further tool for case-by-case applications of community legislation by the Member States.83 However, although the Member States themselves determine the classification of an app as a medical device, the EU Commission can on request by a Member State decide after consultation with the Medical Device Coordination Group whether a specific device (or app) falls within the medical device legislation.84 It becomes apparent that with many apps currently available and designed for a wide range of health-related purposes, including diet and exercise, pregnancy, mental health, and self-monitoring of conditions, there will be considerable uncertainty in the definition. While apps would not be classified as medical devices if they do not perform an action on data, and hence would not be subject to any safety regulations, they do target users’ health and may provide incomplete, misleading, or wrong advice.85 A Dutch study found that approximately 20% of all apps investigated by the authors should have been classified as medical devices using the decision rules of the current MDD because they dealt with medical decisions or measurements.86 Further ambiguity may also arise when during a regular update apps that were not originally classified as medical device ought to be re-classified because of the addition of new features. This reclassification would then require the involvement of notified bodies to obtain market authorisation for the updated app. The second component necessary for the determination of standalone software as a medical device is the subjective element, namely the manufacturer’s or app developer’s intent. The subjective intent is relevant for the definition of a medical device under both the MDD and the MDR. However, with the MDR, there may arguably be a reduced emphasis on the app developer’s intent. This is because the wording used by the manufacturer in the clinical evaluation,87 rather than only the data supplied on the label, in the instructions for use or in promotional or sales materials or statements as with the MDD,88 is included in the assessment of intended purpose. Unsurprisingly, the subjective element as part of the definition of standalone software as a medical device has come under scrutiny by the Court of Justice of the European Union (CJEU). The case of Brain Products89 concerned the product ActiveTwo, a biopotential measurement system manufactured by BioSemi that recorded electrical signals from the brain, heart, and muscles. According to BioSemi, their product was not designed for use in any medical diagnosis or treatment but was used by researchers carrying out clinical investigations, particularly in the cognitive sciences. BrainProducts marketed a similar product for clinical purposes which had a CE marking in accordance with the MDD. It alleged that, regardless of its intended use, the system manufactured by BioSemi must be regarded as a medical device for the purposes of the directive and, accordingly, must be certified as such. The CJEU determined that the fact that a device not explicitly conceived as medical can be put to a medical use is not sufficient to attribute a medical device status as the manufacturer’s explicit intent remains a leading factor.90 Thus, although the system was capable of recording electrical signals from the human body, such as in electroencephalograms (EEGs) and electrocardiograms (ECGs), and such measurements are frequently taken in a healthcare context the product in question was not designed for the medical sector. Further, the related promotional material explicitly stated that it is not designed to be used for diagnosis and/or treatment and, therefore, did not require certification.91 It should be pointed out that the CJEU’s determination contrasted with the prior opinion of the Advocate General who had suggested that even if the information provided by the manufacturer is the key factor in determining whether a product is intended to be used for a medical purpose, any product which, by its very nature, is clearly intended to be used solely for a purpose of a medical nature will have to be regarded as a medical device.92 The problem the Advocate General seemed to have had in mind is that unscrupulous app developers could make the disclaimer that apps are not intended for medical use despite their apparent medical purpose in order to avoid the stricter regulations of their products under the MDD 2007/47/EC.93 Since medical device law is harmonised, EU legislation valid in all Member States excluding the applicability of medical device law by using a disclaimer as to medical purpose would enable considerable cost savings. Specifically, there would then be no need for the prerequisite CE marking under the MDD. To what extent the caveat pronounced by the Advocate General that the manufacturer must not act arbitrarily when using the disclaimer94 might become an issue in future remains to be seen. The role of CE marking under the MDD was a consideration in the case of Snitem.95 The CJEU reiterated that software only constitutes a medical device for the purposes of the MDD if it satisfies the two cumulative conditions,96 the objective element regarding the medical purpose and the subjective element, as set out in the Brain Products case. The CJEU further stated that once a product which had received the CE marking no further certification in any Member State was necessary because once the CE marking of conformity has been obtained, the product, having regard to that function, may be placed on the market and circulate freely in the EU without having to undergo any additional procedure.97 Interestingly, the CJEU referred to their interpretation of the MDD as being based on MEDDEV,98 thus in effect elevating these guidelines to legal principles. Accordingly, the CJEU found that these guidelines indicate that software also constitutes a medical device where it is intended to create or modify medical information, in particular, by means of calculation, quantification, or comparison of the recorded data against certain references, in order to provide information about a particular patient.99 In contrast, software that only performs an action limited to storage, archiving, or simple search and functions as a digital library and makes it possible to find information from metadata, without modifying or interpreting it, should not be considered a medical device.100 What is further interesting in Snitem is the reference to software modules which are medical devices and others that are not. In that regard, the Commission’s Guidelines, mentioned in paragraph 33 of the present judgment, confirm in essence, in Title 4, entitled ‘Modules’, that, where software is composed of modules which satisfy the definition of the concept of ‘medical device’ and others not, only the former must bear the CE marking; the others are not subject to the provisions of the directive. Those guidelines state that it is the responsibility of the manufacturer to identify the limits and interfaces of the different modules which, in the case of modules subject to Directive 93/42, must be clearly identified by the manufacturer and based on the use which will be made of the product. A. Classification of Devices and CE marking Generally, by affixing the CE marking to a product, a manufacturer declares that the product shows conformity with European Regulations101 and can be sold throughout the EU. CE marking signifies that a product has been assessed to meet high safety, health, and environmental protection requirements. The safety requirements for active medical devices generally encompass a need for clinical performance studies as evidence that their safety and performance are proportionate with the risk associated with a given device. Since different medical devices are deemed to pose a higher risk than others, the range of devices is split into one of the four classes (Classes I, IIa, IIb, and III with Class I referring to the lowest risk category).102 The conformity assessment procedures for Class I devices can be carried out under the sole responsibility of the manufacturers in view of the low level of vulnerability associated with these devices, whereas any other class requires the involvement of a notified body.103 Classification is decided upon by establishing the applicable rule within Annex IX of the Council Directive.104 Under the current MDD, most mHealth apps defined as medical devices qualify as Class I devices, and as a consequence do not need to be certified by a notified body which may potentially affect patient safety. However, the new MDR has consequences for the risk classification of medical device apps as they will be classified in risk classes based on different rules with many of the apps that are considered medical devices to be classified in higher risk classes.105 Self-certification of medical device apps by developers will therefore become less common and notified bodies will need to be involved to conduct the assessment of the apps. The classification rules applying to software intended to provide the information which is used to take decisions are contained in Annex IIIV of the MDR. The correct classification of a medical device app under these rules is of paramount importance. Rule 10 of the annex provides that if they are intended for diagnosis and monitoring, they are classified as Class IIa unless ‘they are specifically intended for monitoring of vital physiological parameters and the nature of variations of those parameters is such that it could result in immediate danger to the patient … in which cases they are classified as class IIb’.106 Where apps are intended to provide information which is used to take decisions with diagnostic or therapeutic purposes and such decisions have an impact that may cause death or an irreversible deterioration of a person’s state of health, they are grouped in Class III; or where the decision leads to a serious deterioration of a person’s state of health or a surgical intervention, they are grouped under Class IIb.107 Under these rules therefore not only are apps much less likely to fall under the self-certified risk Group I but apps with the most serious potential consequences may even be classified in Group III. Thus, regarding patient safety the difficulties concern not only the classification of an app as a lifestyle/general purpose app or a medical device app and, if classified as the latter, the complexities of objective medical purpose and the subjective intent of the developer but also the analysis of the risk group in which the app is to be placed and whether self-certification is sufficient or a notified body will need to be involved. In contrast, as will be discussed next, matters regarding the security of app data are subject to different considerations and do not turn on whether the app is a general-purpose or medical device app but depend on the type of data involved. V. THE REGULATION OF SECURITY As the EU Commission’s Green Paper on mHealth acknowledged already in 2014, the rapid development of mHealth apps raises concerns about the appropriate processing of the data collected through apps by app developers, health professionals, public authorities, etc.108 Health apps whether for a general or a medical purpose collect large quantities of information and process them. App users are rightly worried about the risks posed to their information, such as unwanted sharing with third parties since this information will in many instances consist of personal data relating to a person who is directly or indirectly identified or identifiable. The processing of data concerning health is particularly sensitive and therefore requires special protection to ensure its security. Data protection law under the GDPR109 is summoned to regulate not the presence and the use of health apps but the personal data processing performed by those apps. In order to help increase and promote trust amongst users, a Privacy Code of Conduct on mobile health apps (the Code) was facilitated by the EU Commission in 2016 which is still in draft form as it has not yet been approved.110 Because of the sensitivity of health data, the Code will apply the criteria of the GDPR such as user consent, data minimisation, privacy by design and by default, etc. A. Processing of Personal Data The purpose of the Code is to foster trust amongst users of mHealth apps as they process personal data that include data concerning health. Therefore, an mHealth app must provide users with clear and prominent information about how their data will be used to help them make informed decisions prior to using an app. The GDPR itself applies to all types of apps involving data on whether or not the app includes reference to data concerning health or other types of personal data. In accordance with the Code, personal data include information on the user such as their name, device identifiers, location data, and any other information relating to an identified or identifiable natural person, whereas the health data must be understood as any personal data related to the past, present, or future physical or mental health of an individual, including the provision of healthcare services, which reveal information about a person’s health status.111 Lifestyle data, such as data on an individual’s habits and behaviour that do not inherently relate to that individual’s health, are not necessarily considered data concerning health. Difficulties of interpretation are likely as lifestyle data can be qualified as data concerning health when it has a clear and close link to the person’s health status.112 In the context of the GDPR, app developers may experience difficulties in determining whether their app processes merely lifestyle data or data concerning health. Thus, unlike with respect to app safety where the developer needs to determine whether the health app is a lifestyle or a medical device app in order not to avoid contravention of the medical device legislation, with respect to app security an app correctly identified as lifestyle app may still process health data. An example would be a lifestyle app collecting data such as the heart rate or the pulse of an app user. Contrarily, if an app is correctly identified as a medical device app, it will not solely contain lifestyle data as by definition the intended purpose of the app would then be medical and require health data input. The GDPR clearly places the emphasis on the increased rights of the individual and on the obligations on the controllers113 and processors,114 thus enhancing data security for the end-user. In contrast, the medical device legislation errs in favour of the app developer with its emphasis on the medical purpose of the app and subjective intent of the app developer with a potentially negative impact on the safety of the end-user. Furthermore, unlike with the more restrictive regulatory framework around medical devices, the GDPR applies whether or not the data concern health or other types of personal data. Thus, data protection law must be respected whatever type of personal data are being processed. Such processing of personal data must be carried out subject to the principles of purpose limitation, data minimisation, accuracy, storage limitation, integrity and confidentiality, and accountability.115 In particular, the GDPR read together with the Code emphasises the need for app developers to follow certain principles when designing apps, including data protection by design and by default.116 The former means that that the privacy implications of the app and its use have been considered at each step of its development117 and the latter that wherever the user has a choice with respect to the processing of the data but does not take any action to express a preference, the app developer has by default pre-selected the least privacy-invasive and compliant choice.118 The GDPR prohibits the processing of personal data unless one of the exceptions under Article 9 applies.119 In this respect, with regard to apps, one of the most important subsections is the exception under Article 9(2)(a) which deals with the need for the data subjects’ explicit consent to the processing of their personal data. App developers must ensure that the consent is for one or more specified purposes and the personal data collected should be for specified, explicit, and legitimate purposes.120 Furthermore, the request for consent must be concise, transparent, intelligible, and in an easily accessible form, using clear and plain language.121 Individuals’ consent must be freely given, specific,122 and they have the right to withdraw consent at any time.123 Regarding health data as an especially sensitive category of personal data, the legal requirements are more stringent124 with the processing of health data according to GDPR Article 9(2)(h) being only permissible in the medical context as, for example, where it is necessary for the purposes of preventive medicine, for medical diagnosis, the provision of healthcare, or the treatment or the management of health.125 The subsection must be read together with Article 9(3) that the data are processed by or under the responsibility of a professional subject to the obligation of professional secrecy. When this is not the case and the health data are either not processed by a professional or under the responsibility of a professional bound by professional secrecy, consent by the data subject is needed for data processing not to be prohibited. A problem that may arise for app developers is to know beforehand whether the health data will only be processed by a professional or under the responsibility of a professional subject to secrecy. Where health data are stored on a consumer’s smartphone or on a server, not under the healthcare professional’s responsibility, a very likely scenario for many apps including those that are medical devices, the consumer’s consent will still be required. Where the data subject’s consent is therefore needed, the app developer will also need to consider the articles of the GDPR relating to consent, including the principles relating to the processing of personal data (Article 5), the conditions under which the processing is lawful (Article 6), and the conditions for consent (Article 7). Furthermore Articles 12 and 13 apply regarding the type of information which needs to be provided by the controller to the data subject (Article 13) and regarding the quality of the information to be provided about the data subject about his or her rights (Article 12). Consent to the processing of their personal data given by data subjects does not necessarily make the processing lawful when they are unaware of what happens to their personal data.126 Although in his empirical research on health and general apps Mulder found that consent was usually offered in an intelligible and easily accessible form using clear and plain language and the request for consent was presented in a manner clearly distinguishable from other matters in accordance with Article 7, this may in practice not always be the case.127 Even if the data are data concerning health and the explicit consent of the data subject is not needed, the provision of Articles 12 and 13 still apply. However, according to Mulder’s research, most apps examined did not comply with these articles.128 Thus, the apps had privacy policies which were too long, on average over 3,500 words, with only one app being compliant with the controller providing information in a concise and transparent form. Particularly noticeable in this respect was the frequency of words such as ‘may share data’ in these policies.129 Furthermore, none of the privacy policies complied with all the requisite conditions of Article 13 in conjunction with Article 5 regarding purpose limitation of the processing of private data as the clarity of purpose expressed was often very vague.130 The research concludes that although data subjects tend to be informed of their rights by app developers, they are not able to find out where their data are processed nor the exact purposes of the processing, thus making the processing of the data unlawful in most apps.131 Potential infringement of the GDPR whether concerning health or other types of personal data by developers of apps is therefore even more likely as an infringement of the medical device legislation. Despite this legal and regulatory framework, mHealth apps whether concerning health data or more personal type data are clearly not as safe132 or as secure133 as would be desirable. The recent safety issues with the apps of Babylon Health134 and the potential security issues regarding the patient data collected with the Streams app after the sale of Deep Mind to Google135 are evidence of this. One of the greatest concerns in this field is the question of trust by the public in this innovative technology which will be discussed next. VI. HEALTH APPS AND THE QUESTION OF TRUST Support for self-management has been seen as a means of improving healthcare while limiting health service costs.136 Contrary to expectation, the introduction of mobile apps has, however, so far not met with complete success despite mobile health combining the decentralisation of healthcare with patient-centredness.137 This can possibly be partly explained due to a resistance to new technologies but may also be a question of trust in this new technology. Trust is crucial in situations where there is uncertainty or undesirable outcomes are possible.138 Such uncertainty might be due to potential safety or security issues with health apps. In any case, it is not surprising that the stated purpose of much of the regulation around the safety and security of health apps is declared to be the fostering of patient trust.139 In healthcare, trust is imperative. There is an extensive number of definitions of trust in different domains from institutional, interpersonal trust, or trust in technology.140 As Söderström and others state that patients must trust clinicians and their ability to make correct judgments about diagnosis and treatment. However, in the healthcare setting, trust must also include the patients’ trust in the organisation as well as trust in devices or objects.141 Therefore, trust does not need to be located in interpersonal relations which would be difficult regarding online technologies.142 Van den Berg and Keymolen speak of trust being closely connected to having positive expectations about the actions of others.143 It can be defined as reliance upon another, and in the domain of technology, it can also include situations where one relies on an object in situations where one is or has to make oneself vulnerable.144 According to Luhman, trust can be defined as having confidence in somebody or something, or Vertrauen, the German term employed in his writing.145 Specifically, Luhman views trust or Vertrauen as a blending of knowledge and ignorance; it is acting as if the future is certain. As we are aware of the unpredictability of the future, we need trust in order to set aside some of these uncertainties and to act as if we know for certain what the future will bring.146 Therefore, one of the functions of trust, whether reliance on another person or object, is to neutralise uncertainty; it is a strategy to deal with complexities inherent in life.147 According to McKnight, trust in information technology has interesting implications: first, it should influence use or adoption of a technology (trusting intentions), and secondly, it is a general assessment of the technology in terms, for example, of its relative advantages or usefulness which may then influence beliefs and attitudes that affect one’s intentions to continue to use a technology (trusting behaviour).148 Hence, regarding health app technology, trust can be regarded as essential for patients to be willing to adopt despite likely uncertainties regarding its safety and security. Without an intention to trust, the uptake of technological innovation becomes difficult or even impossible as the patient has to believe that the technology will perform as envisaged. Likewise, unless patients have a willingness to exhibit trust by continuing to rely on health app technology after assessing its benefits and usefulness, technological innovation will not be successful. To summarise, trust is needed under uncertainty, and uncertainty is a given with any innovation.149 Contrariwise, innovation suffers under distrust150 whether this relates to trusting intention or trusting behaviour. However, there are also limits to trust. According to van den Berg and Keymolen, this is because trust is only one effective strategy to cope with uncertainties but it is not the sole strategy to reduce complexity.151 There are other factors and strategies which aid in the reduction of complexity and uncertainty. Laws and regulations, potentially anathema to the success of disruptive technologies, do play a part in enhancing certainty and predictability both as regards the security and safety of technology. For adoption and continued use of innovative technology in the healthcare sector, safety and security laws and regulations are key factors providing a safeguard for patients and enabling patient trust.152 In this light, laws and regulations could then be seen as building confidence and providing a reason for the suspension of the uncertainties characteristic for trust.153 The tension caused by the uncertainties around disruptive innovation may be reduced with the aforesaid strategies. However, they may also hinder the successful introduction of these innovative software products with their potentially destabilising effect on the healthcare architecture and infrastructure while at the same time making patients more responsible for their own healthcare. The question of public trust is a major consideration in the regulation, accreditation, and kitemarking of apps by the NHS.154 Although a Health Apps Library pilot was launched in 2013 to pilot and recommend apps, the Health App Library was withdrawn in 2015 after security issues were discovered due to the leaking of data which were sent without encryption.155 A White Paper published in 2018 confirmed the critical need to build and maintain public trust: ‘We need to maintain a safe and secure data infrastructure that protects … patients and the public … It is critical that we maintain public trust in how we hold, share and use data.’156 A new apps library has been established which corresponds with these principles. It claims to help users to find trusted health and well-being apps that have been assessed to be clinically safe and secure to use, to help to ensure that they are better able to take an active role in managing their own mental and physical health. Most importantly, not only medical device apps but also lifestyle apps need to meet standards higher than those required by the current legal framework, including evidence of clinical safety, security, and technical stability.157 This means, for example, that no app will be endorsed that can cause harm, for instance, by miscalculating a drug dose or giving incorrect medical advice. They must also be grounded in the best and most up-to-date knowledge, derived from research, clinical experience, and patient preferences.158 Likewise, for an mHealth app to be accepted into the Library, the capture and handling of personal data have to be carried out legally and securely, the end-users have to be able to understand what the app will do with any data they provide. The end-users also have to be able to give ‘informed consent’ to the use of their personal data. Specific security standards to be met include a number of checks showing the processes and architecture of the app are secure. This applies to the collection, transmission, and storage of user data.159 Compliance by app developers with these higher regulatory mechanisms and standards is likely to increase public trust in the apps but to the detriment of the technology’s disruptive potential. It should therefore also not be surprising that as a consequence less than 100 mHealth and lifestyle apps have so far been approved. The NHS Digital Programme Director for apps and wearables, Hazel Jones, admitted that the assessment process for apps was lengthy as app developers often needed to remediate the apps to achieve the desired standards.160 VII. CONCLUSION It has been shown that patient-facing mHealth app technology has the potential to lead to a redesign of the existing healthcare infrastructure of the NHS leading to considerable cost savings by making the system more efficient and sustainable. This is because health apps result in more person-focussed healthcare by supporting patients to become more involved in their self-care requiring fewer visits to GP surgeries or hospital and reducing the need for more intensive medical intervention. By encouraging individuals to take greater responsibility for their own health, they put greater emphasis on prevention rather than cure potentially reducing healthcare costs long term. As has been argued, health app technology could act as disruptive innovation supporting destabilisation of the existing healthcare organisation through a changed role for healthcare professionals. Patients will increasingly access care remotely or online so displacing the traditional face-to-face doctor–patient relationship at the core of current medical practice. This account coincides with the broader narrative of NHS policy-makers which focusses on personalised healthcare giving patients more choice and autonomy over their treatment while at the same time obliging them to take responsibility for that choice and its outcomes. This policy narrative, as has been demonstrated, can also be explained as a strategy with specific political objectives, to reduce the costs in the NHS. Changing the role of the patient from a rather passive to a more participative role will aid this endeavour. In effect, this narrative may be used by policy-makers as a strategy of destabilisation to disrupt the entrenched institutional architecture of the NHS and encourage reform. For this innovative healthcare technology to be successful in transforming the healthcare system and reducing overall costs, patients need to be able to trust in the safety and security of the new technology. As has been shown, the current legal and regulatory framework for mHealth apps is burdened with considerable problems not conducive for building and maintaining patients’ trust. The majority of apps currently on market are designated as lifestyle apps rather than medical devices so that their safety profile is not covered by any legal regulations. In addition, it is likely that apps which ought to be classified as medical devices rather than lifestyle or general-purpose apps will escape the stricter definition and therefore the need to comply with the rules for CE marking, involvement of notified bodies, and clinical safety evidence. This is because for classification as a medical device, the legal regulations emphasise the existence of an objective medical purpose. Even more demanding for the classification is the existence of the subjective intent by the app developer, namely the app to be intended specifically for diagnostic and/or therapeutic purposes. With respect to app security, the GDPR together with the Code, as will have become apparent, contain some grey areas with ensuing legal uncertainty. All apps, both lifestyle and medical device apps collect large quantities of end-user information, but the GDPR displays considerable complexity as to what constitutes lifestyle data and what constitutes data concerning health as well as to the processing of such data. As patient trust is imperative in healthcare, any new mHealth technology with a safety and security regime burdened by legal uncertainty will be hindered in its adoption and/or continued use. As has been demonstrated, one of the functions of trust is to help neutralise the uncertainty inherent in any innovation. Other factors that can reduce complexity and uncertainty are laws and regulations as they play a part in enhancing the predictability of the new technology. In turn, they enable patient confidence or trust. It is this question of patient trust which has propelled the standards set for the NHS App Library which are higher than those required by the current legal and regulatory framework. This higher standard extends to lifestyle apps which are also subject to an assessment of clinical safety. Furthermore, apps will only be accepted into the Library if the capture and handling of personal data will be carried out legally and securely with security standards meeting a number of checks regarding collection, transmission, and storage of user data. However, forcing trust by applying these regulatory mechanisms may carry the risk of reducing the technology’s possible disruptive effect. Thus, despite its avowed potential to disrupt NHS healthcare architecture and to aid fiscal prudence, the different regulatory mechanisms aimed at supporting patient trust may in the end impede the success of this technology. Clearly, a certain balance ought to be struck between enabling patient trust and encouraging innovation. Whether a more unregulated space is desirable for innovative technology is, however, also a question of one’s world view. Acknowledgement The author would like to thank the anonymous reviewers for their thoughtful comments and efforts towards improving this manuscript. Footnotes 1 The term mHealth app or health app is used in this article to refer to all health-related apps in the widest sense, including well-being and fitness apps and apps classified as medical devices. The term medical app or medical device app refers only to apps classified as medical devices. 2 S Castle-Clarke and C Imison, The Digital Patient: Transforming Primary Care? (Nuffield Trust 2016). 3 T Mulder, ‘Health Apps, Their Privacy Policies and the GDPR’ (2019) 10 EJLT 1, 2. 4 For example, M Szczepański, European App Economy: State of Play, Challenges and EU Policy (European Parliament 2018) last accessed 12 December 2019. 5 An example of a physician-targeted app is the Streams app, see Linklaters LLP, Audit of the Acute Kidney Injury Detection System Known as Streams, The Royal Free London NHS Foundation Trust (Linklaters LLP 2018) reporting on DeepMind’s Streams apps for the detection of acute kidney injury last accessed 20 September 2018; see also DH McKnight and others, ‘Trust in a Specific Technology: An investigation of Its Components and Measures’ (2011) 12 TMIS 1. 6 See, for example, CL Ventola, ‘Mobile Devices and Apps for Healthcare Professionals: Uses and Benefits’ (2014) 39 P&T 356. 7 See, for example, European Commission, Green Paper on Mobile Health ("mHealth") (European Commission 2014) last accessed 25 January 2020. 8 CM Christensen, R Bohmer and J Kenagy, ‘Will Disruptive Innovations Cure Healthcare?’ (2000) 78 Harv Bus Rev 102, 104 defining disruptive innovation as an innovation which is cheaper, simpler, more convenient, and aimed at the lower end of the ‘market’; see also CM Christensen, JH Grossman and J Hwang, The Innovator’s Prescription: A Disruptive Solution for Health Care (McGraw-Hill 2009). 9 European Commission (n 7) 5; see also K Veitch, ‘The Government of Health Care and the Politics of Patient Empowerment: New Labour and the NHS Reform Agenda in England’ (2010) 32 Law & Policy 313. 10 K Garrety, I McLoughlin and G Zelle, ‘Disruptive Innovation in Health Care: Business Models, Moral Orders and Electronic Records’ (2014) 13 Soc Pol Soc 579, 579; see also Department of Health and Social Care, The Government’s Mandate to NHS England for 2018-19 (The Stationery Office 2018) 10 stating that ‘The escalating demands of ill health driven by our lifestyles also threaten the long-term sustainability of the NHS.’ 11 A Majeed, ‘Shortage of General Practitioners in the NHS’ (2017) 358 BMJ 3191; Department of Health, General Practice Forward View (The Stationery Office 2016) last accessed 22 October 2019. 12 TS Bodenheimer and MD Smith, ‘Primary Care: Proposed Solutions to the Physician Shortage Without Training More Physicians’ (2013) 32 Health Aff 1881, 1881; Garrety, McLoughlin and Zelle (n 10) 580. 13 see the NHS Apps Library, eg of patient-facing apps potentially leading to a reduction in the need for visits with primary care specialists, eg ‘engage warfarin self-care’ app which enables the patient to self-test safely at home, reducing the need for regular clinic visits by the patient and making the results available to the care team immediately; ‘one touch reveal’ app which together with a smartphone provides the patient with notification about high or low blood sugar patterns to help avoid them in the future and which allows sharing of data with the physician; ‘my mhealth: myAsthma’ app helps the patient to better manage her condition by perfecting her inhaler technique, tracking the medication and understanding the treatment; ‘my mhealth: myCOPD’ app which helps patients with chronic obstructive pulmonary disease to better manage their condition so reducing exacerbations; ‘my mhealth: myHeart’ app which provides the patient with self-help tools to record her symptoms and medication and also allows remote monitoring and tracking to reduce the need for elective treatment if the condition is stable; and ‘pathway through pain’ app which helps the patient to take a new approach to pain recovery activities and uses pacing techniques to avoid pain flare-ups last accessed 2 April 2020. 14 Garrety, McLoughlin and Zelle (n 10) 580; Christensen, Bohmer and Kenagy (n 8) 105. 15 Christensen, Bohmer and Kenagy, ibid. 16 CM Christensen, The Innovator's Dilemma: When New Technologies Cause Great Firms to Fail (Harvard Business School Press 1997) xv. 17 Christensen, Bohmer and Kenagy (n 8) 105. 18 ibid. 19 NP Terry, ‘Information Technology’s Failure to Disrupt Healthcare’ (2013) 13 Nev LJ 722, 753 suggesting that smartphones and their mobile apps take aim at existing products, aiming for convenient and decentralised care. 20 WHO, mHealth: New Horizons for Health Through Mobile Technologies: Second Global Survey on eHealth (WHO Global Observatory for eHealth 2011). 21 JL Baldwin and others, ‘Patient Portals and Health Apps: Pitfalls, Promises, and What One Might Learn from the Other’ (2017) 5 Healthcare 81, 82. 22 NP Terry, ‘Mobile Health, Assessing the Barriers’ (2015) 147 Chest 1429; see also European Commission (n 7). 23 Terry, ibid, 1432. 24 See Babylon Health, ‘Online Doctor Consultation and Advice’ (2019) last accessed 15 May 2019; see also the mApp ‘NHS GP at Hand Doctor Service’ (2019) last accessed 15 November 2019; see also n 13. 25 Terry (n 19) 757. 26 ibid 752. 27 European Commission (n 7) 7. 28 ibid. 29 ibid 19 referring to initiatives such as through initiatives such as ‘Startup Europe’ or the European Innovation Partnership on Active and Healthy Ageing. 30 Terry (n 19) 755. 31 ibid. 32 Regulation (EU) 2017/745 of the European Parliament and of the Council of 5 April 2017 on medical devices, amending Directive 2001/83/EC, Regulation (EC) No 178/2002 and Regulation (EC) No 1223/2009 and repealing Council Directives 90/385/EEC and 93/42/EEC (henceforth the MDR) (OJ L117/1). 33 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) (henceforth the GDPR) (OJ L119/1). 34 LH Curtis and KA Schulman, ‘Overregulation of Health Care: Musing on Disruptive Innovation Theory’ (2006) 69 L & Contemp Prob 195, 206 arguing that alone the substantial paperwork required to comply with the regulations leads to an increase in costs. 35 ibid 196. 36 Nuffield Council on Bioethics, Medical Profiling and Online Medicine: The Ethics of ‘Personalised Healthcare’ in a Consumer Age (Nuffield Press 2010) ch 2 37 Castle-Clarke and Imison (n 2) 5; F Mirza, R Stockdale and T Norris, ‘Mobile Technologies and the Holistic Management of Chronic Diseases’ (2009) 14 Health Inform J 309, 310. 38 See examples in NHS Apps Library (n 13). 39 T MIadenov, J Owens and A Cribb, ‘Personalisation in Disability Services and Healthcare: A Critical Comparative Analysis’ (2015) 35 Crit Soc Policy 307, 311. 40 ibid 308. 41 Department of Health, High Quality Care for All: NHS Next Stage Review Final report (The Stationery Office 2008). 42 Department of Health, Liberating the NHS: Greater Choice and Control, a Consultation on Proposals (The Stationery Office 2010). 43 Department of Health (n 41) 26. 44 Department of Health (n 42) 17. 45 European Commission (n 7) 5. 46 ibid. 47 MM Funnell and RM Anderson, ‘Empowerment and Self-Management of Diabetes’ (2004) 22 Clin Diabetes 123, 124. 48 Nuffield Council on Bioethics (n 36) 39. 49 Department of Health (n 42) 23. 50 ibid 4. 51 Department of Health (n 41) 26. 52 Department of Health, NHS Constitution for England (The Stationery Office 2015) 11. 53 MIadenov, Owens and Cribb (n 39) 309. 54 Veitch (n 9) 320. 55 European Commission (n 7) 5. 56 J Clarke, J Newman and L Westmarland, ‘The Antagonisms of Choice: New Labour and the Reform of Public Services’ (2008) 7 Soc Pol Soc 245, 249; W Schelkle, J Costa-i-Font and C Wijnbergen, ‘Consumer Choice, Welfare Reform and Participation in Europe. A Framework for Analysis’ (2010) RECON, Online Working Papers Series 26, 1. 57 Veitch (n 9) 321; MIadenov, Owens and Cribb (n 39) 320. 58 S Coughlin and others, ‘Looking to Tomorrow’s Healthcare Today: A Participatory Health Perspective’ (2018) 48 Internal Medicine Journal 92, 92. 59 European Commission (n 7) 5. 60 WHO (n 20) 1. 61 ibid 75. 62 Directive 2001/95/EC of the European Parliament and of the Council of 3 December 2001 on general product safety Directive (henceforth the General Product Safety Directive or GPSD) (OJ L11,15.1.2002, pp. 4–17). 63 ibid, art 3(1). 64 ibid, art 3(2). 65 ibid, art 2(b). 66 The author is grateful to the reviewers for their comments. In reference, it is pointed out that an app embedded in a device or device-driven app can be a product such as a glucometer. Similarly, an app can be a service or be service-driven such as apps aimed at healthcare professionals for patient coaching or for monitoring a patient’s chronic condition. These types of apps fall under different legal regimes and are not discussed in the remainder of this article. 67 European Commission, Commission Staff Working Document on the Existing EU Legal Framework Applicable to Lifestyle and Wellbeing Apps (European commission 2014) 3 last accessed 30 October 2019. 68 ibid. 69 GPSD (n 62) Recitals 4, 11 and 26 and art 9. 70 Council Directive 93/42/EEC of 14 June 1993 concerning medical devices (OJ L169, 12.7.1993, pp. 1–43). 71 Directive 2007/47/EC of the European Parliament amending Council Directive 90/385/EEC on the approximation of the laws of the Member States relating to active implantable medical devices, Council Directive 93/42/EEC concerning medical devices and Directive 98/8/EC concerning the placing of biocidal products on the market, art 1(2)(a) (henceforth the MDD) (OJ L247/21). 72 MDR (n 32). 73 ibid, proviso 6 states that for historical reasons, active implantable medical devices, covered by Directive 90/385/EEC, and other medical devices, covered by Directive 93/42/EEC, were regulated in two separate legal instruments. In the interest of simplification, both directives, which have been amended several times, should be replaced by a single legislative act applicable to all medical devices other than in vitro diagnostic medical devices. 74 MDD (n 71) art 2(1)(a). 75 MDR (n 32) art 2(1). 76 Note that the MDR expands the definition to include prediction and prognosis which may bring certain apps able to predict the likelihood of people incurring certain illnesses or their prognosis, within definition of medical device. 77 European Commission, Guidelines on the Qualification and Classification of Standalone Software Used in Healthcare Within the Regulatory Framework of Medical Devices, MEDDEV 2.1/6 January 2012 (European Commission 2012) (henceforth MEDDEV 2.1/6 or MEDDEV). 78 MDD (n 71) Recital 6; the author is grateful to the reviewers for pointing out that in the age of AI, it is questionable whether this distinction can still be made where fundamentally the same algorithm can have multiple uses depending on the ‘layer’ that is added on top of it or even depending on the use made by the user, where the algorithmic platform is ready-made for multiple uses. 79 MHRA, Guidance: Medical Device Stand-alone Software Including Apps (Including IVDMDs) (The Stationery Office 2018) last accessed 30 October 2019. 80 But see the judgment in Case C-329/16 Syndicat national de l'industrie des technologies médicales (Snitem) and Philips France v Premier ministre and Ministre des Affaires sociales et de la Santé (2017) EU:C:2017:947 (henceforth Snitem). 81 European commission (n 77) 10. 82 ibid 11 and 12. 83 European Commission, Manual on Borderline and Classification in the Community Regulatory Framework for Medical Devices, Version 1.22 (EU Commission 2019). 84 MDD (n 71) Recital ; MDR (n 32)Recital 8. 85 A Ferretti, E Vayena and E Ronchi, ‘From Principles to Practice: Benchmarking Government Guidance on Health Apps’ (2019) 1 The Lancet e55. 86 A van Drongelen and Others, Apps under the Medical Devices Legislation (National Institute for Public Health and the Environment 2019) 23. 87 MDR (n 32) art 2(1)(12). 88 MDD (n 71) art 1(1)(ii)(f). 89 Case C-219/11 ECJ Brain Products GmbH v BioSemi VOF and Others (2012) EU:C:2012:742, para 34, henceforth Brain Products. 90 ibid para 33. 91 ibid para 30. 92 ibid, opinion of AG Mengozzi, para 63. 93 MDD (n 72) art 2, Recital 6. 94 Brain Products (n 89) opinion of AG Mengozzi, para 63. 95 Snitem (n 80). 96 MDD (n 71) art 1(2)(a). 97 Snitem (n 80) para 35. 98 European Commission (n 77). 99 Snitem (n 80) para 33. 100 ibid. 101 Council Directive 93/42/EEC (n 70) art 11 and Annex II. 102 ibid, art 9. 103 ibid, Recital. 104 ibid, art 9 and Annex IX. 105 MDR (n 32) Annex IIIV. 106 ibid, Annex IIIV, ch III, r 10. 107 ibid, Annex IIIV, ch III, r 11. 108 European Commission (n 7) 7. 109 GDPR (n 33). 110 European Commission, Privacy Code of Conduct on Mobile Health Apps (EU Commission 2017) last accessed 10 January 2020. 111 GDPR (n 33) art 4 (15) and Recital 35. 112 European Commission (n 110) 2. 113 GDPR (n 33) art 4(7) determining the purposes and means of the processing of personal data. 114 ibid, art 4(8) processing the personal data on behalf of the controller. 115 ibid, art 5. 116 ibid, art 25; European Commission (n 110) 8 and 9. 117 GDPR (n n 33) art 25(1) which defines ‘by design’ to mean appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects. 118 ibid, art 25(2) which defines ‘by default’ to include appropriate technical and organisational measures for ensuring that only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made accessible without the individual's intervention to an indefinite number of natural persons. 119 ibid, art 9(1). 120 ibid, art 5(1)(b). 121 ibid, art 12. 122 ibid, Recital 32. 123 ibid, art 7(3). 124 European Commission (n 110) 3; see also Mulder (n 3) 3. 125 GDPR (n 33) art 9(2)(h). 126 Mulder (n 3) 5. 127 ibid 7. 128 ibid 8. 129 ibid 10. 130 ibid. 131 ibid 14. 132 H Fraser, E Coiera and D Wong, ‘Safety of Patient-facing Digital Symptom Checkers’ (2018) 392 The Lancet 2263; S Akbar, E Coiera and F Magrabi, ‘Safety Concerns with Consumer-facing Mobile Health Applications and Their Consequences: A Scoping Review’ (2019) 27 JAMIA 330. 133 B Martínez-Pérez, I de la Torre-Díez and M López-Coronado, ‘Privacy and Security in Mobile Health Apps: A Review and Recommendations’ (2015) 39 J Med Syst 181. 134 G Iacobucci, ‘Babylon App Will Be Properly Regulated to Ensure Safety, Government Insists’ (2018) 362 BMJ k3215. 135 Madhumit Murgia, ‘Inside DeepMind as the Lines with Google blur’ Financial Times (London, 19 November 2018) last accessed 15 September 2019. 136 VA Entwistle, A Cribb and J Owens, ‘Why Health and Social Care Support for People with Long-term Conditions Should Be Oriented Towards Enabling Them to Live’ (2018) 26 Health Care Anal 48. 137 Terry (n 22) 1433. 138 McKnight and others (n 5) 2. 139 European Commission (n 110) 1; Department of Health and Social Care, Personalised Health and Care 2020 (The Stationery Office 2020) 11 speaks of trust by patients in the security of their data and building and sustaining patients’ trust last accessed 11 January 2020; M Honeyman, P Dunn and K McKenna, A Digital NHS? An Introduction to the Digital Agenda and Plans for Implementation (King’s Fund 2016). 140 B van den Berg and E Keymolen, ‘Regulating Security on the Internet: Control Versus Trust’ (2017) 31 Int Rev of Law, Computers & Technology 188, 194. 141 E Söderström, N Eriksson and R-M Åhlfeldt, ‘Managing Healthcare Information: Analyzing Trust’ (2016) 29 IJHCQA 786, 789. 142 Van den Berg and Keymolen (n 140) 194. 143 ibid. 144 McKnight and others (n 138) 3. 145 N Luhman, Trust and Power Two Works by Niklas Lumann (Howard Davies tr, John Wiley & Sons 1979). 146 Van den Berg and Keymolen, above, n 140 at 193. 147 ibid 195 citing Luhman (n 145) 25. 148 DH McKnight, ‘Trust in Information Technology’ in GB Davis (ed), The Blackwell Encyclopedia of Management, Vol 7 Management Information Systems (Blackwell 2005) 330. 149 B Nooteboom, ‘Trust and Innovation’ in R Bachmann and A Zaheer (eds), Handbook of Advances in Trust Research (Edward Elgar 2013) 106. 150 ibid 120. 151 Van den Berg and Keymolen (n 140) 195. 152 ibid. 153 ibid, citing G Möllering, Trust, Reason, Routine, Reflexivity (Elsevier 2006) 111. 154 Honeyman, Dunn and McKenna (n 139) 16. 155 K Huckvale and others, ‘Unaddressed Privacy Risks in Accredited Health and Wellness Apps: A Cross-sectional Systematic Assessment’ (2015) 13(1) BMC Medicine 214. 156 Department of Health and Social Care, The Future of Healthcare: Our Vision for Digital, Data and Technology In Health and Care (The Stationery Office 2018) last accessed 20 January 2020. 157 NHS Digital, NHS Library (2019) last accessed 20 January 2020. 158 Public Health England, Criteria for Health App Assessment (2017) last accessed 20 January 2020. 159 ibid. 160 L Postelnicu, ‘More Than 100 Digital Health and Care Tools Evaluated Against Key Standards for NHS Apps Library’ (Mobihealthnews, 2018) last accessed 20 January 2020. © The Author(s) 2020. Published by Oxford University Press; All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) TI - mHealth Apps: Disruptive Innovation, Regulation, and Trust—A Need for Balance JF - Medical Law Review DO - 10.1093/medlaw/fwaa019 DA - 2020-08-01 UR - https://www.deepdyve.com/lp/oxford-university-press/mhealth-apps-disruptive-innovation-regulation-and-trust-a-need-for-B6wf6Z3Ycx SP - 549 EP - 572 VL - 28 IS - 3 DP - DeepDyve ER -