Considering the relationship between the civil law treatment of data and data protection law in Germany

Considering the relationship between the civil law treatment of data and data protection law in... A data-driven economy, in the way it is currently developing and politically desired, needs legal certainty. Therefore, it is important to consider the relationship between the civil law treatment of data and data protection law. This article seeks to provide some answers. I. Introduction In response to the rising importance of data for the economy, the possibility of a specific ‘data right’ has been discussed with increasing frequency. However, the debate suffers from conflicting interests and intentions. While the EU Commission proposes to introduce a ‘data producer’s right’1 to create clear rules for data markets, consumers’ representatives primarily want to enable the data-subjects to benefit from the profits based on the use of their personal data. Therefore, they seek to establish a data right that goes beyond the merely defensive character of data protection law.2 Secondly, the debate lacks a clear application scenario of ‘data rights’. It even seems that - quite the opposite- very heterogenous scenarios shall benefit from ‘data rights’ and, consequently, various interests shall be satisfied. Until today, there has not been a profound workup and analysis of these application scenarios. The question that arises is whether all of them can be dealt with having only one uniform solution: a uniform data right with a uniform scope, for example, or whether branch-specific solutions would be preferable instead. On a more abstract level, arguments in favour and against ‘data rights’ have already been exchanged, comprehensively3, just as the potential lack of economic4 and constitutional5 legitimacy of these data rights has already been discussed. Without wanting to examine these issues in detail, it is worth mentioning that a ‘data right’ would not need to be an overarching, property-like designed right. It is just as possible to think about access rights to data6 or to reduce the scope of a data right to specific branches7. Already today, data can be made subject to contracts,8 it can even be treated as consideration.9 Thus, if a ‘data right’ is not needed due to economic or constitutional reasons, an alternative would be to design standard contractual terms which balance the parties’ interests adequately. But regardless how one treats data in a legal manner, it must be considered in which way data protection law influences this treatment. Hence, this article seeks to examine the relationship between civil law data treatment and data protection law. Hereby, data protection law allows for a distinction between different purposes of data processing. It is, therefore, necessary to begin the examination with a phenomenological assessment of potential application scenarios for civil law data treatment (II.) and as a second step to explain the impacts of data protection law on the design of possible ‘data rights’ and the contractual treatment of data (III.). II. Application scenarios The scenarios in which ‘data-rights’ or data contracts could be relevant can be divided into different categories, the first of which is commercial interest in data handling, the second of which concerns data handling in security-, academic-, or archiving interests. For instance, with regard to (partly-) automated cars, both the producer of the car and the developer of the integrated software are interested in legal positions concerning the data which is collected in the car, for example the driving behaviour of the car owner, because such data can be passed on to insurance providers against payment. As another example, health insurances and producers are interested in data generated by so called smart devices (like fitness trackers or similar applications), to develop customized rates, based on the collected data to minimize the risk of the insurance provider. Rating portals and social networks have interest in extensive data rights which relate to data being generated and collected with the use of their websites, since such data forms the essential value for those networks. Other potential application scenarios can be, for example, data collection in smart homes and smart metering, such as heating behaviour, security behaviour (automated looking of doors, clock-timed lamps, automated control of electric blinds), use of water, etc. All such data collection is generally aimed at analyzing the data through Big Data applications and, hence, to allow a detailed creation of user profiles. The goal of creating these user profiles is – commonly - to provide individualized advertisement. The same is true for customer-card-systems with which shopping behaviour can be analyzed, as well as for click-stream analysis over the internet. Apart from displaying individualized advertisements, the evaluation of data allows for the prediction of behaviour of voters, insurance policy holders and employees etc. Indeed, through the evaluation of personal data, the perception of the data-subjects’ personality has proven to be very precise. For instance, by using the data of only 68 Facebook-likes, analysts can find out a user’s skin colour (with 95% accuracy), sexual orientation (with 88% accuracy), and political persuasion (with 85% accuracy). Intelligence, religion, consumption of alcohol, cigarettes and drugs can also be estimated on this basis.10 Using only 10 Facebook-likes, analysts are able to rate the data subject’s character better than an average work colleague, while by 70 likes they can prove a better understanding of the data subject’s character than his friends can, and by 150 likes, analysts even outbid the data subject’s parents’ understanding of the data subject’s character. With 300 likes, the prediction of the data subject’s behaviour can be described with greater precision than it could be by the data subject’s own partner. With more likes, one can even outbid what the users believe to know about themselves.11 That is, why even the government might have an interest in collecting data to predict and prevent crime. Mayer-Schöneberger/Cukier described already in 2013 how personal data can be used to calculate the probability of an individual to commit a crime.12 Besides individualized advertisement and behavioural prediction, there are other differently motivated interests. For instance, partly- (and in future fully-) automated cars rely on data that enables automated driving by using movement data of the car. Intelligent technology used to regulate traffic and based on car movement data could also assist to regulate traffic lights, making traffic more fluent. In medical research data are processed for scientific purposes. That the processing of such data can also be of commercial interest should not cast a shadow on the fact that data processing in those cases is here mainly being used for scientific advancement, the development of product safety or other public interests. Such society-relevant interests may therefore have to be treated differently from purely commercial interests. III. Relationship between data rights and data protection law Regarding these various constellations, which are, of course, non-exhaustive, the relationship between the civil law treatment of personal data and data protection law can be thought about in three different ways. 1. Data protection law as an instrument to assign data rights Data protection law can be used as an instrument to assign data rights. This, of course, requires the development of data protection law beyond the initial understanding of it as just an instrument of protection to the benefit of the data subject. This is a view that does not allow for a commercialization of data. Regardless as to whether it is possible13 or desirable for data protection law to serve as such as an instrument of assignment of data rights (remember the statements made by the German Constitutional Court about data protection law in the census-decision: ‘The individual does not have a right in the sense of an absolute, illimitably control over their data; information, even if personal, reflects social reality, which cannot be assigned only to the one individual alone’), ‘data-subject’s-rights’ can only be to the benefit of data subjects. Hence, they cannot exist with regard to non-personal data. In many situations though, making a distinction between personal and non-personal data would prove problematic. That is because of the extensive definition of personal data,14 which leads to a large amount of data being personal. In any case, it is at least theoretically possible to remove any data’s personalization by anonymizing the data, or by processing non-personal data at the outset, eg weather and ground-conditions. But also this non-personal data can become personal data (again) by adding more data. Hence, the distinction between personal and non-personal data depends, in many cases, on the data processing circumstances which can make data personal at one moment and non-personal at the other. Therefore, it could make sense not to distinguish between personal an non-personal data, when shaping a ‘data-right’ or issue rules for the contractual treatment of data. 2. Data protection law as a participatory instrument Secondly, one might think of data protection law as a participatory instrument which enables the data subjects to participate in the profits generated by processing their personal data.15 Yet, it is unclear as to how such participation in and distribution of profits could be calculated. It is nigh impossible to determine the portion of the profit of a data processing company that is related to the processing of one person’s specific data. Even if the calculation succeeds, the distribution is difficult. A ‘collecting society data’ (Verwertungsgesellschaft Daten) which helps distributing the profits earned with data-processes does not appear to be a proper solution, since one would hardly know where and which data has been given out. Often enough, data collection happens with only rudimentary knowledge of the data subject (as with cookies, data processing by social networks, or customer-card-systems). In any case, notice of the giving away of data would be essential to participate in the data-processing profit. Anyway, it is open to dispute whether data protection law would even hinder the data subjects’ participation, because its function is actually deemed to discourage the data subject from giving away personal data and to prevent the data subject from losing control over ‘their’ personal data. The state is obligated to protect its citizens from becoming pure objects of data processing. It seems to be a myth that earning money by giving away personal data leads to a more conscious treatment of data by the data subject. Rather, conscious dealing with personal data can only be achieved when the data subjects can be made aware of which data processing arrangement they are agreeing to and which rights can be exercised to limit the use of their personal data. The visualization of information seems to be at least worth a try. Nonetheless, especially the draft directive for digital content16 shows that there is a strong growth of commercialization of personal data which has to, in case of the direction being adopted, recognized by the national legislators of the EU Member States. Assuming (which would surely be difficult to argue) a compatibility of the draft directive and Art 8 of ECHR it would become possible, at least in ‘synallagmatic’ contracts where data are given as consideration, to implement legal rules guaranteeing adequate remuneration of the data subject. It is, at least theoretically, thinkable to have a right similar to § 32 UrhG, but it does seem impracticable in the absence of a satisfactory method of calculating the commercial worth of data. Unfortunately, economic research concerning the calculation of data value is only beginning. 3. Data protection law as an instrument to limit a data right When data rights are assigned to a person other than the data subject, data protection law limits these rights in a way the right to one’s own image (§§ 22, 23 KUG) limits copyright law (regarding a photograph the copyright owner cannot decide about the use of the photograph without the consent of the portrayed person). The same limitation is true for the contractual treatment of data. You cannot process data on the basis of a contract without the consent of the data subject. Data Processing is an act relevant to data protection law, which underlies the prohibition principle, regardless of whether the data are intended to be processed on a contractual basis or because of a specific ‘data right’. a) Scope of the limitation by data protection law Theoretically, the relationship between the civil law treatment of data and data protection law can be thought about precisely in a way which does quite make sense. However practically, the problem is that a limitation by data protection law can undermine potential data rights or contractual agreements concerning data in a way that the legal position concerning the data which should be given under civil law would fade into insignificance. Data rights and data contracts are to capture and mirror reality, in which data are being used more frequently, and becoming a dominant economic factor.17 In any application scenario mentioned under II., a range of data is being processed, especially with the intention to analyse it, using Big Data technology.18 The EU General Data Protection Regulation (GDPR),19 however, is not made for such a massive processing of personal data. There may be circumstances in which permission is granted under Art. 6 GDPR, but in the majority of Big Data scenarios, the data subjects’ consent is required. It has to be an informed consent, meaning, in order for the person to be able to consent, all information about the data process has to be provided. The more data is gathered, the broader the purposes of the data processing are, the more difficult it is to provide all information necessary. Whether a state of being informed is even realisable in Big Data scenarios is highly questionable. The possibility for the data subject to withdraw the consent at any time leads to a heightened uncertainty for a ‘data right’-owner or the one getting data on a contractual basis. The same is true because of the data protection principles, for example the purpose limitation principle, the transparency principle and the principle of data minimisation. a) Purpose limitation principle Already, Art. 8 ECHR requires that personal data is used only for the purpose for which it was collected. The purpose limitation principle is the dominant principle of data protection law.20 It legitimizes the processing of data and requires that the purpose of data processing has to be defined, clear and legitimate, according to Art. 5 I lit b). The criterion of definition is here to be understood in a formal sense, and the criterion of clarity refers to purpose in its material understanding.21 A legitimate purpose is one that is not condemned by the jurisdiction. Ethical considerations and social customs also need to be taken into account.22 For example, a racially motivated discrimination of a group of people is a non-legitimate purpose.23 Nevertheless, the criterion of legitimacy remains only a gross filter to eliminate clearly illegitimate purposes.24 The definition of the purpose of data processing has to be completed before processing the data. If not, the data processing is not in accordance with the law.25 As the required precision for specifying the purpose is determined on a case-by-case Basis, it is not sufficient to determine only a blanket or too generalized purpose.26 Considering this, specifying the purpose and therefore complying with the GDPR using Big Data analysis is very difficult. That is, because it is essential for Big Data processes to keep the purpose flexible,27 to be able to use various analytical algorithms which search for hitherto unknown connections between data and put them in relationship to one another.28 (1) Data processing for statistical purposes The German Federal Constitutional Court (BVerfG) has determined in the census-decision, that data processing for statistical purposes cannot call for a narrow and concrete purpose. It was argued that it is an essential characteristic of statistics that data after their initial processing are supposed to be reused for various other tasks, for which the purpose cannot be determined at the initial stage of processing.29 The same applies to Big Data analysis, which is why this argument could be transferred to Big Data scenarios. Statistical analysis as such is not limited to a concrete content, statistical analysis is more a procedure that can be applied to various content.30 The GDPR not clearly states whether statistical purposes of data processing contain only those purposes which are within the public interest, or whether commercial interests can also be concerned. The wording of Art. 5 (1) lett. b) half sentence 2 GDPR indicates that only archiving purposes have to serve the public interest, whereas statistical purposes can also lie in commercial interests. On the contrary, the legal provision which preceded Art. 5 GDPR (namely Art. 6 (1) lit b) sentence 2 GDPR) indicates, that statistical purposes cannot be of commercial character.31 That is, because the other purposes named in Art. 6 (1) lit. b) sentence 2 GDPR of the data protection directive are non-commercial (historical and scientific) purposes.32 Using the term ‘statistical purposes’ also the German Federal Constitutional Court wanted to privilege only data processing that is in the public interest (here: census) but not commercial interests. Acknowledging these historical and teleological arguments, one can hardly say that the specification of a purpose of Big Data analysis can be similarly broad as it is true for statistical purposes in the public interest. (2) Data processing for advertisement purposes as a satisfying purpose? If a significant part of data processing in Big Data scenarios has the purpose of market- and opinion research, or more specifically customer profiling and individualised advertising, the question is whether ‘advertisement-purposes’ or ‘marketing-purposes’ can satisfy the required specification of the purpose. On the one hand, it could be argued that the purpose of ‘advertisement’ and ‘marketing’ includes a series of advertising activity (addressing of customers via the newspaper, e-mail, telephone marketing, or live advertisement in the internet, etc.) and even product- and services research would be included. Thus, the specification of purpose could be deemed to be too imprecise. On the other hand, it could be reasoned that the customer only cares about the ‘primary’ purpose, the addressing with advertisement, regardless of the form in which the advertising is done and what steps are taken before that.33 This argument is supported by recital 47 sentence 7 GDPR, which expressly recognizes purposes of ‘direct advertisement’ as legitimate within the balancing of interests, without the purpose having to be further specified. ‘Advertisement-purposes’ as well as ‘marketing-purposes’ therefore seem to be possible, though options differ on that issue. (3) Data processing with the purpose of personality profiling However, even if ‘advertisement-purposes’ and ‘marketing-purposes’ are deemed to be sufficiently specialized purposes, these purposes will not cover activities which aim at creating personality profiles. That is, because creating personality profiles is prohibited by law. The German Federal Constitutional Court has made this remark in its census-decision as well as its decision regarding the constitutionality of a representative statistic. A broad registration and cataloguing of the personality through the merger of individual life and personal data to create personality profiles of the citizens is prohibited even in the anonymity of a statistical survey.34 This is due to the indispensable right of human dignity of Art. 1 of the German Constitution (GG) which would be infringed by the creation of personality profiles (even when they cover only parts of the person’s personality details; how many parts must be covered has to be determined on case-by-case basis). If that already applies to purposes of statistical survey which lie in the public interest, and which are therefore privileged in certain kinds, then this must especially be the case when personality profiles are created for commercial purposes which are not privileged at all. This view is being agreed upon by the vast amount of academic literature, at least as long as individual data is not only being gathered, but, with the purpose of illustrating the consumer personality, is also being evaluated.35 The Frankfurt Higher Regional Court (OLG) even considers data processing to be admissible if the data allows the creation of a detailed personality profile.36 But the jurisprudence of the OLG Frankfurt is contrary to the census-decision, which expressly talks about a breach of the human dignity principle when considering the creation of personality profiles. One must not differentiate between constellations in which data has been collected only and those in which the data has already been analyzed because if the data has been collected, it needs only one single mouse click to start analysing. Thus, the personality right is in concrete danger even when the data are being collected only. To put it in a nutshell, when data is being generated with the purpose of Big Data technology analysis, conflicts with the purpose limitation principle will inevitably arise as long as ‘advertisement-purposes’ or ‘marketing-purposes’ are found not to be satisfactory. The creation of personality profiles is no valid purpose and is therefore not able to justify data processing. (4) Breaking the principle of specified purpose Processing data for purposes other than the purpose which the data were collected for, is possible if the data subject consents. Thus, it is possible to justify data processing even for purposes not being defined at the moment the data is collected. Theoretically, this might be a solution for Big Data scenarios but, practically, it is questionable as to whether seeking consent from the data subject would prove to be a practical procedure, given the high number of data in Big Data scenarios.37 Also, it is possible to derogate from the purpose limitation principle on the basis of European Union legislation or national legislation of the member states. This possibility is provided in Art. 6 (4) GDPR, and is only limited by Art. 23 GDPR.38 Anyhow, the principle of specified purpose only requires that data is not being processed for a purpose being incompatible with the purpose the data had been collected for. Therefore, if the initial purpose and the purpose of further processing are mutually compatible, neither consent of the data subject nor a specific legislation at the EU or national level is required.39 The initial purpose for which the processing was undertaken is the benchmark for judging as to which data processing is admissible.40 In order to ascertain whether a purpose of further processing is compatible with the purpose for which the personal data were initially collected, referring to Recital 50 sentence 6 GDPR the controller should take into account any link between those purposes and the purposes of the intended further processing, the context in which the personal data have been collected, in particular the reasonable expectations of data subjects based on their relationship with the controller as to their further use, the nature of the personal data, the consequences of the intended further processing for data subjects and the existence of appropriate safeguards in both the original and intended further processing operations. Hence, a balancing of interests is to be achieved, taking into account all circumstances of the case.41 There are, however, some purposes for the further processing of data which have a privileged status, like statistical and scientific purposes (Art. 89 GDPR).42 Recital 157 GDPR expressly names scientific research as a possible purpose for further data processing.43 But also purposes not being named in Art. 89 GDPR and Recital 157 can be compatible with the purpose the data were collected for.44 Nevertheless, also the further purpose must not be illegitimate.45 To sum up, if data is being gathered for legitimate purposes and the purpose of further processing is satisfactorily precise and legitimately chosen, Big Data analysis for such purposes can comply with data protection law.46 A general exception to the benefit of Big Data analysis is however not known to the GDPR. Thus, every processing by way of Big Data analysis requires a balancing of interests on a case-by-case basis.47 Having said this, a legal framework that aims at legal certainty in the field of Big Data analysis is not provided. b) Principle of data- and storage minimization, principle of transparency According to the principle of data-minimization, personal data must be adequate, relevant and limited to what is necessary in relation to the purposes for which the data are processed.48 Thus, also the data minimization principle limits the data processing according to its purpose. Since that purpose is already difficult to determine for Big Data analysis, there is a high risk that Big Data analysis does not comply with the principle of data minimization. The same applies to the principle of data storage minimization (Art. 5 (1) lit e) GDPR), which prohibits any data storage beyond the purpose of the data processing. Also, the transparency principle seems to be difficult to comply with, for it requires that the data subject is properly informed about the purpose of the data processing.49 Finally, the data subject has the right not to be made subject to a decision based solely on automated data processing (as for example by Big Data algorithms), which produces legal effects concerning them or similarly significantly affects them (Art. 22 GDPR). This prohibits, for example, an automated decision over a bank loan, made using algorithms.50 c) Informed consent Another fundamental problem concerning Big Data scenarios is the informed consent to justify a massive data processing. That is partly because the purpose of the data processing is already difficult to determine precisely (see above).51 But even if such a purpose could be determined, the informed consent is not a secure instrument to justify the data processing. Whether, when and how free and fully-informed consent can be obtained is a question difficult to answer. Regarding (partly-) automated driving one might think about whether consent can already be obtained at the stage of the car purchasing, but it still proves to be unclear how far such consent could go. Moreover, the mass of information necessary to inform the data subject sufficiently usually leads to an information overload which means that providing more information does not lead to the data subject truly being informed. The question arises whether a data subject not feeling accurately informed can consent in a way which justifies data processing. Without going into detail it needs to be underlined that the informed consent undergoes a serious crisis in the digital age. To put it in a nutshell, to rely on an informed consent to justify Big Data analysis proves to be highly risky.52 d) Permissions To prevent such risks the question needs to be considered if data processing in Big Data scenarios is permitted by law. According to Art. 6 (1) lit. f GDPR data processing is lawful if it is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. According to Art. 9 GDPR the processing of sensible data is generally prohibited and only exceptionally permitted within the scope of Art. 9 (2) GDPR. Such an exception is, for example, granted on the basis of EU or national law if the data processing is necessary for reasons of substantial public interest.53 While art 9 (2) lit. g) GDPR allows EU Member States to adopt exceptions to the processing of sensible personal data, Art. 6 (1) lit. f GDPR requires a weighing of interests on case-by-case basis. By way of balancing interests, those of the controller or a third party as well as the rights and interests of the data subject are set in relation to one another. The legitimate interests of the controller or the third party are to be understood in a wide sense and, according to Recital 47 sentence 7 GDPR, can also include processing for the purposes of direct advertisement.54 This was also true within the scope of § 28 Bundesdatenschutzgesetz (BDSG). But even if such commercial interests are to be taken into consideration within the case-by-case weighting of interests according to Art. 6 (2) lit. f GDPR such interests cannot justify any change of purpose.55 This was hitherto true under German data protection law and this must apply to Art. 6 (2) lit. f GDPR, to prevent the principle of purpose limitation from being undermined. Hence it follows, that the massive processing of data in Big Data scenarios for purposes other than the purpose the data were collected for is being difficult to justify under Art. 6 (2) lit f) GDPR. (b) Possible solution: permission to anonymize data One option to process data using Big Data technology, whilst still complying with data protection law, is to anonymize the data. According to Recital 26 GDPR, anonymizing data means to manipulate the data in a way that the data subject cannot be identified any longer.56 Using only pseudonyms which do not effectively prevent identification, is not sufficient. Taking into account the wide variety of methods to de-anonymize data are available already today and which one can think of in the future, it cannot be said with any degree of certainty whether it is actually possible to eliminate every chance to identify the data subject.57 It could be of help to devise technical standards or declare existent standards to be legally binding and to irrefutably presume data to be anonymized when these standards are complied with. Moreover, the de-anonymizing data could be prohibited, possibly even punishable. But nevertheless, anonymizing of data can only happen when the data are collected, before, and it is usually not possible to anonymize data in the very same moment it is collected. In areas, for example, in which driving behaviour data are gathered for product improvement, as well as where smart devices are used, the data are inevitably personal at the moment they are collected. It seems rather difficult if not impossible to collect that kind of data already being anonymized. Since data processing with the purpose of evaluation through Big Data analysis can also hardly be based on an already existing permission or consent, de lege ferenda a permission to collect and short-time store data for the purpose of immediately anonymizing the data similar to Art. 5 (1) InfoSoc-Directive58 (§ 44a UrhG) could be a solution. Such an exception to the general prohibition of data processing would be a significant step towards legal certainty for such data processing which lies in the interest of society, politics and science. However, this solution assumes that the collecting and storage of data for the purpose of anonymizing the data automatically is qualified as being a relevant act of data processing under data protection law. A decision of the German Federal Constitutional Court which refers to automated licence plate recognition could cast some doubt on this assumption. That is, because according to the court in cases of automated collection and only short term storage of data for the purpose of immediate comparison with registry data and immediate deletion of the data after this comparison, there is, according to the court, no infringement of the right to self-determination.59 From a teleological point of view deleting and anonymizing data are similar acts because in either case the data subject is in less need of protection. It could therefore be concluded that automated collection and short-time storage of personal data aren’t relevant acts under data protection law if the data are deleted or anonymized immediately after the collection.60 Art. 4 No. 2 GDPR on the other hand, at least in its wording, seems to define ‘any’ data collection and storage as a data-protection law relevant act, no matter for how long the data are stored without being anonymized. Assuming that also such short-time storage for purposes of anonymizing the data is a data-protection law relevant act, data from connected cars and smart devices, for example, could be collected in accordance with data protection law under an exception similar to Art. 5 (1) InfoSoc-Directive (§ 44a UrhG) for short term storage with the purpose of immediately anonymizing the data. The right to self-determination would be sufficiently protected by the obligation to anonymize the data. Of course, additionally needed is a sufficient regulation of the anonymizing-process to effectively prevent de-anonymizing. This is essential to safeguard the right to informal self-determination. As an alternative to an explicit permission rule similar to Art. 5 (1) InfoSoc-Directve (§44a UrhG), which would be needed to be agreed upon at EU level, one could work out guidelines for the weighing of interests within the scope of Art. 6 (1) lett. f GDPR, at least for the processing of non-sensible data. These guidelines could declare how the different interests should be weighted according to Art. 6 (1) lett. f GDPR when it comes to data processing which lies in the common interest. This requires, of course, that politics and society elaborate which data processing should be privileged, before. With such guidelines, a sufficient degree of legal certainty might also be achieved. IV. Conclusion In summary, there are three options to design the relationship between civil law treatment of data and data protection law. Data protection law might function as an instrument to assign data rights, as well as a participatory instrument which enables the data subjects to participate in the profits generated with the use of their personal data. For both mechanisms, data protection law would have to be developed beyond the initial understanding of it primarily being an instrument of protection, which is a view that does not allow for a commercialization of data. Actually, data has a commercial value and the draft directive about digital content seeks to recognize this commercial value in a legal manner, too. Data protection law could not serve as an assignment instrument for non-personal data and establishing different assignment-criteria for personal and non-personal data does not seem to be reasonable due to the large variety of transformation options for both data categories. Looking at data protection law as a participation instrument, on the other hand, proves to be problematic due to the difficulties of value assessment of data. But this function should be intensively considered, all the more in case of adopting the draft directive about digital content which calls for data to be recognized as akin to consideration in contracts. To be adequately remunerated for giving away data becomes essential when data are consideration. Providing such adequate remuneration requires a legal provision similar to § 32 UrhG. In any case, data protection law limits civil law treatment of data. In doing so, this limitation can undermine potential data rights or contractual agreements concerning data in a way that the legal position concerning the data which should be given under civil law fades into insignificance. A significant limitation appears in relation to data that is collected for Big Data analysis for commercial purposes. But also, data processing for research and product development purposes bears severe legal risks, which especially apply to data collection used in the field of autonomous or automated driving. If a more data driven economy is supposed to act on a secure legal foundation, a permission similar to Art. 5 (1) InfoSoc-Directive (§ 44a UrhG), which would allow short-term storage of data for the purpose of immediately anonymizing the data, or guidelines about the weighing of interests with respect to Art. 6 (1) lit f GDPR, would be of help. If the data-driven economy is provided with the possibility to evaluate anonymized data using Big Data technology, at least when in the interest of society, a fair balance between economical needs and an appropriate protection of the right to informal self-determination could be achieved. Footnotes 1 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of Regions, “Building a European Data Economy”, COM 2017(9), S.13 ff. 2 See for example: Schwartmann/Hentsch, PinG 2016, 117 ff. 3 See especially: Becker, GRUR 2017, 346 ff.; Wiebe, GRUR 2017, 338 ff.; Börding/Jülicher/Röttgen/Schönfeld, CR 2017, 134 ff.; Wandtke, MMR 2017, 6 ff.; Fezer, MMR 2017, 3 ff.; Drexl, Designing Competitive Markets for Industrial Data – Between Propertisation and Access, 2016, MPI for Innovation & Competition Research Paper No. 16-13, to be found at: <https://ssrn.com/abstract=2862975> (accessed 5 April 2018); Kerber, GRUR Int. 2016, 989 ff.; Spindler, JZ 2016, 805 ff.; Heymann, CR 2016, 650 ff.; Härting, CR 2016, 646 ff.; Specht, CR 2016, 288 ff.; Dorner, CR 2014, 617; Hoeren, MMR 2013, 486 (488); Zech, CR 2015, 137; ders., Information als Schutzgegenstand, 2012. 4 Kerber, GRUR Int. 2016, 989 ff. 5 Wiebe/Schur, ZUM 2017, 461 ff. 6 See especially: Drexl (oben Fn. 3). 7 Just recently: Dreier, in: Weller et al., Tagungsband zum XI. Heidelberger Kunstrechtstag, im Erscheinen. 8 Specht, Ökonomisierung informationeller Selbstbestimmung – Die zivilrechtliche Erfassung des Datenhandels, 2012. 9 Specht, JZ 2017, 763 ff.; Metzger, AcP 2016, 817 ff. 10 Grassegger/Krogerus, Das Magazin Nr. 48/2016, „Ich habe nur gezeigt, dass es die Bombe gibt“, to be found at: <https://www.dasmagazin.ch/2016/12/03/ich-habe-nur-gezeigt-dass-es-die-bombe-gibt/ (accessed 15 August 2017). 11 Grassegger/Krogerus (above Fn. 10). 12 Mayer-Schöneberger/Cukier, Big Data, 2013, S. 199; about behaviour prediction using Big Data analysis see also Boehme-Neßler, DuD 2016, 419 (421). 13 See for example: Specht/Rohmer, PinG 2016, 127. 14 For more information regarding the scope of personalization see EuGH, GRUR Int. 2016, 1169 – Breyer; Klabunde, in: Ehmann/Selmayr, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 5 ff.; Ziebart, in: Sydow, Europäische Datenschutzgrundverordnung, 1. Aufl. 2017, Art. 4 para. 7 ff.; Ernst, in: Paal/Pauly, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 3 ff.; Gola, in: Gola, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 3 ff. 15 See for example: Schwartmann/Hentsch, PinG 2016, 117. 16 European Parliament and Council Directive Proposal, COM (2015) 634 final, 2015/0287 (COD), http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52015PC0634, (accessed 5 April 2018). 17 See for example: Van Asbroeck/Debussche/César, Building the European Data Economy, Data Ownership, White Paper, 2017. 18 About potential forms of use of Big Data analysis, see: Orthmann/Schwierig, NJW 2014, 2984. 19 Directive 95/46/EC of the European Parliament and of the Council of 24.10.1995 on the protection of individuals with regards to the processing of personal data and on the free movement of such data. 20 Dammann, ZD 2016, 307 (311); Frenzel, in: Paal/Pauly (oben Fn. 14), Art. 5 para. 23: „Dreh- und Angelpunkt“; Schantz: BeckOK Datenschutzrecht, 20. Aufl. 2017, Art. 5 DSGVO para. 13: „beherrschendes Konstruktionsprinzip”. 21 Frenzel: Paal/Pauly (above. fn. 14), Art. 5 at marginal number 27. 22 Art. 29 Gruppe, S. 20, opinion 03/13 on purpose limitation, 00569/13/ENWP203, to be found at: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf (accessed 9 September 2017); Monreal, ZD 2016, 507 (509). 23 See Helbig, K&R 2015, 145 (146). 24 Schantz: BeckOK Datenschutzrecht (above Fn. 18), Art. 5 DSGVO marginal number 17. 25 BVerfG, NJW 1984, 419 (422) – Volkszählung; Schantz, in: BeckOK Datenschutzrecht (above fn. 18), Art. 5 DSGVO marginal number. 13. 26 Culik/Döpke, ZD 2017, 226 (227); Bergmann/Möhrle/Herb, Datenschutzrecht, 50. EL 2016, § 4 marginal number 43. 27 Weichert, ZD 2013, 251 (256); Culik/Döpke, ZD 2017, 226 (230). 28 Boehme-Neßler (above fn.12); Kring (above fn. 26), S. 553 with further referencing, to be found at: <https://subs.emis.de/LNI/Proceedings/Proceedings232/551.pdf> (accessed 11 August 2017). 29 BVerfG, NJW 1984, 419 (423) – Volkszählung. 30 Richter, DuD 2015, 735 (738). 31 Also: Richter, DuD 2015, 735 (738); Culik/Döpke, ZD 2017, 226 (230); not excluding commercial purposes, at least not from the beginning: Art. 29 Gruppe, S. 28, opinion 03/13 on purpose limitation, 00569/13/ENWP203, <http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf (accessed 5 April 2018). 32 See KOM(92)0422, S. 30, 43. 33 See for the discussion: Kring (oben Fn. 26), S. 553 with further referencing, to be found at: <https://subs.emis.de/LNI/Proceedings/Proceedings232/551.pdf (accessed 11 August 2017). 34 BVerfG, NJW 1984, 419 (424) – Volkszählung; BVerfG, NJW 1969, 1707 – Repräsentativstatistik; Roßnagel, ZD 2013, 562 (565); Schaar, RDV 2013, 223 (225). 35 Wittig, RDV 2000, 59 (61); Moos, MMR 2006, 718 (721). 36 Moos, MMR 2006, 718 (721); OLG Frankfurt/M., CR 2001, 294 (296); also discussing the question: Scholz: Roßnagel, Handbuch Datenschutzrecht, 2003, S. 1833 (1864). 37 See for example: Culik/Döpke, ZD 2017, 226 (228). 38 In more detail: Culik/Döpke, ZD 2017, 226 (228). 39 ErwG 50 sentence 1, 2 DSGVO; Härting, Datenschutz-Grundverordnung, 2016, marginal number 515; Culik/Döpke, ZD 2017, 226 (230). 40 Frenzel:Paal/Pauly (above fn. 14), Art. 5 DSGVO marginal number. 30; zum Prinzip der kompatiblen Weiterverarbeitung; see also: Monreal, ZD 2016, 507 ff. 41 Härting (above fn. 43), marginal number 516 ff.; Culik/Döpke, ZD 2017, 226 (229). 42 About the interpretation of statistic purpoes see already above. 43 Grages:Plath, BDSG/DSGVO, 2nd ed. 2017, Art. 89 DSGVO marginal number. 2; see also: Paal/Hennemann, NJW 2017, 1679 (1700); about Big Data in medicine Schwab/Becker, ZD 2015, 151 ff.; Spindler, MedR 2016, 691 ff.; Timm, MedR 2016, 681. 44 Frenzel: Paal/Pauly (above fn. 14), Art.5 DSGVO marginal number 30. 45 See already above. 46 Dammann, ZD 2016, 307 (313 ff.). 47 With an even more liberal view Helbig, K&R 2015, 145 ff. 48 Reimer: Sydow (above fn. 14), Art. 6 marginal number 67; Frenzel: Paal/Pauly (above fn. 14), Art.5 DSGVO marginal number. 34 ff.; Schantz,: BeckOK Datenschutzrecht (above fn. 18), Art. 5 DSGVO marginal number 24 ff. 49 See: Frenzel: Paal/Pauly (above fn. 14), Art. 5 DSGVO marginal number 21. 50 Recital 71; Dix, Stadtforschung und Statistik, 2016, S. 59, 61, to be found at: <https://www.eaid-berlin.de/wp-content/uploads/2016/05/StSt-1-2016_Dix.pdf (accessed 8 September 2017). 51 See already above. 52 More specifically: Specht/Schröder/Bienemann, Die Chancen der Visualisierung, Handbuch Datenrecht in der Digitalisierung, to be published; Culik/Döpke, ZD 2017, 226 (229). 53 About the use of Big Data analysis in the area of sensible data see: Schneider, ZD 2017, 303 ff. 54 See also: Frenzel: Paal/Pauly (above fn. 14), Art. 6 marginal number. 28. 55 Simitis: Simitis, Bundesdatenschutzgesetz, 8. ed. 2014, § 28 marginal number 99; Mattke, Adressenhandel, 1995, S. 191; Breinlinger, RDV 1997, 247 (249 ff.). 56 See Ernst: Paal/Pauly (above fn. 14), Art. 4 DSGVO marginal number. 48. 57 See: Boehme-Neßler (above fn. 12), at. 422. 58 European Parliament and Council Directive 2001/29/EC, Official Journal L 167, 22/06/2001 P 10, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32001L0029&from=EN, (accessed 5 April 2018). 59 BVerfG v. 11.3.2008, 1 BvR 2074/05, MMR 2008, 308 (309) – Automatisierte Erfassung von Kfz-Kennzeichen. 60 With personal reference Prof. Hubertus Gersdorf; BVerfG v. 11.3.2008, 1 BvR 2074/05, MMR 2008, 308 (309) – Automatisierte Erfassung von Kfz-Kennzeichen. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Intellectual Property Law & Practice Oxford University Press

Considering the relationship between the civil law treatment of data and data protection law in Germany

Loading next page...
 
/lp/ou_press/considering-the-relationship-between-the-civil-law-treatment-of-data-mP4P0dVQZ8
Publisher
Oxford University Press
Copyright
© The Author(s) 2018. Published by Oxford University Press. All rights reserved.
ISSN
1747-1532
eISSN
1747-1540
D.O.I.
10.1093/jiplp/jpy061
Publisher site
See Article on Publisher Site

Abstract

A data-driven economy, in the way it is currently developing and politically desired, needs legal certainty. Therefore, it is important to consider the relationship between the civil law treatment of data and data protection law. This article seeks to provide some answers. I. Introduction In response to the rising importance of data for the economy, the possibility of a specific ‘data right’ has been discussed with increasing frequency. However, the debate suffers from conflicting interests and intentions. While the EU Commission proposes to introduce a ‘data producer’s right’1 to create clear rules for data markets, consumers’ representatives primarily want to enable the data-subjects to benefit from the profits based on the use of their personal data. Therefore, they seek to establish a data right that goes beyond the merely defensive character of data protection law.2 Secondly, the debate lacks a clear application scenario of ‘data rights’. It even seems that - quite the opposite- very heterogenous scenarios shall benefit from ‘data rights’ and, consequently, various interests shall be satisfied. Until today, there has not been a profound workup and analysis of these application scenarios. The question that arises is whether all of them can be dealt with having only one uniform solution: a uniform data right with a uniform scope, for example, or whether branch-specific solutions would be preferable instead. On a more abstract level, arguments in favour and against ‘data rights’ have already been exchanged, comprehensively3, just as the potential lack of economic4 and constitutional5 legitimacy of these data rights has already been discussed. Without wanting to examine these issues in detail, it is worth mentioning that a ‘data right’ would not need to be an overarching, property-like designed right. It is just as possible to think about access rights to data6 or to reduce the scope of a data right to specific branches7. Already today, data can be made subject to contracts,8 it can even be treated as consideration.9 Thus, if a ‘data right’ is not needed due to economic or constitutional reasons, an alternative would be to design standard contractual terms which balance the parties’ interests adequately. But regardless how one treats data in a legal manner, it must be considered in which way data protection law influences this treatment. Hence, this article seeks to examine the relationship between civil law data treatment and data protection law. Hereby, data protection law allows for a distinction between different purposes of data processing. It is, therefore, necessary to begin the examination with a phenomenological assessment of potential application scenarios for civil law data treatment (II.) and as a second step to explain the impacts of data protection law on the design of possible ‘data rights’ and the contractual treatment of data (III.). II. Application scenarios The scenarios in which ‘data-rights’ or data contracts could be relevant can be divided into different categories, the first of which is commercial interest in data handling, the second of which concerns data handling in security-, academic-, or archiving interests. For instance, with regard to (partly-) automated cars, both the producer of the car and the developer of the integrated software are interested in legal positions concerning the data which is collected in the car, for example the driving behaviour of the car owner, because such data can be passed on to insurance providers against payment. As another example, health insurances and producers are interested in data generated by so called smart devices (like fitness trackers or similar applications), to develop customized rates, based on the collected data to minimize the risk of the insurance provider. Rating portals and social networks have interest in extensive data rights which relate to data being generated and collected with the use of their websites, since such data forms the essential value for those networks. Other potential application scenarios can be, for example, data collection in smart homes and smart metering, such as heating behaviour, security behaviour (automated looking of doors, clock-timed lamps, automated control of electric blinds), use of water, etc. All such data collection is generally aimed at analyzing the data through Big Data applications and, hence, to allow a detailed creation of user profiles. The goal of creating these user profiles is – commonly - to provide individualized advertisement. The same is true for customer-card-systems with which shopping behaviour can be analyzed, as well as for click-stream analysis over the internet. Apart from displaying individualized advertisements, the evaluation of data allows for the prediction of behaviour of voters, insurance policy holders and employees etc. Indeed, through the evaluation of personal data, the perception of the data-subjects’ personality has proven to be very precise. For instance, by using the data of only 68 Facebook-likes, analysts can find out a user’s skin colour (with 95% accuracy), sexual orientation (with 88% accuracy), and political persuasion (with 85% accuracy). Intelligence, religion, consumption of alcohol, cigarettes and drugs can also be estimated on this basis.10 Using only 10 Facebook-likes, analysts are able to rate the data subject’s character better than an average work colleague, while by 70 likes they can prove a better understanding of the data subject’s character than his friends can, and by 150 likes, analysts even outbid the data subject’s parents’ understanding of the data subject’s character. With 300 likes, the prediction of the data subject’s behaviour can be described with greater precision than it could be by the data subject’s own partner. With more likes, one can even outbid what the users believe to know about themselves.11 That is, why even the government might have an interest in collecting data to predict and prevent crime. Mayer-Schöneberger/Cukier described already in 2013 how personal data can be used to calculate the probability of an individual to commit a crime.12 Besides individualized advertisement and behavioural prediction, there are other differently motivated interests. For instance, partly- (and in future fully-) automated cars rely on data that enables automated driving by using movement data of the car. Intelligent technology used to regulate traffic and based on car movement data could also assist to regulate traffic lights, making traffic more fluent. In medical research data are processed for scientific purposes. That the processing of such data can also be of commercial interest should not cast a shadow on the fact that data processing in those cases is here mainly being used for scientific advancement, the development of product safety or other public interests. Such society-relevant interests may therefore have to be treated differently from purely commercial interests. III. Relationship between data rights and data protection law Regarding these various constellations, which are, of course, non-exhaustive, the relationship between the civil law treatment of personal data and data protection law can be thought about in three different ways. 1. Data protection law as an instrument to assign data rights Data protection law can be used as an instrument to assign data rights. This, of course, requires the development of data protection law beyond the initial understanding of it as just an instrument of protection to the benefit of the data subject. This is a view that does not allow for a commercialization of data. Regardless as to whether it is possible13 or desirable for data protection law to serve as such as an instrument of assignment of data rights (remember the statements made by the German Constitutional Court about data protection law in the census-decision: ‘The individual does not have a right in the sense of an absolute, illimitably control over their data; information, even if personal, reflects social reality, which cannot be assigned only to the one individual alone’), ‘data-subject’s-rights’ can only be to the benefit of data subjects. Hence, they cannot exist with regard to non-personal data. In many situations though, making a distinction between personal and non-personal data would prove problematic. That is because of the extensive definition of personal data,14 which leads to a large amount of data being personal. In any case, it is at least theoretically possible to remove any data’s personalization by anonymizing the data, or by processing non-personal data at the outset, eg weather and ground-conditions. But also this non-personal data can become personal data (again) by adding more data. Hence, the distinction between personal and non-personal data depends, in many cases, on the data processing circumstances which can make data personal at one moment and non-personal at the other. Therefore, it could make sense not to distinguish between personal an non-personal data, when shaping a ‘data-right’ or issue rules for the contractual treatment of data. 2. Data protection law as a participatory instrument Secondly, one might think of data protection law as a participatory instrument which enables the data subjects to participate in the profits generated by processing their personal data.15 Yet, it is unclear as to how such participation in and distribution of profits could be calculated. It is nigh impossible to determine the portion of the profit of a data processing company that is related to the processing of one person’s specific data. Even if the calculation succeeds, the distribution is difficult. A ‘collecting society data’ (Verwertungsgesellschaft Daten) which helps distributing the profits earned with data-processes does not appear to be a proper solution, since one would hardly know where and which data has been given out. Often enough, data collection happens with only rudimentary knowledge of the data subject (as with cookies, data processing by social networks, or customer-card-systems). In any case, notice of the giving away of data would be essential to participate in the data-processing profit. Anyway, it is open to dispute whether data protection law would even hinder the data subjects’ participation, because its function is actually deemed to discourage the data subject from giving away personal data and to prevent the data subject from losing control over ‘their’ personal data. The state is obligated to protect its citizens from becoming pure objects of data processing. It seems to be a myth that earning money by giving away personal data leads to a more conscious treatment of data by the data subject. Rather, conscious dealing with personal data can only be achieved when the data subjects can be made aware of which data processing arrangement they are agreeing to and which rights can be exercised to limit the use of their personal data. The visualization of information seems to be at least worth a try. Nonetheless, especially the draft directive for digital content16 shows that there is a strong growth of commercialization of personal data which has to, in case of the direction being adopted, recognized by the national legislators of the EU Member States. Assuming (which would surely be difficult to argue) a compatibility of the draft directive and Art 8 of ECHR it would become possible, at least in ‘synallagmatic’ contracts where data are given as consideration, to implement legal rules guaranteeing adequate remuneration of the data subject. It is, at least theoretically, thinkable to have a right similar to § 32 UrhG, but it does seem impracticable in the absence of a satisfactory method of calculating the commercial worth of data. Unfortunately, economic research concerning the calculation of data value is only beginning. 3. Data protection law as an instrument to limit a data right When data rights are assigned to a person other than the data subject, data protection law limits these rights in a way the right to one’s own image (§§ 22, 23 KUG) limits copyright law (regarding a photograph the copyright owner cannot decide about the use of the photograph without the consent of the portrayed person). The same limitation is true for the contractual treatment of data. You cannot process data on the basis of a contract without the consent of the data subject. Data Processing is an act relevant to data protection law, which underlies the prohibition principle, regardless of whether the data are intended to be processed on a contractual basis or because of a specific ‘data right’. a) Scope of the limitation by data protection law Theoretically, the relationship between the civil law treatment of data and data protection law can be thought about precisely in a way which does quite make sense. However practically, the problem is that a limitation by data protection law can undermine potential data rights or contractual agreements concerning data in a way that the legal position concerning the data which should be given under civil law would fade into insignificance. Data rights and data contracts are to capture and mirror reality, in which data are being used more frequently, and becoming a dominant economic factor.17 In any application scenario mentioned under II., a range of data is being processed, especially with the intention to analyse it, using Big Data technology.18 The EU General Data Protection Regulation (GDPR),19 however, is not made for such a massive processing of personal data. There may be circumstances in which permission is granted under Art. 6 GDPR, but in the majority of Big Data scenarios, the data subjects’ consent is required. It has to be an informed consent, meaning, in order for the person to be able to consent, all information about the data process has to be provided. The more data is gathered, the broader the purposes of the data processing are, the more difficult it is to provide all information necessary. Whether a state of being informed is even realisable in Big Data scenarios is highly questionable. The possibility for the data subject to withdraw the consent at any time leads to a heightened uncertainty for a ‘data right’-owner or the one getting data on a contractual basis. The same is true because of the data protection principles, for example the purpose limitation principle, the transparency principle and the principle of data minimisation. a) Purpose limitation principle Already, Art. 8 ECHR requires that personal data is used only for the purpose for which it was collected. The purpose limitation principle is the dominant principle of data protection law.20 It legitimizes the processing of data and requires that the purpose of data processing has to be defined, clear and legitimate, according to Art. 5 I lit b). The criterion of definition is here to be understood in a formal sense, and the criterion of clarity refers to purpose in its material understanding.21 A legitimate purpose is one that is not condemned by the jurisdiction. Ethical considerations and social customs also need to be taken into account.22 For example, a racially motivated discrimination of a group of people is a non-legitimate purpose.23 Nevertheless, the criterion of legitimacy remains only a gross filter to eliminate clearly illegitimate purposes.24 The definition of the purpose of data processing has to be completed before processing the data. If not, the data processing is not in accordance with the law.25 As the required precision for specifying the purpose is determined on a case-by-case Basis, it is not sufficient to determine only a blanket or too generalized purpose.26 Considering this, specifying the purpose and therefore complying with the GDPR using Big Data analysis is very difficult. That is, because it is essential for Big Data processes to keep the purpose flexible,27 to be able to use various analytical algorithms which search for hitherto unknown connections between data and put them in relationship to one another.28 (1) Data processing for statistical purposes The German Federal Constitutional Court (BVerfG) has determined in the census-decision, that data processing for statistical purposes cannot call for a narrow and concrete purpose. It was argued that it is an essential characteristic of statistics that data after their initial processing are supposed to be reused for various other tasks, for which the purpose cannot be determined at the initial stage of processing.29 The same applies to Big Data analysis, which is why this argument could be transferred to Big Data scenarios. Statistical analysis as such is not limited to a concrete content, statistical analysis is more a procedure that can be applied to various content.30 The GDPR not clearly states whether statistical purposes of data processing contain only those purposes which are within the public interest, or whether commercial interests can also be concerned. The wording of Art. 5 (1) lett. b) half sentence 2 GDPR indicates that only archiving purposes have to serve the public interest, whereas statistical purposes can also lie in commercial interests. On the contrary, the legal provision which preceded Art. 5 GDPR (namely Art. 6 (1) lit b) sentence 2 GDPR) indicates, that statistical purposes cannot be of commercial character.31 That is, because the other purposes named in Art. 6 (1) lit. b) sentence 2 GDPR of the data protection directive are non-commercial (historical and scientific) purposes.32 Using the term ‘statistical purposes’ also the German Federal Constitutional Court wanted to privilege only data processing that is in the public interest (here: census) but not commercial interests. Acknowledging these historical and teleological arguments, one can hardly say that the specification of a purpose of Big Data analysis can be similarly broad as it is true for statistical purposes in the public interest. (2) Data processing for advertisement purposes as a satisfying purpose? If a significant part of data processing in Big Data scenarios has the purpose of market- and opinion research, or more specifically customer profiling and individualised advertising, the question is whether ‘advertisement-purposes’ or ‘marketing-purposes’ can satisfy the required specification of the purpose. On the one hand, it could be argued that the purpose of ‘advertisement’ and ‘marketing’ includes a series of advertising activity (addressing of customers via the newspaper, e-mail, telephone marketing, or live advertisement in the internet, etc.) and even product- and services research would be included. Thus, the specification of purpose could be deemed to be too imprecise. On the other hand, it could be reasoned that the customer only cares about the ‘primary’ purpose, the addressing with advertisement, regardless of the form in which the advertising is done and what steps are taken before that.33 This argument is supported by recital 47 sentence 7 GDPR, which expressly recognizes purposes of ‘direct advertisement’ as legitimate within the balancing of interests, without the purpose having to be further specified. ‘Advertisement-purposes’ as well as ‘marketing-purposes’ therefore seem to be possible, though options differ on that issue. (3) Data processing with the purpose of personality profiling However, even if ‘advertisement-purposes’ and ‘marketing-purposes’ are deemed to be sufficiently specialized purposes, these purposes will not cover activities which aim at creating personality profiles. That is, because creating personality profiles is prohibited by law. The German Federal Constitutional Court has made this remark in its census-decision as well as its decision regarding the constitutionality of a representative statistic. A broad registration and cataloguing of the personality through the merger of individual life and personal data to create personality profiles of the citizens is prohibited even in the anonymity of a statistical survey.34 This is due to the indispensable right of human dignity of Art. 1 of the German Constitution (GG) which would be infringed by the creation of personality profiles (even when they cover only parts of the person’s personality details; how many parts must be covered has to be determined on case-by-case basis). If that already applies to purposes of statistical survey which lie in the public interest, and which are therefore privileged in certain kinds, then this must especially be the case when personality profiles are created for commercial purposes which are not privileged at all. This view is being agreed upon by the vast amount of academic literature, at least as long as individual data is not only being gathered, but, with the purpose of illustrating the consumer personality, is also being evaluated.35 The Frankfurt Higher Regional Court (OLG) even considers data processing to be admissible if the data allows the creation of a detailed personality profile.36 But the jurisprudence of the OLG Frankfurt is contrary to the census-decision, which expressly talks about a breach of the human dignity principle when considering the creation of personality profiles. One must not differentiate between constellations in which data has been collected only and those in which the data has already been analyzed because if the data has been collected, it needs only one single mouse click to start analysing. Thus, the personality right is in concrete danger even when the data are being collected only. To put it in a nutshell, when data is being generated with the purpose of Big Data technology analysis, conflicts with the purpose limitation principle will inevitably arise as long as ‘advertisement-purposes’ or ‘marketing-purposes’ are found not to be satisfactory. The creation of personality profiles is no valid purpose and is therefore not able to justify data processing. (4) Breaking the principle of specified purpose Processing data for purposes other than the purpose which the data were collected for, is possible if the data subject consents. Thus, it is possible to justify data processing even for purposes not being defined at the moment the data is collected. Theoretically, this might be a solution for Big Data scenarios but, practically, it is questionable as to whether seeking consent from the data subject would prove to be a practical procedure, given the high number of data in Big Data scenarios.37 Also, it is possible to derogate from the purpose limitation principle on the basis of European Union legislation or national legislation of the member states. This possibility is provided in Art. 6 (4) GDPR, and is only limited by Art. 23 GDPR.38 Anyhow, the principle of specified purpose only requires that data is not being processed for a purpose being incompatible with the purpose the data had been collected for. Therefore, if the initial purpose and the purpose of further processing are mutually compatible, neither consent of the data subject nor a specific legislation at the EU or national level is required.39 The initial purpose for which the processing was undertaken is the benchmark for judging as to which data processing is admissible.40 In order to ascertain whether a purpose of further processing is compatible with the purpose for which the personal data were initially collected, referring to Recital 50 sentence 6 GDPR the controller should take into account any link between those purposes and the purposes of the intended further processing, the context in which the personal data have been collected, in particular the reasonable expectations of data subjects based on their relationship with the controller as to their further use, the nature of the personal data, the consequences of the intended further processing for data subjects and the existence of appropriate safeguards in both the original and intended further processing operations. Hence, a balancing of interests is to be achieved, taking into account all circumstances of the case.41 There are, however, some purposes for the further processing of data which have a privileged status, like statistical and scientific purposes (Art. 89 GDPR).42 Recital 157 GDPR expressly names scientific research as a possible purpose for further data processing.43 But also purposes not being named in Art. 89 GDPR and Recital 157 can be compatible with the purpose the data were collected for.44 Nevertheless, also the further purpose must not be illegitimate.45 To sum up, if data is being gathered for legitimate purposes and the purpose of further processing is satisfactorily precise and legitimately chosen, Big Data analysis for such purposes can comply with data protection law.46 A general exception to the benefit of Big Data analysis is however not known to the GDPR. Thus, every processing by way of Big Data analysis requires a balancing of interests on a case-by-case basis.47 Having said this, a legal framework that aims at legal certainty in the field of Big Data analysis is not provided. b) Principle of data- and storage minimization, principle of transparency According to the principle of data-minimization, personal data must be adequate, relevant and limited to what is necessary in relation to the purposes for which the data are processed.48 Thus, also the data minimization principle limits the data processing according to its purpose. Since that purpose is already difficult to determine for Big Data analysis, there is a high risk that Big Data analysis does not comply with the principle of data minimization. The same applies to the principle of data storage minimization (Art. 5 (1) lit e) GDPR), which prohibits any data storage beyond the purpose of the data processing. Also, the transparency principle seems to be difficult to comply with, for it requires that the data subject is properly informed about the purpose of the data processing.49 Finally, the data subject has the right not to be made subject to a decision based solely on automated data processing (as for example by Big Data algorithms), which produces legal effects concerning them or similarly significantly affects them (Art. 22 GDPR). This prohibits, for example, an automated decision over a bank loan, made using algorithms.50 c) Informed consent Another fundamental problem concerning Big Data scenarios is the informed consent to justify a massive data processing. That is partly because the purpose of the data processing is already difficult to determine precisely (see above).51 But even if such a purpose could be determined, the informed consent is not a secure instrument to justify the data processing. Whether, when and how free and fully-informed consent can be obtained is a question difficult to answer. Regarding (partly-) automated driving one might think about whether consent can already be obtained at the stage of the car purchasing, but it still proves to be unclear how far such consent could go. Moreover, the mass of information necessary to inform the data subject sufficiently usually leads to an information overload which means that providing more information does not lead to the data subject truly being informed. The question arises whether a data subject not feeling accurately informed can consent in a way which justifies data processing. Without going into detail it needs to be underlined that the informed consent undergoes a serious crisis in the digital age. To put it in a nutshell, to rely on an informed consent to justify Big Data analysis proves to be highly risky.52 d) Permissions To prevent such risks the question needs to be considered if data processing in Big Data scenarios is permitted by law. According to Art. 6 (1) lit. f GDPR data processing is lawful if it is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child. According to Art. 9 GDPR the processing of sensible data is generally prohibited and only exceptionally permitted within the scope of Art. 9 (2) GDPR. Such an exception is, for example, granted on the basis of EU or national law if the data processing is necessary for reasons of substantial public interest.53 While art 9 (2) lit. g) GDPR allows EU Member States to adopt exceptions to the processing of sensible personal data, Art. 6 (1) lit. f GDPR requires a weighing of interests on case-by-case basis. By way of balancing interests, those of the controller or a third party as well as the rights and interests of the data subject are set in relation to one another. The legitimate interests of the controller or the third party are to be understood in a wide sense and, according to Recital 47 sentence 7 GDPR, can also include processing for the purposes of direct advertisement.54 This was also true within the scope of § 28 Bundesdatenschutzgesetz (BDSG). But even if such commercial interests are to be taken into consideration within the case-by-case weighting of interests according to Art. 6 (2) lit. f GDPR such interests cannot justify any change of purpose.55 This was hitherto true under German data protection law and this must apply to Art. 6 (2) lit. f GDPR, to prevent the principle of purpose limitation from being undermined. Hence it follows, that the massive processing of data in Big Data scenarios for purposes other than the purpose the data were collected for is being difficult to justify under Art. 6 (2) lit f) GDPR. (b) Possible solution: permission to anonymize data One option to process data using Big Data technology, whilst still complying with data protection law, is to anonymize the data. According to Recital 26 GDPR, anonymizing data means to manipulate the data in a way that the data subject cannot be identified any longer.56 Using only pseudonyms which do not effectively prevent identification, is not sufficient. Taking into account the wide variety of methods to de-anonymize data are available already today and which one can think of in the future, it cannot be said with any degree of certainty whether it is actually possible to eliminate every chance to identify the data subject.57 It could be of help to devise technical standards or declare existent standards to be legally binding and to irrefutably presume data to be anonymized when these standards are complied with. Moreover, the de-anonymizing data could be prohibited, possibly even punishable. But nevertheless, anonymizing of data can only happen when the data are collected, before, and it is usually not possible to anonymize data in the very same moment it is collected. In areas, for example, in which driving behaviour data are gathered for product improvement, as well as where smart devices are used, the data are inevitably personal at the moment they are collected. It seems rather difficult if not impossible to collect that kind of data already being anonymized. Since data processing with the purpose of evaluation through Big Data analysis can also hardly be based on an already existing permission or consent, de lege ferenda a permission to collect and short-time store data for the purpose of immediately anonymizing the data similar to Art. 5 (1) InfoSoc-Directive58 (§ 44a UrhG) could be a solution. Such an exception to the general prohibition of data processing would be a significant step towards legal certainty for such data processing which lies in the interest of society, politics and science. However, this solution assumes that the collecting and storage of data for the purpose of anonymizing the data automatically is qualified as being a relevant act of data processing under data protection law. A decision of the German Federal Constitutional Court which refers to automated licence plate recognition could cast some doubt on this assumption. That is, because according to the court in cases of automated collection and only short term storage of data for the purpose of immediate comparison with registry data and immediate deletion of the data after this comparison, there is, according to the court, no infringement of the right to self-determination.59 From a teleological point of view deleting and anonymizing data are similar acts because in either case the data subject is in less need of protection. It could therefore be concluded that automated collection and short-time storage of personal data aren’t relevant acts under data protection law if the data are deleted or anonymized immediately after the collection.60 Art. 4 No. 2 GDPR on the other hand, at least in its wording, seems to define ‘any’ data collection and storage as a data-protection law relevant act, no matter for how long the data are stored without being anonymized. Assuming that also such short-time storage for purposes of anonymizing the data is a data-protection law relevant act, data from connected cars and smart devices, for example, could be collected in accordance with data protection law under an exception similar to Art. 5 (1) InfoSoc-Directive (§ 44a UrhG) for short term storage with the purpose of immediately anonymizing the data. The right to self-determination would be sufficiently protected by the obligation to anonymize the data. Of course, additionally needed is a sufficient regulation of the anonymizing-process to effectively prevent de-anonymizing. This is essential to safeguard the right to informal self-determination. As an alternative to an explicit permission rule similar to Art. 5 (1) InfoSoc-Directve (§44a UrhG), which would be needed to be agreed upon at EU level, one could work out guidelines for the weighing of interests within the scope of Art. 6 (1) lett. f GDPR, at least for the processing of non-sensible data. These guidelines could declare how the different interests should be weighted according to Art. 6 (1) lett. f GDPR when it comes to data processing which lies in the common interest. This requires, of course, that politics and society elaborate which data processing should be privileged, before. With such guidelines, a sufficient degree of legal certainty might also be achieved. IV. Conclusion In summary, there are three options to design the relationship between civil law treatment of data and data protection law. Data protection law might function as an instrument to assign data rights, as well as a participatory instrument which enables the data subjects to participate in the profits generated with the use of their personal data. For both mechanisms, data protection law would have to be developed beyond the initial understanding of it primarily being an instrument of protection, which is a view that does not allow for a commercialization of data. Actually, data has a commercial value and the draft directive about digital content seeks to recognize this commercial value in a legal manner, too. Data protection law could not serve as an assignment instrument for non-personal data and establishing different assignment-criteria for personal and non-personal data does not seem to be reasonable due to the large variety of transformation options for both data categories. Looking at data protection law as a participation instrument, on the other hand, proves to be problematic due to the difficulties of value assessment of data. But this function should be intensively considered, all the more in case of adopting the draft directive about digital content which calls for data to be recognized as akin to consideration in contracts. To be adequately remunerated for giving away data becomes essential when data are consideration. Providing such adequate remuneration requires a legal provision similar to § 32 UrhG. In any case, data protection law limits civil law treatment of data. In doing so, this limitation can undermine potential data rights or contractual agreements concerning data in a way that the legal position concerning the data which should be given under civil law fades into insignificance. A significant limitation appears in relation to data that is collected for Big Data analysis for commercial purposes. But also, data processing for research and product development purposes bears severe legal risks, which especially apply to data collection used in the field of autonomous or automated driving. If a more data driven economy is supposed to act on a secure legal foundation, a permission similar to Art. 5 (1) InfoSoc-Directive (§ 44a UrhG), which would allow short-term storage of data for the purpose of immediately anonymizing the data, or guidelines about the weighing of interests with respect to Art. 6 (1) lit f GDPR, would be of help. If the data-driven economy is provided with the possibility to evaluate anonymized data using Big Data technology, at least when in the interest of society, a fair balance between economical needs and an appropriate protection of the right to informal self-determination could be achieved. Footnotes 1 Communication from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of Regions, “Building a European Data Economy”, COM 2017(9), S.13 ff. 2 See for example: Schwartmann/Hentsch, PinG 2016, 117 ff. 3 See especially: Becker, GRUR 2017, 346 ff.; Wiebe, GRUR 2017, 338 ff.; Börding/Jülicher/Röttgen/Schönfeld, CR 2017, 134 ff.; Wandtke, MMR 2017, 6 ff.; Fezer, MMR 2017, 3 ff.; Drexl, Designing Competitive Markets for Industrial Data – Between Propertisation and Access, 2016, MPI for Innovation & Competition Research Paper No. 16-13, to be found at: <https://ssrn.com/abstract=2862975> (accessed 5 April 2018); Kerber, GRUR Int. 2016, 989 ff.; Spindler, JZ 2016, 805 ff.; Heymann, CR 2016, 650 ff.; Härting, CR 2016, 646 ff.; Specht, CR 2016, 288 ff.; Dorner, CR 2014, 617; Hoeren, MMR 2013, 486 (488); Zech, CR 2015, 137; ders., Information als Schutzgegenstand, 2012. 4 Kerber, GRUR Int. 2016, 989 ff. 5 Wiebe/Schur, ZUM 2017, 461 ff. 6 See especially: Drexl (oben Fn. 3). 7 Just recently: Dreier, in: Weller et al., Tagungsband zum XI. Heidelberger Kunstrechtstag, im Erscheinen. 8 Specht, Ökonomisierung informationeller Selbstbestimmung – Die zivilrechtliche Erfassung des Datenhandels, 2012. 9 Specht, JZ 2017, 763 ff.; Metzger, AcP 2016, 817 ff. 10 Grassegger/Krogerus, Das Magazin Nr. 48/2016, „Ich habe nur gezeigt, dass es die Bombe gibt“, to be found at: <https://www.dasmagazin.ch/2016/12/03/ich-habe-nur-gezeigt-dass-es-die-bombe-gibt/ (accessed 15 August 2017). 11 Grassegger/Krogerus (above Fn. 10). 12 Mayer-Schöneberger/Cukier, Big Data, 2013, S. 199; about behaviour prediction using Big Data analysis see also Boehme-Neßler, DuD 2016, 419 (421). 13 See for example: Specht/Rohmer, PinG 2016, 127. 14 For more information regarding the scope of personalization see EuGH, GRUR Int. 2016, 1169 – Breyer; Klabunde, in: Ehmann/Selmayr, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 5 ff.; Ziebart, in: Sydow, Europäische Datenschutzgrundverordnung, 1. Aufl. 2017, Art. 4 para. 7 ff.; Ernst, in: Paal/Pauly, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 3 ff.; Gola, in: Gola, Datenschutz-Grundverordnung, 1. Aufl. 2017, Art. 4 para. 3 ff. 15 See for example: Schwartmann/Hentsch, PinG 2016, 117. 16 European Parliament and Council Directive Proposal, COM (2015) 634 final, 2015/0287 (COD), http://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:52015PC0634, (accessed 5 April 2018). 17 See for example: Van Asbroeck/Debussche/César, Building the European Data Economy, Data Ownership, White Paper, 2017. 18 About potential forms of use of Big Data analysis, see: Orthmann/Schwierig, NJW 2014, 2984. 19 Directive 95/46/EC of the European Parliament and of the Council of 24.10.1995 on the protection of individuals with regards to the processing of personal data and on the free movement of such data. 20 Dammann, ZD 2016, 307 (311); Frenzel, in: Paal/Pauly (oben Fn. 14), Art. 5 para. 23: „Dreh- und Angelpunkt“; Schantz: BeckOK Datenschutzrecht, 20. Aufl. 2017, Art. 5 DSGVO para. 13: „beherrschendes Konstruktionsprinzip”. 21 Frenzel: Paal/Pauly (above. fn. 14), Art. 5 at marginal number 27. 22 Art. 29 Gruppe, S. 20, opinion 03/13 on purpose limitation, 00569/13/ENWP203, to be found at: http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf (accessed 9 September 2017); Monreal, ZD 2016, 507 (509). 23 See Helbig, K&R 2015, 145 (146). 24 Schantz: BeckOK Datenschutzrecht (above Fn. 18), Art. 5 DSGVO marginal number 17. 25 BVerfG, NJW 1984, 419 (422) – Volkszählung; Schantz, in: BeckOK Datenschutzrecht (above fn. 18), Art. 5 DSGVO marginal number. 13. 26 Culik/Döpke, ZD 2017, 226 (227); Bergmann/Möhrle/Herb, Datenschutzrecht, 50. EL 2016, § 4 marginal number 43. 27 Weichert, ZD 2013, 251 (256); Culik/Döpke, ZD 2017, 226 (230). 28 Boehme-Neßler (above fn.12); Kring (above fn. 26), S. 553 with further referencing, to be found at: <https://subs.emis.de/LNI/Proceedings/Proceedings232/551.pdf> (accessed 11 August 2017). 29 BVerfG, NJW 1984, 419 (423) – Volkszählung. 30 Richter, DuD 2015, 735 (738). 31 Also: Richter, DuD 2015, 735 (738); Culik/Döpke, ZD 2017, 226 (230); not excluding commercial purposes, at least not from the beginning: Art. 29 Gruppe, S. 28, opinion 03/13 on purpose limitation, 00569/13/ENWP203, <http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2013/wp203_en.pdf (accessed 5 April 2018). 32 See KOM(92)0422, S. 30, 43. 33 See for the discussion: Kring (oben Fn. 26), S. 553 with further referencing, to be found at: <https://subs.emis.de/LNI/Proceedings/Proceedings232/551.pdf (accessed 11 August 2017). 34 BVerfG, NJW 1984, 419 (424) – Volkszählung; BVerfG, NJW 1969, 1707 – Repräsentativstatistik; Roßnagel, ZD 2013, 562 (565); Schaar, RDV 2013, 223 (225). 35 Wittig, RDV 2000, 59 (61); Moos, MMR 2006, 718 (721). 36 Moos, MMR 2006, 718 (721); OLG Frankfurt/M., CR 2001, 294 (296); also discussing the question: Scholz: Roßnagel, Handbuch Datenschutzrecht, 2003, S. 1833 (1864). 37 See for example: Culik/Döpke, ZD 2017, 226 (228). 38 In more detail: Culik/Döpke, ZD 2017, 226 (228). 39 ErwG 50 sentence 1, 2 DSGVO; Härting, Datenschutz-Grundverordnung, 2016, marginal number 515; Culik/Döpke, ZD 2017, 226 (230). 40 Frenzel:Paal/Pauly (above fn. 14), Art. 5 DSGVO marginal number. 30; zum Prinzip der kompatiblen Weiterverarbeitung; see also: Monreal, ZD 2016, 507 ff. 41 Härting (above fn. 43), marginal number 516 ff.; Culik/Döpke, ZD 2017, 226 (229). 42 About the interpretation of statistic purpoes see already above. 43 Grages:Plath, BDSG/DSGVO, 2nd ed. 2017, Art. 89 DSGVO marginal number. 2; see also: Paal/Hennemann, NJW 2017, 1679 (1700); about Big Data in medicine Schwab/Becker, ZD 2015, 151 ff.; Spindler, MedR 2016, 691 ff.; Timm, MedR 2016, 681. 44 Frenzel: Paal/Pauly (above fn. 14), Art.5 DSGVO marginal number 30. 45 See already above. 46 Dammann, ZD 2016, 307 (313 ff.). 47 With an even more liberal view Helbig, K&R 2015, 145 ff. 48 Reimer: Sydow (above fn. 14), Art. 6 marginal number 67; Frenzel: Paal/Pauly (above fn. 14), Art.5 DSGVO marginal number. 34 ff.; Schantz,: BeckOK Datenschutzrecht (above fn. 18), Art. 5 DSGVO marginal number 24 ff. 49 See: Frenzel: Paal/Pauly (above fn. 14), Art. 5 DSGVO marginal number 21. 50 Recital 71; Dix, Stadtforschung und Statistik, 2016, S. 59, 61, to be found at: <https://www.eaid-berlin.de/wp-content/uploads/2016/05/StSt-1-2016_Dix.pdf (accessed 8 September 2017). 51 See already above. 52 More specifically: Specht/Schröder/Bienemann, Die Chancen der Visualisierung, Handbuch Datenrecht in der Digitalisierung, to be published; Culik/Döpke, ZD 2017, 226 (229). 53 About the use of Big Data analysis in the area of sensible data see: Schneider, ZD 2017, 303 ff. 54 See also: Frenzel: Paal/Pauly (above fn. 14), Art. 6 marginal number. 28. 55 Simitis: Simitis, Bundesdatenschutzgesetz, 8. ed. 2014, § 28 marginal number 99; Mattke, Adressenhandel, 1995, S. 191; Breinlinger, RDV 1997, 247 (249 ff.). 56 See Ernst: Paal/Pauly (above fn. 14), Art. 4 DSGVO marginal number. 48. 57 See: Boehme-Neßler (above fn. 12), at. 422. 58 European Parliament and Council Directive 2001/29/EC, Official Journal L 167, 22/06/2001 P 10, https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32001L0029&from=EN, (accessed 5 April 2018). 59 BVerfG v. 11.3.2008, 1 BvR 2074/05, MMR 2008, 308 (309) – Automatisierte Erfassung von Kfz-Kennzeichen. 60 With personal reference Prof. Hubertus Gersdorf; BVerfG v. 11.3.2008, 1 BvR 2074/05, MMR 2008, 308 (309) – Automatisierte Erfassung von Kfz-Kennzeichen. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/about_us/legal/notices)

Journal

Journal of Intellectual Property Law & PracticeOxford University Press

Published: Apr 24, 2018

There are no references for this article.

You’re reading a free preview. Subscribe to read the entire article.


DeepDyve is your
personal research library

It’s your single place to instantly
discover and read the research
that matters to you.

Enjoy affordable access to
over 18 million articles from more than
15,000 peer-reviewed journals.

All for just $49/month

Explore the DeepDyve Library

Search

Query the DeepDyve database, plus search all of PubMed and Google Scholar seamlessly

Organize

Save any article or search result from DeepDyve, PubMed, and Google Scholar... all in one place.

Access

Get unlimited, online access to over 18 million full-text articles from more than 15,000 scientific journals.

Your journals are on DeepDyve

Read from thousands of the leading scholarly journals from SpringerNature, Elsevier, Wiley-Blackwell, Oxford University Press and more.

All the latest content is available, no embargo periods.

See the journals in your area

DeepDyve

Freelancer

DeepDyve

Pro

Price

FREE

$49/month
$360/year

Save searches from
Google Scholar,
PubMed

Create lists to
organize your research

Export lists, citations

Read DeepDyve articles

Abstract access only

Unlimited access to over
18 million full-text articles

Print

20 pages / month

PDF Discount

20% off