TY - JOUR AU - Wiedemann, Klaus AB - Key Points The enforcement of Art. 102 TFEU vis-à-vis excessive and discriminatory pricing in digital markets remains unlikely in the coming years. The recent interim ruling of the Düsseldorf Higher Regional Court in the Facebook case shows the difficulties that a competition authority might face when it comes to sanctioning exploitative abuses in digital markets. Rather than adopting infringement decisions, competition agencies should conclude behavioural commitments with the dominant online platform. Competition authorities may borrow from the European data protection regime a number of behavioural remedies to tackle forms of privacy degradation unilaterally imposed by online platforms. I. Introduction Exploitative conducts are unilateral behaviours by dominant firms that distort competition in the market by directly harming customers, rather than excluding competitors. Art. 102 Treaty of the Functioning of the European Union (TFEU)1 lists a number of exploitative conducts as examples of abuses of dominant position, such as ‘directly or indirectly imposing unfair purchase or selling prices’ (i.e. excessive pricing),2 ‘applying dissimilar conditions to equivalent transactions with other trading parties, thereby placing them at a competitive disadvantage’ (i.e. discriminatory pricing),3 as well as unilaterally imposing ‘other unfair trading conditions’.4 Traditionally, the European Commission has seldom investigated these types of abuses under Art. 102 TFEU, due to the high burden of proof and concerns over the risk of market regulation.5 However, this scenario has progressively changed over recent years, when a number of national competition authorities (NCAs) have sanctioned exploitative conducts in the energy sector,6 as well as cases of excessive pricing of generic drugs.7 Finally, the recent Facebook decision by the Bundeskartellamt represents the first case of exploitative conduct sanctioned by an NCA in the digital world.8 The recent Facebook decision represents an important development,9 which calls for a broader discussion on whether and to what extent Art. 102 TFEU should be relied on by the NCAs/the European Commission to sanction exploitative abuses by dominant digital platforms. In particular, the present paper analyses the challenges that a competition authority would face in sanctioning exploitative abuses in digital markets in light of the case law of the Court of Justice of the European Union (CJEU). Secondly, the paper discusses potential remedies that a competition law agency could adopt in such cases. In particular, the paper explores the idea that an NCA/the European Commission may borrow from the European data protection regime a number of behavioural remedies to tackle forms of privacy degradation unilaterally imposed by online platforms. In other words, one of the main issues discussed in the present paper is whether the rights provided in the EU General Data Protection Regulation (GDPR)10 to natural persons (e.g. access rights, the right to data portability, transparency obligations regarding the processing of personal data etc.) might be negotiated by the competent competition authority with the dominant online platform as behavioural remedies in the context of a competition law investigation. The paper does not discuss either the definition of the relevant market or market power in the digital world—i.e. subjects that have already been extensively discussed in the literature.11 Therefore, we take for granted that the online platform has a substantial degree of market power in order to be considered dominant (e.g. the platform owns a large amount of personal data, while network effects discourage new entries in the market), and thus its market behaviour could fall within the scope of Art. 102 TFEU. In this paper, we analyse three categories of exploitative abuses by dominant online platforms that could directly harm consumers: Excessive pricing: In data markets, this conduct could take the form of an ‘excessive’ amount of personal data that online platforms request final consumers to provide in exchange for ‘free’ access to an online service.12 Discriminatory pricing: Via an analysis of personal data and by means of predictive modelling (i.e. profiling), algorithms facilitate cases of price discrimination among different consumers who purchase goods and services from a dominant online platform. Unfair trading conditions: Data protection terms and privacy policies could be unfair from a competition law point of view. Furthermore, by unilaterally imposing a change of these policies on final consumers, a dominant online platform could decrease the product quality of an online service. In other words, consumers would receive the same online service by ‘paying’ a higher price in terms of lower privacy standards. In Section II, the paper analyses the enforcement challenges faced by an NCA/the European Commission in sanctioning an exploitative abuse in a digital market in the light of the existing case law of the CJEU concerning Art. 102 TFEU. On the other hand, Section III discusses the role of behavioural commitments that an NCA/the European Commission could conclude with the dominant online platform to remedy a competition law violation. Finally, Section IV concludes and sets the floor for further discussion on this issue. II. Exploitative conducts in the data-driven economy: enforcement challenges A. Excessive pricing in the data economy A peculiarity of the data economy is that online users often receive ‘free’ services from online platforms: apps, videos, games, maps, search engines etc. are freely provided to Internet users ‘in exchange’ for their personal data.13 In other words, users agree to ‘reduce’ their privacy in exchange for some kind of service. The online platform uses the large amount of data collected to create detailed consumer profiles and, for instance, sell such valuable information to other firms or improve the quality of its products.14 It should be noted that the latter aspect also shows that sometimes it is difficult to draw the line between personal data that serve as counter-performance (since it will be commercially exploited by the platform), and personal data that users provide in their own interest (i.e. to help improve the quality of the services that they receive from the platform).15 Many online services, such as Facebook, certainly need to process personal data to a certain extent, in order to improve the quality of their services—or to function in the first place.16 In this section, we focus on the first aspect, i.e. data as a means of payment in lieu of monetary compensation. This new business model that characterizes the data economy has also been acknowledged by a 2019 Directive ‘on certain aspects concerning contracts for the supply of digital content and digital services’.17 The scope of this new legislation is interesting, as it does not only apply when a consumer pays (or undertakes to pay) a monetary price for digital content or a digital service, but also when ‘the consumer provides or undertakes to provide personal data to the trader’ (Art. 3(1)). Yet, it should be noted that the lawmaker refrained from using the term ‘counter-performance’ when referring to (all kinds of) data provided by the consumer, as opposed to the wording of the initial draft of the Directive published in 2015.18 The debate on the nature of the counter-performance could provide a basis in the discussion on sanctioning excessive pricing in the data economy. In a world where platforms mainly provide ‘free’ services to final consumers, the traditional concept of excessive pricing requires an update. Since United Brands, the European Court of Justice (ECJ) has consistently defined excessive pricing as an ‘unreasonable’ price paid by the customers of the dominant firm in comparison to the economic value of the product purchased.19 If we relied on this definition in the context of the data economy, Art. 102 TFEU could sanction the excessive amount of personal data that a dominant online platform requests from Internet users. Secondly, in view of the ECJ ruling in Latvian Copyright Society, the benchmark to determine whether the amount of data requested by the dominant platform is ‘excessive’ could be the amount or type of personal data requested by other online platforms for the provision of a similar online service.20 In particular, the difference in terms of quality and amount of data requested by other online platforms would have to be ‘significant’ and ‘persistent’.21 Finally, it would be up to the online platform to put forward justifications.22 Of course, the dominant operator could argue that it provides a better service than its competitors, which in turn justifies a larger amount of personal data to be provided by final consumers as counter-performance. If—in theory—sanctioning the excessive request of personal data under Art. 102 TFEU would be possible, in practice an NCA/the European Commission would face a number of enforcement challenges. First of all, privacy preferences are highly subjective and it would be quite difficult for a competition authority to determine when the amount or quality of personal data requested by the online platform is truly ‘excessive’.23 Secondly, Latvian Copyright Society refers to ‘significant’ and ‘persistent’ disparities. These are vague terms that would leave a broad margin of discretion to the authority, and they would imply a high risk that the decision will be annulled by a court on appeal. Finally, the online platform could put forward good arguments to justify its conduct. In particular, if consumers are willing to transfer certain personal data to the online platform, they implicitly accept the value of the service offered by the online platform in comparison to the amount of data requested as counter-performance. Against the backdrop of these considerations, no competition authority has ever sanctioned any case of excessive pricing in data markets. The Facebook decision recently adopted by the Bundeskartellamt discussed in Section II.C concerns unfair trading conditions in data markets, rather than a case involving excessive pricing. In view of the abovementioned high burden of proof, we will probably have to wait for some time before we see any competition law enforcement attempt in this area. B. Price discrimination in the data economy Economists traditionally identify three degrees of price discrimination:24 1) First-degree price discrimination takes place when a firm is able to perfectly discriminate among its customers, adjusting the price of the product to the individual consumer’s (exact) willingness to pay. 2) Second-degree price discrimination means that the firm discriminates among its customers by granting discounts once a specific purchase quota is achieved (‘non-linear pricing’). This includes two-part tariffs, but also extends to versioning (i.e. offering the ‘same’ product at different quality levels and prices). 3) Third-degree price discrimination takes place when the firm charges different prices to different groups of customers (‘group pricing’). First-degree price discrimination has traditionally been considered de facto impossible, because the seller usually does not have enough information to accurately differentiate the price for each consumer.25 However, as found by a 2015 White House report, big data analytics facilitate the shift from second/third-degree price discrimination to first-degree price discrimination.26 Online platforms collect a large amount of personal data via Internet cookies, search engines, social platforms etc. Besides traditional personal data, such as gender and age, other information is essential for online platforms, too. Past online purchases, location, the web sites previously visited, as well as search queries are key data that allow online platforms to understand the online ‘behaviour’ of their customers. In particular, these data are relied on by online platforms to create profiles of their customers, in order to understand what types of products consumers are currently searching for, and to find out what the maximum price is that consumers would be willing to pay for a certain product (the so-called ‘reservation price’).27 Thus, on the basis of the users’ profiles, a platform is able, under certain circumstances, to develop customized offers for individual consumers. At the moment, the prediction accuracy of such techniques does not correspond to first-degree price discrimination due, inter alia, to a lack of sufficient data.28 Yet, prediction accuracy is on the rise and in the coming years, price discrimination methods are expected to get closer to first-degree price discrimination. Economists traditionally differentiate forms of price discrimination that harm a rival (‘primary line injury’) from those that harm a direct customer of the firm (‘secondary line injury’).29 As recognized by Advocate General Wahl in MEO, secondary line injury is ‘extremely rare’:30 the dominant firm has limited incentives to discriminate customers that are not its competitors in the markets. However, the advent of personalised pricing facilitates forms of first-degree price discrimination and incentivizes a dominant online platform to implement such strategy in order to maximize profits. Economists agree that price discrimination has ambiguous effects on the consumers’ welfare.31 For example, a dominant firm could charge a lower price to ‘budget conscious consumers’ who have a lower reservation price, and who are also expected to be ‘poorer’ in terms of personal income. Hence, price discrimination could increase product affordability for a larger number of consumers and thus facilitate welfare re-distribution among different categories of consumers. However, it is worth remembering that the objective of price discrimination is to ‘capture as much consumer surplus as possible’, while welfare distribution is only a side effect of this strategy.32 In the end, the ‘optimal’ price for the online platform is the price at which the platform can maximize its profits—i.e. the (perceived) consumers’ reservation price.33 Therefore, the ‘optimal’ price shifts part of the consumers’ welfare to the online platform: without price discrimination, some consumers would pay a lower price for the product in comparison to their reservation price. In other words, even though price discrimination could facilitate welfare re-distribution among different categories of consumers, in certain cases this practice could also increase the firm’s welfare to the detriment of overall consumers’ welfare. In view of the ambiguous effects of price discrimination on consumers’ welfare, there is no reason to rule out a priori the enforcement of Art. 102 TFEU vis-à-vis discriminatory pricing. However, price discrimination is not per se an abuse of dominance. An NCA/the European Commission would need to prove that the conditions under Art. 102(c) TFEU are satisfied. In particular, the authority would need to prove that the dominant online platform has applied discriminatory pricing to ‘equivalent transactions’ and that the discriminated customer has suffered a ‘competitive disadvantage’ in comparison to ‘other trading partners’, which also have a trading relationship with the dominant firm. In MEO, the ECJ has recently provided some guidance on the interpretation of Art. 102(c) TFEU.34 In particular, in MEO the Court recognized that the competition authority would need to consider and analyse ‘all the relevant circumstances’ before concluding that the conduct causes a competitive disadvantage for the discriminated customer.35 For instance, the authority should assess the bargaining power of the customer, the duration of the conduct and whether or not there is a discriminating strategy by the dominant firm.36 It would be quite challenging for an NCA/the European Commission to prove these criteria, especially the last one, in cases of dominant online platforms employing a personalised pricing strategy. First of all, the authority would need to prove that the price discrimination is a repeated conduct—i.e. a strategy which is systematically implemented by the dominant undertaking vis-à-vis certain customers. However, customers are often not aware of having been the subject of discrimination. In addition, in case of price discrimination applied by dominant online platforms, the authority would need to analyse the functioning of the firm’s algorithm to understand whether it systematically discriminates between different categories of customers. This is a very complex task for any authority. Last but not least, the dominant online platform could come up with a number of objective justifications.37 For instance, the platform could argue that its conduct leads to forms of optimal prices that increase overall consumers’ welfare. As discussed above, price discrimination has a mixed effect on consumers’ welfare, and sometimes it can increase the welfare of ‘poorer’ consumers. Therefore, the competition authority would need to assess the impact of the specific form of price discrimination at hand on the overall consumers’ welfare of ‘budget conscious’ and ‘affluent’ consumers in great detail. In view of these considerations, it is not surprising that no competition authority has ever investigated any case of price discrimination in data markets. As with excessive pricing, it is quite unlikely that any agency will start any enforcement action in this area in the near future. C. Unfair trading conditions in the data economy In Section II.A we discussed ‘how much’ users pay for online services in the form of personal data, and thus whether and to what extent the ‘excessive’ amount of personal data requested by a dominant online platform from its users could be considered an exploitative abuse of dominance. By contrast, in this section we discuss whether the conditions imposed by a dominant platform to get users’ consent to the processing of their personal data may be considered ‘too far-reaching’—i.e. unfair trading under Art. 102(a) TFEU. Put differently, not the quantity of data collected is under scrutiny, but the question of how data are processed, who gets access, and how transparent the ‘data for services’ deal is. The most interesting case in this area is the 2019 Facebook decision by the Bundeskartellamt. In March 2016, the German competition authority formally initiated proceedings against Facebook based on the suspicion that the social network abused its market power by violating data protection rules.38 In December 2017, a more detailed preliminary assessment and background information to the proceedings were published by the authority.39 Based on the assumption that Facebook is a dominant company on the market for social networks in Germany, the Bundeskartellamt held the view ‘that Facebook is abusing this dominant position by making the use of its social network conditional on its being allowed to limitlessly amass every kind of data generated by using third party websites and merge it with the user’s Facebook account’.40 In line with this reasoning, the authority issued a final administrative decision against Facebook in February 2019, prohibiting the social network from, inter alia, combining user data from different sources.41 In August 2019, the Düsseldorf Higher Regional Court (Oberlandesgericht Düsseldorf) rendered a decision in favour of Facebook in interim proceedings.42 The interim ruling will be further discussed below after an analysis of the Bundeskartellamt’s reasoning. In its decision, the Bundeskartellamt draws a clear line between the collection and use of data on the network itself, and from third party websites. Only the latter was the subject of the investigations43 and refers to those websites and apps that have an embedded application programming interface (API) with Facebook, which allows for data sharing with the online platform. The Bundeskartellamt further distinguishes between services owned by Facebook (most notably WhatsApp and Instagram) and other third party websites that, from a user’s point of view, are not always prima facie connected to the social network.44 These web sites and apps transfer personal data relating to third users to Facebook, no matter if they, for instance, make use of Facebook’s ‘Like Button’ or otherwise actively engage in data sharing. In this context, the ‘Facebook Business Tools’, provided by the company for free, play an important role. In its decision, the German NCA has prohibited the social network from using terms of service that force users to consent to Facebook collecting personal data from third party websites and apps (including Facebook-owned services) and assigning them to the individual user account.45 The Bundeskartellamt prohibited the abusive parts of the terms of service (including the data and cookie policies used by Facebook). Furthermore, it also prohibited the actual corresponding data processing itself. Facebook was given twelve months to implement the decision. Also, the company is obliged to draft an implementation plan containing possible solutions regarding the abusive conduct and present it to the Bundeskartellamt within four months. The merging of data from these different sources will, after implementation of the decision, only be possible when users have given what the authority termed ‘voluntary consent’. Facebook is obliged to change its terms of service and the corresponding internal processing of user data. If users do not consent, the data sharing must be substantially restricted (e.g. by lowering the amount of data transferred or by implementing additional control options for users).46 As a rule of thumb, the data sets must be kept separate if no consent is given. It should be noted that after the decision was rendered, Facebook announced online that ‘in the coming months’ it will grant its users more ‘control’ as regards off-Facebook data collection activities.47 At the moment of writing, it is not clear what exact options Facebook is going to present to its users. Yet, the announcement makes clear that it should be possible to display to the user those data collected from third parties and, if desired, ‘disconnect’ them from his or her Facebook account. Based on the rather vague wording of the announcement, it is difficult to predict whether this involuntary self-regulatory movement can be seen as a (small) step towards more data protection compliance and away from the need for competition law intervention. It is noteworthy that in the working paper concerning the case, the Bundeskartellamt makes reference to two privacy-related market failures. The NCA maintains that Facebook reduces users’ ‘control’ as they ‘cannot perceive which data from which sources are combined for which purposes with data from Facebook accounts’.48 This makes clear that the authority sees a lack of transparency with respect to Facebook’s business model, in particular as the transfer of data takes place even when users choose to disable web tracking in the settings of their browsers or devices.49 The authority also states that ‘[t] he investigations have shown that users in Germany generally consider the terms and conditions for processing data to be important and that they are aware of the implications of data transfer. However, because of Facebook’s market power users have no option to avoid the combination of their data’.50 Thus, privacy preferences of many users are not catered for by Facebook. It is unclear how the Facebook case will proceed after the ruling of the Düsseldorf Higher Regional Court. Injunctive relief was granted, and as a result the Bundeskartellamt’s decision may not be enforced pending a decision in the main proceedings. Even though the court decision is preliminary, the Court made surprisingly clear that it will eventually annul the NCA’s decision. Its reasoning is manifold and, above all, based on the assumption that Facebook’s conduct does not result in ‘damage to competition’. According to the Court, neither an exploitative nor an exclusionary abuse of dominance can be found. Inter alia, the Court argues that causality between Facebook’s dominant position and users agreeing to its terms of service cannot be proven, and that the excessive data collection leads to neither an abusive situation nor a loss of control for consumers, as the latter knowingly and willingly consent to the data processing. Most likely, the German Federal Supreme Court (Bundesgerichtshof) will eventually decide the case on appeal in the interim and/or main proceedings. The Bundeskartellamt has sanctioned Facebook under Art. 19(1) of the Gesetz gegen Wettbewerbsbeschränkungen (GWB, i.e. Act Against Restraints of Competition).51 The decision was based on national case law on the so-called ‘Konditionenmissbrauch’, which can be roughly translated as ‘use of abusive contractual conditions’.52 The latter is not an established theory of harm under EU competition law. Under Art. 3(1) Reg. 1/2003, NCAs are required to apply Arts. 101 and 102 TFEU when the anti-competitive conduct has an impact on ‘intra-community trade’.53 On the other hand, under Art. 3(2) Reg. 1/2003, EU Member States can rely on ‘stricter’ national competition law than Arts. 101 and 102 TFEU. This is the case, for instance, with Art. 20 GWB. The latter provision sanctions the abuse of ‘relative’ market power of the supplier vis-à-vis its customers. By contrast, the wording of Art. 19(1) GWB corresponds to Art. 102 TFEU. In the Facebook case, the cross-border dimension of the case is self-evident: Facebook operates throughout Europe via its Irish subsidiary. In addition, the Bundeskartellamt decided the case under Art. 19(1) GWB rather than Art. 20 GWB, and thus one might argue that the exception provided by Art. 3(2) Reg. 1/2003 is not applicable in this case. As a consequence, the Bundeskartellamt should have decided the case under Art. 19(1) GWB in conjunction with Art. 102 TFEU.54 Although the Facebook case is based on German rather than EU competition law, either the Düsseldorf Higher Regional Court or the Bundesgerichtshof might still refer a question for preliminary ruling to the ECJ. This would be in line with the latter’s ruling in Maxima Latvija.55 In that judgment, the Court confirmed that it has jurisdiction to answer questions for preliminary ruling even in cases fully based on national competition law ‘in order to forestall future differences of interpretation, provisions or concepts’ between Arts. 101 and 102 TFEU and the corresponding national provision.56 In view of these considerations, an analysis whether and to what extent the Facebook decision complies with the jurisprudence of the European courts on unfair trading conditions is worthwhile. The ECJ has sanctioned under Art. 102(a) TFEU a number of contractual clauses imposed by dominant companies on their customers. For instance, in United Brands the Court considered unfair the fact that the distributors of United Brands could not sell unripened bananas.57 Similarly, in Porto di Genova the Court considered unfair the fact that the maritime companies were obliged to rely on the docking services provided by the firm appointed by Genoa seaport authority, rather than being able to freely choose the service provider.58 In its jurisprudence, the Court has generally considered contractual clauses in breach of Art. 102(a) TFEU when they have been ‘unilaterally’ imposed by the dominant company, rather than negotiated between the dominant supplier and the customer. There are different ways whereby contractual clauses could be considered ‘unfair’. In one scenario, the dominant company could oblige its customers to purchase services either not requested or not closely related to the core subject of the contract. For instance, in Alsatel the ECJ considered unfair the contractual clause whereby a rent contract concluded between the dominant company with another firm would be automatically extended after its expiration, and the firm would be automatically required to pay a higher rent to the dominant firm.59 Similarly, in BRT the Court considered abusive the clause whereby artists were required to transfer the management of their copyright works to SABAM (the Belgian collecting society of copyright owners) even after the end of the contract.60 In particular, the Court ruled that SABAM breached Art. 102(a) TFEU by imposing ‘on its members obligations which are not absolutely necessary for the attainment of its object and which thus encroach unfairly upon a member’s freedom to exercise his copyright’.61 Finally, in their jurisprudence on unfair contractual clauses, the European courts were open to analyse possible objective justifications put forward by the dominant company, even though they have, in the end, generally rejected such claims.62 If we analyse the findings of the Bundeskartellamt in view of the CJEU case law, we could conclude that Facebook’s terms of service can be considered unfair contractual clauses within the scope of the existing jurisprudence. In particular, Facebook unilaterally imposes on its users the transfer of personal data to third parties. Users are subject to a ‘take or leave it’ requirement, and the transfer of personal data to and from third parties is not transparent. On the other hand, Facebook could put forward objective justifications related to the structure of its business model, where users receive the use of a ‘free’ social media platform by implicitly accepting the transfer of their personal data to Facebook. In addition, the overlap between data protection and competition law remains an open issue in this case, as one could argue that Facebook’s conduct should rather be prosecuted by the competent data protection authorities. The Facebook decision is now subject to lengthy and uncertain judicial proceedings. In the meantime, it is worth acknowledging this recent development, which may open the way to new investigations by other competition authorities in the near future. III. EU competition law remedies vis-à-vis exploitative conducts in the data economy A. Behavioural commitments vis-à-vis excessive and discriminatory pricing in the data economy In the previous section, we discussed the challenges that NCAs and the European Commission would face in sanctioning under Art. 102 TFEU cases of exploitative abuses in the data economy. For instance, in a digital world characterized by ‘free’ services,63 it would be rather difficult to apply the traditional United Brands cost/price test described above to show that excessive pricing is taking place.64 Similarly, comparing the ‘price’ of an online ‘free’ service with the ‘price’ of a similar service provided by competing platforms in accordance with Latvian Copyright Society would be a very complex task.65 It would also be hard for a competition authority to show that a discriminated customer has suffered a ‘competitive disadvantage’ under Art. 102(c) TFEU and in particular to meet the rather high threshold that has been formulated by the ECJ in the MEO ruling in this regard.66 In addition, in the case of both excessive and discriminatory pricing, the competition authority would need to show that the anti-competitive conduct is a repeated, rather than a sporadic conduct, and that it has a negative impact on consumers’ welfare. Finally, the authority would need to assess the arguments put forward by the dominant platform to justify its behaviour. For these reasons, it is unlikely that any competition authority will open investigations on excessive and discriminatory prices charged by a dominant online platform. Nevertheless, if a competition authority was ‘brave enough’ to follow this unexplored route and found enough convincing evidence to sanction an online platform, the issue of defining suitable remedies would come up. A fine coupled with a cease-and-desist order would probably be an unwise solution, due to the lack of precedents in this area. Furthermore, especially in the digital world, anti-competitive behaviour can often not be ‘undone’. Once personal data has been transferred and further processed by the online platform, it is usually impossible to ‘turn back the clock’ and undo the privacy harm suffered by consumers. Yet, by working in cooperation with the dominant firm, the competition authority could design a number of behavioural commitments aiming at solving the contested practice ex ante. In particular, we argue that two types of remedies could be designed to tackle issues related to excessive or discriminatory pricing imposed by dominant online platforms: (1) Limiting the number of data collected by the platform: Online platforms can discriminate their customers based on the huge amount of personal data they collect. Even when it comes to anonymized data, via data fusion and data analytics the platform can infer the individual reservation price and use this information to discriminate its customers. As further discussed in the next pages, an NCA/the European Commission could impose a number of limitations on the types of data gathered by the platform. Such a remedy could borrow concepts from the relevant data protection legislation. However, when needed, it could go even further: it could solve a number of market failures that are rising in the data economy, such as the lack of transparency of data protection terms and the lack of effective data anonymization. Data protection law is oftentimes not suitable to solve these issues in modern digital markets, since, for example, it still mostly relies on the concept of consent. Via its behavioural remedy, the NCA/the European Commission could overcome the issue of lack of informed users’ consent, by indicating what types of personal data the platform would be allowed to collect, for how long, and for which purposes. This type of remedy would clearly have a ‘regulatory’ character, and it would overlap with the relevant data protection regime. However, when designing the applicable remedy, the competition authority could actively cooperate with the competent data protection authority in order to identify ‘gaps’ in the data protection regime which could be filled via the behavioural remedy. (2) Sharing the customers’ data with competing platforms: As discussed in Section II.B, online platforms can discriminate their customers based on the large amount of personal data they collect and the ability to build user profiles via data analytics. Instead of asking the platform to reduce the amount of data collected, and thus hampering possible efficiencies generated by data analytics, the behavioural remedy could require the online platform to share a number of customer data with competing platforms. A similar type of remedy has already been applied by the European Commission in cases concerning the airline industry. In this field, a number of mergers have been cleared by the European Commission subject to the condition that the merging parties open their frequent flyer programs to competing airlines.67 Travellers can now redeem and acquire miles/bonus points by traveling with competing airlines. This encourages flyers to switch to other operators. However, by making compatible the frequent flyer programs of the merging parties and their competitors, the European Commission de facto required the merging parties to share important data about their frequent flyers (i.e. the premium customers) with competitors. The latter then were able to target the frequent flyers with ad hoc offers. This type of remedy could also be considered in digital markets, by requiring the dominant online platform to share some information about its customers with its competitors. As a result, competitors would be able to target the customers of the dominant firm with ad hoc offers, and consumers would thus be encouraged to switch suppliers. As recently argued by Kathuria and Globočnik,68 a data-sharing remedy would pose a number of enforcement problems for the competition authority. First of all, it would be difficult to define exactly which data would be subject to this ‘sharing duty’ in the first place.69 The ability to extract useful information via data analytics primarily depends on the algorithm used, rather than on the amount of cumulated data. Therefore, data sharing might not be sufficient to rebalance the competitive disadvantage suffered by the competitor of the dominant firm if the latter does not have access to the technology/algorithms to process the data shared. Secondly, data sometimes have a limited lifespan, depending on the context. Thus, sharing obligations might prove to be useless for the competitor of the dominant firm after a certain amount of time has passed.70 Finally, it would be hard for the authority to define a priori the price for the data sharing.71 The value of a dataset is rather subjective and strongly influenced by the possible outcomes of data analytics. For instance, the question of what other data sets a data controller has access to plays a significant role when it comes to determining their value, as it oftentimes is the clever combination of data coming from different sources that opens up new knowledge. These challenges show the need for an NCA/the European Commission to enter into commitments with the parties, rather than imposing unilaterally a behavioural remedy. By making use of commitments, remedies can be designed to match the needs of both the dominant firm and the new entrants. Furthermore, a review clause should be included in the behavioural decision in order to adjust the remedy to changing market conditions (if necessary). So far, these types of remedies have never been applied by any competition authority. It remains to be seen when and whether any competition enforcer will be ‘brave enough’ to sanction a dominant platform under Art. 102 TFEU due to excessive or discriminatory pricing. However, in such a case, these types of remedies would probably be more suitable to solve the anti-competitive conduct at stake, rather than a fine coupled with a cease-and-desist order. B. Behavioural commitments vis-à-vis unfair trading conditions in the data economy Digital markets are characterized by a number of market failures which can oftentimes be linked to the ‘unfair’ nature of the terms of service used by online platforms. These market failures72 mostly stem from the so-called ‘privacy paradox’. Simply put, this expression stands for the phenomenon that users usually claim that they care a lot about data protection and privacy issues, but in practice do not act accordingly. Oftentimes the situation is as follows: Consumers are confronted with an online ‘take it or leave it’ lockup situation, just like the one we have seen in the Facebook case discussed in Section II.C. Online platforms do not provide what users actually demand in terms of privacy-enhanced services. Paradoxically, consumers ‘give in’ and do not abstain from using those services that do not fulfil their actual privacy needs. As a result, online platforms do not always cater for the privacy preferences of their users. Apart from that, the terms of service are oftentimes characterized by a ‘lack of transparency’, which we consider to be a second type of market failure. In many online situations, privacy policies are so comprehensive, hard to understand and vague that lay people cannot realistically fully understand these terms and conditions, even if they wanted to. As a result, consumers simply ‘consent’ without making a truly informed decision. In the following pages, we will show that these market failures might be solved—or at least mitigated—by behavioural commitments. In negotiating such commitments with the dominant online platform the competition authority could take recourse to the rules and general principles contained in the GDPR. The approach presented here may seem unorthodox at first, since the primary goals of competition and data protection law are different in nature. Compliance with data protection law is obligatory for all undertakings, no matter if they are market dominant or not. As such, the imposition of rules that are applicable anyway would not provide for added value. Furthermore, one might argue that data protection infringements should be prosecuted by data protection authorities and not by means of behavioural commitments under EU competition law. Yet, for various reasons, implementing data protection rules via behavioural commitments might be a reasonable and efficient tool to foster competition in some situations. The idea behind our proposal is that the rules given under data protection law primarily aim at safeguarding privacy and other fundamental rights. As such, they might serve as a valuable starting point when it comes to solving an existing imbalance ‘between privacy and disclosure’73 (i.e. a market failure), which has tipped to the detriment of data subjects. The behavioural commitments discussed in the following sections could serve as a tool to change this imbalance in favour of users, as far as this is necessary from a competition policy perspective. Furthermore, and on a more general level, our approach might generally be feasible when data protection rules also serve competition policy goals, as we will see with respect to the example of social networks and the data subjects’ right to data portability under Art. 20 GDPR. A commitment by a market-dominant undertaking might help to clarify cases of doubt as regards the lawfulness of a particular conduct. Being an omnibus legislation applicable in a variety of entirely different situations,74 the GDPR contains a couple of rather open provisions that are subject to interpretation, such as the ‘principles relating to processing of personal data’ given in its Article 5, or the scope of the ‘right to data portability’ as given in its Article 20. Another example for newly-arisen legal uncertainty is the ‘right to be forgotten’ as granted in Art. 17 GDPR, which needs to be interpreted and substantiated further by both the courts and academia.75 As such, insecurity prevails in many situations as regards the legality of specific conducts, especially in those fields where the GDPR introduced significant changes of law. Hence, a commitment between the European Commission (or an NCA) and a market-dominant undertaking can result in legal certainty for the latter, as regards compliance with both competition and data protection law. This might serve as a strong incentive for cooperation, in particular with a view to the (severe) administrative fines that can now be imposed under the GDPR.76 From this point of view, it might also be feasible to include the competent data protection authority in the process of negotiating a commitment in order to streamline the negotiations and ensure full compliance with both legal regimes. This idea was also embraced by the Bundeskartellamt, which closely cooperated with several data protection authorities from Germany and abroad in its Facebook investigations described above.77 Also, as part of behavioural commitments, obligations that reach further than the rules given under the GDPR can be imposed on undertakings and thus serve as tailor-made answers to existing market failures. One might apply a provision contained in the GDPR and, using this mandatory minimum threshold as a basis for the behavioural commitment, extend its scope of applicability. As a result, market-dominant undertakings are subject to stronger obligations than smaller ones, yet the obligations remain similar in nature. This flexible and contextual approach might be helpful in those situations where there is a market failure (in the form of an inadequate balance between privacy and commercial interests) because the regulatory minimum provided under the GDPR does not suffice to uphold effective competition. Competition law enforcement in the form of behavioural commitments would act as a ‘safety net’ in those cases where data protection regulation—which naturally aims at protecting privacy instead of competition—does not lead to satisfactory results. We have found that oftentimes markets do not cater for the actual privacy preferences users have, and this eventually leads to bigger privacy losses than users would be willing to accept if they had a choice. Getting back to the Facebook decision by the Bundeskartellamt described above, the authority finds that ‘[t] he damage for the users lies in a loss of control: They are no longer able to control how their personal data are used’.78 This finding points directly at the alleged abuse of dominance: users do not have a choice of which social network to use. At the same time, they have to accept the privacy policy in full when setting up an account and are restricted to the privacy options presented to them by Facebook when using the network. Here, an appropriate balance between the users’ privacy interests and Facebook’s commercial interests is de facto not reached. There are several conceivable solutions to this problem, all of which are ‘inspired’ by the role consent plays in the GDPR,79 and all of which could easily be integrated into a behavioural commitment. For instance, the social network could commit to offering more detailed privacy settings to its users, which allows them to ‘fine tune’ their privacy settings. This could be more or less granular: One might give users the option to decide what kinds of personal data Facebook is allowed to collect from third party websites. Alternatively, one might give users a choice which third party websites to allow to transfer data to (or from) Facebook, or whether they would prefer to block data access entirely.80 Hence, if being tracked is no problem for a user and they prefer personalised advertising over neutral ads, they can allow Facebook to collect the data. If they do not feel at ease with this kind of data collection, they can choose not to allow the data transfers but, for instance, pay a monthly fee to Facebook instead. Many more options are conceivable and of course the behavioural commitment would need to be well-designed and based on a thorough assessment of all relevant aspects, such as the specific market conditions, observed users’ behaviour etc. Still, the overall idea of tackling the market failure with recourse to the consent principle given under the GDPR and the underlying right to ‘informational self-determination’81 seems promising, and fair, for all parties involved. We have also found that a recurring problem of online business models is a lack of transparency: users oftentimes consent to online privacy policies without actually properly reading and fully understanding them. The reason behind this situation is that on the one hand, users do not really care, while on the other hand, they are simply not able to meaningfully grasp and comprehend these terms and conditions. Again, the validity of the consent given is the Achilles’ heel, and again, the Facebook case serves as a good example to analyse. When it comes to making terms on consent when drafting a behavioural commitment, there is a certain degree of leeway as regards ‘how’ data subjects should be requested to give consent. In some situations, it might be desirable to make users well aware of what exactly they are consenting to, and to ‘force’ them to decide deliberately. Insofar, the system has already changed in favour of users, as the GDPR has abolished the ‘opt-out’ system: Under the GDPR, users always have to actively ‘opt-in’. This means that, for instance, a pre-ticked box cannot constitute valid consent anymore.82 Also, the controller must now be able to demonstrate that consent was given when it serves as the legal basis for the processing of personal data (i.e. the burden of proof now lies with the data controller, Art. 7(1) GDPR). Yet, as part of a behavioural commitment, one could go further than that if deemed necessary. For instance, when a data controller wants to engage in data processing that goes significantly further than what a regular user would legitimately expect, the introduction of a so-called ‘double opt-in’ could be required as part of a behavioural commitment (i.e. a two-step confirmation that consent is granted). ‘Double opt-in’ in this context could mean that users have to actively change their privacy preferences (step 1), and then they have to confirm the new settings by clicking on a link that has been sent via e-mail (step 2). This would still allow Facebook and other social networks to pursue their business model, and at the same time safeguard the users’ abovementioned right to informational self-determination. Generally speaking, one could demand different levels of explicitness of the consent given based on the respective context. Alternatively, one could make consent ‘expire’ after a certain period of time. For instance, it might make sense to ask users to confirm their consent again after a 6-month period. This would raise their awareness of what is currently happening to ‘their’ personal data and would mitigate the lethargy that regularly results from the ‘privacy paradox’. Lastly, we will have a closer look at one kind of market failure that does not result from the ‘privacy paradox’ and is not directly connected to the issue of the validity of user consent. Social networks are prone to market concentration due to network effects, and thus might be critical from a competition law point of view.83 They are, like other data controllers, subject to Art. 20 GDPR, which grants data subjects a ‘right to data portability’. This means that under certain circumstances, data subjects have the right ‘to receive the personal data concerning him or her, which he or she has provided to a controller, in a structured, commonly used and machine-readable format and have the right to transmit those data to another controller without hindrance from the controller to which the personal data have been provided’. In addition, under Art. 20(1) and (2) GDPR, data subjects have the right to demand that ‘personal data are transmitted directly from one controller to another, where technically feasible’. This bundle of rights (i.e. the right to data portability) has two dimensions: a data protection dimension, as it serves to ‘further strengthen the [data subject’s] control over his or her own data’,84 and a competition policy dimension, as it aims at reducing lock-in effects and switching costs for users and thus stimulates competition by making it easier for users to switch to competitors.85 The right to data portability only refers to personal data which have been ‘provided’ to the controller by the data subject. As such, non-personal data (i.e. those that have been rendered anonymous and those that are anonymous per se) are excluded from its scope of applicability from the outset. Furthermore, all personal data referring to a data subject that have been uploaded to the social network by other users are excluded as well.86 In many situations, the right to data portability might not be problematic or controversial at all, such as when it comes to the transfer of personal data stored ‘in the cloud’ to another cloud service provider. Yet, in the context of social networks, the right to data portability is particularly cumbersome—even though they were on the mind of the lawmaker, since social networks were named explicitly as a use case for data portability in the Commission’s original 2012 draft of the GDPR.87 In the context of social networks, in fact, data portability is difficult to implement for several reasons. For instance, technical burdens must be overcome due to the very different software ‘architecture’ and their internal logic and functionality.88 Also, those parts of a social network which qualify as personal data—such as a picture of the data subject—but have not been uploaded by the respective data subject themselves would not be included in the data portability claim that data subjects have, since they are not ‘provided’ by them (as stipulated in Art. 20(1) GDPR) but by someone else. Yet, those personal data uploaded by someone else might still be considered to be an integral part of a person’s profile. Looking at it from another angle, the latter aspect also has a legal dimension, as third party rights, such as those of other users of the social network (stemming from data protection and intellectual property law), must also be cleared before the porting of data to another social network can take place.89 In practice, these factors might de facto uphold the lock-in effects social networks have, and as such mitigate both the positive data protection and competition effects originally envisaged by the drafters of the GDPR. In such an unsatisfactory situation, entering into a behavioural commitment with the social network provider based on the underlying rationale of Art. 20 GDPR might make sense. For instance, as part of such a commitment, one could think of extending the scope of applicability of Art. 20 GDPR to non-personal (e.g. anonymized) data or including third party data as far as necessary to meaningfully port profiles. This would make data portability more meaningful and effective when a social network is market-dominant. Of course, when it comes to porting data belonging to third parties, a solution would have to be found to ensure compliance with, for instance, their data protection and IP rights. A method would need to be implemented to ensure, for instance, that a legal basis for the processing of personal data of third parties is given up front, such as consent according to Art. 6(1)(a) GDPR, and that no copyrights are violated. In addition to modifying the data portability requirements given under the GDPR, one might also consider the fostering of interoperability between social networks. Yet, as regards the latter, a regulatory approach might be more effective, as only general interoperability requirements could ensure that new, small social networks also partake from the outset.90 In this section, we have argued that behavioural commitments between market-dominant undertakings and competition authorities should, in some situations, be drafted based on the rules and principles contained in the GDPR. This might serve to adequately remedy unfair contractual clauses by dominant online platforms as these clauses often harm the consumers’ privacy. We are aware that our approach is unconventional, and that some issues would need to be addressed and analysed further. A general problem—as is always the case with behavioural commitments—is that commitments barely ever undergo judicial review and, as such, might foster a certain degree of legal uncertainty on an abstract level. Yet, from the point of view of the undertakings concerned, entering into such a commitment holds the promise of individual legal certainty, as they do not run the risk of being found in violation of competition or data protection provisions. This is even more attractive with a view to the newly established, severely high administrative fines that can be imposed under the GDPR in case of data protection infringements (cf. Art. 83(5) GDPR). Furthermore, especially when close cooperation with data protection authorities is sought during the process of negotiating the commitments, this might actually even contribute to legal certainty for other undertakings as well. In data protection matters, it is quite common to retrieve informal advice from data protection authorities up front in case of uncertainty as regards the legality of specific conducts, as it is a field of law that regularly requires a significant amount of balancing of interests. As such, behavioural commitments might even provide general guidance on how to interpret the GDPR’s provisions. In sum, our approach might serve to inspire tailor-made and contextual remedies in a field that is still developing. The cautious combination of competition and data protection law that we have argued for might be a fruitful opportunity to appreciate the ‘family ties’ that these legal regimes have in common. IV. Conclusions The data economy generates a number of opportunities to improve the production and marketing of existing products, and to provide new services to consumers. Thus, it generates a number of new business opportunities that have the potential to increase consumers’ welfare. However, it also poses new questions and challenges as regards the enforcement of EU competition law. In this paper we have discussed the role that Art. 102 TFEU could have in sanctioning exploitative conducts by dominant online platforms, such as the imposition of excessive and discriminatory pricing, as well as the use of unfair contractual clauses. We have discussed in particular the challenges that an NCA/the European Commission would face in sanctioning these types of abuses in light of the jurisprudence of the CJEU, as well as the possible remedies that the competition agency could adopt to solve the competition infringement. While the enforcement of EU competition law vis-à-vis exploitative conducts in data markets should remain the exception, the competent competition authority would face a number of challenges when it comes to applying the relevant legal standards to sanction these types of abuses under Art. 102 TFEU. As argued in Sections II.A and II.B, the competent competition authority would have a hard time when it comes to sanctioning excessive and discriminatory pricing by dominant online platforms in accordance with the standards recently defined by the ECJ in Latvian Copyright Society and MEO. Therefore, although possible in theory, the enforcement of Art. 102 TFEU vis-à-vis excessive and discriminatory pricing in the data economy seems unlikely in the near future. By contrast, it would be easier to satisfy the European courts’ legal standards when it comes to sanctioning unfair contractual clauses by dominant online platforms under Art. 102 TFEU. The Facebook decision in Germany is an example of an emerging trend in this regard. However, the recent interim ruling of the Düsseldorf Higher Regional Court shows that national courts might be reluctant to support this enforcement trend. In the second part of the paper we discussed possible remedies that an NCA/the European Commission could take into consideration to sanction exploitative conducts by dominant online platforms. We have argued that due to the lack of precedents in this area and its specific characteristics, fines coupled with cease-and-desist orders do not seem to be the most appropriate choice. Instead, the competition enforcer should guide the market players via behavioural remedies, tailor-made commitments agreed with the dominant firm. In particular, in Section III.A we discussed possible remedies in cases concerning excessive and discriminatory pricing in the digital economy, such as obliging the dominant platform to either reduce the amount of collected data or to share certain information with competitors. By contrast, in Section III.B we argued that when designing behavioural remedies in relation to unfair contractual clauses imposed by dominant online platforms, the competition authority should take recourse to the relevant data protection legislation. In particular, the behavioural commitments could either clarify unclear aspects of the GDPR or extend the scope of its obligations and application. In designing the applicable remedies, the NCA/the European Commission will have to actively cooperate with the relevant authorities, in particular the competent data protection authority. The recent Facebook decision represents the first case of exploitative abuse sanctioned in Europe in digital markets. This milestone decision calls for a broader discussion on the role (if any) that Art. 102 TFEU should play in sanctioning the business conduct of dominant digital platforms that harm final consumers. The present paper aims at contributing to this discussion—a discussion that is likely to keep academics and practitioners in Europe busy in the years to come. The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. The authors received no financial support for the research, authorship, and/or publication of this article. Footnotes 1 Consolidated Version of the Treaty on the Functioning of the European Union, 26 October 2012, 2012 O.J. C 326/47. 2 Art. 102(a) TFEU. 3 Art. 102(c) TFEU. 4 Art. 102(a) TFEU. 5 MS Gal, ‘Monopoly Pricing as an Antitrust Offense in the U.S. and the EC: Two Systems of Belief about Monopoly?’ (2004) 49:1–2 Antitrust Bulletin 343–84. 6 R Karova and M Botta, ‘Sanctioning Excessive Energy Prices as Abuse of Dominance: Are the EU Commission and the National Competition Authorities on the Same Frequency?’ in PL Parcu, G Monti and M Botta (eds), Abuse of Dominance in EU Competition Law: Emerging Trends (Edward Elgar Publisher, Cheltenham, UK and Northampton, MA, USA, 2017) 169–84. 7 M Colangelo and C Desogus, ‘Antitrust Scrutiny of Excessive Prices in the Pharmaceutical Sector: A Comparative Study of the Italian and UK Experiences’ (2018) 41:2 World Competition 225–54. 8 The Bundeskartellamt adopted the Facebook decision on 7 February 2019. The decision was made public on 29 March 2019. It is available in English https://www.bundeskartellamt.de/SharedDocs/Entscheidung/EN/Entscheidungen/Missbrauchsaufsicht/2019/B6-22-16.pdf?__blob=publicationFile&v=5 accessed 23 September 2019. 9 For a detailed discussion of the Facebook decision, see M Botta and K Wiedemann, ‘The Interaction of EU Competition, Consumer, and Data Protection Law in the Digital Economy: The Regulatory Dilemma in the Facebook Odyssey’ (2019) 64:3 Antitrust Bulletin 428, 437–42. 10 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation) 2016 O.J. L 119/1. 11 See, for instance, I Graef, ‘Market Definition and Market Power in Data: The Case of Online Platforms’ (2015) 38:4 World Competition 473–506; L Filistrucchi, D Geradin and E van Damme, ‘Identifying Two-Sided Markets’ (2013) 36:1 World Competition 33–59. 12 The expression online ‘free’ service is generally considered misleading. In fact, for consumers, the online services delivered ‘involve multiple non-pecuniary costs in the form of providing personal data, paying attention to ads, or the opportunity costs of reading privacy policies’, OECD Secretariat, Big Data: Bringing Competition Policy to the Digital Era (Report published on 27 October 2016), DAF/COMP(2016)14, 25 http://www.oecd.org/daf/competition/big-data-bringing-competition-policy-to-the-digital-era.htm accessed 23 September 2019. 13 C Langhanke and M Schmidt-Kessel, ‘Consumer Data as Consideration’ (2015) 4:6 Journal of European Consumer and Market Law 218–23. 14 Cf. MS Gal and DL Rubinfeld, ‘The Hidden Costs of Free Goods: Implications for Antitrust Enforcement’ (2016) 80:3 Antitrust Law Journal 521–24. 15 J Drexl, ‘Legal Challenges of the Changing Role of Personal and Non-Personal Data in the Data Economy’ (2018) Max Planck Institute for Innovation and Competition Research Paper No. 18–23, 27 https://ssrn.com/abstract=3274519 accessed 23 September 2019. 16 Ibid. 17 At the moment of writing, the Directive has already been formally adopted, but not yet published in the Official Journal. The final text can be found at https://data.consilium.europa.eu/doc/document/PE-26-2019-INIT/en/pdf accessed 23 September 2019. 18 Cf. European Commission, Proposal for a Directive of the European Parliament and of the Council of 9 December 2015 on certain aspects concerning contracts for the supply of digital content, COM(2015) 634 final, Art. 3(1): ‘This Directive shall apply to any contract where the supplier supplies digital content to the consumer or undertakes to do so and, in exchange, a price is to be paid or the consumer actively provides counter-performance other than money in the form of personal data or any other data’. 19 Case C-27/76, United Brands Company v. Commission, ECLI:EU:C:1978:22, para. 250. 20 Case C-177/16, Autortiesību un komunicēšanās konsultāciju a|$\textrm{`}{\!\!\rm g} $|entūra/Latvijas Autoru apvienība v. Konkurences padome, ECLI:EU:C:2017:689 (Latvian Copyright Society). 21 Ibid., para. 55. 22 Ibid., para. 58. 23 S Kokolakis, ‘Privacy Attitudes and Privacy Behaviour: A Review of Current Research on the Privacy Paradox Phenomenon’ (2017) 64 Computers & Security 122–34. 24 For an economic analysis of the anti-competitive effects of price discrimination, see M Motta, Competition Policy—Theory and Practice (Cambridge University Press 2004) 493–94. 25 D Geradin and N Petit, ‘Price Discrimination under EC Competition Law: Another Antitrust Doctrine in Search of Limiting Principles?’ (2006) 2:3 Journal of Competition Law and Economics 479, 483. 26 Executive Office of the President of the United States, Big Data and Differential Pricing (2015), 19 https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/docs/Big_Data_Report_Nonembargo_v2.pdf accessed 23 September 2019. 27 Ibid., at 8. 28 One should keep in mind that first-degree price discrimination means that the exact reservation price of a specific person is known. For this reason, it is primarily seen to be a hypothetical situation/model (cf. A Ezrachi and M Stucke, Virtual Competition (Harvard University Press, Cambridge, MA, USA and London, UK, 2016), 96–9). 29 Supra, Motta (2004), at 493. 30 Opinion of Advocate General Wahl delivered on 20 December 2017 in the Case C-525/16, MEO—Serviços de Comunicações e Multimédia SA v. Autoridade da Concorrência, ECLI:EU:C:2017:1020, para. 80. 31 Cf. D Bergemann, B Brooks and S Morris, ‘The Limits of Price Discrimination’ (2015) 105:3 American Economic Review 921–57. 32 D Carlton and J Perloff, Modern Industrial Organization (3rd ed., Addison-Wesley, Reading, MA, USA, 1999), 280. 33 Cf. supra, Ezrachi and Stucke (2016), at 88. 34 Case C-525/16, MEO—Serviços de Comunicações e Multimédia SA v. Autoridade da Concorrência, ECLI:EU:C:2018:270. 35 Ibid., para. 28. 36 Ibid., para. 31. 37 Case T-301/04, Clearstream Banking AG and Clearstream International SA v. Commission, ECLI:EU:T:2009:317, para. 185. 38 The original press release published on 2 March 2016 www.bundeskartellamt.de/SharedDocs/Meldung/EN/Pressemitteilungen/2016/02_03_2016_Facebook.html?nn=3591568 accessed 23 September 2019. 39 On 19 December 2017, the Bundeskartellamt issued a press release, as well as a paper containing background information regarding the case: Bundeskartellamt, ‘Press Release of 19 December 2017: Preliminary Assessment in Facebook Proceeding: Facebook’s Collection and Use of Data from Third-Party Sources is Abusive’ https://www.bundeskartellamt.de/SharedDocs/Publikation/EN/Pressemitteilungen/2017/19_12_2017_Facebook.pdf?__blob=publicationFile&v=3 accessed 23 September 2019; Bundeskartellamt, ‘Background Information on the Facebook Proceeding of 19 December 2017’ https://www.bundeskartellamt.de/SharedDocs/Publikation/EN/Diskussions_Hintergrundpapiere/2017/Hintergrundpapier_Facebook.pdf;jsessionid=56A0B536F26D00068F29D7D2AF8F0A59.1_cid378?__blob=publicationFile&v=6 accessed 23 September 2019. 40 Ibid., 2017 Bundeskartellamt Press Release, at 1. 41 Supra, 2019 Bundeskartellamt Facebook Decision, at 2–6. 42 The decision published on 26 August 2019 is available (in German) www.olg-duesseldorf.nrw.de/behoerde/presse/Presse_aktuell/20190826_PM_Facebook/20190826-Beschluss-VI-Kart-1-19-_V_.pdf accessed 23 September 2019. A shortened version was published in print in (2019) 7:9 Neue Zeitschrift für Kartellrecht 495–501. 43 This seems to be a means to streamline the investigations, since the authority leaves open ‘whether these terms can still result in a violation of data protection rules and how this would have to be assessed under competition law’, Bundeskartellamt, ‘Background Information on the Bundeskartellamt’s Facebook Proceeding of 7 February 2019: Bundeskartellamt Prohibits Facebook from Combining User Data from Different Sources’, 2 https://www.bundeskartellamt.de/SharedDocs/Publikation/EN/Pressemitteilungen/2019/07_02_2019_Facebook_FAQs.pdf?__blob=publicationFile&v=6 accessed 23 September 2019. 44 Bundeskartellamt, ‘Press Release of 7 February 2019: Bundeskartellamt Prohibits Facebook from Combining User Data from Different Sources’, 1. https://www.bundeskartellamt.de/SharedDocs/Publikation/EN/Pressemitteilungen/2019/07_02_2019_Facebook.pdf?__blob=publicationFile&v=2 accessed 23 September 2019. 45 Supra, 2019 Bundeskartellamt Background Information, at 2. 46 Ibid. 47 Cf. https://www.facebook.com/off-facebook-activity accessed 23 September 2019. 48 Supra, 2019 Bundeskartellamt Background Information, at 5. 49 Ibid., at 1. 50 Ibid., at 5. 51 An official English translation is available at https://www.gesetze-im-internet.de/englisch_gwb/index.html accessed 23 September 2019. 52 The Bundeskartellamt made particular reference to the ‘VBL Gegenwert II’ and the ‘Pechstein’ case law (Bundesgerichtshof, 24 January 2017, (2017) 67:5 Wirtschaft und Wettbewerb 283 (VBL Gegenwert II); Bundesgerichtshof, 7 June 2016, (2016) 69:31 Neue Juristische Wochenschrift 2266 (Pechstein)). 53 Council Regulation (EC) No 1/2003 of 16 December 2002 on the Implementation of the Rules on Competition Laid Down in Articles 81 and 82 of the Treaty, 2003 O.J. L 1/1. 54 Cf. supra, Botta and Wiedemann (2019), at 441. 55 Case C-345/14, SIA ‘Maxima Latvija’ v. Konkurences padome, ECLI:EU:C:2015:784. 56 Ibid., para. 12. 57 Supra, case C-27/76, para. 130–62. 58 Case C-179/90, Merci Convenzionali Porto di Genova SpA v. Siderurgica Gabrielli SpA, ECLI:EU:C:1991:464, para. 8–24. 59 Case C-247/86, Société Alsacienne et Lorraine de Télécommunications et d’Électronique (Alsatel) v. SA Novasam, ECLI:EU:C:1988:469. 60 Case C-127/73, Belgische Radio en Televisie and société belge des auteurs, compositeurs et éditeurs v. SV SABAM and NV Fonior, ECLI:EU:C:1974:25. 61 Ibid., para. 15. 62 For instance, in the AAMS case the Italian monopoly in charge of the distribution of cigarettes in the country tried to justify the contractual clauses limiting the ability of foreign suppliers to sell cigarettes in Italy. In particular, AAMS argued that these clauses were necessary in view of the limited capacity of its distribution network. The Court of First Instance did not ultimately accept these justifications, ruling that ‘AAMS has not proved to the requisite legal standard that the clauses mentioned above were necessary to protect its commercial interests and to avoid … the risk of its distribution network becoming overloaded’ (Case T-139/98, Amministrazione Autonoma dei Monopoli di Stato (AAMS) v. Commission, ECLI:EU:T:2001:272, para. 79). 63 In the paper, we have argued that online ‘free’ services actually have a ‘price’ in terms of personal data provided by the user to the platform in exchange of the service. 64 Supra, case C-27/76. 65 Supra, case C-177/16. 66 Supra, case C-525/16, para. 35–7. 67 This type of remedy was imposed by the European Commission in the following merger decisions: Commission Decision of 14 July 2010, British Airways/American Airlines/Iberia, COMP/39.596, 2010 O.J. C 278/14; Commission Decision of 23 May 2013, Continental/United/Lufthansa/Air Canada, AT.39595, 2013 O.J. C 201/8; Commission Decision of 12 May 2015, Air France/KLM/Alitalia/Delta, AT.39964, 2015 O.J. C 212/5. 68 Cf. V Kathuria and J Globočnik, ‘Exclusionary Conduct in Data-Driven Markets: Limitations of Data Sharing Remedy’ (2019) Max Planck Institute for Innovation & Competition Research Paper No. 19-04, 4 https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3337524 accessed 23 September 2019. 69 G Colangelo and M Maggiolino, ‘Big Data as Misleading Facilities’ (2017) 13:2–3 European Competition Journal 249, 274. 70 Ibid., at 275. 71 Ibid. 72 Here, the term ‘market failure’ has a normative connotation and, as such, does not look primarily at economic efficiency. 73 A Acquisti, ‘Privacy and Market Failures: Three Reasons for Concern, and Three Reasons for Hope’ (2012) 10:2 Journal on Telecommunications and High Technology Law 227. 74 On this term and on the difference between the EU’s ‘omnibus’ approach to privacy legislation as opposed to the sector-by-sector approach taken by the United States, see PM Schwartz, ‘The EU-U.S. Privacy Collision: A Turn to Institutions and Procedures’ (2013) 126:7 Harvard Law Review 1966, 1973–79. 75 F Di Ciommo, ‘Privacy in Europe after Regulation (EU) No 2016/679: What will Remain of the Right to be Forgotten?’ (2017) 3:2 The Italian Law Journal 623, 628–29. 76 Under the GDPR, ‘administrative fines up to 20,000,000 EUR, or in the case of an undertaking, up to 4% of the total worldwide annual turnover of the preceding financial year, whichever is higher’ can be imposed for infringements (of most) of the provisions contained in the Regulation: Art. 83(5) GDPR. It has been heavily criticized in the literature that infringements of clauses as vague and open to interpretation as Art. 5(1) GDPR are subject to these heavy fines; P Gola, Datenschutz-Grundverordnung: VO (EU) 2016/679—Kommentar (2nd ed., C.H.Beck, Munich, 2018), Art. 83 para. 26. 77 Bundeskartellamt, ‘Case Summary of 15 February 2019: Facebook, Exploitative Business Terms Pursuant to Section 19(1) GWB for Inadequate Data Processing’, 8–9 https://www.bundeskartellamt.de/SharedDocs/Entscheidung/EN/Fallberichte/Missbrauchsaufsicht/2019/B6-22-16.pdf?__blob=publicationFile&v=3 accessed 23 September 2019. 78 Supra, 2019 Bundeskartellamt Background Information, at 5. 79 Art. 4(11), 6(1)(a) and 7 GDPR. 80 Cf. supra, 2019 Bundeskartellamt Background Information, at 2: ‘Facebook is required to adapt its terms of service and data processing accordingly. If Facebook intends to continue collecting data from outside the social network and combining them in users’ accounts without the consent of users, the processing of these data must be substantially restricted. A number of different criteria are feasible (e.g. restrictions on the amount of data, purpose of use, type of data processing, additional control options for users, anonymisation, processing only upon instruction by third party providers, limitations on data storage periods, etc)’. 81 The term informational self-determination (‘Recht auf informationelle Selbstbestimmung’) was coined by the German Federal Constitutional Court (Bundesverfassungsgericht) in its judgment on a national census in 1983 and was expressly acknowledged by the Court as a fundamental right (Bundesverfassungsgericht, 15 December 1983, BVerfGE 65, 1). 82 ‘Silence, pre-ticked boxes or inactivity should not … constitute consent’ (Recital 32 GDPR). 83 I Graef, ‘Mandating Portability and Interoperability in Online Social Networks: Regulatory and Competition Law Issues in the European Union’ (2015) 39:6 Telecommunications Policy 502, 503–4. 84 Recital 68 GDPR. 85 Supra, Graef (2015, Telecommunications Policy), at 507–8. 86 Ibid., at 507. 87 ‘The data subject should also be allowed to transmit those data, which they have provided, from one automated application, such as a social network, into another one’. (European Commission, Proposal of 25 January 2012 for a regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final, Recital 55). 88 Supra, Graef (2015, Telecommunications Policy), at 507. 89 Ibid. 90 Ibid., at 510. © The Author(s) 2019. Published by Oxford University Press. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited. © The Author(s) 2019. Published by Oxford University Press. TI - Exploitative Conducts in Digital Markets: Time for a Discussion after the Facebook Decision JO - Journal of European Competition Law & Practice DO - 10.1093/jeclap/lpz064 DA - 2019-12-31 UR - https://www.deepdyve.com/lp/oxford-university-press/exploitative-conducts-in-digital-markets-time-for-a-discussion-after-1ch4XEKGGW SP - 465 EP - 478 VL - 10 IS - 8 DP - DeepDyve ER -