Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 7-Day Trial for You or Your Team.

Learn More →

Big Data and Merger Control in the EU

Big Data and Merger Control in the EU Key Points The Commission has an extensive track record examining data-related issues in its EU Merger Regulation decisions, but many of these have involved markets in which data or data analytics were the relevant product. In a number of cases, however, the Commission has examined the antitrust implications of merging parties collecting data that are not offered as a product on the market, the issue at the heart of the big-data debate. In these cases, the Commission’s attempts to apply its existing guidelines have raised a number of conceptual and practical issues. In future cases, it is to be hoped that the Commission will more clearly articulate its framework for analysing the competitive effect of big data where companies are collecting and using the data in their own businesses. EU antitrust authorities have taken a leading role in exploring antitrust concerns around ‘big data.’ The UK CMA announced the formation of a new technology team to keep pace with the use of algorithms, artificial intelligence and big data in 2017,1 and the French and German authorities published a joint study on big data and antitrust in 2016.2 Compared to these national authorities, the European Commission has kept a relatively low profile, although Commissioner Vestager has commented on big data and algorithm issues in several speeches.3 In one area, however, the Commission has been perhaps the most active global antitrust authority in documenting its analysis of big-data issues: merger control. The Commission has a long track record of assessing the effect of notified mergers on data-related markets.4 The Commission has analysed a wide range of data-related markets, from market research and marketing information services,5 to financial information,6 to navigable digital map databases,7 to online advertising.8 In these cases, the Commission’s traditional tools for assessing the effects of a combination on horizontally and vertically affected markets can normally be applied in a straightforward way. Recent debates on big data and antitrust, by contrast, focus on the antitrust implications of companies collecting and using big data that are not offered as a product on the market. The Franco/German Study opined that a combination of ‘data troves could raise competition concerns if the combination of data makes it impossible for competitors to replicate the information’ possessed by the merged entity.9 One early Commission decision, Google/DoubleClick,10 considered the competitive effects of combining data that were not being offered and sold, but the discussion was brief, and the issue did not arise in a similar case two years later.11 In three recent merger cases, however, Verizon/Yahoo,12Microsoft/LinkedIn,13 and Facebook/WhatsApp,14 the Commission has elaborated on its approach to big data issues in merger review. These decisions help to illustrate the Commission’s approach and its attempts to fit big data issues into its traditional analytical frameworks, as set out in its Guidelines (the ‘Horizontal Guidelines’) on the assessment of horizontal mergers under the Council Regulation on the control of concentrations between undertakings (the ‘EUMR’),15 its Guidelines (the ‘Non-Horizontal Guidelines’) on the assessment of non-horizontal mergers under the EUMR16 and its Notice (the ‘Remedies Notice’) on remedies acceptable under the EUMR.17 This article discusses the concept of big data and the characteristics of big data that are arguably most relevant from an antitrust perspective, before analysing the Commission’s recent decisions assessing the effects of combining merging parties’ data that are not being licensed or sold as products in their own right. I. What is ‘Big Data’ and which characteristics are antitrust-relevant? Although the term is widely used, there is no agreed definition of ‘big data.’ Gartner Research defines big data as ‘high-volume, high-velocity, and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.’18 ‘Volume,’ ‘velocity’, and ‘variety’ are commonly referred to as the ‘three Vs’ of big data. A fourth ‘V’ – veracity – is sometimes added.19 Although big data are commonly defined by reference to these ‘Vs,’ these three (or four) characteristics arguably aren’t the most important for antitrust purposes. The Franco/German Study discusses other approaches to categorising big data, including by reference to the subject of the information (individuals, economic entities or objects), whether the information is structured (such that it can more easily be processed for commercial purposes), and how it is gathered (collected internally, provided by third parties voluntarily, collected from public sources or by observing users’ behaviour, or inferred from existing data).20 The Franco/German Study refers to data made available by one company to others as ‘third-party data.’ Third-party data can be viewed for competition purposes as a product offered for consideration, and evaluated under traditional antitrust principles. By contrast, data collected or inferred directly by a company is referred to as ‘first-party data.’ As discussed in more detail below, the novel issues raised by antitrust authorities and commentators with respect to big data relate mainly to first-party data collected from other natural and legal persons. In a related discussion of big data’s propensity to create market power, the Franco/Germany Study discusses other characteristics that may be more relevant to the application of potential antitrust theories of harm: (i) the ‘non-rivalrous’ nature of data, (ii) the ‘ubiquity’ of data, (iii) the decreasing marginal value of additional data, and (iv) the tendency of data’s value to decline rapidly over time. In relation to the non-rivalrous and ubiquitous nature of data, the Franco/German Study explains that big data can be described as non-rivalrous when ‘someone having and using a dataset does not prevent others, be they competitors or not, from having and using the same data as well (provided they can access them).’21 The non-rivalrous nature of big data is sometimes simply assumed,22 but some commentators contest this description.23 The Franco/German Study also notes that the high volumes of data generated by online users give rise to an argument that big data cannot give rise to market power because ‘data is everywhere.’ If ‘the same kind of knowledge can be extracted from different datasets which may also be obtained through different mechanisms, the risk that an undertaking may not be able to have access to the knowledge enjoyed by his competitors could be low.’24 The Franco/German Study suggests that the validity of this argument may be undermined by limitations on the accessibility of data and the substitutability of one dataset for another. Again, some commentators stress the ubiquity of data,25 while others argue that the ubiquity of data may be exaggerated, in particular where companies collecting such data use exclusionary practices to limit competitors’ access to data.26 As regards the value of data, the Franco/German Study notes that ‘decreasing marginal returns to scale… would [limit] the competitive advantages resulting from large amounts of data.’27 The Franco/German Study notes further that the scope of a dataset may be as important as its scale; data collected by offering different services may allow the collecting entity to gather knowledge on multiple aspects of users’ behaviour and tastes, potentially improving the data holder’s ability to predict users’ interests. Similarly, the Franco/German Study discusses the fact that ‘the value of data may decrease quite quickly in time,’28 but notes that some data, such as gender, names, address, date of birth, job, etc., may not lose value over time. According to the Franco/German Study, a company having such data at its disposal may have a lasting advantage over its competitors. The Franco/German Study concludes that the advantages associated with access to a larger volume of data may be quite different from one market to another, and case-by-case assessments are required.’29 While it is hard to argue with this observation, the theoretical analysis can be taken a step further by considering the characteristics of different types of data and applications. For example, whether data are non-rivalrous and ubiquitous depends on the data in question and the purpose for which it is to be used. Absent contractual restrictions, third-party data offered as a product are normally available to anyone willing to pay for them, so they are non-rivalrous and ubiquitous by definition. The situation with first-party data is more complicated. First-party data collected by a company developing and testing its own products, for instance through the Internet of Things, would not normally be accessible to others and so are neither non-rivalrous nor ubiquitous. However, the exclusive nature of such data does not necessarily have any foreclosing effect; any company is free to collect the same or similar types of data on its own products. First-party data collected on others, such as data collected by an online retailer on its customers’ preferences, may not be available to competitors, but the same or similar data may be available from multiple sources. Whether those other data are sufficiently similar to be substitutable depends on their intended use. Without the tool of market prices, however, antitrust authorities may find it difficult to define and measure the substitutability of first-party data for different purposes. Similarly, the marginal value of additional data, and how long-lasting that value is, depends on the type of data in question and the purpose for which it is used. For example, the principle of diminishing returns presumably applies less or not at all to data collected from third parties for use in time-sensitive applications, such as individually targeted online advertising. On the other hand, the more time-sensitive the application, the shorter-lived the data’s value is likely to be. Conversely, in some applications for which data’s value is more long-lasting, such as first-party data collected on a company’s own products or data collected from users of online services where the user’s identity and preferences may not be important (e.g., to test speech recognition software), incremental data seems more likely to have diminishing returns. Such data are also more likely to be available from a greater variety of sources, i.e. to be non-rivalrous and ubiquitous. Data also have different values to different users; first-party data collected by a company on its own products may have lasting value to that company, but no (legitimate) competitive value to another company. Again, absent market prices for first-party data, defining and measuring these differences will be challenging for antitrust authorities. II. Big data and EU merger review As mentioned, merger review is the area in which the Commission has considered big-data issues in concrete cases, including a number of major technology-sector transactions. The Commission has not laid out a theoretical framework for analysing big data issues as such, and the Horizontal and Non-Horizontal Guidelines do not specifically reference data (unsurprisingly in view of the dates they were published). Nonetheless, the Commission’s recent decisions go a long way towards formalising its approach to assessing data-related aspects of notified mergers where the data are not themselves being offered to third parties as part of a product or service. This emerging approach is discussed below, first in relation to horizontal issues and then in relation to vertical issues. A. Horizontal big data issues As noted, according to the Franco/German Study, a combination of ‘data troves could raise competition concerns if the combination of data makes it impossible for competitors to replicate the information’ possessed by the merged entity. The Franco/German Study also noted that mergers ‘in data-related markets… could result in differentiated data access and increase the concentration of data related to this market if the newcomer has access to a large database (gained on another market for instance).’30 Although the Franco/German Study did not divide its merger-related concerns between horizontal and non-horizontal theories of harm, these seem to fall in the category of horizontal concerns, relating to potential data-related barriers to entry a merger could raise. 1. The Horizontal Guidelines and remedies notice The Commission’s Horizontal Guidelines note that the anticompetitive effects of horizontal mergers can be divided into non-coordinated effects, where a merger eliminates ‘important competitive constraints on one or more firms, which consequently would have increased market power,’ and coordinated effects, ‘by changing the nature of competition in such a way that firms that previously were not coordinating their behaviour, are now significantly more likely to coordinate’ or by making ‘coordination easier, more stable or more effective for firms which were coordinating prior to the merger.’ The Horizontal Guidelines describe a number of possible barriers to entry or expansion that may be relevant in the big-data context, including ‘technical advantages, such as preferential access to essential facilities, innovation and R & D, or intellectual property rights,’ difficulties in obtaining ‘essential input materials,’ ‘economies of scale and scope,’ and ‘access to important technologies.’ 31 The Horizontal Guidelines do not discuss the possibility that a merger might significantly impede competition by raising barriers to entry or expansion, but the Remedies Notice observes that the Commission has accepted remedies to facilitate market entry by competitors, including the ‘granting of access to key infrastructure, networks, key technology, including patents, know-how or other intellectual property rights, and essential inputs… on a non-discriminatory and transparent basis.’32 2. Commission precedents As discussed below, the Commission’s analysis of big data issues in recent merger cases has focused more on vertical than on horizontal issues. The Commission has, however, considered horizontal issues in both Verizon/Yahoo and Microsoft/LinkedIn. In Verizon/Yahoo, the Commission noted33 that ‘[a]ssuming data combination is allowed under the applicable data protection legislation, there are two main ways in which a merger may raise horizontal issues as a result of the combination, under the ownership of the merged entity, of two datasets previously held by two independent firms. ‘First, the combination of two datasets post-merger may increase the merged entity’s market power in a hypothetical market for the supply of this data or increase barriers to entry/expansion in the market for actual or potential competitors, which may need this data to operate on this market. Competitors may be required to collect a larger dataset in order to compete effectively with the merged entity than absent the merger. ‘Second, even if there is no intention or technical possibility to combine the two datasets, it may be that pre-merger the two companies were competing with each other on the basis of the data they controlled and this competition would be eliminated by the merger.’ Concluding that the combination of the parties’ data did not raise antitrust concerns, the Commission noted the potential restrictions imposed by EU data protection rules and observed that the combination would not raise barriers to entry/expansion for other players, since there would ‘continue to be a large amount of internet user data that are valuable for advertising purposes and that are not within the Parties’ exclusive control.’ Indeed, the Commission’s market test confirmed that Yahoo’s and Verizon’s data were not ‘unique.’ In Microsoft/LinkedIn, the Commission used the same formulation to describe how a merger may raise horizontal issues as a result of a combination of datasets. Again, the Commission found that the combination of the parties’ datasets did not raise barriers to entry/expansion for other players, because there would continue to be a ‘large amount of internet user data that are valuable for advertising purposes’ and that are not within Microsoft’s exclusive control. The Commission also noted that (with very limited exceptions) Microsoft and LinkedIn did not make their data available to third parties for advertising purposes and that the parties were small market players with little overlap in online advertising. 3. Comments Verizon/Yahoo and Microsoft/LinkedIn, with their identical summary of competitive harms that can arise from a combination of big datasets, are the closest the Commission has come to providing a theoretical framework for the analysis of big data issues in the horizontal context. A closer look at the Commission’s approach, however, raises a number of questions. Unpacking the two points raised in these decisions, the Commission’s formulation seems to break down into four theories of harm, none of which is fully articulated. First, the combination of the merging parties’ datasets may increase market power in a ‘hypothetical’ market for the supply of the merged parties’ data, i.e. where the merging parties are not in fact offering their data to third parties. It is unclear how competition would be impeded in such a situation, or in what market. The theory might be that, absent the merger, one or both parties would be more likely to start offering their data to third parties, turning a ‘hypothetical’ market into a real one. However, it is difficult to see how the Commission could meet its burden of proof to object to a transaction on this basis, at least without clear evidence of an intent to enter or create a market for the data in question absent the transaction. Second, the combination of the merging parties’ datasets may increase barriers to entry/expansion in a market for the supply of data by actual or potential competitors who need ‘this data to operate on this market.’ The Commission here seems to contemplate that a combination of the merging parties’ datasets would foreclose competition because actual or potential competitors would need access to the parties’ data, presumably because their data are unique for some reason. Again, the theory might be that, even if the merging parties would not offer their data to third parties themselves, one or both might be more likely absent the merger to license their data to a third party who would in turn market that dataset to end customers. If the data in question have not previously been made available to third parties, however, it would seem hard to show that competitors need it to compete. The concern over competitors’ need for the merging parties’ data as an input also seems to overlap with the vertical concerns discussed below. Third, the combination of the merging parties’ datasets may increase barriers to entry/expansion in a market for the supply of data, because the merger would result in actual or potential competitors being required to collect ‘a larger dataset to compete effectively with the merged entity.’ In this scenario, the merging parties’ data would not be uniquely required for other players to compete effectively, but those competitors would for some reason need more data than they would if they had access to the data of one or both of the merging parties. The focus on the quantity of data competitors would need to compete effectively with the merged entity seems simplistic; presumably the substitutability of alternative data, the cost of obtaining it, limitations on how the data can be used, etc., would also be relevant. Again, the concern over competitors’ access to the merging parties’ data seems to represent a vertical concern more than a horizontal one. Fourth, competition ‘based on’ the merging parties’ datasets could be eliminated by the merger even if the datasets themselves could not or would not be merged. It is not clear what market or markets would be affected in this scenario or how the competitive effects would be assessed. For example, auto manufacturers compete (among other things) based on the data they have collected in the process of designing and testing their products, but if two auto manufacturers merge, it would not make sense to assess the effect of combining their datasets separately from the effects of combining their product lines. Perhaps, in this scenario, the Commission has in mind the role of data in innovation competition, which the Commission has otherwise measured mainly based on patents.34 In any event, the Commission was able to dismiss these theories of harm in both Verizon/Yahoo and Microsoft/LinkedIn on the basis that a ‘large amount’ of user data would be available for advertising purposes and the parties’ data were not ‘unique’. Strikingly, in neither case did the Commission attempt to link its horizontal analysis of big data issues to its Horizontal Guidelines or prior treatments of barriers to entry. For example, the Commission did not describe the relevant test as whether the data in question were ‘essential’ or ‘key’ for competitors and if so how those concepts would be tested (although the Commission did comment on the data in question’s ‘uniqueness’). The Commission also left ambiguous which market or markets could be considered affected by the combination of the parties’ datasets and did not clarify how it determined that the ‘large amount’ of data available to third parties would be sufficient, and/or whether the Commission considered the substitutability of different available datasets. Since the Commission found no big-data related ground for concern in these cases, of course, it was not required to discuss potential theories of harm in detail. In future cases, the Commission will hopefully clarify its approach to horizontal theories of harm arising from the combination of big datasets. It would be helpful for the Commission’s analysis to consider the impact of the big data characteristics discussed above and in the Franco/German Study. Novel issues are most likely to arise in relation to combinations of first-party data (i.e., data collected or inferred by the merging parties themselves). Verizon/Yahoo and Microsoft/LinkedIn both concerned first-party data collected by companies on others, in particular consumers. As mentioned, such data are normally non-rivalrous and most commonly described as ‘ubiquitous’. Similarly, increasing the size of such datasets may yield diminishing returns, while the value of such data may decline relatively swiftly over time, though these features clearly vary depending on how companies use the data. By contrast, first-party data collected by a company on its own products and services may be unique and proprietary, but there would seem to be little ground for analysing the combination of data separately from the analysis of the combination of the products or services to which they relate. B. Vertical big data issues In general, although the Commission’s big-data cases have addressed both horizontal and non-horizontal concerns, the Commission’s analytical framework appears better suited to dealing with big-data concerns in a vertical context. Input foreclosure concerns are addressed in the Non-Horizontal Guidelines, and potential remedies for such concerns are discussed in the Remedies Notice. This is in line with the U.S. agencies’ approach in focussing on vertical foreclosure issues where big data represented an important input to a downstream market.35 The Franco/German Study also noted that a merger of companies holding ‘strong market positions in separate upstream or downstream markets can foreclose these markets for new competitors,’ for instance where an online service provider acquires ‘producers of computers, smartphones or softwares in order to make sure to continue to access important amounts of data through users of these services.’ 1. The Non-Horizontal Guidelines and Remedies Notice According to the Commission’s Non-Horizontal Guidelines, a ‘merger is said to result in foreclosure where actual or potential rivals’ access to supplies or markets is hampered or eliminated as a result of the merger, thereby reducing these companies’ ability and/or incentive to compete.’ According to the Commission, input foreclosure may occur in various forms. The merged entity may decide not to deal with its actual or potential downstream competitors, to restrict supplies, to raise prices to competitors and/or to otherwise make the conditions of supply less favourable (for example, by degrading the quality of the input). In assessing input foreclosure risks, the Commission examines, first, whether the merged entity would have the ability to substantially foreclose access to inputs; second, whether it would have the incentive to do so; and third, whether a foreclosure strategy would have a significant detrimental effect on competition downstream. Concerns arise only if the input in question is important for the downstream product, for instance because the input represents a significant cost factor or is important for other reasons. According to the Commission, a merged entity would only have the ability to foreclose if, by reducing access to its own upstream products or services, it could negatively affect the overall availability of inputs for the downstream market in terms of price or quality (e.g., because the remaining upstream suppliers are less efficient, offer less preferred alternatives, or lack the ability to expand output in response to the supply restriction) or due to the presence of exclusive contracts between the merged entity and independent input providers. Input foreclosure is only a concern where the merged firm has significant market power in the upstream market, so that its conduct can be expected to significantly influence competition in the upstream market and thus, possibly, prices and supply conditions in the downstream market. The incentive to foreclose depends on the degree to which foreclosure would be profitable. The merged entity faces a trade-off between the profit lost in the upstream market due to a reduction of input sales to (actual or potential) rivals and the profit gain, in the short or longer term, from expanding sales downstream or, as the case may be, being able to raise prices to consumers. Other things being equal, a merged entity is more likely to have an incentive to foreclose where the upstream margins are low and the downstream margins are high. The merged firm’s incentives also depend on the share of demand that may be diverted from downstream rivals to the merged entity. In general, input foreclosure concerns may lead to a challenge when input foreclosure would lead to increased prices in the downstream market or raise barriers to entry to potential competitors, in particular if input foreclosure would entail for such potential competitors the need to enter at both the downstream and the upstream level in order to compete effectively. Conversely, concerns may be rebutted if there are sufficient credible downstream competitors whose costs would likely not be raised, countervailing factors such as buyer power or the likelihood that entry upstream would maintain effective competition and efficiencies. According to the Remedies Notice, concerns that competition may be significantly impeded by foreclosure can be addressed by ‘commitments granting non-discriminatory access to infrastructure or networks of the merging parties’ or ‘information necessary for the interoperability of different equipment’ where ‘the control of key technology or IP rights may lead to concerns of foreclosure of competitors which depend on the technology or IP rights as essential input for the activities in a downstream market.’36 Similarly, concerns that a merger may eliminate players’ incentives to license patents may be eliminated by commitments to grant licenses on the same basis post-transaction to all third parties which depend on the IP rights or information for their activities. 2. Commission precedents In a number of recent cases, the Commission has applied the analysis of input foreclosure effects to concerns about access to big data in the merger context, supplementing the discussions from earlier cases such as Telefónica UK/ Vodafone UK/Everything Everywhere/JV and Google/DoubleClick. In Microsoft/LinkedIn, certain software solutions providers claimed that LinkedIn ‘full data’ would in future constitute an important input for the provision of advanced functionalities of CRM software. One complainant in particular argued that Microsoft could restrict access to LinkedIn full data, thereby making it harder for other providers to compete and to bring innovation in the market. The Commission noted that LinkedIn did not generally make available LinkedIn full data to third parties pre-transaction, and the complaint assumed that, absent the transaction, LinkedIn would have started monetising such data. The Commission questioned that assumption, but nonetheless analysed whether Microsoft would have had the ability and incentive to reduce competition by restricting access to LinkedIn full data to downstream competitors. Microsoft argued that LinkedIn full data were not an important input, and that alternative data were available from other vendors. The Commission considered that the transaction would not have given Microsoft the ability to foreclose competition, since reducing access to LinkedIn full data would be unlikely to negatively affect the overall availability of data needed by downstream competitors, in part because LinkedIn did not appear to have a significant degree of market power in any potential relevant upstream market, which in this case would be an hypothetical market or segment for the provision of data. In addition, the Commission considered that LinkedIn full data could not be characterised as (and would not be likely to become in the next two to three years) an important input, since all major CRM vendors had already started offering advanced functionalities or planned to do so in the next two to three years, without access to LinkedIn full data. Furthermore, even if LinkedIn full data were to be used for that purpose, it would constitute only one of the many types of data needed and many other possible sources of data were already available. Moreover, the Commission considered that it was unclear whether Microsoft would have had the incentive to foreclose competing providers of CRM software solutions by restricting access to LinkedIn full data. The Commission noted that the market investigation was inconclusive and that it was not clear to what extent any such foreclosing strategy would be profitable for Microsoft (since it was not possible to estimate what profits Microsoft would derive by licensing LinkedIn full data). As regards the impact of a foreclosure strategy on the downstream market, the Commission considered that the transaction was unlikely to have an overall negative impact on effective competition in the market for CRM software solutions, and any potential restriction of access to LinkedIn full data would be unlikely to lead to consumer harm, since any impact would be felt in sub-segments representing less than 30 per cent of the entire CRM software solutions market, and LinkedIn was only one of multiple data sources required. Therefore, it was unlikely that competing CRM software providers would be hampered in their ability to compete and innovate, or that the mere likelihood that Microsoft would carry out a foreclosure strategy would raise barriers to entry for potential competitors. In Facebook/Whatsapp,37 the Commission considered whether a ‘potential data concentration’ would be likely to strengthen Facebook’s position in the market for online advertising, where there was no horizontal overlap. Since WhatsApp did not collect any user data that was valuable for advertising purposes, moreover, the transaction would not increase the amount of data available to Facebook for that purpose. Nonetheless, in response to concerns raised in its market test, the Commission considered whether Facebook’s potential use of WhatsApp as a source of data to improve the targeting of Facebook’s advertising activities. The Commission noted that, even if the merged entity were to start collecting and using data from WhatsApp users, the transaction would only raise competition concerns if the concentration of data within Facebook’s control were to allow it to strengthen its position in advertising. The Commission’s market test showed that, post-transaction, a sufficient number of alternative providers of online advertising services would remain, and that a significant number of other market participants collect user data, including Google, which the Commission said ‘accounts for a significant portion of the Internet user data’, as well as many others. Accordingly, the Commission concluded that a ‘large amount’ of Internet user data that is valuable for advertising purposes would remain outside Facebook’s exclusive control. In Telefónica UK/Vodafone UK/Everything Everywhere/JV, a case involving the formation of a joint venture to offer mobile commerce services to UK businesses, the Commission thoroughly analysed concerns that the JV would foreclose competing providers of data analytics or advertising services by combining personal, location, response, social behaviour, and browsing data by creating a unique database that would become an essential input for targeted mobile advertising that no competing provider of mobile data analytics services or advertising customer would be able to replicate,38 although none of the JV parents was previously active in data analytics services. The Commission noted that customers generally tend to give their personal data to many market players, so that this type of data is considered a ‘commodity.’ This type of information was also already available to market players who were already providing targeted advertising or developing these activities. The Commission considered a claim that the JV parents would be uniquely placed to reach the consumer at a certain ‘right’ moment’ via geolocation data but again concluded that multiple sources of such information would remain available. In sum, the Commission concluded that while the JV would indeed be able to collect a broad range of consumer information, which would be very valuable for its data analytics services and advertising services, many other strong and established players could offer comparable solutions, so advertising service providers would not be foreclosed from an essential input. In Google/DoubleClick, the Commission described the theory of harm related to the combination of Google’s and DoubleClick’s customer data as a ‘foreclosure scenario’ in which the combination of the parties’ data could allow the merged entity to marginalise competitors by better targeting ads to users. The Commission concluded, however, that the merged entity would lack the ability and incentive to force advertisers to allow better targeting of ads. The Commission also noted that, even if Google’s and DoubleClick’s data were available as an input for DoubleClick, its competitiveness would not likely be enhanced in a way that could be matched by competitors, taking account of the availability of data from other sources. The Commission concluded that the combination of Google’s and DoubleClick’s data would be very unlikely to enable the merged entity to charge higher prices for its intermediation services.39 3. Comments The Commission’s analysis of big-data related input foreclosure concerns has tended to track the Commission’s traditional approach as set out in its Guidelines more closely than its analysis of horizontal issues. In Microsoft/LinkedIn in particular, the Commission closely tracked the Non-Horizontal Guidelines analysis of the merged entity’s alleged incentive and ability to foreclose downstream competitors’ access to data as an input, taking account of the parties’ strength in upstream and downstream markets. The Commission considered foreclosure effects in a ‘hypothetical’ market, since the concerns were based on an unproven assumption that without the transaction LinkedIn would have attempted to monetise its full data by selling it to downstream competitors. The Commission itself noted the theoretical difficulties with this approach; it is hard to see how the Commission could meet its burden to show that a notified transaction would create the incentive and ability to foreclose the supply of data to competitors where the data have not been supplied by either merging party in the past, there is no course of dealing to measure the profitability of such a strategy, and the data in question has by definition not previously been a key input for competitors. Future decisions may clarify how the Commission will apply its analytical framework for vertical issues to big data, where the data are not already available as an input to competitors. The Commission should also consider potential anticompetitive effects in view of the characteristics of the data in question. Again, the cases in which these issues have arisen involve first-party data (often personal data) collected by companies on their customers for use (or potential use) in online advertising. As noted, such data are the type most commonly considered to be non-rivalrous and ubiquitous, as acknowledged by the Commission in calling such data a ‘commodity.’ Such data is also more likely – depending on the use case – to suffer from diminishing returns and to decline rapidly in value. Where the value of particular data is likely to be short-lived, it may be more appropriate to consider the potential foreclosure impact of the merging parties’ sources for generating new data, as opposed to the combination of existing datasets. By contrast, where merging companies’ data may have longer lasting value, such as for developing and improving a platform’s own online advertising algorithms or other products (e.g., spell-check programs or voice recognition software), incremental data may have decreasing marginal value, which would likely limit the competitive impact of combining the merging parties’ data. The Commission would also need to take account of non-data alternatives to address relevant needs. For example, if competitors can license a suitable algorithm from a software developer, access to a particular dataset to develop its own software may be less relevant. III. Conclusion Although the European Commission has been less vocal on big-data issues than some national authorities, the Commission has developed a significant track record in examining big-data-related concerns in the merger context. Most of the Commission’s older decisions concerned data-related markets; i.e., markets in which the data in question were themselves the product market. More recently, the Commission has started to explicate its approach to big-data concerns where the merging parties collect large amounts of data in their business but do not offer such data to third parties as a product. The Commission has considered such issues from both horizontal and vertical perspectives. Since the Commission has not yet objected to a merger based on big-data concerns, it has not been required to articulate a theory of harm in detail. Nonetheless, the Commission’s approach so far raises questions about the suitability of its existing guidelines for assessing concerns about data that companies collect and use internally, as opposed to data offered on the market. In the horizontal context, the Commission articulated an analytical framework in two recent cases, but this framework did not clearly track the Commission’s Horizontal Guidelines and raised a number of interpretive questions, apparently mingling horizontal and vertical elements. In the vertical context, the Commission has more clearly followed the analytical approach set out in the Commission’s own Guidelines, considering the possibility that a merger would lead to foreclosure of access by downstream competitors to data they need to compete effectively. Where merging parties are only collecting and using data in their own businesses, however, conclusions about how a merger might change the parties’ incentives to offer such data in the future seem speculative, and in any case showing that such data would be a ‘key’ or ‘important’ input for competitors seems difficult where the data were not available pre-merger. In future cases, it is to be hoped that the Commission will more clearly articulate its framework for analysing the competitive effect of big data where companies are collecting and using the data in their own businesses, as opposed to participating in data-related markets. In this process, it would be helpful for the Commission to consider the significance of big data in light of the factors discussed in the Franco/German Study and elsewhere – whether the data in question are non-rivalrous and ubiquitous, and whether their value reflects diminishing incremental returns from additional data and/or is likely to decline rapidly over time, in each case depending on the source and nature of the data and the purposes for which it is used. In particular, while big data concerns can be expected to arise most often in relation to first-party data collected by online companies on consumers and used as an input in related markets such as online advertising, these are the types of data most commonly considered non-rivalrous and ubiquitous. By contrast, it should be possible in most cases to rule out big-data concerns in relation to first-party data companies collect on their own products and services. Big data issues would not be expected to arise, as intimated in the Franco/German Study, simply because the data in question are ‘unique’ and valuable. Footnotes 1 https://www.ft.com/content/349103ba-c631-11e7-b2bb-322b2cb39656 2 The ‘Franco/German Study’, available at http://www.bundeskartellamt.de/SharedDocs/Publikation/DE/Berichte/Big%20Data%20Papier.html?nn = 3591568. See also, Monopolies Commission, Competition Policy: The Challenge of Digital Markets, Special Report No 68 [2015], the CMA’s reports on consumer data, available at https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/435817/The_commercial_use_of_consumer_data.pdf, and big data in the insurance sector, available at https://www.fca.org.uk/publications/feedback-statements/fs16-5-call-inputs-big-data-retail-general-insurance. 3 See e.g., Big Data and Competition, September 29, 2017, available at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/big-data-and-competition_en, and Clearing the Path for Innovation, November 7, 2017, available at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/clearing-path-innovation_en. 4 For an overview of these cases, see Massimiliano Kadar and Mateusz Bogdan, ‘Big Data’ and EU Merger Control – A Case Review, JECLAP, 17 October 2017, p. 479, available at https://academic.oup.com/jeclap/article/8/8/479/3844574#. 5 See e.g., VNU/AC Nielsen, COMP/M.2291, dated 12.02.2001 available at http://ec.europa.eu/competition/mergers/cases/decisions/m2291_en.pdf and WPP/TMS, COMP/M.5232, dated 23.09.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m5232_20080923_20212_en.pdf 6 See e.g., Thomson Corporation/Reuters, COMP/M.4726, dated 19.02.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4726_20080219_20600_en.pdf 7 E.g., TomTom/TeleAtlas, Comp/M. 4854, dated 14.05.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4854_20080514_20682_en.pdf and Nokia/Navteq, dated 02/07.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4942_20080702_20682_en.pdf. 8 See e.g., Google/DoubleClick, COMP/M.4731, dated 11.03.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4731_20080311_20682_en.pdf. 9 Franco/German Study, p. 16. 10 Paras 359 et seq. 11 Microsoft/Yahoo! Search Business, dated 18.02.2010, available at http://ec.europa.eu/competition/mergers/cases/decisions/M5727_20100218_20310_261202_EN.pdf. 12 COMP/M.8180, dated 21.12.2016, available at http://europa.eu/rapid/press-release_MEX-16-4491_en.htm. 13 COMP/M.8124, dated 6.12.2016 available at http://europa.eu/rapid/press-release_IP-16-4284_en.htm. 14 COMP/M.7217, dated 3.10.2014, available at http://ec.europa.eu/competition/mergers/cases/decisions/m7217_20141003_20310_3962132_EN.pdf. See also Telefónica UK/Vodafone UK/Everything Everywhere, COMP/M.6314, dated 04.09.2012, available athttp://ec.europa.eu/competition/mergers/cases/decisions/m6314_20120904_20682_2898627_EN.pdf 15 OJ C 31/5, 5.2.2004 (the ‘Horizontal Guidelines’), available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52004XC0205(02)&from=EN 16 OJ C 265/6, 18.10.2008, available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52008XC1018(03)&from=EN. 17 OJ 267/1, 22.10.2008, available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52008XC1022(01)&from=EN 18 http://www.gartner.com/it-glossary/big-data/. See also Maurice E. Stucke and Allen P. Grunes (2016), Big Data and Competition Policy (Oxford University Press 2016), pp. 15–28. 19 See e.g., http://www.ibmbigdatahub.com/infographic/four-vs-big-data 20 Franco/German Study, pp. 4–7. The Franco/German Study focuses mainly on personal data, whose collection and processing is subject to Regulation 2016/679, the General Data Protection Regulation. 21 Franco/German Study, p. 36. See also, Lerner, The Role of ‘Big Data’ in Online Platform Competition (2014), pp. 21–23. 22 See e.g., Anja Lambrecht and Catherine E. Tucker, Can Big Data Protect a Firm from Competition (Winter 2017) CPI Antitrust Chronicle, p. 1. 23 See e.g., Maurice E. Stucke and Allen P. Grunes (2016), Big Data and Competition Policy (Oxford University Press 2016), pp. 42 to 47 and 288–301; Maurice E. Stucke and Allen P. Grunes (2015), Debunking the Myths over Big Data and Antitrust (Spring 2015) CPI Antitrust Chronicle; No Mistake about it: The Important Role of Antitrust in the Era of Big Data (April 2015) Antitrust Source 1, pp. 7–8; and Inge Graef (2015), Market Definition and Market Power in Data: the Case of Online Platforms (2015) World Competition Vol.38, No.4, pp. 479–480. 24 Franco/German Study, p. 42. See also, Darren Tucker and Hill Wellford, Big Mistakes Regarding Big Data (December 2014) Antitrust Source 1, pp. 3–4. 25 See e.g., Anja Lambrecht and Catherine E. Tucker, Can Big Data Protect a Firm from Competition (Winter 2017) CPI Antitrust Chronicle, pp. 2–3. 26 See e.g., Maurice E. Stucke and Allen P. Grunes, Big Data and Competition Policy (Oxford University Press 2016), pp. 42–47 and 288–301; and Maurice E. Stucke and Allen P. Grunes (2015), Debunking the Myths over Big Data and Antitrust (Spring 2015) CPI Antitrust Chronicle; No Mistake about it: The Important Role of Antitrust in the Era of Big Data (April 2015) Antitrust Source 1, pp. 7–8. 27 Franco/German Study, pp. 47–48. See also, Inge Graef, Market Definition and Market Power in Data: the Case of Online Platforms (2015) World Competition Vol. 38, No. 4, pp, 483–487; Andres V. Lerner, The Role of ‘Big Data’ in Online Platform Competition (2014), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2482780, p. 35 and Darren Tucker and Hill Wellford, Big Mistakes Regarding Big Data (December 2014) Antitrust Source 1, p. 4. 28 Franco/German Study, p. 49. 29 Franco/German Study, pp. 50–52. 30 Franco/German Study, p. 16. 31 Horizontal Guidelines, para. 71 (citations omitted). 32 Remedies Notice, para. 62. 33 Paras. 81 et seq. 34 See e.g., Dow/DuPont, COMP/M.7932, dated 27.3.2017, available at http://ec.europa.eu/competition/mergers/cases/decisions/m7932_13668_3.pdf. 35 See. e.g., Statement of the Fed. Trade Comm’n, in the Matter of Nielsen Holdings N.V. and Arbitron Inc., File No. 131-0058 at 1 (Sept. 20, 2013), https://www.ftc.gov/system/files/documents/cases/140228nielsenholdingsstatement.pdf. See also, Terrell McSweeney and Brian O’Dea, Data, Innovation, and Potential Competition in Digital Markets – Looking Beyond Short-Term Price Effects in Merger Analysis, CPI Antitrust Chronicle February 2018. 36 Remedies Notice, paras. 64–65. 37 Paras 184 et seq. 38 Telefónica UK/ Vodafone UK/ Everything Everywhere/ JV, paras. 528 et seq. 39 Google/DoubleClick, paras 359–366. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: [email protected] This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of European Competition Law & Practice Oxford University Press

Big Data and Merger Control in the EU

Loading next page...
 
/lp/ou-press/big-data-and-merger-control-in-the-eu-TusTpXFluO

References (0)

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Oxford University Press
Copyright
© The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: [email protected]
ISSN
2041-7764
eISSN
2041-7772
DOI
10.1093/jeclap/lpy062
Publisher site
See Article on Publisher Site

Abstract

Key Points The Commission has an extensive track record examining data-related issues in its EU Merger Regulation decisions, but many of these have involved markets in which data or data analytics were the relevant product. In a number of cases, however, the Commission has examined the antitrust implications of merging parties collecting data that are not offered as a product on the market, the issue at the heart of the big-data debate. In these cases, the Commission’s attempts to apply its existing guidelines have raised a number of conceptual and practical issues. In future cases, it is to be hoped that the Commission will more clearly articulate its framework for analysing the competitive effect of big data where companies are collecting and using the data in their own businesses. EU antitrust authorities have taken a leading role in exploring antitrust concerns around ‘big data.’ The UK CMA announced the formation of a new technology team to keep pace with the use of algorithms, artificial intelligence and big data in 2017,1 and the French and German authorities published a joint study on big data and antitrust in 2016.2 Compared to these national authorities, the European Commission has kept a relatively low profile, although Commissioner Vestager has commented on big data and algorithm issues in several speeches.3 In one area, however, the Commission has been perhaps the most active global antitrust authority in documenting its analysis of big-data issues: merger control. The Commission has a long track record of assessing the effect of notified mergers on data-related markets.4 The Commission has analysed a wide range of data-related markets, from market research and marketing information services,5 to financial information,6 to navigable digital map databases,7 to online advertising.8 In these cases, the Commission’s traditional tools for assessing the effects of a combination on horizontally and vertically affected markets can normally be applied in a straightforward way. Recent debates on big data and antitrust, by contrast, focus on the antitrust implications of companies collecting and using big data that are not offered as a product on the market. The Franco/German Study opined that a combination of ‘data troves could raise competition concerns if the combination of data makes it impossible for competitors to replicate the information’ possessed by the merged entity.9 One early Commission decision, Google/DoubleClick,10 considered the competitive effects of combining data that were not being offered and sold, but the discussion was brief, and the issue did not arise in a similar case two years later.11 In three recent merger cases, however, Verizon/Yahoo,12Microsoft/LinkedIn,13 and Facebook/WhatsApp,14 the Commission has elaborated on its approach to big data issues in merger review. These decisions help to illustrate the Commission’s approach and its attempts to fit big data issues into its traditional analytical frameworks, as set out in its Guidelines (the ‘Horizontal Guidelines’) on the assessment of horizontal mergers under the Council Regulation on the control of concentrations between undertakings (the ‘EUMR’),15 its Guidelines (the ‘Non-Horizontal Guidelines’) on the assessment of non-horizontal mergers under the EUMR16 and its Notice (the ‘Remedies Notice’) on remedies acceptable under the EUMR.17 This article discusses the concept of big data and the characteristics of big data that are arguably most relevant from an antitrust perspective, before analysing the Commission’s recent decisions assessing the effects of combining merging parties’ data that are not being licensed or sold as products in their own right. I. What is ‘Big Data’ and which characteristics are antitrust-relevant? Although the term is widely used, there is no agreed definition of ‘big data.’ Gartner Research defines big data as ‘high-volume, high-velocity, and/or high-variety information assets that demand cost-effective, innovative forms of information processing that enable enhanced insight, decision making, and process automation.’18 ‘Volume,’ ‘velocity’, and ‘variety’ are commonly referred to as the ‘three Vs’ of big data. A fourth ‘V’ – veracity – is sometimes added.19 Although big data are commonly defined by reference to these ‘Vs,’ these three (or four) characteristics arguably aren’t the most important for antitrust purposes. The Franco/German Study discusses other approaches to categorising big data, including by reference to the subject of the information (individuals, economic entities or objects), whether the information is structured (such that it can more easily be processed for commercial purposes), and how it is gathered (collected internally, provided by third parties voluntarily, collected from public sources or by observing users’ behaviour, or inferred from existing data).20 The Franco/German Study refers to data made available by one company to others as ‘third-party data.’ Third-party data can be viewed for competition purposes as a product offered for consideration, and evaluated under traditional antitrust principles. By contrast, data collected or inferred directly by a company is referred to as ‘first-party data.’ As discussed in more detail below, the novel issues raised by antitrust authorities and commentators with respect to big data relate mainly to first-party data collected from other natural and legal persons. In a related discussion of big data’s propensity to create market power, the Franco/Germany Study discusses other characteristics that may be more relevant to the application of potential antitrust theories of harm: (i) the ‘non-rivalrous’ nature of data, (ii) the ‘ubiquity’ of data, (iii) the decreasing marginal value of additional data, and (iv) the tendency of data’s value to decline rapidly over time. In relation to the non-rivalrous and ubiquitous nature of data, the Franco/German Study explains that big data can be described as non-rivalrous when ‘someone having and using a dataset does not prevent others, be they competitors or not, from having and using the same data as well (provided they can access them).’21 The non-rivalrous nature of big data is sometimes simply assumed,22 but some commentators contest this description.23 The Franco/German Study also notes that the high volumes of data generated by online users give rise to an argument that big data cannot give rise to market power because ‘data is everywhere.’ If ‘the same kind of knowledge can be extracted from different datasets which may also be obtained through different mechanisms, the risk that an undertaking may not be able to have access to the knowledge enjoyed by his competitors could be low.’24 The Franco/German Study suggests that the validity of this argument may be undermined by limitations on the accessibility of data and the substitutability of one dataset for another. Again, some commentators stress the ubiquity of data,25 while others argue that the ubiquity of data may be exaggerated, in particular where companies collecting such data use exclusionary practices to limit competitors’ access to data.26 As regards the value of data, the Franco/German Study notes that ‘decreasing marginal returns to scale… would [limit] the competitive advantages resulting from large amounts of data.’27 The Franco/German Study notes further that the scope of a dataset may be as important as its scale; data collected by offering different services may allow the collecting entity to gather knowledge on multiple aspects of users’ behaviour and tastes, potentially improving the data holder’s ability to predict users’ interests. Similarly, the Franco/German Study discusses the fact that ‘the value of data may decrease quite quickly in time,’28 but notes that some data, such as gender, names, address, date of birth, job, etc., may not lose value over time. According to the Franco/German Study, a company having such data at its disposal may have a lasting advantage over its competitors. The Franco/German Study concludes that the advantages associated with access to a larger volume of data may be quite different from one market to another, and case-by-case assessments are required.’29 While it is hard to argue with this observation, the theoretical analysis can be taken a step further by considering the characteristics of different types of data and applications. For example, whether data are non-rivalrous and ubiquitous depends on the data in question and the purpose for which it is to be used. Absent contractual restrictions, third-party data offered as a product are normally available to anyone willing to pay for them, so they are non-rivalrous and ubiquitous by definition. The situation with first-party data is more complicated. First-party data collected by a company developing and testing its own products, for instance through the Internet of Things, would not normally be accessible to others and so are neither non-rivalrous nor ubiquitous. However, the exclusive nature of such data does not necessarily have any foreclosing effect; any company is free to collect the same or similar types of data on its own products. First-party data collected on others, such as data collected by an online retailer on its customers’ preferences, may not be available to competitors, but the same or similar data may be available from multiple sources. Whether those other data are sufficiently similar to be substitutable depends on their intended use. Without the tool of market prices, however, antitrust authorities may find it difficult to define and measure the substitutability of first-party data for different purposes. Similarly, the marginal value of additional data, and how long-lasting that value is, depends on the type of data in question and the purpose for which it is used. For example, the principle of diminishing returns presumably applies less or not at all to data collected from third parties for use in time-sensitive applications, such as individually targeted online advertising. On the other hand, the more time-sensitive the application, the shorter-lived the data’s value is likely to be. Conversely, in some applications for which data’s value is more long-lasting, such as first-party data collected on a company’s own products or data collected from users of online services where the user’s identity and preferences may not be important (e.g., to test speech recognition software), incremental data seems more likely to have diminishing returns. Such data are also more likely to be available from a greater variety of sources, i.e. to be non-rivalrous and ubiquitous. Data also have different values to different users; first-party data collected by a company on its own products may have lasting value to that company, but no (legitimate) competitive value to another company. Again, absent market prices for first-party data, defining and measuring these differences will be challenging for antitrust authorities. II. Big data and EU merger review As mentioned, merger review is the area in which the Commission has considered big-data issues in concrete cases, including a number of major technology-sector transactions. The Commission has not laid out a theoretical framework for analysing big data issues as such, and the Horizontal and Non-Horizontal Guidelines do not specifically reference data (unsurprisingly in view of the dates they were published). Nonetheless, the Commission’s recent decisions go a long way towards formalising its approach to assessing data-related aspects of notified mergers where the data are not themselves being offered to third parties as part of a product or service. This emerging approach is discussed below, first in relation to horizontal issues and then in relation to vertical issues. A. Horizontal big data issues As noted, according to the Franco/German Study, a combination of ‘data troves could raise competition concerns if the combination of data makes it impossible for competitors to replicate the information’ possessed by the merged entity. The Franco/German Study also noted that mergers ‘in data-related markets… could result in differentiated data access and increase the concentration of data related to this market if the newcomer has access to a large database (gained on another market for instance).’30 Although the Franco/German Study did not divide its merger-related concerns between horizontal and non-horizontal theories of harm, these seem to fall in the category of horizontal concerns, relating to potential data-related barriers to entry a merger could raise. 1. The Horizontal Guidelines and remedies notice The Commission’s Horizontal Guidelines note that the anticompetitive effects of horizontal mergers can be divided into non-coordinated effects, where a merger eliminates ‘important competitive constraints on one or more firms, which consequently would have increased market power,’ and coordinated effects, ‘by changing the nature of competition in such a way that firms that previously were not coordinating their behaviour, are now significantly more likely to coordinate’ or by making ‘coordination easier, more stable or more effective for firms which were coordinating prior to the merger.’ The Horizontal Guidelines describe a number of possible barriers to entry or expansion that may be relevant in the big-data context, including ‘technical advantages, such as preferential access to essential facilities, innovation and R & D, or intellectual property rights,’ difficulties in obtaining ‘essential input materials,’ ‘economies of scale and scope,’ and ‘access to important technologies.’ 31 The Horizontal Guidelines do not discuss the possibility that a merger might significantly impede competition by raising barriers to entry or expansion, but the Remedies Notice observes that the Commission has accepted remedies to facilitate market entry by competitors, including the ‘granting of access to key infrastructure, networks, key technology, including patents, know-how or other intellectual property rights, and essential inputs… on a non-discriminatory and transparent basis.’32 2. Commission precedents As discussed below, the Commission’s analysis of big data issues in recent merger cases has focused more on vertical than on horizontal issues. The Commission has, however, considered horizontal issues in both Verizon/Yahoo and Microsoft/LinkedIn. In Verizon/Yahoo, the Commission noted33 that ‘[a]ssuming data combination is allowed under the applicable data protection legislation, there are two main ways in which a merger may raise horizontal issues as a result of the combination, under the ownership of the merged entity, of two datasets previously held by two independent firms. ‘First, the combination of two datasets post-merger may increase the merged entity’s market power in a hypothetical market for the supply of this data or increase barriers to entry/expansion in the market for actual or potential competitors, which may need this data to operate on this market. Competitors may be required to collect a larger dataset in order to compete effectively with the merged entity than absent the merger. ‘Second, even if there is no intention or technical possibility to combine the two datasets, it may be that pre-merger the two companies were competing with each other on the basis of the data they controlled and this competition would be eliminated by the merger.’ Concluding that the combination of the parties’ data did not raise antitrust concerns, the Commission noted the potential restrictions imposed by EU data protection rules and observed that the combination would not raise barriers to entry/expansion for other players, since there would ‘continue to be a large amount of internet user data that are valuable for advertising purposes and that are not within the Parties’ exclusive control.’ Indeed, the Commission’s market test confirmed that Yahoo’s and Verizon’s data were not ‘unique.’ In Microsoft/LinkedIn, the Commission used the same formulation to describe how a merger may raise horizontal issues as a result of a combination of datasets. Again, the Commission found that the combination of the parties’ datasets did not raise barriers to entry/expansion for other players, because there would continue to be a ‘large amount of internet user data that are valuable for advertising purposes’ and that are not within Microsoft’s exclusive control. The Commission also noted that (with very limited exceptions) Microsoft and LinkedIn did not make their data available to third parties for advertising purposes and that the parties were small market players with little overlap in online advertising. 3. Comments Verizon/Yahoo and Microsoft/LinkedIn, with their identical summary of competitive harms that can arise from a combination of big datasets, are the closest the Commission has come to providing a theoretical framework for the analysis of big data issues in the horizontal context. A closer look at the Commission’s approach, however, raises a number of questions. Unpacking the two points raised in these decisions, the Commission’s formulation seems to break down into four theories of harm, none of which is fully articulated. First, the combination of the merging parties’ datasets may increase market power in a ‘hypothetical’ market for the supply of the merged parties’ data, i.e. where the merging parties are not in fact offering their data to third parties. It is unclear how competition would be impeded in such a situation, or in what market. The theory might be that, absent the merger, one or both parties would be more likely to start offering their data to third parties, turning a ‘hypothetical’ market into a real one. However, it is difficult to see how the Commission could meet its burden of proof to object to a transaction on this basis, at least without clear evidence of an intent to enter or create a market for the data in question absent the transaction. Second, the combination of the merging parties’ datasets may increase barriers to entry/expansion in a market for the supply of data by actual or potential competitors who need ‘this data to operate on this market.’ The Commission here seems to contemplate that a combination of the merging parties’ datasets would foreclose competition because actual or potential competitors would need access to the parties’ data, presumably because their data are unique for some reason. Again, the theory might be that, even if the merging parties would not offer their data to third parties themselves, one or both might be more likely absent the merger to license their data to a third party who would in turn market that dataset to end customers. If the data in question have not previously been made available to third parties, however, it would seem hard to show that competitors need it to compete. The concern over competitors’ need for the merging parties’ data as an input also seems to overlap with the vertical concerns discussed below. Third, the combination of the merging parties’ datasets may increase barriers to entry/expansion in a market for the supply of data, because the merger would result in actual or potential competitors being required to collect ‘a larger dataset to compete effectively with the merged entity.’ In this scenario, the merging parties’ data would not be uniquely required for other players to compete effectively, but those competitors would for some reason need more data than they would if they had access to the data of one or both of the merging parties. The focus on the quantity of data competitors would need to compete effectively with the merged entity seems simplistic; presumably the substitutability of alternative data, the cost of obtaining it, limitations on how the data can be used, etc., would also be relevant. Again, the concern over competitors’ access to the merging parties’ data seems to represent a vertical concern more than a horizontal one. Fourth, competition ‘based on’ the merging parties’ datasets could be eliminated by the merger even if the datasets themselves could not or would not be merged. It is not clear what market or markets would be affected in this scenario or how the competitive effects would be assessed. For example, auto manufacturers compete (among other things) based on the data they have collected in the process of designing and testing their products, but if two auto manufacturers merge, it would not make sense to assess the effect of combining their datasets separately from the effects of combining their product lines. Perhaps, in this scenario, the Commission has in mind the role of data in innovation competition, which the Commission has otherwise measured mainly based on patents.34 In any event, the Commission was able to dismiss these theories of harm in both Verizon/Yahoo and Microsoft/LinkedIn on the basis that a ‘large amount’ of user data would be available for advertising purposes and the parties’ data were not ‘unique’. Strikingly, in neither case did the Commission attempt to link its horizontal analysis of big data issues to its Horizontal Guidelines or prior treatments of barriers to entry. For example, the Commission did not describe the relevant test as whether the data in question were ‘essential’ or ‘key’ for competitors and if so how those concepts would be tested (although the Commission did comment on the data in question’s ‘uniqueness’). The Commission also left ambiguous which market or markets could be considered affected by the combination of the parties’ datasets and did not clarify how it determined that the ‘large amount’ of data available to third parties would be sufficient, and/or whether the Commission considered the substitutability of different available datasets. Since the Commission found no big-data related ground for concern in these cases, of course, it was not required to discuss potential theories of harm in detail. In future cases, the Commission will hopefully clarify its approach to horizontal theories of harm arising from the combination of big datasets. It would be helpful for the Commission’s analysis to consider the impact of the big data characteristics discussed above and in the Franco/German Study. Novel issues are most likely to arise in relation to combinations of first-party data (i.e., data collected or inferred by the merging parties themselves). Verizon/Yahoo and Microsoft/LinkedIn both concerned first-party data collected by companies on others, in particular consumers. As mentioned, such data are normally non-rivalrous and most commonly described as ‘ubiquitous’. Similarly, increasing the size of such datasets may yield diminishing returns, while the value of such data may decline relatively swiftly over time, though these features clearly vary depending on how companies use the data. By contrast, first-party data collected by a company on its own products and services may be unique and proprietary, but there would seem to be little ground for analysing the combination of data separately from the analysis of the combination of the products or services to which they relate. B. Vertical big data issues In general, although the Commission’s big-data cases have addressed both horizontal and non-horizontal concerns, the Commission’s analytical framework appears better suited to dealing with big-data concerns in a vertical context. Input foreclosure concerns are addressed in the Non-Horizontal Guidelines, and potential remedies for such concerns are discussed in the Remedies Notice. This is in line with the U.S. agencies’ approach in focussing on vertical foreclosure issues where big data represented an important input to a downstream market.35 The Franco/German Study also noted that a merger of companies holding ‘strong market positions in separate upstream or downstream markets can foreclose these markets for new competitors,’ for instance where an online service provider acquires ‘producers of computers, smartphones or softwares in order to make sure to continue to access important amounts of data through users of these services.’ 1. The Non-Horizontal Guidelines and Remedies Notice According to the Commission’s Non-Horizontal Guidelines, a ‘merger is said to result in foreclosure where actual or potential rivals’ access to supplies or markets is hampered or eliminated as a result of the merger, thereby reducing these companies’ ability and/or incentive to compete.’ According to the Commission, input foreclosure may occur in various forms. The merged entity may decide not to deal with its actual or potential downstream competitors, to restrict supplies, to raise prices to competitors and/or to otherwise make the conditions of supply less favourable (for example, by degrading the quality of the input). In assessing input foreclosure risks, the Commission examines, first, whether the merged entity would have the ability to substantially foreclose access to inputs; second, whether it would have the incentive to do so; and third, whether a foreclosure strategy would have a significant detrimental effect on competition downstream. Concerns arise only if the input in question is important for the downstream product, for instance because the input represents a significant cost factor or is important for other reasons. According to the Commission, a merged entity would only have the ability to foreclose if, by reducing access to its own upstream products or services, it could negatively affect the overall availability of inputs for the downstream market in terms of price or quality (e.g., because the remaining upstream suppliers are less efficient, offer less preferred alternatives, or lack the ability to expand output in response to the supply restriction) or due to the presence of exclusive contracts between the merged entity and independent input providers. Input foreclosure is only a concern where the merged firm has significant market power in the upstream market, so that its conduct can be expected to significantly influence competition in the upstream market and thus, possibly, prices and supply conditions in the downstream market. The incentive to foreclose depends on the degree to which foreclosure would be profitable. The merged entity faces a trade-off between the profit lost in the upstream market due to a reduction of input sales to (actual or potential) rivals and the profit gain, in the short or longer term, from expanding sales downstream or, as the case may be, being able to raise prices to consumers. Other things being equal, a merged entity is more likely to have an incentive to foreclose where the upstream margins are low and the downstream margins are high. The merged firm’s incentives also depend on the share of demand that may be diverted from downstream rivals to the merged entity. In general, input foreclosure concerns may lead to a challenge when input foreclosure would lead to increased prices in the downstream market or raise barriers to entry to potential competitors, in particular if input foreclosure would entail for such potential competitors the need to enter at both the downstream and the upstream level in order to compete effectively. Conversely, concerns may be rebutted if there are sufficient credible downstream competitors whose costs would likely not be raised, countervailing factors such as buyer power or the likelihood that entry upstream would maintain effective competition and efficiencies. According to the Remedies Notice, concerns that competition may be significantly impeded by foreclosure can be addressed by ‘commitments granting non-discriminatory access to infrastructure or networks of the merging parties’ or ‘information necessary for the interoperability of different equipment’ where ‘the control of key technology or IP rights may lead to concerns of foreclosure of competitors which depend on the technology or IP rights as essential input for the activities in a downstream market.’36 Similarly, concerns that a merger may eliminate players’ incentives to license patents may be eliminated by commitments to grant licenses on the same basis post-transaction to all third parties which depend on the IP rights or information for their activities. 2. Commission precedents In a number of recent cases, the Commission has applied the analysis of input foreclosure effects to concerns about access to big data in the merger context, supplementing the discussions from earlier cases such as Telefónica UK/ Vodafone UK/Everything Everywhere/JV and Google/DoubleClick. In Microsoft/LinkedIn, certain software solutions providers claimed that LinkedIn ‘full data’ would in future constitute an important input for the provision of advanced functionalities of CRM software. One complainant in particular argued that Microsoft could restrict access to LinkedIn full data, thereby making it harder for other providers to compete and to bring innovation in the market. The Commission noted that LinkedIn did not generally make available LinkedIn full data to third parties pre-transaction, and the complaint assumed that, absent the transaction, LinkedIn would have started monetising such data. The Commission questioned that assumption, but nonetheless analysed whether Microsoft would have had the ability and incentive to reduce competition by restricting access to LinkedIn full data to downstream competitors. Microsoft argued that LinkedIn full data were not an important input, and that alternative data were available from other vendors. The Commission considered that the transaction would not have given Microsoft the ability to foreclose competition, since reducing access to LinkedIn full data would be unlikely to negatively affect the overall availability of data needed by downstream competitors, in part because LinkedIn did not appear to have a significant degree of market power in any potential relevant upstream market, which in this case would be an hypothetical market or segment for the provision of data. In addition, the Commission considered that LinkedIn full data could not be characterised as (and would not be likely to become in the next two to three years) an important input, since all major CRM vendors had already started offering advanced functionalities or planned to do so in the next two to three years, without access to LinkedIn full data. Furthermore, even if LinkedIn full data were to be used for that purpose, it would constitute only one of the many types of data needed and many other possible sources of data were already available. Moreover, the Commission considered that it was unclear whether Microsoft would have had the incentive to foreclose competing providers of CRM software solutions by restricting access to LinkedIn full data. The Commission noted that the market investigation was inconclusive and that it was not clear to what extent any such foreclosing strategy would be profitable for Microsoft (since it was not possible to estimate what profits Microsoft would derive by licensing LinkedIn full data). As regards the impact of a foreclosure strategy on the downstream market, the Commission considered that the transaction was unlikely to have an overall negative impact on effective competition in the market for CRM software solutions, and any potential restriction of access to LinkedIn full data would be unlikely to lead to consumer harm, since any impact would be felt in sub-segments representing less than 30 per cent of the entire CRM software solutions market, and LinkedIn was only one of multiple data sources required. Therefore, it was unlikely that competing CRM software providers would be hampered in their ability to compete and innovate, or that the mere likelihood that Microsoft would carry out a foreclosure strategy would raise barriers to entry for potential competitors. In Facebook/Whatsapp,37 the Commission considered whether a ‘potential data concentration’ would be likely to strengthen Facebook’s position in the market for online advertising, where there was no horizontal overlap. Since WhatsApp did not collect any user data that was valuable for advertising purposes, moreover, the transaction would not increase the amount of data available to Facebook for that purpose. Nonetheless, in response to concerns raised in its market test, the Commission considered whether Facebook’s potential use of WhatsApp as a source of data to improve the targeting of Facebook’s advertising activities. The Commission noted that, even if the merged entity were to start collecting and using data from WhatsApp users, the transaction would only raise competition concerns if the concentration of data within Facebook’s control were to allow it to strengthen its position in advertising. The Commission’s market test showed that, post-transaction, a sufficient number of alternative providers of online advertising services would remain, and that a significant number of other market participants collect user data, including Google, which the Commission said ‘accounts for a significant portion of the Internet user data’, as well as many others. Accordingly, the Commission concluded that a ‘large amount’ of Internet user data that is valuable for advertising purposes would remain outside Facebook’s exclusive control. In Telefónica UK/Vodafone UK/Everything Everywhere/JV, a case involving the formation of a joint venture to offer mobile commerce services to UK businesses, the Commission thoroughly analysed concerns that the JV would foreclose competing providers of data analytics or advertising services by combining personal, location, response, social behaviour, and browsing data by creating a unique database that would become an essential input for targeted mobile advertising that no competing provider of mobile data analytics services or advertising customer would be able to replicate,38 although none of the JV parents was previously active in data analytics services. The Commission noted that customers generally tend to give their personal data to many market players, so that this type of data is considered a ‘commodity.’ This type of information was also already available to market players who were already providing targeted advertising or developing these activities. The Commission considered a claim that the JV parents would be uniquely placed to reach the consumer at a certain ‘right’ moment’ via geolocation data but again concluded that multiple sources of such information would remain available. In sum, the Commission concluded that while the JV would indeed be able to collect a broad range of consumer information, which would be very valuable for its data analytics services and advertising services, many other strong and established players could offer comparable solutions, so advertising service providers would not be foreclosed from an essential input. In Google/DoubleClick, the Commission described the theory of harm related to the combination of Google’s and DoubleClick’s customer data as a ‘foreclosure scenario’ in which the combination of the parties’ data could allow the merged entity to marginalise competitors by better targeting ads to users. The Commission concluded, however, that the merged entity would lack the ability and incentive to force advertisers to allow better targeting of ads. The Commission also noted that, even if Google’s and DoubleClick’s data were available as an input for DoubleClick, its competitiveness would not likely be enhanced in a way that could be matched by competitors, taking account of the availability of data from other sources. The Commission concluded that the combination of Google’s and DoubleClick’s data would be very unlikely to enable the merged entity to charge higher prices for its intermediation services.39 3. Comments The Commission’s analysis of big-data related input foreclosure concerns has tended to track the Commission’s traditional approach as set out in its Guidelines more closely than its analysis of horizontal issues. In Microsoft/LinkedIn in particular, the Commission closely tracked the Non-Horizontal Guidelines analysis of the merged entity’s alleged incentive and ability to foreclose downstream competitors’ access to data as an input, taking account of the parties’ strength in upstream and downstream markets. The Commission considered foreclosure effects in a ‘hypothetical’ market, since the concerns were based on an unproven assumption that without the transaction LinkedIn would have attempted to monetise its full data by selling it to downstream competitors. The Commission itself noted the theoretical difficulties with this approach; it is hard to see how the Commission could meet its burden to show that a notified transaction would create the incentive and ability to foreclose the supply of data to competitors where the data have not been supplied by either merging party in the past, there is no course of dealing to measure the profitability of such a strategy, and the data in question has by definition not previously been a key input for competitors. Future decisions may clarify how the Commission will apply its analytical framework for vertical issues to big data, where the data are not already available as an input to competitors. The Commission should also consider potential anticompetitive effects in view of the characteristics of the data in question. Again, the cases in which these issues have arisen involve first-party data (often personal data) collected by companies on their customers for use (or potential use) in online advertising. As noted, such data are the type most commonly considered to be non-rivalrous and ubiquitous, as acknowledged by the Commission in calling such data a ‘commodity.’ Such data is also more likely – depending on the use case – to suffer from diminishing returns and to decline rapidly in value. Where the value of particular data is likely to be short-lived, it may be more appropriate to consider the potential foreclosure impact of the merging parties’ sources for generating new data, as opposed to the combination of existing datasets. By contrast, where merging companies’ data may have longer lasting value, such as for developing and improving a platform’s own online advertising algorithms or other products (e.g., spell-check programs or voice recognition software), incremental data may have decreasing marginal value, which would likely limit the competitive impact of combining the merging parties’ data. The Commission would also need to take account of non-data alternatives to address relevant needs. For example, if competitors can license a suitable algorithm from a software developer, access to a particular dataset to develop its own software may be less relevant. III. Conclusion Although the European Commission has been less vocal on big-data issues than some national authorities, the Commission has developed a significant track record in examining big-data-related concerns in the merger context. Most of the Commission’s older decisions concerned data-related markets; i.e., markets in which the data in question were themselves the product market. More recently, the Commission has started to explicate its approach to big-data concerns where the merging parties collect large amounts of data in their business but do not offer such data to third parties as a product. The Commission has considered such issues from both horizontal and vertical perspectives. Since the Commission has not yet objected to a merger based on big-data concerns, it has not been required to articulate a theory of harm in detail. Nonetheless, the Commission’s approach so far raises questions about the suitability of its existing guidelines for assessing concerns about data that companies collect and use internally, as opposed to data offered on the market. In the horizontal context, the Commission articulated an analytical framework in two recent cases, but this framework did not clearly track the Commission’s Horizontal Guidelines and raised a number of interpretive questions, apparently mingling horizontal and vertical elements. In the vertical context, the Commission has more clearly followed the analytical approach set out in the Commission’s own Guidelines, considering the possibility that a merger would lead to foreclosure of access by downstream competitors to data they need to compete effectively. Where merging parties are only collecting and using data in their own businesses, however, conclusions about how a merger might change the parties’ incentives to offer such data in the future seem speculative, and in any case showing that such data would be a ‘key’ or ‘important’ input for competitors seems difficult where the data were not available pre-merger. In future cases, it is to be hoped that the Commission will more clearly articulate its framework for analysing the competitive effect of big data where companies are collecting and using the data in their own businesses, as opposed to participating in data-related markets. In this process, it would be helpful for the Commission to consider the significance of big data in light of the factors discussed in the Franco/German Study and elsewhere – whether the data in question are non-rivalrous and ubiquitous, and whether their value reflects diminishing incremental returns from additional data and/or is likely to decline rapidly over time, in each case depending on the source and nature of the data and the purposes for which it is used. In particular, while big data concerns can be expected to arise most often in relation to first-party data collected by online companies on consumers and used as an input in related markets such as online advertising, these are the types of data most commonly considered non-rivalrous and ubiquitous. By contrast, it should be possible in most cases to rule out big-data concerns in relation to first-party data companies collect on their own products and services. Big data issues would not be expected to arise, as intimated in the Franco/German Study, simply because the data in question are ‘unique’ and valuable. Footnotes 1 https://www.ft.com/content/349103ba-c631-11e7-b2bb-322b2cb39656 2 The ‘Franco/German Study’, available at http://www.bundeskartellamt.de/SharedDocs/Publikation/DE/Berichte/Big%20Data%20Papier.html?nn = 3591568. See also, Monopolies Commission, Competition Policy: The Challenge of Digital Markets, Special Report No 68 [2015], the CMA’s reports on consumer data, available at https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/435817/The_commercial_use_of_consumer_data.pdf, and big data in the insurance sector, available at https://www.fca.org.uk/publications/feedback-statements/fs16-5-call-inputs-big-data-retail-general-insurance. 3 See e.g., Big Data and Competition, September 29, 2017, available at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/big-data-and-competition_en, and Clearing the Path for Innovation, November 7, 2017, available at https://ec.europa.eu/commission/commissioners/2014-2019/vestager/announcements/clearing-path-innovation_en. 4 For an overview of these cases, see Massimiliano Kadar and Mateusz Bogdan, ‘Big Data’ and EU Merger Control – A Case Review, JECLAP, 17 October 2017, p. 479, available at https://academic.oup.com/jeclap/article/8/8/479/3844574#. 5 See e.g., VNU/AC Nielsen, COMP/M.2291, dated 12.02.2001 available at http://ec.europa.eu/competition/mergers/cases/decisions/m2291_en.pdf and WPP/TMS, COMP/M.5232, dated 23.09.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m5232_20080923_20212_en.pdf 6 See e.g., Thomson Corporation/Reuters, COMP/M.4726, dated 19.02.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4726_20080219_20600_en.pdf 7 E.g., TomTom/TeleAtlas, Comp/M. 4854, dated 14.05.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4854_20080514_20682_en.pdf and Nokia/Navteq, dated 02/07.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4942_20080702_20682_en.pdf. 8 See e.g., Google/DoubleClick, COMP/M.4731, dated 11.03.2008, available at http://ec.europa.eu/competition/mergers/cases/decisions/m4731_20080311_20682_en.pdf. 9 Franco/German Study, p. 16. 10 Paras 359 et seq. 11 Microsoft/Yahoo! Search Business, dated 18.02.2010, available at http://ec.europa.eu/competition/mergers/cases/decisions/M5727_20100218_20310_261202_EN.pdf. 12 COMP/M.8180, dated 21.12.2016, available at http://europa.eu/rapid/press-release_MEX-16-4491_en.htm. 13 COMP/M.8124, dated 6.12.2016 available at http://europa.eu/rapid/press-release_IP-16-4284_en.htm. 14 COMP/M.7217, dated 3.10.2014, available at http://ec.europa.eu/competition/mergers/cases/decisions/m7217_20141003_20310_3962132_EN.pdf. See also Telefónica UK/Vodafone UK/Everything Everywhere, COMP/M.6314, dated 04.09.2012, available athttp://ec.europa.eu/competition/mergers/cases/decisions/m6314_20120904_20682_2898627_EN.pdf 15 OJ C 31/5, 5.2.2004 (the ‘Horizontal Guidelines’), available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52004XC0205(02)&from=EN 16 OJ C 265/6, 18.10.2008, available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52008XC1018(03)&from=EN. 17 OJ 267/1, 22.10.2008, available at http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52008XC1022(01)&from=EN 18 http://www.gartner.com/it-glossary/big-data/. See also Maurice E. Stucke and Allen P. Grunes (2016), Big Data and Competition Policy (Oxford University Press 2016), pp. 15–28. 19 See e.g., http://www.ibmbigdatahub.com/infographic/four-vs-big-data 20 Franco/German Study, pp. 4–7. The Franco/German Study focuses mainly on personal data, whose collection and processing is subject to Regulation 2016/679, the General Data Protection Regulation. 21 Franco/German Study, p. 36. See also, Lerner, The Role of ‘Big Data’ in Online Platform Competition (2014), pp. 21–23. 22 See e.g., Anja Lambrecht and Catherine E. Tucker, Can Big Data Protect a Firm from Competition (Winter 2017) CPI Antitrust Chronicle, p. 1. 23 See e.g., Maurice E. Stucke and Allen P. Grunes (2016), Big Data and Competition Policy (Oxford University Press 2016), pp. 42 to 47 and 288–301; Maurice E. Stucke and Allen P. Grunes (2015), Debunking the Myths over Big Data and Antitrust (Spring 2015) CPI Antitrust Chronicle; No Mistake about it: The Important Role of Antitrust in the Era of Big Data (April 2015) Antitrust Source 1, pp. 7–8; and Inge Graef (2015), Market Definition and Market Power in Data: the Case of Online Platforms (2015) World Competition Vol.38, No.4, pp. 479–480. 24 Franco/German Study, p. 42. See also, Darren Tucker and Hill Wellford, Big Mistakes Regarding Big Data (December 2014) Antitrust Source 1, pp. 3–4. 25 See e.g., Anja Lambrecht and Catherine E. Tucker, Can Big Data Protect a Firm from Competition (Winter 2017) CPI Antitrust Chronicle, pp. 2–3. 26 See e.g., Maurice E. Stucke and Allen P. Grunes, Big Data and Competition Policy (Oxford University Press 2016), pp. 42–47 and 288–301; and Maurice E. Stucke and Allen P. Grunes (2015), Debunking the Myths over Big Data and Antitrust (Spring 2015) CPI Antitrust Chronicle; No Mistake about it: The Important Role of Antitrust in the Era of Big Data (April 2015) Antitrust Source 1, pp. 7–8. 27 Franco/German Study, pp. 47–48. See also, Inge Graef, Market Definition and Market Power in Data: the Case of Online Platforms (2015) World Competition Vol. 38, No. 4, pp, 483–487; Andres V. Lerner, The Role of ‘Big Data’ in Online Platform Competition (2014), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2482780, p. 35 and Darren Tucker and Hill Wellford, Big Mistakes Regarding Big Data (December 2014) Antitrust Source 1, p. 4. 28 Franco/German Study, p. 49. 29 Franco/German Study, pp. 50–52. 30 Franco/German Study, p. 16. 31 Horizontal Guidelines, para. 71 (citations omitted). 32 Remedies Notice, para. 62. 33 Paras. 81 et seq. 34 See e.g., Dow/DuPont, COMP/M.7932, dated 27.3.2017, available at http://ec.europa.eu/competition/mergers/cases/decisions/m7932_13668_3.pdf. 35 See. e.g., Statement of the Fed. Trade Comm’n, in the Matter of Nielsen Holdings N.V. and Arbitron Inc., File No. 131-0058 at 1 (Sept. 20, 2013), https://www.ftc.gov/system/files/documents/cases/140228nielsenholdingsstatement.pdf. See also, Terrell McSweeney and Brian O’Dea, Data, Innovation, and Potential Competition in Digital Markets – Looking Beyond Short-Term Price Effects in Merger Analysis, CPI Antitrust Chronicle February 2018. 36 Remedies Notice, paras. 64–65. 37 Paras 184 et seq. 38 Telefónica UK/ Vodafone UK/ Everything Everywhere/ JV, paras. 528 et seq. 39 Google/DoubleClick, paras 359–366. © The Author(s) 2018. Published by Oxford University Press. All rights reserved. For Permissions, please email: [email protected] This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)

Journal

Journal of European Competition Law & PracticeOxford University Press

Published: Dec 1, 2018

There are no references for this article.