April 21st 2016 saw the publication of the report on ‘Online platforms and the Digital Single Market’ by the House of Lords EU Internal Market Sub-Committee. This reports presents the findings of the inquiry that was held from October 2015 till spring 2016, receiving 85 written responses and 20 oral evidence sessions. Included in the written responses were two from Horizon Digital Economy Research, one by Prof. Rodden and one by myself which we partially posted on this blog about in October 2015. The main driver for this inquiry was the publication in May 2015 by the European Commission (EC) of its ‘Digital Single Market Strategy for Europe’ (DSM), which drew attention to the growing role of online platforms as key players in social and economic interactions on the internet, and was followed on 24 September by the launch of an EC consultation ‘A fit for purpose regulatory environment for platforms and intermediaries’. For the purposes of both the EC consultation and the Lords’ inquiry online platforms were considered to ‘represent a broad category of digital businesses that provide a meeting place for two or more different groups of users over the Internet, examples of which include search engines, online marketplaces, the collaborative or sharing economy, and social networks’. What follows is an incomplete summary of the findings in the report, with a focus on the issues related to fundamental rights or platform users (e.g. privacy), the role of algorithms and user consent, which are most closely related to our work at CaSMa.
In an effort to explore if current regulations are ‘fit-for-purpose’ when it comes to online platforms, most of the effort of the inquiry was naturally dedicated to issues where there are online platforms are potential causing problems. In order to put the bigger picture somewhat in perspective, the start of the report sets aside a couple of pages to highlight some of the benefits the online platforms bring to business and to consumers. One examples for business was the contribution by Experian that pointed out that “Platforms provide SMEs as well as large companies [with] a distribution channel, and in many ways can help level the competitive playing field between the two, ensuring small companies can get the same exposure to potential customers as the larger companies.” For consumers the German Monopolies Commission provided the example that online platforms that facilitate transactions offer “a large number of advantages for consumers, such as greater market transparency, a broader selection of products, overcoming confidence problems when shopping on the Internet, a reduction of transaction costs, as well as the ability to engage in cross-border transactions.” Beyond the economic sphere, online platforms like social media also play an important role in enabling new forms of communication, political activism and self-expression.
Despite these generally beneficial effects of online platforms, however, there were also many aspects that gave cause for concern. Some of the more high-profile cases being:
- Concerns about privacy in the wake of the revelations from Edward Snowden that US security services were scrutinising non-US citizens’ personal data held in the US, and that such data were being provided by Apple, Facebook and Google, among others. This led to the ruling by the Court of Justice of the European Union on 6 October 2015, shortly after the launch of this inquiry, against the EU-US ‘Safe Harbour’ agreement, which had previously provided the basis upon which personal data could be transferred between the EU and the US, on the grounds that the agreement did not protect EU citizens’ fundamental rights.
- Protests across Europe against the on-demand transport platform Uber on the assertion that Uber is using its ‘sharing economy’ type platform business model as a means to implement unfair competition practices by avoiding the need for expensive taxi licenses and side-stepping employee regulations.
- The ongoing antitrust dispute between the EU and Goggle over alleged anti-competitive practise whereby Google’s shopping service was given artificially higher rankings in search results than rival services. A dispute that has been going since late 2010 and which the European competition commission has just recently decided to expand to include charges concerning abuse of power with regards to its Android mobile phone operating system.
Market Power and Network Effects
The Commission said: “The market power of some online platforms potentially raises concerns, particularly in relation to the most powerful platforms whose importance for other market participants is becoming increasingly critical”. It noted that as a result of their market power “some platforms can control access to online markets and can exercise significant influence over how various players in the market are remunerated.”
One of the oft discusses features that can determine which platform dominates a specific niche is Network effects, which In the case of online platforms can lead to “exponential growth.” Mr Chisholm explained that network effects could lead to ‘tipping’, “whereby more and more people use [a platform] until it seems almost pointless to use any other platform because there is so much value in that … If, for example, your ability as a seller to be able to reach very large numbers of potential purchasers is so great on one platform …, why would you consider other platforms?”
Professors Ezrachi and Stucke said that search engines’ use of data allowed them to harness new types of direct network effect: “the more consumers who use the search engine and the more searches they run, the more trials the search engine has in predicting consumer preferences, the more feedback the search engine receives of any errors, and the quicker the search engine can respond with recalibrating its offerings. Naturally, the quality improvement attracts additional consumers to that search engine compared to competitor sites.” They also suggested that the “scope of data” collected about individual users’ preferences through the variety of “e-mail, geo-location data, social network and browser history” allowed them to better harness indirect network effects through the “targeting of users with specific sponsored ads”.
Professor Rodden agreed that it was “now common for a single provider to dominate a service sector (Facebook for social networks, Google for search)”.
Mr French said that some online platforms’ market shares across multiple markets meant that they represented a high share of all online activity, not just specific sectors: “When some Google services went offline in August 2013 for between 1 and 5 minutes, global Internet traffic shrunk by 40%.” ‘Switching costs’ are the barriers that platforms’ users may face when seeking to switch to another equivalent platform. Professor Gawer clarified that “it is not the network effect per se that may harm consumers”, but that 5 consumers were harmed when switching costs were very high and they became, “stuck with one provider and a lack of choice”. Consumers face particularly high switching costs with social networks because they display strong direct network effects. Dr Koene described the phenomenon: “Anecdotally, many people who would like to quit Facebook and move to a different platform ultimately continue to use Facebook because that is where their peers are.”
IMPALA, the Independent Music Companies Association, said that traders were [also] likely to become reliant on a platform “when the number of visitors accessing the platform greatly surpasses that of its competitors”, adding that in such cases “the online platforms’ business model places them in a position of indispensable trading partner, ‘essential facility’ or ‘gatekeeper’.” Many large businesses said that they were reliant on Google Search in precisely this way.
Professor Evans told us that a key challenge for emerging online platforms was getting a critical mass of users on both sides, so that ‘ignition’ [of network effects] could occur.
e-Conomics said that switching costs, lack of interoperability and lock-in due to the presence of rating systems “may strengthen the market power of the platform by raising entry barriers for competitors “.
Dr Anna Plodowski said that online platforms transformed “first-mover advantage into network-effect business models that lock-out the entrance of later competitors.”
The Association of Authors’ Agents said: “the rapid development and business models of early entrants into the market has led to monopolistic situations, creating an inherent danger whereby an individual marketplace becomes the main market stall, jeopardising healthy competition and controlling access to the consumer.” Getty Images agreed: “the adoption of one platform or technology may make switching to another more difficult … increasing barriers to entry for later players … This may mean that one player captures a market and then entrenches itself, with customers being denied the benefits of innovation over time.
Professors Ezrachi and Stucke concluded that “network effects, absence of outside options, high switching costs and locked-in customers, may all give rise to market power at lower levels than in traditional markets.”
Dr Richard Hill, of the Association for Proper Internet Governance summarized that “because of the economies of scale and network effects … online platforms have a tendency to be natural monopolies”.
Professor Richard N. Langlois said that “Platform services have many of the characteristics of old-fashioned public utilities”, adding that some people found it “tempting to regulate platforms as if they were public utilities, controlling rates and terms of access”. Dr Jerry Ellig noted that “Various commentators have argued that some type of sharing or openness regulation is appropriate for Facebook, Google, eBay, Twitter, and Amazon because network externalities make them natural monopolies or close to it.”
In contrast to some public utilities however, market power in online platforms is secured through innovation that succeeds in harnessing network effects, the investment in infrastructure required to enter these markets is lower, the risk of disruptive innovation is substantially higher, and ‘competition for the market’ may create competitive pressure even when one firm is dominant. Professor Richard N. Langlois explained that firms did not compete “just, or even primarily, within existing market structures but also to change markets’ structures”. This meant “completely redefining products and relationships with customers: in short … innovation.” Daniel Gordon, Senior Director of Markets at the Competition and Markets Authority argued that it was “the competition to replace that is the dynamic incentive.” He referred to this as the “competition for the market” as opposed to “competition within the market”.
Nonetheless, the potential for dominant positions to emerge means that competition authorities must be vigilant in these markets, to ensure that market power is not abused. Protecting users in these markets also requires that consumer rights and data protection rights are effectively enforced.
Are competition agencies and competition law able to address concerns related to abuse of market power?
[The next part of the report focuses on Competition Law, which is less closely related to our work as CaSMa, so I summarize this section very briefly based on the section conclusions]
The increasing use of restrictive pricing practices by online platforms requires critical scrutiny by competition agencies. While some restraints may be justified to enable price comparison websites to work, these clauses may also, especially when broadly designed, enable firms to exploit suppliers and exclude competitors. A case by case analysis by competition authorities is therefore necessary.
Asymmetries of bargaining power, although not unique to online platforms and their trading partners, are nevertheless extremely pronounced in these markets: many large online platforms offer access to global markets to a dispersed multitude of much smaller businesses. This asymmetry gives rise to a variety of concerns about how online platforms use their bargaining power, including the imposition of unfair terms and conditions on their trading partners.
The dependency of some suppliers on large online platforms, and their fear of commercial retaliation by those platforms, may prevent complainants from approaching competition authorities. We recommend that the Competition and Markets Authority introduce new measures to protect complainants in these markets. These should include imposing substantial penalties upon online platforms that are found to have engaged in commercial retaliation.
We  recommend that the Competition and Markets Authority use its market investigation tool to ascertain whether codes of practice are appropriate to specific sectors, starting with online travel agents.
During our inquiry the European Commission launched a new ‘dispute resolution’ platform that enables EU consumers and traders to settle their disputes for both domestic and cross-border online purchases. Disputes are registered through the EU’s online dispute resolution platform, and subsequently channelled to one of the Alternative Dispute Resolution bodies that connect to the platform. At present, around 117 such bodies from 17 Member States are connected to the platform: they can take the form of 10 arbitration, mediation, ombudsmen, and complaints boards. The legal basis for the establishment of the platform is Regulation 524/2013 on Online Dispute Resolution for Consumer Disputes.
Professors Ezrachi and Stucke [said] that “competition authorities are sensitive to vertical integration by a dominant platform operator”, but added that it could be a problem with online platforms in particular, because platforms were able to “inhibit rivals on its platform or give preference to its own programs or services … to the detriment of rival sellers (and contrary to consumers’ wishes).” Charly Berthet, from the French Digital Council, agreed that: “When an online platform is vertically integrated, it might restrain competition by decreasing the feasibility of the offers of its competitors to the benefit of its own offers.”
The Booksellers Association told us that Amazon used data gathered from its Marketplace sellers to give itself a competitive advantage on its e-commerce website. The Booksellers Association and Association of Authors Agents also told us that Amazon had inhibited the interoperability between the Amazon Kindle and non-Amazon eBook formats, vertically leveraging its dominance in e-readers into an adjacent market by requiring publishers to use its proprietary Kindle e-book publishing format. DG Competition is currently investigating these claims.
[Due to the many submissions to the inquiry that discussed the concerns about vertical integration and leveraging concerning Google Search, at this point there follows a brief summary of the antitrust investigation DG Competition opened again Google in 2010]
Ms Jameson, from Skyscanner, said: “Google is so dominant that it is effectively the infrastructure of the market that we all operate in.”
Google’s search engine shows how the tendencies to concentration in these markets may result in a successful innovator becoming the main provider of a particular service. Google Search has become a gateway through which a large proportion of the world accesses information on the Internet, which many businesses consequently depend on in order to be visible and to compete online.
The Google case illustrates the way in which a platform may use a strong position in one sector (in this case, general search) to integrate a range of other services into its core offering, thereby entering into direct competition with the trading partners on its platform. Such integration can offer consumers benefits, such as increased convenience; however, it can also exclude competitors and harm consumers if they are not directed to the best service or if innovation is reduced.
Whether individual examples should be deemed an abuse must be ascertained through rigorous case by case analysis.
Mergers and acquisitions
We are concerned that mergers and acquisitions between large online platforms and less established digital tech businesses, may escape scrutiny by competition authorities because the target company generates little or no revenue and so falls below the turnover threshold adopted by the European Commission’s Merger Regulation. Such transactions may be investigated in the UK under a “share of supply test”, but there is little consistency between Member States. This seems inconsistent with the principles of the single market.
We recommend that the Commission amend the Merger Regulation to include additional thresholds to take account of this dynamic, examples of which might include the price paid for the target or a version of the ‘share of supply’ test used in the UK.
Data and competition law
Professors Ezrachi and Stucke highlighted the OECD’s finding that big data was a “core economic asset”, which could create a “significant competitive advantage”. They also said that firms were increasingly turning to mergers to acquire a “data advantage” over rivals, noting that “according to one estimate, the number of Big Data-related mergers doubled between 2008 and 2013—from 55 to 134.”
The Professors cautioned against the assumption that Big Data was inherently anti-competitive, […] and said that data analysis could provide firms “with insights on how to use resources more efficiently and to outmanoeuvre dominant incumbents.” However, they also said that businesses had “strong incentives to limit their competitors’ access to these datasets, prevent others from sharing the datasets, and be adverse to data-portability policies that threaten their data-related competitive advantage.”
The Information Commissioner’s Office said: “it is fairly safe to deduce that very large amounts of data are being collected by the major online technology companies.”
The Competition and Markets Authority (CMA) wrote: “Data is often central to the activities of platforms, since so many of them are involved in matching disparate parties: if the platform does not know anything about the parties they are matching, they often cannot add value.”
Dr Weck said that data-driven insights could potentially blur the line between innovative and anti-competitive behaviour. He said: “The company may find out, “The markets I am in will develop in a certain direction. If I want to block arising competition, I have to expand into this or that market”, just based on the data the company has access to.” He asked: “Is this just innovative behaviour, because the company is following market developments and creating new products, or is it not really foreclosure, based on data access?”
Professor Rodden suggested that the opacity of decisions made by data-driven algorithms created problems of accountability: “Due to the large number of parameters that are used by the algorithms, even the engineers who constructed the system are often not able to explain why the algorithms made specific decisions”. Dr Koene said that one consequence of this lack of transparency was “that platform providers may not be able to guarantee that they are compliant with regulations.” Professor Rodden concluded that this lack of transparency “offers the potential for abusive manipulation”. He said that “the topic of data driven algorithms and their interpretability is currently in the realm of research, so through Research Councils and initiatives such as the Alan Turing Institute, this should be made a priority for “big data” related research funding.”
Professors Ezrachi and Stucke said that widespread use of sophisticated algorithms could result in ‘tacit collusion’, in which rival firms effectively coordinated strategies to reduce competition. They said: “Collusion may be facilitated when the firm programmes an algorithm, among other things, to monitor price changes and swiftly react to any competitor’s pricing.” They said that “industry-wide use of such pricing algorithms” was “likely to push markets which were just outside the realm of tacit collusion into interdependence” and to “support conscious parallelism”.
Professors Ezrachi and Stucke identified a number of possible data-driven abuses specific to search engines, which involved degrading the quality of the service for users in order to increase revenues from advertisers on the other side of the platform. They said: “a search engine, to incentivise users to click on sponsored advertisements or the results of its affiliated business, can promote, and rank higher, its sponsored results and provide fewer, and rank lower, its more relevant organic results.” They also described a “‘hold-up’ scenario” whereby “the search engine could lower the ranking of potential advertisers appearing in the organic search results to pressure the businesses to advertise with the search engine, namely to bid for keywords to get the attention of viewers who do not scroll down the list of search results.” In this way, Professor Ezrachi said, search engines “can actually degrade quality to some extent, because when they have to choose between the free side and the paid side—the side where they make the revenues from advertisements—their loyalty, or their interest, obviously lies with that side.”
Degrading privacy standards as an abuse of dominance
Professors Ezrachi and Stucke said that platforms could “degrade other dimensions of quality, such as collecting more personal data and providing less privacy protection for the data, than consumers would otherwise prefer”.
Dr Weck said: “If consumers do not know how their data are used, if consumer rights are not respected and content provider rights are not respected, perhaps because the platform is so powerful that it does not need to heed those rights, then that harm to consumers is a competition related problem”, and added that this could amount to “an abuse of market power”.
Indeed, during the course of this inquiry Germany’s Bundeskartellamt opened an investigation against Facebook, stating that: “Facebook’s use of unlawful terms and conditions could represent an abusive imposition of unfair conditions on users.”
e-Conomics suggested that, despite competition law’s inability to adequately deal with questions of data protection, the centrality of data to these markets meant that competition authorities needed to integrate [the role of data] more fully into their analyses. They said that competition authorities should consider “whether data market(s) exist and can be defined for the purpose of competition law (because data is traded)” and “whether dominance is possible on such market.” In relation to investigations of possible abuses of dominance, they said that “access to data, possession of datasets and/or processing capabilities should be considered when assessing market power.”
Data is integral to the operation of many online platforms and the benefits they provide. For this reason, exclusive access to multiple sources of user data may confer an unmatchable advantage on individual online platforms, making it difficult for rival platforms to compete.
As well as providing new benefits, rapid developments in data collection and data analytics have created the potential for new welfare reducing and anti-competitive behaviours by online platforms, including subtle degradations of quality, acquiring datasets to exclude potential competitors, and new forms of collusion. While some of these abuses are hypothetical, they raise questions as to the adequacy of existing competition enforcement tools.
We recommend that the European Commission co-ordinate further research regarding the effects that algorithms have on the accountability of online platforms and the implications of this for enforcement. We also recommend that the Commission co-ordinate further research which investigates the extent to which data markets can be defined and dominant positions identified in these markets.
Adequacy of competition law and speed of enforcement
The sheer diversity of online platforms and the complexity of their business models raise obvious challenges for competition authorities. The lack of price signals on the consumer side, and the presence of multiple prices in multi-sided markets, pose problems for standard antitrust analysis. Quality is a key parameter of competition in these markets, but is not easily measured.
While these challenges are significant, we note that the flexible, principle-based framework of competition law, which can be customised to individual cases, is uniquely well-suited to dealing with the subtlety, complexity and variety of possible abuses that may arise in these markets. We cannot see how a less flexible regulatory approach could be more effective.
In order to speed up enforcement, Professor Zimmer said that the German Monopolies Commission had encouraged the European Commission to apply interim measures, which require a firm to amend any allegedly anti-competitive conduct pending the outcome an investigation. Professor Zimmer explained that interim measures would enable competition authorities to say, “We order now and for the time of this proceeding that the firm has to refrain from certain behaviour”. He said that interim measures were therefore helpful when markets were anticipated to change quickly – within two years, he suggested – in order to ensure that the harm did not become permanent before the case concluded.
Competition law is perceived as being too slow to react to rapidly evolving digital markets. While the length of time taken to arrive at a decision in the Google case reflects its importance, it also highlights a wider problem. In such fast-moving markets a competitor who falls foul of anti-competitive conduct may suffer irreversible harm long before a competition case concludes. This undermines public confidence in the ability of regulators to hold large online platforms to account and may create political pressure for legislators to regulate unnecessarily.
We recommend that DG Competition make greater use of interim measures by lowering the threshold for their use, bringing it into line with that of the UK Competition and Markets’ Authority.
We recommend that the Competition and Markets Authority and DG Competition consider introducing time limits in commitment proceedings. Restricting the period for discussion of commitments should encourage parties to offer serious proposals at the outset and prevent them from delaying the process.
Data Protection law and online platforms
Professor Rodden said: “For many online platforms the default business model has become the ‘freemium’ / free to use model that is supported by advertising revenue.”
David Alexander, Chief Executive of MyDex, said that it was possible to put a value on users’ data: “If you are trying to value a tech start-up, the value of data is calculated to a very fine precision—$720 per person, per year is Google’s estimate when they are talking to investors over time.”
The Information Commissioner’s Office noted that the “collection and use of personal data is becoming more central to the business model of the online platforms, particularly to drive personalisation and tailored services, also linked to more sophisticated behavioural advertising.” Adam Cohen, from Google, provided confirmation: “our annual turnover last year, 2014, was $66 billion. We derived 89 per cent of that income from advertising.”
BEUC told us that “The misuse of personal data is perhaps the main source of concerns for consumers using platforms, particularly social networks. This is confirmed by recent data showing that 70% of EU consumers are worried about how their data is being collected and processed.”
The process through which online platforms use consumers’ personal data to generate revenues from advertising is complex and opaque, contributing to low consumer trust in online platforms. The Commission said: “only 22% of individuals have full trust in service providers such as search engines, social networking sites and e-mail services”. Citizens Advice told the Committee of “general unease” among consumers about how their personal data are collected and used online: “A recent survey of consumers … found that 69% describe the way companies use their data as ‘creepy’”.
Skyscanner described online platforms as collecting personal data in two main ways, either “actively” or “passively”. An example of the latter “would be where a user passively provides data via their web browser (e.g., their IP address) or the cookies that are placed on their device by the online platform, or through their incidental use of the online platform (i.e. what sections of the website did they access, at which point did they exit the website).”
The Commission recognised that passively collected data “may also reveal information on websites visited and on users’ interests on the Internet”. Such data may be “combined to form (anonymised) user profiles.”
Professor Ezrachi told the Committee that an application for smartphones called Brightest Flashlight provided users with a flashlight app for free, but said that what many” did not know was that that application was tracking your location all the time, even when you were not using it, and that information was being sold to third parties as part of harvesting.” Professor Tom Rodden, Director of RCUK Horizon Digital Economy Hub, said that Facebook collected personal data through location tracking, which “was recently blamed as [the] possible cause for large power drain in iPhones.”
Michael Ross, from Dynamic Action, said when a consumer visited a retail website, the retailer would be “building profiles”, based on “what you click on, how you behave and what marketing you are looking at.” These profiles helped them “work out how they can target you with better offers and a build a range [of services] that is more attractive to you.” He described that as “a fantastic thing”, because it helped retail businesses remain competitive. Steve Chester, from the Internet Advertising Bureau, said: “The advertisers will not be able to see who that person is or any personal details, but they are actually looking at passion points and interests and being able to sell advertising based on interest levels.”
However, some online platforms also sell the personal data they collect to third parties. Demos and Ipsos MORI said that in a qualitative survey of about 1,250 people, “while the majority of respondents were aware that advertising is targeted using their social media data (57% said this currently happen) … six in ten (60%) respondents felt that social media data should not be shared with third parties as happens currently under existing terms and conditions of social media sites.”
Dr Lynskey concluded that “Individuals are not data brokers”, and could not be expected to understand “the multitude of daily transactions which take place online.”
Competition on the basis of privacy
Consumers’ lack of awareness of how their personal data are collected and used means that there is limited competition between online platforms on the basis of privacy standards. The CMA said: “While, in theory, consumers should be able to discipline providers over the level of privacy or the extent to which data may be used … in practice, consumers may find it difficult because of a lack of awareness that data may be used for this purpose and/or the value of the data to the platforms.”
For some, this lack of competition on the basis of privacy standards reflected the market power of some online platforms. The Information Commissioner’s Office was concerned about “how free people are to offer consent to use a market-dominant search engine, for example. Nobody has to use search engines or social media services, but in reality they are [the] first port of call for many who want to access or share Internet content.” Dr Lynskey agreed that “In reality, most content and services offered by online platforms are offered on a ‘take it or leave it’ basis.”
The Information Commissioner’s Office [said]: “Platforms must find more effective means of explaining their complex information systems to ‘ordinary’ service users. This is important as transparency opens the way to the exercise of individuals’ rights, and choice and control over their personal data.” CMA noted that “pressure on consumers is only set to increase. Developments such as the Internet of Things -like online devices we wear or carry and devices in the home or in our cars – will mean that data is collected and shared on a regular basis without the consumer having to make a conscious decision.”
General Data Protection Regulation (GDPR)
The General Data Protection Regulation (GDPR), which was agreed on 15 10 December 2015, will substantially change how the collection and processing of personal data is regulated in the EU.
Nonetheless, given the limitations of the consent-based model and industry’s reluctance to make consent more meaningful, we are concerned that the provisions that widen the definition of ‘personal data’ will be difficult to apply in practice. We recommend the Commission investigate how the requirement for all business to seek consent for the collection of personal data through online identifiers, device identifiers, cookie IDs and IP addresses can be applied to online platforms in a practical and risk-based way.
Citizens Advice said “approximately only a third of consumers’ report that they read terms and conditions”, but that “actually people are likely to be over-claiming”—according to the evidence of “actual time spent reading terms and conditions … the figure appears closer to 1%.” The German Monopolies Commission confirmed that “the collection of personal data without users’ explicit consent is likely to be not the exception, but in fact the rule.”
One problem with privacy notices is their length. Steve Wood, from the ICO, described many privacy notices as being “longer than Hamlet”, while Professor Rodden said they were “as long as Othello” and Mr Alexander said they were “longer than the Declaration of Independence”.
They are, though, much less readable. Professor Rodden highlighted research undertaken by Research Councils UK showing that the language of privacy notices was “overly complex and difficult to read” and that they were “written to be understood and used in [a] US court rather than by ordinary consumers.” A Eurobarometer survey found that, of those who did not fully read privacy statements, 67 per cent found them too long, while 38 per cent found them unclear or difficult to understand.
Accessible Privacy notices
The ICO, the European Data Protection Supervisor and the CMA all said that online platforms had to improve the transparency of their privacy notices.
Minister of State for Culture and the Digital Economy, the Rt. Hon. Ed Vaizey MP, agreed: “You get these very complex terms and conditions. I signed up to some this morning, to an unnamed provider, on my tablet in order to update my software—I do not have a clue what I signed up to. People have to be told, partly by government and partly by consumer rights organisations”.
Mr Buttarelli told the Committee that the GDPR would ensure that “the quality of notices in the new framework will be verifiable by regulators”, who would be able to object to unclear notices. The Information Commissioner’s Office also said the “GDPR will open the possibility of stronger sanctions for the breach of the transparency provisions.” The GDPR provides for a maximum fine of €20 million or 4 per cent of annual turnover in cases where an online platform failed to obtain explicit consent.
In order to address concerns about the length and accessibility of privacy notices, Professor Rodden recommended that privacy notices should be “supported by kite-marks”, to identify online platforms meeting EU standards on the handling and processing of personal data. Kite marks would provide a visual symbol for consumers to quickly understand the implication of any agreement they may make regarding data protection when engaging with an online platform. Kite marks have also been recommended by the House of Commons Science and Technology Committee in its report on Responsible Use of Data. In order to create an incentive to foster competition, rather than just compliance, on the basis of privacy standards, such kite marks should include a graded scale indicating levels of data protection, similar to the traffic light system used in labelling for food products.
The Information Commissioner’s Office said that the GDPR made provisions for Data Protection Authorities to support privacy seal schemes or stamps of approval to demonstrate good privacy practices “as a way of demonstrating data protection compliance”, and that the Commissioner was “developing a privacy seal programme that will enable data controllers to apply for a seal.” They said that this would work by allowing third party scheme operators to apply to the Information Commissioner for an endorsement that would enable them to use the seal. The Commissioner launched a call for applications in 2015 and expects the first scheme to be formally launched sometime in 2016.
In order to encourage competition on privacy standards, not just compliance with the law, we recommend that the Government and the Information Commissioner’s Office work with the European Commission to develop a kite mark or privacy seal that incorporates a graded scale or traffic light system, similar to that used in food labelling, which can be used on all websites and applications that collect and process the personal data of EU citizens.
Notifying users of abuse
To discourage abuse of users’ personal data, we recommend that the European Commission take action to require all online platforms that are found to have breached EU data protection standards, or to have breached competition law by degrading privacy standards, to communicate this information clearly and directly to their users within the EU through a notice that is prominently displayed on the home screens of their desktop and mobile applications.
Improving control over personal data
The Commission said the GDPR would “equip individuals with a new set of rights fit for the digital age, such as the ‘right to be forgotten’, the right to data portability and the right to be notified when the security of personal data is breached.” These provisions respond to demands from consumers to have more control over their personal data. Citizens Advice said previous research from Demos in 2012 found that 70 per cent of consumers “would be more willing to share data if they had the ability to withdraw it and see what data was held on them.” In its report, ‘The commercial use of consumer data’, the CMA recommended more “control over how the data is used subsequently—so that consumers can manage the data they are sharing and choose how much, if any, data to share.”
Mr Alexander told us that data portability was particularly important in order to drive competition and enable consumers to switch to different providers: “if individuals themselves do not have the ability to move those around to different platforms, so that they can apply and share their browsing experience or their purchasing history with other platforms—it makes it incredibly hard for them to find new service providers.”
Mr Alexander [said] that data portability was “a minefield, particularly with things like Facebook’s download, where you are downloading posts and comments made by other people”.
Data portability could be one of the most significant changes brought in under the General Data Protection Regulation. It could promote quality-based competition and innovation by making it easier for consumers to switch to platforms for various services.
However, we are concerned that the principle of data portability may unravel in practice. If applied too rigidly, it could place onerous obligations to ensure interoperability with technology on emerging businesses; loosely defined, it is of limited use. We recommend that the Commission match data portability requirements to the kinds of services provided by different businesses and online platforms, adopting a proportionate approach depending on the essentiality of the service in question. We recommend the Commission publish guidelines within the next year explaining how data portability requirements will apply to different types of online platforms.
Experiments using personal data on social networks
Richard French, Legal Director at the Digital Catapult, noted that online platforms “carry out research using personal data into the effects of their services on individuals’ behaviours and habits”. In so doing, and regardless of the impact upon consumers, “the online platform has total autonomy over the purposes and means and no obligation of transparency.”
Joe McNamee, from European Digital Rights (EDRi) referred to the experiment Facebook conducted for one week in 2012, which altered users’ news feeds to see how this affected their mood updates. Mr McNamee told us that “Facebook did this on the basis of a phrase in its 9,000-plus-word terms of service that states the company can use the data for research purposes.” Dr Ansgar Koene mentioned another Facebook experiment, during the 2012 US presidential election, “which showed that people who had been notified when their friends mentioned that they’d just voted were significantly more likely to have also voted during the election.”
The use of personal data as the basis of research, particularly on social media, goes beyond what most users would ordinarily expect or consider acceptable. We recommend that the Government and Information Commissioner’s Office publish guidelines in the next 12 months setting out best practice for research using personal data gathered through social media platforms.
Implementing the General Data Protection Regulation
In the past, US-based platforms were able to circumvent European data protection rules. This resulted in a weak data protection regime in which European citizens’ fundamental rights were breached, and resulted in consumers losing trust in how online platforms collect and process their personal data. We are therefore concerned that industry remains sceptical about the forthcoming General Data Protection Regulation. Online platforms must accept that the Regulation will apply to them and will be enforced, and prepare to make the necessary adaptations.
Consumer Protection and Online Platforms
The growth of online platforms and the collaborative economy raise important questions about the definitions of a “consumer” and a “trader” which form the cornerstone of consumer protection law. This creates uncertainty about the liability of online platforms and their users in instances where consumer protection concerns may arise.
The Commission should also publish guidance about the liability of online platforms on consumer protection issues in relation to their users, including their trading partners.
We also recommend that online platforms clearly inform consumers that their protection under consumer protection law is reduced when purchasing a good or service from an individual, as opposed to a registered trader.
Transparency in search results
Which? agreed that there were concerns over whether “the basis upon which … search results were generated is clear to the consumer”. It was important for consumers to know whether search results were “influenced by promotional spending on the part of sellers (to make their offer more prominent)”, or “affected by information about the consumer in ways that the consumer would not reasonably have been able to expect.”
The hotel chain that submitted evidence anonymously was concerned about a mismatch between how online travel agents (OTAs) presented search results, and consumers’ expectations. They said 82% said they used OTAs in order “to get the lowest price”. However, they noted that OTAs “do not sort hotel search results by price by default”. Instead the hotel chain said that “hotels are told the more commission they pay, the higher they will appear in the sort results … the sort order … is entirely shaped by commercial factors“. Finally, they noted that these factors were “not made clear to the consumer”, and that the consumer “will rarely alter the default search on a website (ie, from the ‘our favourites’ or ‘recommended’ option”.
Skyscanner also recommended that price comparison platforms disclose “their business structure in order for consumers to fully understand the company that they are dealing with and the method of remuneration of an online platform.” Dr Anna Plodowski recommended that: “A clear graphic should be created for each digital platform to show the network of relationships it mediates in its business model, and [be] displayed on an easily accessible and explicitly named page on the website.”
Disclosing basis of algorithms to improve transparency
Professors Broughton and Tambini were cautious about forcing online platforms to disclose the workings of their algorithms because it raised numerous practical difficulties. They said: “Google estimated it carries out up to 20,000 experiments of changes in its search algorithms with 585 launching permanently. Would platforms be required to update regulators each time one of these changes was made?” They also raised concerns about how policymakers would “gain and maintain the technical literacy to understand the content and implications of often very complex algorithms and computer software”. They also highlighted the “commercial sensitivities” of disclosing information relating to algorithms which could be considered to be the intellectual property of online platforms and the risk that disclosing this information could lead to the “gaming” of algorithms.
As an alternative, witnesses proposed that online platforms be transparent about the aims and intentions of the algorithms powering their search results. Mr Alexander said it was important that there was sufficient transparency so that regulators could: “audit algorithms for delivering the outcome they were intended to deliver”. He said this required greater transparency regarding the types of data used by the algorithm (“input parameters”) and to transparency regarding the “corporate objective of the algorithm”.
We find that concerns about the lack of transparency in how search and meta-search results are presented to consumers are well founded, especially in relation to price comparison websites, where the results of a search may be linked to a commercial deal between the website and a business, rather than on the best possible price.
However, we do not believe that this problem should be addressed by requiring online platforms to disclose their algorithms, which are rightfully considered to be their intellectual property. Instead, we believe that these concerns should be addressed through increased transparency.
We recommend that the Commission amend the Unfair Consumer Practices Directive so that online platforms that rank information and provide search and specialised results are required to be explicit on their website about the basis upon which they rank search results. We also recommend that the Commission amend the Directive to require online platforms to provide a clear explanation of their business models and relationships with suppliers, prominently displayed on their websites.
Which? raised a further concern, around so-called ‘personalised pricing’, whereby online platforms use “information provided by or revealed by the consumer” to determine prices. Professor Eric Clemons and the German Monopolies Commission also expressed concerns that online platforms’ use of personalised pricing was not transparent to the consumer.
Professor Ezrachi outlined how online platforms could use personal data to personalise pricing in a particularly effective way: “While perfect price discrimination may be unattainable, ‘almost perfect’ price discrimination may be within reach for dominant online platforms.” The practice worked in the following way: “If you are likely to spend more, you will just have to pay the display price, but if they know that you have some reservations—if your history, the data that were gathered on you, indicates not … you will immediately also get a coupon.” He believed the practice warranted greater attention, as it was likely to lead to “the transfer of wealth from the pockets of consumers to the pockets of operators”.
We note concerns that online platforms can and do engage in personalised pricing, using personal data about consumers to determine an individual price for a particular good or service. We find this to be another concerning example of opacity with which some online platforms operate. We recommend that DG Competition urgently build on the work of the Office of Fair Trading and investigate the prevalence and effects of personalised pricing in these markets.
Ratings and reviews
The rating and review systems used by online platforms are instrumental in creating the trust necessary for consumers to make transactions online and a vital building block of the digital economy. However, to ensure transparency, we believe that all online platforms should have publicly available policies for handling negative reviews, and clearly distinguish between user reviews and paid-for promotions. We recommend that the Commission publish guidance clarifying how the Unfair Commercial Practices Directive applies to the rating and review systems used by online platforms.
How to grow European platforms
Market scale is paramount for online platforms, whose value resides in the size of the networks they can create. The fragmentation of the European market in digital goods and services substantially limits growth and acts as an incentive for businesses to shift the locus of their operations to the US, to maximise their growth potential. We therefore strongly endorse the central aim of the Digital Single Market strategy, which is to reduce regulatory fragmentation and remove barriers to cross border trade, and urge the Commission to retain a sharp focus on this over-riding purpose.
First pillar initiatives in the Digital Single Market Strategy, particularly the greater harmonisation of contract law and consumer protection, are critically important to enabling digital tech start-ups and platforms to operate without friction across borders and to fully exploit a potential market of over 500 million consumers. We recommend that the Commission and the Government pursue an ambitious degree of integration in these areas, and resist a lowest common denominator approach.
Regulating Online platforms
BEUC said that the benefits of digital technologies “must not come at the expense of fundamental rights and freedoms.” Mr Chisholm said: “I absolutely accept that certain fundamental rights should be protected up front in relation to things such as privacy and data protection. We should, if you like, be able to take that for granted as the regulatory framework.”
Professors Sally Broughton and Damian Tambini advocated the creation of a regulatory regime that leaned “in the direction of regulating for the protection of individual consumers (data protection, transparency of terms etc.) and not over-regulating the arenas in which freedom of expression and creation are at stake”.
The preceding evidence in this report highlights the need for the enforcement of consumer protection law, data protection law and competition law to be sufficiently robust to protect the public interest and deter abusive behaviour.
The rapid growth of online platforms has disrupted many traditional markets. The speed at which they have emerged has also resulted in uncertainty about how existing regulation, designed in a pre-digital age, applies to these new disruptive business models. As a consequence there is a perception that large online platforms are above the law.
Witnesses emphasised that digital disruption was unlikely to end any time soon Professor Rodden said “current trends such as the “Internet of Things” and “Smart cities” will further expand the influence of online platforms” and continue to cause disruption. Witnesses therefore suggested that addressing regulatory disruption needed to be an ongoing process.
We do not think highly restrictive regulation that seeks to contain disruption is the right response. Nonetheless, we acknowledge the need to protect fundamental rights by ensuring that existing regulation is effective and up to date.
We recommend that the Commission, in concert with regulators at Member State level, critically review and refit existing regulation to ensure that its application to online platforms is clear. We believe that in many cases specific guidance from the Commission could provide this clarification.
We suggest that both the Commission and the Member States consider whether providing regulators with increased resources would be a more efficient way to address concerns about enforcement than introducing additional rules.
We recommend that regulators robustly enforce against online platforms they believe to be in breach of the law. Enforcement authorities should sometimes proceed even where there is a risk of losing the case or having the outcome appealed – such outcomes help to clarify how the law applies. For this reason we welcome Commissioner Vestager’s decision to proceed with the Google case, without prejudice to the outcome.
Digital Policy Alliance said: “The Commission should resist any political pressure to achieve quick results at the expense of well formulated, clearly targeted and effective remedies.”
We recommend that the European Commission appoint an independent panel of experts tasked with identifying priority areas for policy action in the digital economy and making specific policy proposals. This panel would gather the concerns of policymakers and regulators about emerging issues in digital markets. This report demonstrates that businesses and citizens also have many concerns in these markets, and the panel would also collect and prioritise these concerns – potentially through an online portal.
- While the panel would set its own agenda, on the basis of this report we suggest three initial subjects that it would consider:
- Whether enforcement agencies have the necessary powers and resources to take effective action against abuse by the largest online platforms; and how enforcement could be better co-ordinated across national and EU enforcement authorities, and across different regulatory regimes;
- How the lack of competition between platforms on privacy standards can be best remedied;
- Beyond the Digital Single Market Strategy, how to smooth the way for emerging areas of disruptive innovation, such as the internet of things, driverless cars and the expansion of the collaborative economy.