article
Responses to the European Commission’s consultation on the Digital Services Act (DSA)
Contact
Published by
Interface
September 04, 2020
The following statement summarizes some of the key points from the response to the European Commission’s consultation to the planned Digital Services Act (DSA). The entire submission to the consultation, including this statement, can be downloaded as a PDF under the link on the right-hand side.
We welcome the European Commission’s plans to update the rules for digital services by proposing a Digital Services Act (DSA). An EU-wide legal framework is vital for a digital market that can function well across countries in order to provide the necessary scale for European competition and innovation, and to safeguard citizens’ fundamental rights. Since the E-Commerce Directive was passed, internet usage and online business models have changed considerably, and new, wide-ranging benefits as well as harms associated with the digital sphere have become apparent.
Gatekeeper platforms have produced significant benefits by lowering the barriers to enter the markets that they serve compared to the harder-to-search non-platform or even offline markets. These benefits should be preserved as much as possible, alongside positive externalities such as network effects.
However, gatekeeper platforms can also disengage consumers from making deliberative choices, which is harmful for consumer welfare, markets and, in some instances, democratic processes. It is highly problematic if consumers face barriers to see and consider options that a gatekeeper platform does not promote and are instead made to believe they get a full and supposedly objective picture of “the market”. The widespread activities of the owners of gatekeeper platforms on downstream or related markets mean that gatekeepers have an incentive not to provide the best outcome for the consumer (for example, showing them the product or content that best fits their needs), but those outcomes favoring the gatekeepers’ own commercial interests. This has a strong potential to harm competition on those related markets and to make the gatekeepers’ position even less contestable. Furthermore, some gatekeeper platforms, especially social media and video portals offering algorithmic information and media spaces for citizens, have amplified risks for fair political and social debates. Disinformation and discriminatory content can be spread both widely and in a targeted manner in such online information spaces. This poses the danger of undermining democratic processes, as citizens find it harder to access trustworthy information and participate in democratic debates.
We appreciate the opportunity to participate in the Commission’s consultation on the DSA. In this summary, we highlight some of our key responses to the consultation. First, we provide considerations that should form part of the potential regulatory framework for gatekeeper platforms. Second, we emphasize that dealing with disinformation requires establishing procedural accountability. Third, we describe how online advertising should be made more transparent not just for advertisers, but also for authorities, civil society organizations and consumers.
Creating an EU regulatory framework for gatekeeper platforms
Ex-ante regulation for gatekeeper platforms is sensible to reduce potential adverse effects on the economy and the society. Various expert reports have confirmed that while traditional competition policy has its merits, its limitations in digital markets include lengthy procedures (during which competitive harm becomes increasingly irreversible) and the requirement of culpability. Regulation has the benefit of setting the rules ex ante such that procedures to ensure adherence to regulation can be shorter and such that firms do not have to be found guilty of anti-competitive conduct before certain rules can be applied to them. The Furman Report to the UK government provides helpful advice on how to set up such a regulation.
There are two types of obligations we consider particularly helpful that should be considered for inclusion on a list of special obligations for gatekeeper platforms:
First, the DSA should require gatekeeper platforms to provide meaningful transparency and data on their internal workings to authorities, researchers, and, where appropriate, the public. Their important role to the economy and democracy necessitates comprehensive transparency standards that enable society to understand the impact of gatekeeper platforms on markets as well as on political and social debates. For example, gatekeeper platforms should report on their algorithmic recommender systems and their content moderation policies and practices. The platforms’ interest in keeping business-sensitive information private needs to be balanced with the significant public interest in understanding their impact. Transparency is also a prerequisite for assessing the platforms’ compliance with EU and international human rights law.
Second, the DSA should mandate gatekeeper platforms to provide both their business and their personal users with data portability, the scope of which needs to go far beyond that stipulated in the EU’s General Data Protection Regulation (GDPR). Data portability should be continuous, include a broad range of user-specific data, and users should be able to move data directly between platforms. Enabling users to port their data between services is important to reduce data-related lock-in like in app store ratings or location history. While more portability may be desirable for platforms without the gatekeeper status in certain markets, lock-in is a greater concern for gatekeeper platforms and they are more likely to have the relevant technical expertise to implement it.
There are two types of gatekeeper platform behavior we consider particularly harmful that should be considered for inclusion on a deny list:
First, the DSA should introduce higher hurdles for gatekeeper platforms with a conglomerate structure to merge data sets including personal data and for using personal data across services. The GDPR does not explicitly address the special dynamics associated with data-dominant firms. However, in dealing with these firms, users are deprived of any meaningful choice. Hence, gatekeeper platforms should be prevented from engaging in excessive data merging across services. The German Federal Supreme Court recently published its detailed judgment in the case Bundeskartellamt v. Facebook which is instructive on the understanding of choice as an objective of competition policy.
Second, the DSA should establish clearer rules for gatekeeper platforms regarding how to treat their own services. Harm can arise especially if a gatekeeper platform gives preferential treatment to its own services if this is not based on criteria that benefit consumers. In the digital world, it is often difficult to distinguish between vertical and horizontal relationships between services because this may depend on the user group (for example, some may use Google as an entry point for product search and continue to Amazon, while others may start at Amazon directly). Hence, clear criteria need to be developed to distinguish when preferential treatment is problematic. More evidence is necessary to specify when and what kind of prohibition is useful to balance the harm to competition and the scope for companies to exploit synergies among their services.
A regulatory horizontal framework is compatible with and should be complemented with a market-specific approach to address structural concerns that persist despite regulation. The New Competition Tool (NCT) would be a suitable addition to ex-ante regulation. With a well-designed NCT – including a broad set of remedies available, appropriate checks and balances, and no need for establishing culpability – ex-ante regulation focus on the most problematic types of behavior across markets.
Tackling the spread of disinformation online
The need for EU-wide regulation is particularly evident with regard to information gatekeepers such as social media companies, search engines and video apps. They provide digital communication and media spaces, where citizens debate and form their opinions on social and political topics. While platforms can assist such democratic processes, serious dangers for democracy arise as well: Disinformation and discriminatory content spread online can considerably infringe upon citizens’ basic human right to form their political opinions without interference, and can furthermore negatively affect individual and public health, as is visible in the COVID-19 pandemic.
Continuing to rely only on national (criminal law) rules to tackle these challenges is misguided and not sufficient. Such rules largely focus on removing individual pieces of harmful/illegal content without addressing the overarching market failures that create the incentives to not tackle disinformation more effectively. Besides, neither governments nor companies should be left on their own to decide what content to delete, and thus to decide how to balance free speech concerns with potential harms stemming from disinformation and discrimination.
The DSA should provide clear legal guidance for platforms, that does not focus on enforcing decisions on individual pieces of content, but instead focuses on the processes for accountable corporate decision-making. This could include a common EU framework for content moderation policies and practices, based on international human rights standards, mandatory transparency and accountability reporting as well as independent oversight.
Establishing binding rules and oversight for online advertising
Most behavioral data should not be collected in the first place because its economic value is not proportionate to the harm it creates. The deep intrusion into privacy is not offset by the little added value created mainly for advertisers, most of which is extracted by the data-collecting platform. The current terms and conditions do not give individuals an effective way to opt out from those practices.
However, even where using certain behavioral data is in the interest of users and society, big ad-tech platforms do not provide users, researchers and regulators easy-to-access and easy-to-understand insights into how personal behavioral data is being used to target and deliver advertising. This is detrimental to consumer welfare, as users are left in the dark as to who is paying to reach them and how.
The DSA should include the following measures aimed at platforms that would establish more meaningful transparency for online advertising:
-
Mandatory, expanded and vastly improved ad archives including information on targeting and engagement metrics, data sources, and ad financing
-
Mandatory, expanded transparency reporting on processes for ad targeting and ad delivery
-
Mandatory, improved ad disclaimers
-
Mandatory advertiser verification
Compliance with these requirements should be checked by an independent oversight body that has the technical expertise as well as staff and budget resources to audit transparency reports.
Transparency, accountability and oversight mechanisms are especially crucial for online political advertising. When advertising, candidates, political parties and other campaigners are not trying to sell products and services but pay to shape political debates and influence voting decisions. A lack of options for public interest scrutiny of online political ads can therefore weaken the legitimacy and integrity of elections and political campaigning more generally. In addition to the transparency standards mentioned above, further rules for political advertising should be established, including restrictions on behavioral microtargeting and expanded financial accountability reporting by platforms and political advertisers such as European parties and candidates. In sync with other Commission initiatives, especially the European Democracy Action Plan, the DSA should define the baseline requirements for transparency and accountability for paid online phgolitical messaging in Europe.
We thank the Commission for providing the opportunity to submit our responses to the consultation and look forward to engaging further on this important legislative proposal in the future, not only with the Commission and the European Parliament but all interested stakeholders.
Author
Dr. Julian Jaursch
Lead Platform Regulation