in-the-media
Opinion piece: The DSA also works “in the metaverse” – if it is enforced well
Contact
Published by
Tagesspiegel Background Digitalisierung & KI
December 14, 2022
The German government and the Bundestag are discussing the "metaverse" and the European Commission also has the topic on its agenda for 2023. Specific new laws are not officially planned for the time being. That's the right choice for now, says SNV project director Julian Jaursch. On the occasion of the December 2022 hearing in the Bundestag on "Web 3.0 and Metaverse", he writes in an opinion piece for Tagesspiegel Background that the focus should first be on enforcing existing law - especially the DSA, which has just come into force.
This is a translated version of the German original text.
The DSA also works “in the metaverse” – if it is enforced well
The German government and the Bundestag are currently discussing the “metaverse” and the European Commission also has the topic on its agenda for 2023. Specific laws are not officially planned for the time being. Julian Jaursch considers this to be the right approach for now. Instead of new laws, he says, the focus must be on enforcing existing rules – especially the DSA, which has just come into force.
“Metaverse” is one of the buzzwords of the hour: More companies than ever before refer to it in their earnings reports, Meta’s investments are as famous as they are infamous and virtual 3D games and entertainment worlds are attracting millions of visitors. The topic has also arrived in political debates. The European Commission wants to take a closer look at the metaverse at the beginning of 2023 and in Germany both the German government and – with a hearing today – the Bundestag are dealing with it. However, neither consumers nor political decision-makers such as the members of parliament should be blinded by the currently popular marketing term “metaverse”. Not everything labeled metaverse is a true internet revolution. And for those offers that are already metaverse-like, existing laws should first be applied.
A/one/no metaverse?
Whether there is or will be a metaverse or several metaverses is disputed even among tech entrepreneurs. Fundamentally, there is no viable definition of the term metaverse. For some, it denotes a comprehensive revolution of the internet, for others it is first and foremost an evolution of the way people move online – from “2D and stationary” (PC) to “2D and mobile” (smartphone) to “3D and mobile”. Currently, the term spans both long-established use cases and future offerings for which there is not even a technical foundation yet.
An attempt at a definition that illustrates this fuzziness well comes from Matthew Ball, who takes an enthusiastic but also realistic look at the topic in his book on the metaverse. He describes the metaverse as a vast 3D virtual world that can be experienced by an almost unlimited number of people synchronously and persistently, and in which data such as identities or payments are interoperable and continuously available. An important part of this definition already exists today: Virtual 3D worlds are commonplace in (online) games. Immersive “extended reality” (XR) technologies are also already being used in medical surgeries, model building in architecture or maintenance work on machines. Yet, for a long time, these applications were not referred to as “metaverse”, but as XR or “digital twinning”.
Other elements of the metaverse in Ball’s definition, however, are still in the distant future. Virtual 3D worlds are not yet usable by millions of people simultaneously: In online games, around 100 people usually share such a world. Even in the case of high-profile, huge virtual concerts like Travis Scott’s on Fortnite, the 12.5 million viewers were technically spread across 250,000 copies of the concert. The infrastructure and engines needed for 3D renderings, for example, would require improvements in bandwidth, hardware and computing, among other things, to create a true metaverse, not to mention standards for interoperability of file formats or protocols.
Apart from industrial applications, therefore, “virtual 3D worlds” in which people play, shop, are entertained, communicate and perhaps also work is currently the more appropriate buzzword. This not entirely visionary view is in no way intended to belittle forward-looking, fascinating further developments, and a sober view of the phenomenon should not mean that politics and society should not deal with this long-term view. On the contrary, looking into the more distant future is important and helpful. For the nearer future, however, many metaverse-like offerings with all their opportunities and risks are already known and companies and regulators should adapt the rules accordingly. So, it’s more of an evolution than a revolution.
Many risks to the “metaverse” can be drawn from experience
Virtual 3D worlds offer potential for participation, accessibility, new forms of entertainment and communication as well as work. These new types of opportunities are certainly accompanied by new types of risks, which are still difficult to assess today. Nonetheless, there is a long list of risks that are painfully familiar from the physical world and previous online experiences: cyberbullying, government and corporate surveillance, espionage, identity theft, fraud, disinformation, negative mental and physical health effects, addiction, demeaning employment conditions and lack of occupational health and safety, digital colonialist tendencies (in content moderation, for example), negative environmental effects, monopolization and barriers to market entry. There is no reason to assume that such risks do not also occur in virtual 3D worlds – there have long been examples of them.
Existing laws attempt to mitigate many of these risks: Games are subject to regulations on the protection of minors and media law, among other rules. Laws on consumer protection and IT security also apply online, as does the General Data Protection Regulation. In addition, there is now a new legislative heavyweight: The EU’s Digital Services Act (DSA) will take effect next year. It is mainly discussed in the context of platforms such as Facebook, Twitter and YouTube. Now, the declared goal of the DSA – to create a safe and transparent online environment – must be applied not only to 2D mobile internet services, but also to 3D virtual online worlds. Can this succeed?
The DSA can minimize risks – if it is implemented well
First, it must be clarified whether the DSA applies to 3D virtual worlds. The definitions in the law are very much geared towards existing online platforms and search engines. But it also covers games. While there has been displeasure in the industry over unclear terminology, there is no question that the DSA does apply to games. This can serve as a clue that apart from games, other virtual 3D worlds are also covered by the DSA, at least as an “intermediary service”, often also as an “online platform”. Over the long term, it would be desirable for legislators, authorities and courts to define more clearly the extent to which virtual 3D worlds fall within the scope of the DSA. As a first step, oversight bodies could show that they are aware of the topic by issuing guidelines, for example.
Some of the most important requirements of the DSA can be transferred to virtual 3D worlds but might have to be adjusted. Examples of this can be found in the rules on content moderation, advertising and deceptive platform design.
The DSA does not specify rules for the moderation or deletion of individual pieces of content but pursues the approach of creating transparency for terms and conditions and for reporting mechanisms. In principle, this works well with virtual 3D worlds. On closer inspection, it quickly becomes clear that the DSA was written under the impression of static 2D content but there is hope that the specifications are formulated openly enough to apply beyond that.
An example of this is Article 16 on notification procedures for potentially illegal content. Users should be able to specify an “exact electronic location” such as a URL for the report. This is probably inappropriate for 3D virtual worlds, since there is no URL like there is for a post on a social network. But the DSA adds that other information “adapted to the type of content and to the specific type of hosting service” can be provided. This addition opens the door for application to 3D worlds. Clarifying this would be helpful during the initial evaluation of the DSA. It still remains open how useful a report is, for example, on an insult that happened “live” at a virtual 3D concert but is not still available after the event (like a post). But there are already examples for this beyond any metaverse discussion, for instance, considering the moderation of comments on live streams.
An important concern of the DSA is transparency, especially in online advertising. The wording that online platforms must indicate in a “clear, concise and unambiguous manner and in real time” (Article 26(1)) whether content is advertising should also be applicable to 3D virtual worlds – if they qualify as online platforms. Companies would then have to find a solution on how to practically implement the requirement in virtual 3D environments. Likewise, databases on online advertising, as envisaged by the DSA for very large online platforms, would probably have to be adapted. Current versions of these are heavily geared towards static 2D advertising.
One area where the DSA would need to be adapted is the rules on deceptive platform design. The prohibition on deceptive design is weak anyway, as the DSA barely extends the rules beyond existing consumer and privacy laws. In virtual 3D worlds, misleading design practices are likely to occur that are currently not even known or at least not yet widespread. So far, deceptive design has often been about buttons, pop-ups or fonts. However, due to the virtual 3D environment, further possibilities of deception by design could emerge. These need to be much more clearly defined and covered by the DSA or other consumer protection legislation. Article 25(3) of the DSA provides that the Commission may issue guidelines on deceptive design. This possibility should be used as much as possible to limit misleading design practices in virtual 3D worlds. This should not only refer to the three examples mentioned in the article.
Commission and DSCs must become active
This only brief look at some examples from the DSA show: Any talk of the metaverse should not distract from the fact that quite a few rules for virtual 3D worlds already exist. A fairy tale of a “wild west in the metaverse” must not be allowed to spread, just as slogans like “platforms are the wild west” (i.e., lawless) were misguided. This only creates incentives for companies to set the rules on their own. Instead, it is up to companies to adapt their content moderation practices, transparency measures or reporting channels to virtual 3D worlds to bring them in line with existing rules. Meta, for example, is testing content moderation techniques in its virtual 3D world “Horizon Worlds” that differ from classic Facebook, such as “Safe Zones” which a person can withdraw to.
However, it should not be left to companies alone to develop or test the necessary adaptations to 3D online experiences (and especially not only after a product launch). The European Commission and the national Digital Services Coordinators (DSCs), which are jointly responsible for enforcing the DSA, must act as active oversight bodies on behalf of consumers. For this, on the one hand, expertise and capacities must be built up. On the other hand, there must be a fundamental motivation to deal with technologies which may not be included letter by letter in the DSA, but which can by no means be neglected if a “transparent and safe online environment” is to be created.
It is precisely against this background that it becomes clear how important strong, independent supervisory authorities in the EU, that have strong networks with academia and civil society, will be. Instead of relying solely on companies or years-long court cases, the Commission and DSCs should work with external experts to clarify to whom and how the DSA applies in virtual 3D worlds.
Author
Dr. Julian Jaursch
Lead Platform Regulation