Regulatory approaches to immersive worlds: an introduction to metaverse regulation

Matthias C. Kettemann
Department of Legal Theory and Future of Law, University of Innsbruck
Leibniz-Institute for Media Research | Hans-Bredow-Institut, Hamburg
Humboldt Institute for Internet and Society, Berlin

Martin Müller
Department of Legal Theory and Future of Law, University of Innsbruck




Caroline Böck
Department of Legal Theory and Future of Law, University of Innsbruck 




1 INTRODUCTION

Metaverses[1] are phenomenologically diverse, technically complex, offer huge economic potential, challenge traditional concepts of democratic codetermination and still largely lack legal foundations. “In the metaverse, you’ll be able to do almost anything you can imagine – get together with friends and family, work, learn, play, shop, create – as well as completely new experiences that don’t really fit how we think […] today,” said Meta’s Mark Zuckerberg, outlining his vision. Where people shop, learn, play, make statements, enter into contracts, that is where standards are relevant. Where we are active, where we express opinions, we come into contact and conflict with others. Standards in the metaverse – like standards in general – solve distribution problems, coordination problems, cooperation problems; they have a shaping, pacifying and balancing function. But who makes the rules for governing the metaverse, and for governing in the metaverses?

These fundamental questions are not easy to answer from a democratic theory perspective. But we can learn from history. In many respects, the metaverse is now where digital platforms were at the turn of the millennium: less regulated, offering huge potential. As these platforms have emerged, the communicative infrastructures of democratic public spheres have faced significant changes. With the metaverse, the challenges are accelerating. As regards platforms, institutional solutions for democratic reconnection have come to the forefront; they have even made it into the current German government’s coalition agreement.[2] This process has also established itself in the scientific field as the “constitutionalisation” of social media (Celeste/Heldt/Keller 2022; Celeste 2022; De Gregorio 2022). These steps have yet to be taken for the metaverse. An examination of some legal aspects of the metaverse makes a contribution to this. 

The key challenges of regulating the metaverse as a virtual, immersive and interactive space concept created by merging the physical and digital worlds are therefore found in the areas of data protection and privacy, security, content governance, interoperability, openness and democratic participation. This article focuses on some of these.

Following the introduction (1), two chapters examine the regulation of the metaverse (2) and selected legal questions relating to the application of rules in the metaverse (3). Future developments are then considered (4).[3]

2 REGULATING THE METAVERSE

2.1 The basics

So far, neither national nor European Union legislators have developed a well thought out regulatory approach for the emerging metaverses[4]; the stage of normative visions has currently been reached (European Commission 2023).[5] Nevertheless, individual features of the metaverse, its providers and the actants in it (the avatars) are governed by various regulations in the normative multilevel system between private rules, national law and EU law. The technical entry points to the metaverses are also regulated: VR glasses and similar connection devices are subject to existing product safety regulations.

The metaverse communication space is also bound by standards. They include standards under private law (what the platform allows) and national law (what the state allows) and increasingly also European law, due to the growing amount of regulations regarding digital services, markets, data and algorithms. 

2.2.  European regulatory approaches

The Digital Services Act (DSA)[6] is an EU Regulation that came into force in November 2022. The DSA and the Digital Markets Act (DMA), which was negotiated at the same time, aim to create a safer digital space in which the fundamental rights of all users of digital services are protected and to establish a level playing field to foster innovation, growth, and competitiveness in the European Single Market.

As stated in Article 2, point (1), of the DSA, the Regulation applies to intermediary services offered to recipients in the European Union, irrespective of whether the providers of the services have their place of establishment in the EU. The scope of the DMA is similarly regulated in Article 1, point (2), of the DMA. If offered to users in the EU, metaverses therefore fall within the scope of the DSA. Of the three intermediary services listed in the DSA, metaverses are regarded as hosting services: to present the virtual world and interaction with it, information provided by users must be stored on their behalf by operators, meaning that the requirements of hosting services in Article 3 (g) (iii) of the DSA are met. Fully decentralised metaverses are a special case. Here, there is not one hosting service operating the metaverse – there are multiple operators. 

Firstly, the operators of metaverses must comply with the regulations applicable to all intermediary services in Articles 11–15 of the DSA. Further obligations are now being introduced here in relation to the Electronic Commerce Directive. For example, points of contact for authorities, the Commission and users must be designated (Articles 11 and 12 of the DSA); this obligation should be met through an appropriate form of the legal notice as laid down in Section 5 of the Telemedia Act. The terms and conditions of all intermediary services must meet certain requirements (Article 14 of the DSA), and in particular ensure legally secure moderation behaviour that can be challenged. 

In addition to the minimum requirements for the content of terms and conditions as described in Article 14, point (1), of the DSA, Article 14, point (4), stipulates that the interests of users are to be taken into account when content is moderated and when complaints are handled by platforms. The fundamental rights of users are explicitly mentioned, such as the right to freedom of expression. Contrary to previous decisions by the Federal Court of Justice, it is therefore a matter of a direct horizontal commitment to fundamental rights by platforms, irrespective of their size (Quintais/Appelman/Fahy 2022).[7] Metaverse operators must therefore clearly state when and why they moderate and what legal remedies exist. Article 15 on transparency obligations is also relevant with regard to content moderation.

The Digital Markets Act (DMA)[8] attempts to restrict the economic power of the big tech platforms on digital markets. For the application of the DMA to metaverses, there is no designation of metaverses as central platform services, regardless of the future variable conditions of impact on the single market and the stable and sustainable position.

The Data Governance Act (DGA)[9] is the first legislative act at Union level to address data sharing. While the GDPR deals with the protection of personal data, the DGA initially aims to regulate the commercial use of data in general, i.e., of personal and non-personal data, and thus represents a reorientation of Union policy (Metzger/Schweitzer 2023).

The European Commission’s proposal for a data act[10] is the core component of the data strategy. The aim is to increase the amount of publicly available data. Currently, devices in the Internet of Things (IoT) generate huge quantities of data which usually remain with the manufacturers and can only be accessed in exceptional cases. Developing more data altruism or data trust models would bring added value for metaverse operators, including small operators.

The European Commission’s proposal for an artificial intelligence act[11] is a risk-based regulation (Ebers et al. 2021, 589 (589); De Gregorio/Dunn, 473 (488 ff.)), in which the use of AI systems is divided into various risk categories, with further regulations for higher risks for the fundamental rights of users.

It also appears possible for interoperability regulations, which are currently primarily geared towards communication services, to ensure a certain standardisation of data formats for metaverses. The regulations on data interoperability currently apply only to data intermediation services (Article 26, points 3 and 4, in conjunction with Article 29 of the Data Act) and operators of data spaces as defined in Article 28 ff. of the Data Act. However, if these services have the success anticipated by the Commission (European Commission, no date), large parts of the digital economy, e.g. operators of metaverses, will use data intermediation services and data spaces in the near future and will therefore inevitably have to follow the standardisation rules. 

2.3 General terms and conditions

It is apparent that the existing regulations only marginally regulate the main features of the metaverse – namely hardware, software and content – and depend very much on which metaverses will prevail in the market and how they are specifically designed. The significant actors are rather the above-mentioned digital companies with their contract-based private regulations.

3 REGULATION IN METAVERSES

3.1 Communication space 

The private regulations on digital platforms, like social media, are nowadays subject firstly to technical settings as well as to the guidelines that the digital company itself has developed (Quintais/de Gregorio/Magalhães 2023). These rules are often called community guidelines. Community guidelines describe users’ relationships with each other and also the relationship between the user and the platform.[12] These rules do indeed have an effect of creating order because they constitute private communication rules as a partial regime constitution. However, they systematically fall under civil law and have a concrete effect due to the private-law legal relationship (Quintais/de Gregorio/Magalhães 2023). The admissibility of such regulations can in turn be derived from the Basic Law, specifically from the fundamental rights of private autonomy, occupational freedom and freedom of ownership, since the principles enable private individuals to organise private structures and systems within legal parameters (Teubner 2012, 36 ff.; Mast/Kettemann/Schulz 2023, forthcoming).

Use of a platform is only possible after consenting to the terms of use, which include the community guidelines. Thus, a platform usage agreement is typically concluded.[13] The terms of use themselves are regularly incorporated into the contract as standard business terms in accordance with Section 305 (1) sentence 1 of the German Civil Code (BGB).[14] This classification as standard business terms is unproblematic in this respect because the terms of use are unilaterally provided by the company operating the platform for a number of contracts, are pre-worded and are fundamentally not negotiable. Law relating to terms and conditions is therefore applicable to the terms of use. The individual clauses must be able to stand up to a legal check of terms and conditions in their own right. 

In addition to the specific prohibited clauses in Sections 308 f. of the Civil Code, Section 307 (1) sentence 1 of the Civil Code, concerning the principle of good faith, offers the possibility of considering constitutional law valuations as an opening clause under civil law.[15] Case law has taken advantage of this and by way of the doctrine of the indirect third-party effect[16] has established a commitment to fundamental rights for operators of online platforms[17], which must be taken into account when designing terms of use. The fundamental rights would not bind the platforms directly because they are private companies and do not have any state or state-like role.[18] A state-like role is then only to be accepted if a private actor in fact “grows into a comparable obligation or guarantor role that traditionally belongs to the state”[19]. This can only be accepted in the field of communication, in which platforms – and in the future sometimes also metaverses – (will) also move if “private companies themselves take on the task of providing conditions for public communication and thus take on roles that – like ensuring post and telecommunication services – were formerly assigned to the state as a service for the public”[20]. However, such communication provision is not (yet) happening. It cannot be completely ruled out since access to metaverses is much more intensely privately regulated. 

 3.2 Avatars

In metaverses, human actors act via non-physical actants: avatars. From a civil law point of view, the question is how the aspect of merging the real world with the world of the metaverse via an avatar affects issues of attribution under civil law. It is clear that the avatar is the central point of contact for actions in the metaverse. Avatars represent the digital identity of a natural or legal person and are considered an extension of a legal entity (Kaulartz/Schmid/Müller-Eising 2022). The avatar will represent the significant attribution object in the metaverse, which will perform all actions in the metaverse for or by the legal entity (Rippert/Weimer 2007). People ordering items on the Internet are also acting in a technology-mediated way – from this viewpoint, an avatar is no different from an email that can be dressed up. Avatars are able to make declarations of intent in the metaverse, e.g. to purchase concert tickets or other goods and services (Kaulartz/Schmid/Müller-Eising 2022). However, it is not the avatar who is acting, but the person behind it.

Can avatars be liable to prosecution? No, but the people acting through them can be. A post or an email cannot commit an offence, so the focus must shift away from the actant – the avatar – to the person whenever legal responsibility is to be attributed. However, this presupposes the applicability of national criminal law, such as the German Criminal Code (StGB). According to the territorial principle, sovereign powers of punishment are restricted to a state’s own territory (more on the territorial principle: Mills 2006, basis: Schmalenbach/Bast 2017). When determining the location of criminal acts on the Internet, it is recognised that at least such acts committed against a German national or committed by a German national are subject to German criminal law (Schönke/Schröder/Eser/Weißer, no date). This principle can be transferred to the metaverse if there is a possibility of associating the avatar in the metaverse with a specific real person (concurring: Kaulartz/Schmid/Müller-Eising 2022, 521 (529 f.).

In reverse: if an avatar is a victim of an act, e.g. of an insult, does this also always affect the person behind it? That depends: an avatar is a communication medium. I cannot insult a mobile phone, but in the case of an avatar a distinction must be made as to how strong the relationship is between the avatar and the real person behind it. If there is a very strong relationship – for example, an avatar depicts a person’s key characteristics so it can be assumed that the real person is identifiable –, then an insult is likely to be assumed (to the person behind it). 

Some violations of rights, such as murder or conventional theft, cannot be committed against avatars. In the case of other violations of rights, facts other than in real life can be applied. If an avatar is “kidnapped”, for example, someone can be held accountable on the grounds of computer-related violations (hacking). 

May avatars be copied or distorted? That also depends. They have no personal rights of their own but, insofar as a real person is recognisable, the rights of this person prevail. Intellectual property rights may be relevant also for avatars that do not resemble anyone. Not everyone can create an avatar in the form of a Disney character, for example.

4 A METAVERSE FOR EVERYONE?

Despite substantial investment, metaverse technology is neither market ready nor widely deployed. This means it is possible for national and European legislators to introduce appropriate rules to proactively protect individual areas of freedom and reduce negative social consequences. While the law relating to platforms (DSA and DMA) only emerged around 20 years after they gained importance at the beginning of the 21st century, smart metaverse regulation can ensure the effective protection of legal rights. The constitutionalisation process of the platforms through internal juridification and external attribution of responsibility (e.g., through procedural obligations and checks of terms and conditions and also through transparency requirements and risk minimisation obligations) can also be deployed in a targeted manner at a much earlier stage for the metaverse, before its use becomes widespread.

Centralised metaverses present risks for democratic values, for open discourse, for rational self-determination processes. Metaverse operators can abuse their special position of influence on rules and moderation practices. The opening of the platforms and increased criticism from the perspective of democratic theory clearly shows the direction in which the regulatory wind is blowing (Hermann 2022). 

With regard to the metaverse in particular, there are good arguments for developing innovative models for the institutional restriction of power of operators and for improving the legitimacy of social arrangements of the metaverse.  This should happen in close collaboration with all stakeholders involved in digital transformation. As called for by the National Academy of Sciences, “innovative participation ideas must be specifically promoted because established platform operators and service providers are likely to be closely attached to their current business and participation models, and this could hinder the support of democracy-friendly, commercially less usable formats from private commercial sources” (German National Academy of Sciences Leopoldina/acatech – National Academy of Science and Engineering/Union of the German Academies of Sciences and Humanities, 2021, 56). Democracy in the metaverse could thus be supported by initiatives from below and outside. 

The EU has recognised this and has set up comprehensive, Union-wide citizens’ panels. In addition to guidance on good behaviour in the metaverse, these have produced further recommendations and eight basic principles that are intended to apply to the regulation of the metaverse and within the metaverse (Bürgerrat.de 2023). The protection of users has been highlighted here as a particularly important basic principle. Such participatory processes – Meta itself has organised similar events on a global scale – are important preliminary stages in the development of legitimate rules for virtual worlds. An adjustment of regulation would be effective against the backdrop of the Brussels effect, as, accordingly, the EU’s rules have an impact on the international community and encourage other states to adopt the rules or enact similar legislation of their own. Also pointing in this direction are the first preliminary studies by European institutions, including a document by the Committee on Culture and Education[21] and a longer study by the European Parliament Committee on Legal Affairs (2023).

It would also be desirable to have an international legal framework to determine access to and development of metaverses, as the metaverse cannot be thought of on a national level but must be considered global, especially since it is designed for global use. Nor should global regulation exclude issues of international solidarity: access to the metaverse is currently based on privilege. If the metaverse is to develop into a fair and open communication space where rights are supported, perspectives of access for all must also be developed.

BIBLIOGRAPHY

Bürgerrat.de, Virtual worlds for everyone, 24 April 2023, https://www.buergerrat.de/en/news/virtual-worlds-for-everyone/

Celeste/Heldt/Keller (eds.), Constitutionalising Social Media (2022)

Celeste, Digital Constitutionalism (2022)

Sam Jungyun Choi et al. (2023). Regulating the Metaverse in Europe, https://www.globalpolicywatch.com/2023/04/regulating-the-metaverse-in-europe.

De Gregorio, Digital Constitutionalism in Europe (2022).

De Gregorio/Dunn (2022), Common Market Law Review, 473 (488 ff.).

European Commission (no date), European Data Strategy, https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/european-data-strategy_en (last accessed: 6 June 2023).

German National Academy of Sciences Leopoldina/acatech – National Academy of Science and Engineering/Union of the German Academies of Sciences and Humanities (p. 56).

Ebers et al. (2021). Multidisciplinary Scientific Journal, 589.

European Commission (2023). Virtual worlds (metaverses) – a vision for openness, safety and respect, https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13757-Virtual-worlds-metaverses-a-vision-for-openness-safety-and-respect_en

European Parliament Committee on Legal Affairs, Metaverse

Hermann, Demokratische Werte nach europäischem Verständnis im Metaverse (2022).

Kettemann/Böck (2023), Regulierung des Metaverse. In Steege/Chibanguza (eds.), Metaverse

Kettemann/Müller (2023), Plattformregulierung, in Steege/Chibanguza (eds.), Metaverse (2023) (forthcoming).

Mast/Kettemann/Schulz (2023, forthcoming). In Puppis/Mansell/van den Bulck (eds.), Handbook of Media and Communication Governance.

Metzger Axel/Schweitzer, Heike (2022). Shaping Markets: A Critical Evaluation of the Draft Data Act. ZEuP, 01, 42.

Mills, International and Comparative Law Quarterly 55 (2006), 1 (13)

Müller/Kettemann (2023, forthcoming). European approaches to the regulation of digital technologies. In Werthner et al. (ed.), Introduction to Digital Humanism (2023)

Quintais/Appelman/Fahy (2022). “Using Terms and Conditions to Apply Fundamental Rights to Content Moderation”, https://ssrn.com/abstract=4286147, 25.

Quintais, João Pedro/De Gregorio, Giovanni/Magalhães, João C. (). How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications. Computer Law & Security Review, 48 (Article 105792), 3.

Quintais/de Gregorio/Magalhães, Computer Law & Security Review 2023, 105792, 5.

Schmalenbach/Bast, VVDStRL 2017, 245 (248).


[1] The metaverse is understood in the sense of the EU’s (non-binding) definition as “an immersive and constant virtual 3D world where people interact through an avatar to enjoy entertainment, make purchases and carry out transactions with crypto-assets, or work without leaving their seat” (European Commission’s Analysis and Research Team, Metaverse – Virtual World, Real Challenges, 9 March 2022, p. 3.). Various metaverses exist.

[2] The German National Academy of Sciences Leopoldina also recommends the increased democratic reconnection of platforms. See Leopoldina, the Union of the German Academies of Sciences and Humanities, acatech – National Academy of Science and Engineering (2021): Digitalisierung und Demokratie, https://www.leopoldina.org/uploads/tx_leopublication/2021_Stellungnahme_Digitalisierung_und_Demokratie_web_01.pdf, p. 46.

[3] More on the role of law in the metaverse, with further evidence: Kettemann/Böck (2023, forthcoming) and Kettemann/Müller (2023, forthcoming). More on platform regulation can be found in Müller/Kettemann (2023, forthcoming). This article builds on these explanations and summarises them.

[4] However, consultations have already been carried out, see Choi et al. 2023.

[5] “The European Commission will develop a vision for emerging virtual worlds (e.g. metaverses), based on respect for digital rights and EU laws and values. The aim is open, interoperable and innovative virtual worlds that can be used safely and with confidence by the public and businesses.” (European Commission 2023).

[6] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), OJ L 277, 1.

[7] Spindler, GRUR 2021, 545 (551); Mast, JZ 2023, 287 (289).

[8] Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act), OJ L 265, 1.

[9] Regulation (EU) 2022/868 of the European Parliament and of the Council of 30 May 2022 on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act), OJ L 152, 1.

[10] Proposal for a Regulation of the European Parliament and of the Council on harmonised rules on fair access to and use of data (Data Act), COM(2022) 68 final.

[11] Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts, COM(2021) 206 final.

[12] Comparisons: Instagram’s Community Guidelines, available at: https://help.instagram.com/477434105621119/?helpref=hc_fnav; more detail at: Mast/Kettemann/Schulz (2023, forthcoming).

[13] Friehe NJW 2020, 1697 (1697); basis OLG Munich NJW 2018, 3115 (3116); concurring: BGH ZUM 2021, 953 (957 f.).

[14] Spindler, CR 2019, 238 (240); Friehe, NJW 2020, 1697 (1697); OLG Munich NJW 2018, 3115 (3116).

[15] Munich commentary Civil Code/Wurmnest, Civil Code Section 307 margin no. 57.

[16] BVerfGE 7, 198 (205 ff.).

[17] BVerfG NJW 2019, 1935 (1936) relating to digital platforms; also: BGH NJW 2012, 148 (150 f.); BGH NJW 2016, 2106 (2107 ff.) relating to liability of hosting services.

[18] BGH ZUM 2021, 953 (961).

[19] BGH ZUM 2021, 953 (960); BVerfG ZUM 2020, 58 (70) with further references.

[20] BGH ZUM 2021, 953 (960); BVerfG ZUM 2020, 58 (70 f.) with further references.

[21] Draft Opinion of the Committee on Culture and Education for the Committee on the Internal Market and Consumer Protection on virtual worlds – opportunities, risks and policy implications for the single market (2022/2198(INI)), 27 April 2023, https://www.europarl.europa.eu/doceo/document/CULT-PA-746918_EN.pdf; with amendments: Amendments 1–64, Draft opinion by Laurence Farreng (PE746.918v01-00), Virtual worlds – opportunities, risks and policy implications for the single market (2022/2198(INI)), 5 June 2023, https://www.europarl.europa.eu/doceo/document/CULT-AM-749262_EN.pdf.

Stay tuned to the latest news:

Current publications and dates in the newsletter