1 Introduction

The power wielded by social media platforms (e.g., Facebook, Twitter, YouTube, TikTok) to direct, influence, and control public discourse has become a topic of vigorous debate in the past decade. Platform owners have been criticized for designing and implementing content moderation systems in ways that arbitrarily undermine users’ freedom of expression. Although typically subject to the private ownership of corporate entities, these platforms—as exemplified in Sect. 2—are essential infrastructures for public discourse and a core component of the contemporary digital sphere. Thus, it is essential that the regulation of social media platforms reflects this implicit “private-public partnership” and is designed and implemented in a manner that enables them to flourish as digital spaces for robust democratic discourse based on humanistic values of self-determination and inclusion. Ensuring effective protection of users’ fundamental right to freedom of expression is critical toward achieving this aim. Accordingly, in recent years, States have come under increased pressure to introduce more effective legal, regulatory, and policy frameworks to ensure that users’ freedom of expression is adequately safeguarded on social media platforms.

The enactment of Article 17 of the EU Copyright in the Digital Single Market Directive (DSM)Footnote 1 in 2019 reignited this debate within the EU. Article 17 DSM introduced a more stringent and expansive approach to copyright enforcement on social media platforms resulting in the EU being accused of “killing our democratic spaces using copyright as a Trojan Horse” (Avila et al., 2018). This chapter explores how Article 17 DSM actuates the power of social media platforms to arbitrarily limit users’ freedom of expression and outlines several proposals put forward by copyright law scholars to render the EU legal framework on online copyright enforcement more conducive for fostering democratic discourse on these digital spaces.

2 Why Should Social Media Platforms Be Governed in a Manner That Promotes Democratic Discourse?

Online social media platforms constitute digital spaces that provide tools and infrastructure for members of the public to dialogically interact across geographic boundaries. Given the high numbers of users they attract, the substantial amount of discourse taking place on these platforms, and the capacity of this discourse to influence public opinion, it is possible to define them as a core component of the contemporary digital public sphere and accordingly as essential infrastructures for public discourse in today’s world.

The term “digital public sphere” is of relatively recent origin and builds upon Habermas’s notion of the public sphere (Habermas, 1989). It has been described as:

[…] a communicative sphere provided or supported by online or social media—from websites to social network sites, weblogs and micro-blogs—where participation is open and freely available to everybody who is interested, where matters of common concern can be discussed, and where proceedings are visible to all. (Schäfer, 2015, p.322)

As per Habermas’ vision, the public sphere has a key function in fostering “democratic discourse.” A precise definition of the term “democratic discourse” is difficult to come by. However, Dahlberg’s concept of rational-critical citizen discourse provides a notion of public discourse that is autonomous from both state and corporate power and enables the development of public opinion that can rationally guide democratic decision-making (Dahlberg, 2001). This presupposes the ability of members of the public to engage in autonomous self-expression leading to a proliferation of diverse viewpoints that can be the subject of open, inclusive, and deliberative discussion (Dahlgren, 2005).

The value attributed to fostering democratic discourse within the European Union (EU) is underscored by the fundamental right to freedom of expression guaranteed under Article 10 of the European Convention on Human RightsFootnote 2 (ECHR 1953) and Article 11 of the EU Charter of Fundamental RightsFootnote 3 (EUCFR 2000), which safeguards the “[…] freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”

In comparison with traditional media, social media platforms offer greater scope for ordinary members of the public to exercise their freedom of expression by directly engaging in the creation of social, political, and cultural content. Footnote 4 A primary example of this is user-generated content (UGC), which refers to content shared by users on these platforms involving the reuse and reinterpretation of existing informational and cultural content (e.g., texts, images, music) in creative ways for purposes of social commentary and critique (e.g., parodies, memes, GIFs, commentaries). These typically comprise transformative uses of existing content for noncommercial purposes. UGC is a particularly powerful mode of dialogic interaction that enables the dissection of contemporary narratives to create new meaning by challenging established ideological assumptions and stereotypes (Peverini, 2015) and constitutes an important form of individual self-expression (Conti, 2015, 346). As observed by the CJEU in the Poland v CouncilFootnote 5 case, “User-generated expressive activity on the Internet provides an unprecedented platform for the exercise of freedom of expression.”Footnote 6

As UGC aims at critiquing and commenting on contemporary political, social, and cultural issues, the content they reuse and reinterpret also tends to be contemporary content, which are often under copyright protection. For example, memes and GIFs often use images and video clips from recent TV shows and movies. Creators of UGC typically reuse such copyright-protected content without the authorization of the copyright owner. This is inter alia for the reason that obtaining authorization from the relevant copyright owner can be an expensive and time-consuming process. However, it is important to note that the unauthorized use of copyright-protected content in UGC does not necessarily result in copyright infringement. This is because copyright law recognizes the importance of facilitating the reuse and reinterpretation of informational and cultural content for purposes of commentary, critique, and creative experimentation. Accordingly, it provides for exceptions and limitations (E&L) to copyright law that permit any person to make use of copyright-protected content for these purposes, without the need to obtain authorization from the copyright owner. Such E&L to copyright have the character of user freedoms that enable members of the public to dialogically interact with copyright-protected content without fear of legal sanction. For instance, in the USA, the “fair use exception” provides a broad and open standard that can be flexibly interpreted for the purpose of protecting user freedoms to make transformative uses of copyright-protected content. The use of copyright-protected content in memes and GIFs has a high likelihood of coming within the “fair use exception” as transformative uses of copyright-protected content that are necessary for facilitating commentary, critique, and creative experimentation.

The EU copyright law framework does not include an equivalent broad and open exception. However, Article 5 of the EU Copyright Directive [2001]Footnote 7 provides specific E&L, which enable the quotation of copyright-protected content for purposes of such as criticism or review (Article 5(3)(d)) and the use of copyright-protected content for the purpose of caricature, parody, and pastiche (Article 5(3)(k)). As stipulated by the CJEU in the determination delivered in Poland v Council case, these E&L qualify as user rights, which “(…) confer rights on the users of works and (…) seek to ensure a fair balance between the fundamental rights of those users and of rightholders.”Footnote 8 In a series of decisions,Footnote 9 the CJEU recognized the significance of these E&L for the protection of users’ freedom of expression. That view is reinforced by AG Saugmandsgaard Øe’s OpinionFootnote 10 in the Poland v Council case where he enunciates that the exceptions for quotation, criticism, review, parody, pastiche, and caricature ensure the safeguarding of users’ freedom of expression.Footnote 11 The AG also recognizes that a significant proportion of content uploaded by users on social media platforms will consist of uses that come within the scope of these E&L.Footnote 12

Thus, ensuring the effective protection of users’ ability to benefit from these copyright E&L is critical for the purpose of protecting users’ freedom of expression and consequently democratic discourse in the digital public sphere.

3 How Can Content Moderation on Social Media Platforms Undermine Freedom of Expression?

Although they fulfill a vital public function by providing essential infrastructures for public discourse, social media platforms are privately owned spaces and are therefore subject to private property rights. Based on these private property rights, owners of social media platforms have the right to determine the terms and conditions subject to which members of the public are allowed access and use of these digital spaces. Such terms and conditions are typically set out in the form of contractual terms of service (ToS), which users are required to accept prior to accessing and using the platforms. ToS inter alia include terms preventing the sharing of illegal content—including copyright infringing content.Footnote 13 These terms and conditions are enforced through the process of content moderation. Content moderationFootnote 14 refers to the governance mechanism through which platform owners ensure that user-uploaded content comply with applicable legal rules and the platforms’ own ToS. It involves monitoring content shared on the platform (i.e., gathering information about content shared on the platform to identify the existence of content that is illegal or incompatible with the platforms’ ToS) and filtering offending content to prevent it from being shared on the platform (i.e., by removing, disabling access to illegal content, and/or terminating or suspending the accounts of users who share the illegal content). In the context of copyright law, for instance, this would refer to platform owners’ ability to monitor user-uploaded content for the purpose of identifying copyright infringing content and, if such content is identified, removing or disabling access to the infringing content (i.e., blocking) and/or terminating or suspending the account of the miscreant user.

Content moderation grants platforms the power to influence and control public discourse in two ways (Laidlaw, 2010). Firstly, it has an “enabling function” whereby the platform obtains power to influence and shape user perceptions and behavior through a process of norm-setting (e.g., sha** user perceptions on what constitutes lawful or unlawful speech based on the way in which the platform interprets applicable law). Secondly, it has a “restricting function” whereby platforms gain power to restrict users’ ability to participate in public discourse through the blocking/removal of speech or by terminating/suspending user accounts. Thus, content moderation grants owners of social media platforms, who are private actors, substantial power in regulating the public discourse taking place on their platforms.

Content moderation becomes problematic when platforms exercise this power in a manner that leads to the arbitrary limitation of users’ freedom of expression. In a copyright law context, this would be the case when the content moderation system is designed and implemented in a manner that interprets the scope of a copyright exception (e.g., the parody exception) more restrictively than what is actually provided by law. This would, firstly, create an incorrect perception in the minds of users as to the extent to which the copyright exception permits the reuse of copyright-protected content resulting in users exercising self-censorship and avoiding legally permitted uses of copyright-protected content, for fear of their UGC being removed by the platform. Secondly, it results in the suppression of lawful speech through wrongful blocking/removal of lawful UGC that de jure falls within the scope of that copyright exception. Footnote 15

Such instances of wrongful suppression of lawful speech are unfortunately not infrequent. A good example is the controversy that arose in relation to a UGC video uploaded onto YouTube entitled “Buffy v Edwards: Twilight Remixed.”Footnote 16 This approximately 6-minute video comprised a collage of audiovisual clips from the movie series The Twilight Saga (2008–2012) and the movie “Buffy the Vampire Slayer” (1992). In the words of its creator, Jonathan Mcintosh, the video was:

[…] an example of transformative storytelling serving as a pro-feminist visual critique of Edward’s character and generally creepy behavior […] some of the more sexist gender roles and patriarchal Hollywood themes embedded in the Twilight saga are exposed—in hilarious ways […]. It also doubles as a metaphor for the ongoing battle between two opposing visions of gender roles in the 21st century. (Mcintosh, 2013)

According to the creator, since it was uploaded in 2009, the public’s response to the video was “swift, enthusiastic and overwhelming,” and in the first 11 days since its upload, it was viewed over 3 million times, and the subtitles were translated by volunteers into 30 different languages. According to the creator, the video was also used in “media studies courses, and gender studies curricula across the country” and “ignited countless online debates over the troubling ways stalking-type behavior is often framed as deeply romantic in movie and television narratives.”

In 2012, YouTube removed the video, pursuant to a complaint by Lionsgate Entertainment—the copyright owner of The Twilight Saga movie series—that the video infringed upon their copyright by making unauthorized use of copyright-protected audiovisual content taken from their movies. The creator’s YouTube account was suspended and a copyright “strike” placed on it. The creator’s defense that the video came within the “fair use” exception was rejected. Finally, in the face of significant public protests against the removal of the video on the Internet, Lionsgate conceded that the use of the audiovisual content did in fact come within the scope of the fair use exception, and YouTube re-posted the video.

As private actors, social media platforms do not have positive obligations to protect fundamental rights. Therefore, it is essential that legal and regulatory frameworks step in to ensure that content moderation systems are designed and implemented in a manner that can adequately safeguard user freedoms to benefit from copyright E&L. However, as will be discussed in the following section, the current EU legal framework on online copyright enforcement actuates the arbitrary limitation of users’ freedom of expression, thereby increasing the risk of wrongful suppression of lawful uses of copyright-protected content that come within the scope of legally granted E&L.

4 How Can the EU Legal Framework on Online Copyright Enforcement Undermine Democratic Discourse on Online Platforms?

The seminal provision of the EU legal framework on online copyright enforcement is Article 17 of the Copyright in the Digital Single Market Directive (DSM) [2019]. Article 17 DSM constitutes a lex specialis to the general intermediary liability framework provided under the Digital Services Act [2022] and determines the intermediary liability of online content-sharing service providers (OCSSPs) for copyright infringement arising from UGC.

An OCSSP is defined in Article 2(6) of the DSM as being:

[…] a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes.

Accordingly, owners of social media platforms fall within the scope of Article 17 DSM.

The doctrine of intermediary liability imputes liability to providers of online hosting services (such as search engines, streaming services, and social media platforms) for illegal acts committed via those services. Such wrongful acts could involve criminal offences (e.g., child sexual abuse, dissemination of terrorist content) and civil wrongs such as defamation and copyright infringement.

There are two main forms of intermediary liability. The hosting services provider incurs primary (direct) liability for illegal acts that it deliberately commits. For example, if Netflix streamed copyright-infringing content on its service, it would incur primary liability for the copyright infringement since it is Netflix that determines which content is made available to the public on its service. On the other hand, when the illegal act is committed by a user of the service, the hosting service provider would typically be only charged with secondary (indirect) liability on the basis that it facilitated or contributed to the commission of that act through the provision of its service.

Prior to the enactment of Article 17 DSM, social media platform owners were typically imputed secondary (indirect) liability for copyright infringements arising from content shared by users on their platforms. Unlike content distributors (such as Netflix) who directly engage in the sharing of content (and are therefore able to determine and control the types of content that are streamed on their services), social media platforms were regarded as “mere conduits” who simply provided digital spaces and the technological infrastructure to facilitate the sharing of content by others. Thus, social media platform owners would be imputed secondary (indirect) liability, while the user who shared the infringing content incurred primary liability. Even then, the platform owner would be able to avoid being held secondarily liable for copyright infringement by fulfilling the criteria necessary for coming within the intermediary liability “safe harbor” provided in Article 6 of the EU Digital Services Act [2022].Footnote 17 This reflected the traditional “negligence-based” approach to intermediary liability whereby secondary liability was only imputed if the social media platform owner had knowledge of the infringing content shared by a user and after obtaining such knowledge failed to stop that infringement from continuing by removing/blocking the infringing content. Thus, the avoidance of liability merely required the platform to respond to copyright infringements after they arose (ex post copyright enforcement).

Article 17 DSM marks a radical shift in the general intermediary liability framework of the EU. Firstly, under Article 17(1), OCSSPs are assigned primary liability for copyright infringement arising from content shared by users. This is on the basis that, although it is the user who uploads the content, it is the OCSSP who carries out the unauthorized communication to the publicFootnote 18 via the platform by granting the public access to the infringing content. Under Article 17(3), OCSSPs are also prevented from relying on the Article 6 DSA “safe harbor,” thereby exposing them to a higher risk of incurring this enhanced degree of liability for copyright infringements arising through UGC.

In order to avoid primary liability for copyright infringement, OCSSPs are required to make best efforts to obtain licenses from copyright holders for content uploaded by their users [Article 17(4)(a)]. Where such licenses cannot be obtained, they are imputed with positive obligations to:

  • Make “best efforts” in accordance with high industry standards of professional diligence to ensure the unavailability of specific works for which copyright holders have provided relevant and necessary information [Article 17(4)(b)]

  • Upon receiving sufficiently substantiated notice from copyright holders to act expeditiously to disable access or to remove from their websites (i.e. platforms) the notified works and to make “best efforts” to prevent the future upload of that content [Article 17(4)(c)]

By imposing positive obligations to make best efforts to “ensure the unavailability” of specific content and to “prevent the future upload” of such content, Article 17 DSM compels OCSSPs to engage in preventive monitoring and filtering (prior review) of content shared by users, with the aim of preventing copyright infringement from taking place on their platform. Thus, it compels OCSSPs engage in ex ante copyright enforcement not just ex post.

The heightened degree of liability coupled with positive obligations to engage in preventive monitoring and filtering compel OCSSPs to adopt a more stringent approach toward copyright enforcement through expansive monitoring and filtering of UGC with the aid of automated content moderation systems (ACMS) (Frosio & Mendis, 2020, p. 563). This in turn enhances risks of “collateral censorship” and “chilling-effects” on speech.Footnote 19 At the prevailing level of technological sophistication, ACMS tend to be context-blind and have a relatively low capacity for comprehending nuances, contextual variations, and cultural connotations in human speech (Cowls et al., 2020). This means they are often unable to correctly appreciate the nuances between unauthorized uses of copyright-protected content and authorized uses that come within the scope of copyright E&L (e.g., the exception for parody), thereby resulting in the wrongful removal/blocking of lawful speech.Footnote 20 Although Article 17(9) DSM requires platforms to put in place effective and expeditious complaints and redress mechanisms to allow users to challenge such wrongful removal/blocking, the fact that the redress mechanism only comes into play after the suppression of lawful speech (ex post redress mechanism) limits its efficacy as a mechanism for safeguarding users’ freedom of expression.

In comparison, Article 17 DSM places significantly less emphasis on the protection of users’ ability to benefit from copyright E&L. Although Article 17(7) DSM enunciates that OCSSP efforts to suppress infringing content should not result in preventing the availability of non-infringing content (as in the case where the use of copyright-protected content falls within the scope of a copyright exception), no explicit liability is imposed on OCSSPs for the wrongful suppression of such lawful uses. Neither are OCSSPs imposed with enforceable obligations to safeguard user freedoms. However, Article 17(7) DSM underscores the importance of ensuring that users are able to rely on existing E&L for quotation, criticism, reviewFootnote 21 and parody, caricature, and pastiche.Footnote 22 This responsibility is assigned to Member States as opposed to OCSSPs.

Given the absence of positive enforceable obligations to protect users’ ability to benefit from copyright E&L, pursuant to a simple cost-benefit analysis, it is less costly for platforms to remove/block UGC, which reuses copyright-protected content than to invest resources in assessing whether such use comes within the scope of a copyright exception. Thus, Article 17 DSM incentivizes OCSSPs to design and implement their content moderation systems to suppress even potentially copyright-infringing content, thereby increasing the risks of collateral censorship.

Therefore, the online copyright enforcement regime introduced by Article 17 is skewed in favor of protecting the interests of copyright owners with less emphasis being placed on the protection of user freedoms. This reflects the primarily economic goal of Article 17 DSM, which is to ensure the ability of copyright holders to obtain appropriate remuneration for uses of their content on OCSSP platforms.Footnote 23 Accordingly, Article 17 DSM is deeply entrenched in the narrow-utilitarian viewpoint (based on the neoclassical economic approach) that conceptualizes copyright’s primary function as being to incentivize the production of creative content by granting copyright holders a means of obtaining an adequate return on their intellectual/entrepreneurial investment. Conversely, the preservation of user freedoms is rendered peripheral to this core economic aim.

5 Proposals for Reform

So how could the existing EU legal framework on online copyright enforcement be revisited in order to effect a fair balance between the interests of copyright owners and users of social media platform owners with the aim of fostering robust democratic discourse on these online spaces? In recent years, several proposals for reform have been put forward by copyright law scholars, which reflect different legal and regulatory strategies. This section outlines some of these proposed strategies.

5.1 Enhanced Regulatory Supervision

The pervasiveness of digital platforms and their growing economic and societal impact has led to increasing calls for the introduction of enhanced regulatory supervision by public authorities (Rieder & Hofmann, 2020). This strategy has also been proposed as a means of providing better protection to users’ freedom of expression in the implementation of Article 17 of the DSM.

For example, Geiger and Mangal advocate for the establishment of an independent EU-level public regulator to oversee the implementation of Article 17 DSM in a manner that can ensure predictable and consistent application of user rights in copyright enforcement, across the EU (Geiger & Mangal, 2022).

Similarly, Cowls et al. propose the establishment of an Ombudsperson at the EU level, who is vested with powers to supervise the safeguarding of the freedom of expression by platforms and to provide advice and guidance to platforms in determining whether specific UGC could come within the scope of copyright E&L (Cowls et al., 2020).

The EU DSA (2022) introduces a regulatory framework involving closer supervision of platforms (including social media platforms) and includes several progressive measures which are inter alia designed to secure more effective protection of users’ freedom of expression within the content moderation process. For instance, the DSA imposes obligations on social media platforms to (1) provide periodic reports on the use of automated systems for content moderationFootnote 24 (including indicators of the accuracy, the possible rate of error and safeguards applied for minimizing such errors); (2) carry out periodic risk assessments of systemic risks for freedom of expression, stemming from the design or functioning of automated and non-automated content moderation systems;Footnote 25 and (3) put in place reasonable, proportionate, and effective measures to mitigate these systemic risks.Footnote 26

Given the lex specialis nature of Article 17 DSM, it is unclear whether the obligations provided in the DSA could apply to social media platforms in the specific case of content moderation designed to address copyright infringement. Article 2(4)(b) read with Recital 11 stipulates that the DSA is without prejudice to EU law on copyright and related rights including the DSM Directive, which should remain unaffected. This is in accordance with the general principle of lex specialis derogat legem generalem (special/specific law prevails over the general law). Nevertheless, they offer interesting insights on how enhanced regulatory supervision could assist in achieving a more transparent and fair application of content moderation systems.

In addition, Article 14(4) DSA obliges platforms to have “due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression […]” in imposing any restrictions in relation to the use of their service.Footnote 27 According to Article 14(1) DSA, such restrictions include “policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.” The term “due regard” is vague and ambiguous, and whether Article 14(4) DSA can be interpreted to impose positive obligations on platforms to design and implement their content moderation systems in a manner that safeguards the fundamental right to freedom of expression remains to be seen. Even if this were the case, whether Article 14 DSA would apply to aspects of content moderation that are aimed at monitoring and filtering copyright infringing content in fulfilment of the obligations that are imposed Article 17 DSM is unclear, given the lex specialis nature of that law (Mendis, 2023).

5.2 Fundamental Rights as an External Balancing Mechanism

In recent years, invoking the freedom of expression as a legal basis for safeguarding user freedoms to benefit from copyright E&L has gained significant traction within the EU copyright law discourse. Pursuant to this approach, the fundamental rights regime is used as an external “safety valve” to achieve a fair balance between the fundamental right to copyright guaranteed in Article 17(2) of the EUCFR and the fundamental right to freedom of expression guaranteed in Article 11 of the EUCFR.

Article 52(1) of the EUCFR stipulates that fundamental rights guaranteed therein are not absolute and must be balanced against each other as per the principle of proportionality, which requires that any limitation on the exercise of a fundamental right must (1) be provided for by law; (2) respect the essence of the fundamental right, which is subject to the limitation; and (3) be suitable, necessary, and proportionate stricto sensuFootnote 28 for achieving the objective pursued by its limitation.

This strategy was employed by Poland when it sought the annulment of Article 17(4)(b) and Article 17(4)(c) of the DSM Directive before the CJEU in the Poland v Council case.Footnote 29 Poland claimed that the preventive monitoring and filtering obligations imposed by these provisions (with the objective of protecting the fundamental right to copyright) limited users’ fundamental right to freedom of expression in a manner that did not comply with the proportionality principle in Article 52(1) of the EUCFR. The CJEU acknowledged that the contested provisions did in fact limit the freedom of expression of users but did so in a manner that was necessary and proportionate for safeguarding the fundamental right to copyright under Article 17(2) of the EUCFR, which made it compliant with Article 52(1) of the EUCFR. What is significant is that in reaching this decision, the CJEU adopted a narrow-utilitarian perception of copyright and accordingly observed that:

[…] in the context of online content sharing services […] copyright protection must necessarily be accompanied, to a certain extent, by a limitation on the exercise of the right of users to freedom of expression […].Footnote 30

The CJEU determination in the Poland v Council case illustrates a fundamental drawback in using the freedom of expression as an external balancing mechanism for safeguarding user freedoms to benefit from copyright E&L. By seeking to protect copyright E&L under the fundamental right to freedom of expression and pitting them against the fundamental right to copyright, it perpetuates the misguided conception of these user freedoms as being something external and even antithetical to copyright. Thus, the ability to benefit from copyright E&L is once again relegated to a position of secondary importance within the copyright law discourse, while the protection of copyright holders’ interests is reinforced as being its primary function.

5.3 The Need for a Paradigm Shift?

The need for a fundamental shift in the theoretical framework of EU copyright law based on the communicational (a.k.a. social planning) theory of copyright law has been advocated with the aim of recognizing and giving effect to copyright law’s potential to serve as a legal tool for fostering robust democratic discourse in the digital public sphere (Mendis, 2021).

The communicational theory of copyright law has been advanced by scholars such as Netanel (1996, 1998), Fisher (2001), Elkin-Koren (Elkin-Koren, 1996, 2010), and Sunder (2012). While affirming the role of copyright in preserving the incentives of authors to produce and distribute creative content, the communicational theory envisions an overarching democratic function for copyright law, which is the promotion of the discursive foundations for democratic culture and civic association (Netanel, 1996).

Thus, the communicational theory prescribes that protecting the interests of copyright owners must be tempered by the overarching aspiration of sustaining a participatory culture (Fisher, 2001), which in turn necessitates the adequate preservation of user freedoms to engage with copyright-protected content for purposes of democratic discourse. As noted by Netanel:

Copyright-holder rights should be sufficiently robust to support copyright's democracy-enhancing functions, but not so broad and unbending as to chill expressive diversity and hinder the exchange of information and ideas. (Netanel, 1998, p.220)

Espousing the communicational theory as a theoretical framework for EU copyright law would bring about a paradigm shift that enables the protection of democratic discourse to be seen as something that is endogenous—and in fact fundamental—to copyright’s purpose. This paradigm shift would provide a solid normative basis for re-imagining the EU legal framework on online copyright enforcement to increase its fitness for preserving and promoting copyright law’s democracy enhancing function.

Firstly, it would entail a re-affirmation that the protection of user freedoms to benefit from copyright E&L (particularly those E&L such as quotation and parody that are vital for safeguarding users’ freedom of expression) is central to copyright law’s purpose and as such should be granted equal weight and importance as the protection of the economic rights of copyright owners. This would provide a normative basis for courts to engage in a more expansive teleological interpretation of copyright E&L with a view to advancing the democracy-enhancing function of copyright law.

Secondly, in view of the character of social media platforms as a key component of the digital public sphere, it would pave the way for acknowledging the role played by OCSSPs as facilitators and enablers of democratic discourse and the potency of content moderation systems to direct and influence public discourse on social media platforms. This would provide a basis for OCSSPs to be imposed with positive obligations to ensure that content moderation systems are designed and implemented in a manner that provides adequate protection to user freedoms.

6 Conclusions

The pervasiveness of social media platforms and their growing influence on every aspect of our lives means that issues concerning their regulation will continue to remain high on policy agendas for years to come. As digital technology continues to advance and evolve, it is likely that the nature of the dialogic interaction taking place in these digital spaces will also evolve and transform. Emerging technologies (e.g., artificial intelligence (AI) and virtual reality) and business models (e.g., live streaming on social media, influencer marketing) are already having a powerful influence on transforming the nature and scope of discourse in the digital public sphere, thereby giving rise to a host of new regulatory problems. Copyright law is already struggling to address several such issues, for instance, the copyright law implications of AI-generated discourse and the need to re-think the private/public and commercial/non-commercial use distinctions in copyright law. It is important to recognize the need for a holistic and interdisciplinary approach in addressing these issues as any legal or regulatory strategy or mechanism that is aimed toward resolving them is likely to have far-reaching social, political, cultural, and economic implications and to have a powerful impact on our digital sovereignty and the protection of our fundamental rights and freedoms in the digital public sphere.

Discussion Questions for Students and Their Teachers

  1. 1.

    Should social media platforms be designated as public utilities/infrastructures notwithstanding their private ownership? What would be the regulatory implications of such designation?

  2. 2.

    Should owners of social media platforms be imposed with positive obligations to protect fundamental rights of users? If yes, does the existing EU law framework allow private actors to be imputed with positive obligations to protect fundamental rights?

  3. 3.

    Should Article 14(4) the EU DSA [2022] apply to copyright enforcement on social media platforms, notwithstanding the lex specialis nature of Article 17 DSM? To what extent would such an application contribute toward the safeguarding of users’ fundamental right to freedom of expression in copyright enforcement?

Learning Resources for Students

  1. 1.

    Frosio, G. and Mendis, S. (2020). ‘Monitoring and Filtering: European Reform or Global Trend?’. In: G. Frosio ed., The Oxford Handbook of Intermediary Liability. Oxford: OUP (pp. 544–565). Available at: https://doi.org/10.1093/oxfordhb/9780198837138.013.28

    This chapter explores the evolution of the “Internet threat” discourse and demonstrates how Article 17 of the EU Copyright in the Digital Single Market (DSM) Directive [2019] is rooted within this discourse. It analyzes the impact of Article 17 DSM on users’ fundamental rights, particularly in the light of its propensity to motivate wider use of automated content moderation systems.

  2. 2.

    Friedmann, D. (2023). ‘Digital Single Market, First Stop to The Metaverse: Counterlife of Copyright Protection Wanted’. In: K. Mathis and A. Tor eds., Law and Economics of the Digital Transformation. Springer. Available at: https://springer.longhoe.net/chapter/10.1007/978-3-031-25059-0_8

    This paper demonstrates how the legislative shift toward charging platform owners with a higher standard of liability (strict liability) for copyright infringement incentivizes the adoption of automated content moderation systems and analyzes its implications for user freedoms in virtual worlds (metaverse). Building upon the “fair use by design” concept, it discusses strategies for designing automated content moderation systems in a way that would enable them to safeguard legitimate uses of copyright-protected content such as uses falling within the scope of copyright exceptions and limitations.

  3. 3.

    Geiger, C. and Jutte, B.J. (2021). Platform Liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match, GRUR International, 70(6), 517–543. Available at: https://doi.org/10.1093/grurint/ikab037

    This paper analyzes the impact of Article 17 of the EU Copyright in the Digital Single Market (DSM) Directive on fundamental rights to freedom of expression and information; freedom of the arts; freedom to conduct a business; data protection, privacy, and family life; right to property; and right to an effective remedy and to a fair trial. It also analyzes Article 17 DSM with regard to its compatibility with general principles of EU law such as proportionality and legal certainty. It demonstrates the difficulty of striking a fair balance between these different fundamental rights within the normative framework of Article 17 DSM.

  4. 4.

    De Gregorio, G. (2022). ‘Digital Constitutionalism and Freedom of Expression’. In: De Gregorio, G. Digital Constitutionalism in Europe: Reframing Rights and Powers in the Algorithmic Society. Cambridge: CUP. Available at: https://www.cambridge.org/core/books/digital-constitutionalism-in-europe/digital-constitutionalism-and-freedom-of-expression/72ACEF48324D180E95BBD456E52E9C96.

    This chapter explores the challenges posed by the private enforcement of the fundamental right to freedom of expression by online platforms in the algorithmic public sphere. It outlines how EU legislators and the courts have entered a new phase of digital constitutionalism in reigning in platform power and addressing the challenges of content moderation.

  5. 5.

    Mendis S. (2023, May 18) The Magic Bullet That Isn’t! The Limited Efficacy of Article 14 DSA in Safeguarding Copyright Exceptions to Quotation and Parody on Social Media Platforms. Verfassungsblog. Available at: https://verfassungsblog.de/no-magic-bullet/

    This blog article explores the potential of Article 14 of the EU Digital Service Act (DSA, 2022) in effectively safeguarding user rights to benefit from copyright exceptions to quotation and parody. It analyzes whether Article 14 DSA could apply to obligations imposed on online content-sharing service providers (OCSSPs) under Article 17 of the EU Copyright in the Digital Single Market (DSM) Directive and, if so, whether Article 14 DSA could lead to these exceptions being rendered enforceable as user rights in EU law.