top of page
lawofficeminic

Regulating freedom of expression on online platforms?


Regulating freedom of expression on online platforms? Poland’s action to annul Article 17 of the Directive on Copyright in the Digital Single Market Directive


3 FEBRUARY 2021/ BY BERND JUSTIN JÜTTE AND CHRISTOPHE GEIGER



In an action for annulment in relation to Article 17 of the Directive on copyright and related rights in the Digital Single Market (Directive (EU) 790/2019, CDSM Directive), currently pending before the Court of Justice of the European Union (CJEU), the Republic of Poland argues that an obligation for specific online platforms leading them to filter content uploaded by their users by automated means infringes the right to freedom of expression. After the CJEU had rejected indirect challenges to existing copyright rules in C-476/17, Pelham (see here and here), C-516/17, Spiegel Online (see here and here) and C-469/17, Funke Medien (on the AG Opinions of all three cases see here), this action for annulment (C-401/19, Poland v Parliament and Council) constitutes another escalation in the struggle for a balanced European copyright law. The AG’s Opinion is expected in spring 2021.


Background


The CDSM Directive was highly contested and passed the vote in the Council in May 2019 with support from Germany, France (see here) and the UK (which has since then withdrawn from the EU) but with strong opposition from Sweden and Finland, two of the BeNeLux countries (with Belgium abstaining), Italy and Poland. It is the latter that has challenged one of the most contested (and lobbied) provisions of the Directive (for an overview of the substantive provisions of the Directive see here and here). Article 17 (formerly Article 13 in the draft proposal) has been opposed by academics and public interest groups, and has mobilized millions of European citizens to take the streets under the slogan #SaveYourInternet. Leading NGOs led the protests, amongst them European Digital Rights, the Electronic Frontier Foundation, Communia and Creative Commons.


Article 17 aims to address what the content industry has called the ‘value gap’ on the internet, which means that uploads of copyright protected works to online platforms by their users compete with commercial offers (such as Spotify and Netflix) and reduce the revenues of artists and producers (see for an early critique here). In practice, according to rightholders, a ‘value gap’ exits if a protected work (for example a music video or a photograph) is uploaded by a user of an online service without authorization from the owner of the copyright. Benefitting from unauthorized uploads, but in general uploads of any kind, are users, whose experience using an online platform such as Facebook, YouTube and Instagram improves, and the platforms themselves whose business model is based on an ever-increasing amount of shared content, which generates interaction, data and, as a result, advertisement revenue.



Article 17 is designed to let rightholders have a stronger control over the use of their works and of the potential revenue created by streams and downloads of their works on specific platforms, so-called online content-sharing service providers (OCSSPs). The notion of an OCSSP is defined in Article 2 of the CDSM Directive as ‘a provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes’.


Article 17(1) employs two mechanisms to ensure that rightholders can better control the uses of copyright works on the platforms of OCSSPs. First, it makes OCSSPs directly liable for the content uploaded by their users, by stating that OCSSPs perform an act of communication to the public when their users upload content without authorization from the rightholder. Prior to this ‘clarification’ (see Recital 64 CDSM Directive, see critically here, here, here and here), the scope of the right of communication to the public under Article 3 of the Information Society Directive (Directive 2001/29/EC, InfoSoc Directive) has been interpreted extensively by the CJEU to include the sale of a media player on which third-party software is installed, which enables its users to access unlawful online-streaming offers (C-527/15, Filmspeler, see for example here), and the operation of a website which enabled its users to identify works unlawfully made available by other users and to download them with a torrent client software (C-610/15, Ziggo, see here, see also the AG Opinion in C-682/18 and C-683/18, YouTube). To avoid incurring liability for unauthorized content, OCSSPs must acquire authorization, for example a license, for hosting such content. Second, in the absence of authorization, OCSSPs are liable for content uploaded without authorization, unless they can demonstrate they comply with certain conditions.


With its challenge under Article 263 TFEU, Poland seeks to annul, partially or in its entirety, Article 17 of the CDSM Directive. The Polish government claims that Article 17(4) infringes the right to freedom of expression by requiring OCSSPs to block and filter content posted online, including non-infringing material.


Intermediary liability before and after the CDSM Directive


Before the introduction of Article 17 CDSM Directive, OCSSPs were exempted from liability under the horizontal ‘safe harbour’ regime of the E-Commerce Directive (Directive 2000/31/EC, ECD)) for certain acts committed by their users. Under Article 14 ECD, intermediary service providers (ISPs) do not incur liability for such acts of their users if they do not have any actual knowledge of the illegal activity and act expeditiously to remove or disable access to the infringing content when they gain knowledge, for example through notification (C-324/09, L’Oréal v eBay).


In relation to copyright, the CJEU already ruled that ISPs, such as access providers, must effectively disable access to infringing content if such content has been clearly identified and certain additional conditions are met (C-314/12, UPC Telekabel, see also here). However, pursuant to Article 15 ECD, an online platform cannot be ordered, by way of an injunction, to block or filter content indiscriminately (C-360/10, SABAM v Netlog). The Court stressed the effect of such a general filtering obligation on the fundamental rights of rightholders and of users of such services. While rightholders of works protected by copyright (a property right) enjoy protection under Article 17(2) of the EU Charter of Fundamental Rights (EU Charter), users are particularly affected in their right to freedom to impart and receive information (Article 11 EU Charter).


The Polish Challenge


At the heart of the challenge lies Article 17(4) CDSM Directive, which stipulates that in the absence of prior authorization (i.e. a license), OCSSPs ‘shall be liable for unauthorised acts of communication to the public’. Article 17(4) states that, to escape this liability, OCSSPs must demonstrate that they have (and this is worth repeating):


“(a) made best efforts to obtain an authorisation, and


(b) made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information; and in any event


(c) acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads in accordance with point (b).” (emphasis added)


Either before or after infringements have taken place, rightholders can provide OCSSPs with information pertaining to their protected works; the latter will then have to ensure that these specific works cannot be uploaded in the future. This obligation is similar to that established in C-18/18, Glawischnig-Piesczek, where the Court ruled that a hosting provider can be required, by way of an injunction, to monitor and remove identical and equivalent content that has already been declared illegal (it is however important to stress that this case concerned privacy issues, not copyright).


To ensure unavailability of repeat-uploads of infringing content hosting platform must necessarily install some sort of automated filters based on algorithms with the result that such content cannot be uploaded. These filters would be applied to all uploads that are fully identical to the works for which rightholders have provided information, for example an upload of a particular song or video. The filter would also apply to uploads that contain (possibly very short) parts of a relevant work, or in cases in which the work has been altered.


Poland argues that an obligation that requires OCSSPs to filter out uploads that contain works or parts of works protected by copyright infringes the right to freedom of expression of users of such services. Because of the volume of content that is uploaded to platforms that would fall within the scope of Article 17 (e.g. YouTube, Facebook, Dailymotion, etc), the obligations arising under the provision will inevitably require the installation of automated filtering mechanisms (see on the limitations of this technology here).


Algorithmic content-filtering and Freedom of Expression


The right to freedom of expression as laid down in Article 11 EU Charter includes the right to receive and impart information. Uploading content, even of an illegal nature, is covered by the right to freedom to impart information and users, by accessing such content exercise their right to receive information (see for example ECtHR, Appl. No. 40397/12, Neij and Sunde Kolmisoppi v Sweden). To create a balance between the interest of users and those of rightholders, only such content would be filtered or monitored that actually infringes copyright. However, by obliging OCSSPs to monitor and prevent uploads of unauthorized content (‘to ensure the unavailability of specific works’), this balance is altered.


The fundamental problem with the AI-powered technology that is employed to ensure the unavailability of content is that it is not able to distinguish between lawful and unlawful content (see here). However, the removal of lawful content would be a collateral damage caused by attempts to remove unlawful content. This phenomenon is referred to as over-blocking and has the result that exercises of the right to freedom of expression are stifled. For example, one and the same picture or a slightly changed picture, could be used as a mere (infringing) reproduction and for purposes of quotation or parody. The latter two uses are expressly permitted by the InfoSoc Directive (Articles 5(3)(c) & (k)) and have been implemented in most national laws (see on the fundamental importance of such uses e.g. Funke Medien, para. 58).


A use of automated filters that ensures the unavailability of unlawful content but cannot guarantee the unfettered exercise of the right to freedom of expression through the upload of lawful content would, arguably, not reflect a fair balance. In order to mitigate the effects of the likely effects of over-blocking as a result of the obligations for OCSSPs arising under Article 17 DSM Directive, Article 17(7) and (9) should ensure that the exercise by users of OCSSPs of certain exceptions and limitation to copyright is safeguarded and that appropriate complaints mechanisms are available to those users.


Article 17 constitutes a metamorphosis from a ‘knowledge and take-down’ regime, which was partially operated under the E-Commerce rules, to a preventive regime. The difference lies in the distribution of the burden. Under the old E-Commerce system, infringing content would, unless prevented by a targeted and specific filtering mechanism, be uploaded and then removed after a notice from the rightholder and future uploads of the same content could be prevented (notice and take down & notice and stay down). Under the regime of the CDSM Directive, OCSSPs have an interest to avoid the liability under Article 17(1) and are incentivized to filter at a larger scale all notified content for which they have not acquired a license. Users who upload lawful content which nevertheless gets caught in a filtering net would have to notify the platform themselves. This could create chilling effects for users and therefore constitute a barrier to the unhindered exercise of freedom of expression.


Fundamental conflicts


The tensions that Article 17 creates are, however, far more complicated as that they could be reduced to the right to freedom of expression. Indeed, quite recently a number of studies have been published (see here, here and here) that examine the variety of implications Article 17 has on the balance of fundamental rights.


The multitude of rights and (sometimes conflicting) obligations established by Article 17 (a provision with 10 paragraphs and unclear wording) translates into a complexity of fundamental rights conflicts, all of which cannot be examined here (see for a more thorough assessment by the authors here). But because the CJEU cannot ignore the various fundamental rights will be affected, other than Article 11 EU Charter, some of the conflicts should at least be highlighted.


First, the enforcement of copyright under Article 17 will be largely conducted by intermediaries in cooperation with rightholders (see Recital 66). Entrusting this delicate task to private companies could remove the control over the exercise of the rights of users from public scrutiny. Although Article 17(9) foresees that users should have recourse to dispute settlement mechanisms, including out-of-court mechanisms, crucial decisions (to block or not to block) will be made by private enterprises. This has significant repercussions on the right to a fair trial, and in general the right to an effective remedy under Article 47 EU Charter. Neither does Article 17 set out concrete procedural safeguards, nor is it guaranteed that beyond the vague obligation that decisions to remove content must be subject to “human review” that these decisions will be made impartially.


Second, to fulfil the obligation to make certain content unavailable, but also to ensure that lawful uses can still be made on their services, OCSSPs will have to make difficult decisions and have to invest significantly in infrastructure and human resources. The right to conduct a business under Article 16 EU Charter must therefore also be taken into consideration (for a more detailed analysis see here and here).


Third, the right to property of rightholders cannot be eliminated from this delicate equation. They have a vested interest in the protection of their works and related rights on hosting platforms. After all, the initial purpose of Article 17 CDSM Directive was to improve the position of rightholders and their interests protected by the fundamental right to (intellectual) property under Article 17(2) EU Charter.


A more general problem with Article 17 is that it is incredibly vague. During the process of implementation (the deadline for transposition of the Directive will expire on 7 June 2021 and therefore before the final judgment in the Polish challenge will be handed down), it has become apparent that Member States have very different ideas on how Article 17 including all its elements should be implemented. This will most likely result in differing national appreciations of the balance between the fundamental rights (see here and here). However, the CJEU has most recently stated that ‘any limitation on the exercise of fundamental rights must be provided for by law implies that the legal basis which permits the interference with those rights must itself define the scope of the limitation on the exercise of the right concerned’ (C-311/18, Schrems II, para. 175, see here, here and here).


Outlook


The hearing on this action for annulment was held in November 2020 and it will most likely take until late spring until the Opinion by Advocate General Saugmandsgaard Øe is published. However, the main rifts and the questions that will have to be answered became apparent during the hearing (see here). One strand of questions sought to clarify the normative balance within Article 17, in particular the relation between the interests of rightholders and those of users. The DSM Directive is not clear about whether a possibly infringing upload should be blocked initially or whether content should remain online until it has been proved to be unlawful; or whether there is some middle-ground that allows for some consent to be blocked initially and other content to be removed only after some sort of procedure. This is linked to the question as to how far automated mechanisms are able to perform the tasks of judges, i.e. to determine whether a particular upload is infringing or not.


But these questions go to the heart of the fundamental rights conflict and it is uncomfortable to imagine that these important decisions will be left to a large extent to private actors (the authors suggest delegating some of these tasks to an independent body to ensure fundamental rights compliance and oversight, see here).


A concern only raised implicitly is the vagueness of Article 17 and its inherent contradictions. Diverging national transpositions will make compliance and enforcement management significantly more difficult – and expensive – for platforms that operate in the entire EU and beyond. This arguably fails to achieve the objective of market harmonization (see Recital 2 CDSM Directive). Whether the Court will venture into this treacherous territory remains to be seen, but it should not overlook the implications of the lack of legal certainty on the fundamental rights of users and online platforms.


So there are many reasons that should lead the Court to annul the contested provision and give the European legislator the opportunity to elaborate a revised liability regime for platforms and to implement it in a fundamental rights compliant manner. This could be done without difficulty in the context of the ongoing discussion of the proposed Digital Services Act (see here), which main purpose is in fact to regulate the activities and responsibilities of platforms. The proposed regulation would be a great opportunity to create a unified standard for illegal content coupled with appropriate independent EU institutional control, an opportunity that has, so far, been overlooked by the EU legislator.


Source: https://europeanlawblog.eu/

3 views0 comments

Comments


bottom of page