Language of document : ECLI:EU:C:2021:613

OPINION OF ADVOCATE GENERAL

SAUGMANDSGAARD ØE

delivered on 15 July 2021 (1)

Case C401/19

Republic of Poland

v

European Parliament,

Council of the European Union

(Action for annulment – Directive (EU) 2019/790 – Copyright and related rights – Use of protected content by online content-sharing service providers – Communication to the public – Liability of those providers – Article 17 – Exemption from liability – Article 17(4)(b) and (c), in fine – Filtering of content uploaded by users – Freedom of expression and information – Charter of Fundamental Rights of the European Union – Article 11(1) – Compatibility – Safeguards governing such filtering)






I.      Introduction

1.        By the present action, brought on the basis of Article 263 TFEU, the Republic of Poland asks the Court, principally, to annul Article 17(4)(b) and (c), in fine, of Directive (EU) 2019/790 of the European Parliament and of the Council of 17 April 2019 on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (2) and, in the alternative, to annul Article 17 in its entirety.

2.        This action asks the Court to examine the question of the liability borne by providers of online sharing services when content that is protected by copyright or related rights is uploaded (3) by users of those services.

3.        This issue has already been brought to the Court’s attention in Joined Cases C‑682/18, YouTube, and C‑683/18, Cyando, from the perspective of the framework provided by Directive 2000/31/EC on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) (4) and by Directive 2001/29/EC on the harmonisation of certain aspects of copyright and related rights in the information society. (5) This case involves examining Article 17 of Directive 2019/790, which provides for a new liability regime applicable to online sharing service providers.

4.        As I will explain in this Opinion, that provision imposes on those providers obligations to monitor the content posted by the users of their services in order to prevent the uploading of protected works and subject matter which the rightholders do not wish to make accessible on those services. Such preventive monitoring will, as a general rule, take the form of filtering that content using software tools.

5.        That filtering raises complex questions, put forward by the applicant, with regard to the freedom of expression and information of users of sharing services, guaranteed in Article 11 of the Charter of Fundamental Rights of the European Union (‘the Charter’). Further to its judgments in Scarlet Extended, (6)SABAM (7) and Glawischnig-Piesczek, (8) the Court will have to determine whether, and as the case may be the circumstances in which, such filtering is compatible with that freedom. It will have to take account of the advantages, but also the risks of such filtering and, in that connection, ensure that a ‘fair balance’ is maintained between, on the one hand, the interest of rightholders in the effective protection of their intellectual property and, on the other, the interest of those users, and the general public, in the free flow of information online.

6.        In this Opinion, I shall explain that, in my view, the EU legislature may, while observing freedom of expression, impose certain monitoring and filtering obligations on certain online intermediaries, provided, however, that those obligations are circumscribed by sufficient safeguards to minimise the impact of such filtering on that freedom. Since Article 17 of Directive 2019/790 contains, in my view, such safeguards, I shall propose that the Court should rule that that provision is valid and, consequently, that it should dismiss the action brought by the Republic of Poland. (9)

II.    Legal framework

A.      Directive 2000/31

7.        Article 14 of Directive 2000/31, entitled ‘Hosting’, provides, in paragraph 1:

‘Where an information society service is provided that consists of the storage of information provided by a recipient of the service, Member States shall ensure that the service provider is not liable for the information stored at the request of a recipient of the service, on condition that:

(a)      the provider does not have actual knowledge of illegal activity or information and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or information is apparent; or

(b)      the provider, upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the information.’

8.        Article 15 of that directive, entitled ‘No general obligation to monitor’, provides, in paragraph 1:

‘Member States shall not impose a general obligation on providers, when providing the services covered by Articles 12, 13 and 14, to monitor the information which they transmit or store, nor a general obligation actively to seek facts or circumstances indicating illegal activity.’

B.      Directive 2001/29

9.        Article 3 of Directive 2001/29, entitled ‘Right of communication to the public of works and right of making available to the public other subject-matter’, provides, in paragraphs 1 and 2:

‘1.      Member States shall provide authors with the exclusive right to authorise or prohibit any communication to the public of their works, by wire or wireless means, including the making available to the public of their works in such a way that members of the public may access them from a place and at a time individually chosen by them.

2.      Member States shall provide for the exclusive right to authorise or prohibit the making available to the public, by wire or wireless means, in such a way that members of the public may access them from a place and at a time individually chosen by them:

(a)      for performers, of fixations of their performances;

(b)      for phonogram producers, of their phonograms;

(c)      for the producers of the first fixations of films, of the original and copies of their films;

(d)      for broadcasting organisations, of fixations of their broadcasts, whether those broadcasts are transmitted by wire or over the air, including by cable or satellite.’

10.      Article 5 of that directive, entitled ‘Exceptions and limitations’, provides, in paragraph 3:

‘Member States may provide for exceptions or limitations to the rights provided for in Articles 2 and 3 in the following cases:

(d)      quotations for purposes such as criticism or review …;

(k)      use for the purpose of caricature, parody or pastiche;

…’

C.      Directive 2019/790

11.      Article 17 of Directive 2019/790, entitled ‘Use of protected content by online content-sharing service providers’, provides:

‘1.      Member States shall provide that an online content-sharing service provider performs an act of communication to the public or an act of making available to the public for the purposes of this Directive when it gives the public access to copyright-protected works or other protected subject matter uploaded by its users.

An online content-sharing service provider shall therefore obtain an authorisation from the rightholders referred to in Article 3(1) and (2) of [Directive 2001/29], for instance by concluding a licensing agreement, in order to communicate to the public or make available to the public works or other subject matter.

2.      Member States shall provide that, where an online content-sharing service provider obtains an authorisation, for instance by concluding a licensing agreement, that authorisation shall also cover acts carried out by users of the services falling within the scope of Article 3 of [Directive 2001/29] when they are not acting on a commercial basis or where their activity does not generate significant revenues.

3.      When an online content-sharing service provider performs an act of communication to the public or an act of making available to the public under the conditions laid down in this Directive, the limitation of liability established in Article 14(1) of [Directive 2000/31] shall not apply to the situations covered by this Article.

The first subparagraph of this paragraph shall not affect the possible application of Article 14(1) of [Directive 2000/31] to those service providers for purposes falling outside the scope of this Directive.

4.      If no authorisation is granted, online content-sharing service providers shall be liable for unauthorised acts of communication to the public, including making available to the public, of copyright-protected works and other subject matter, unless the service providers demonstrate that they have:

(a)      made best efforts to obtain an authorisation, and

(b)      made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information; and in any event

(c)      acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads in accordance with point (b).

5.      In determining whether the service provider has complied with its obligations under paragraph 4, and in light of the principle of proportionality, the following elements, among others, shall be taken into account:

(a)      the type, the audience and the size of the service and the type of works or other subject matter uploaded by the users of the service; and

(b)      the availability of suitable and effective means and their cost for service providers.

7.      The cooperation between online content-sharing service providers and rightholders shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or limitation.

Member States shall ensure that users in each Member State are able to rely on any of the following existing exceptions or limitations when uploading and making available content generated by users on online content-sharing services:

(a)      quotation, criticism, review;

(b)      use for the purpose of caricature, parody or pastiche.

8.      The application of this Article shall not lead to any general monitoring obligation.

Member States shall provide that online content-sharing service providers provide rightholders, at their request, with adequate information on the functioning of their practices with regard to the cooperation referred to in paragraph 4 and, where licensing agreements are concluded between service providers and rightholders, information on the use of content covered by the agreements.

9.      Member States shall provide that online content-sharing service providers put in place an effective and expeditious complaint and redress mechanism that is available to users of their services in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them.

Where rightholders request to have access to their specific works or other subject matter disabled or to have those works or other subject matter removed, they shall duly justify the reasons for their requests. Complaints submitted under the mechanism provided for in the first subparagraph shall be processed without undue delay, and decisions to disable access to or remove uploaded content shall be subject to human review. Member States shall also ensure that out-of-court redress mechanisms are available for the settlement of disputes. Such mechanisms shall enable disputes to be settled impartially and shall not deprive the user of the legal protection afforded by national law, without prejudice to the rights of users to have recourse to efficient judicial remedies. In particular, Member States shall ensure that users have access to a court or another relevant judicial authority to assert the use of an exception or limitation to copyright and related rights.

This Directive shall in no way affect legitimate uses, such as uses under exceptions or limitations provided for in Union law …

Online content-sharing service providers shall inform their users in their terms and conditions that they can use works and other subject matter under exceptions or limitations to copyright and related rights provided for in Union law.

10.      As of 6 June 2019 the Commission, in cooperation with the Member States, shall organise stakeholder dialogues to discuss best practices for cooperation between online content-sharing service providers and rightholders. The Commission shall, in consultation with online content-sharing service providers, rightholders, users’ organisations and other relevant stakeholders, and taking into account the results of the stakeholder dialogues, issue guidance on the application of this Article, in particular regarding the cooperation referred to in paragraph 4. When discussing best practices, special account shall be taken, among other things, of the need to balance fundamental rights and of the use of exceptions and limitations. For the purpose of the stakeholder dialogues, users’ organisations shall have access to adequate information from online content-sharing service providers on the functioning of their practices with regard to paragraph 4.’

III. Facts giving rise to the present action

A.      The proposal for a directive on copyright in the Digital Single Market

12.      On 14 September 2016, the European Commission submitted a proposal for a directive on copyright in the Digital Single Market. (10) The aim of that proposal was to adapt the EU rules in the field of literary and artistic property – copyright and rights related to copyright – in particular Directive 2001/29, to the evolution of digital technologies. (11) It also sought further to harmonise that area in a way that, whilst continuing to guarantee a high level of protection of intellectual property, ensures that creative content is widely available throughout the European Union and maintains a ‘fair balance’ with other public interests in the digital environment.

13.      In that context, Article 13 of that proposal sought, more specifically, to remedy the ‘Value Gap’, namely the perceived gap between the value that online sharing service providers derive from protected works and subject matter and the revenue they distribute to rightholders. (12)

14.      It that regard, it should be recalled that the services in question, which are characteristic of interactive ‘Web 2.0’ services, and of which YouTube, (13) Soundcloud and Pinterest are the best-known examples, enable anyone automatically to upload the content they wish, without prior selection by their providers. The content uploaded by the users of those services – commonly referred to as ‘user-generated content’ or ‘user-uploaded content’ – can then be streamed from the websites or applications for smart devices associated with those services, that viewing being facilitated by indexing, search and recommendation functionalities which are generally found on those websites and applications – in most cases free of charge – as the providers of those services are usually remunerated from the sale of advertising space. A huge amount of content (14) is thus made available to the public on the Internet, including a significant proportion of works and other protected subject matter.

15.      Since 2015, rightholders, and in particular those in the music industry, have claimed that, while those sharing services are, de facto, an important part of the online distribution of protected works and other subject matter and their providers bring in considerable advertising revenue from them, they do not remunerate rightholders fairly. The revenue that those providers distribute to those same rightholders is insignificant compared to the amount that providers of music streaming services – such as Spotify – pay them, even though those two types of service are often seen by consumers as equivalent sources of access to that subject matter. This is also said to lead to unfair competition between those services. (15)

16.      In order to understand the ‘Value Gap’ argument properly, we must look back to the legal framework which applied before the adoption of Directive 2019/790 and the uncertainties surrounding it.

17.      First, Article 3(1) of Directive 2001/29 grants authors the exclusive right to authorise or prohibit any ‘communication to the public’ of their works, including the ‘making available to the public’ of those works in such a way that members of the public may access them from a place and at a time individually chosen by them. (16) Similar rights are granted to holders of related rights over their protected subject matter (17) under Article 3(2) of that directive. (18) In principle, (19) a third party cannot therefore ‘communicate to the public’ a work or subject matter without having, beforehand, obtained authorisation from the rightholder or rightholders of that work or subject matter, which generally takes the form of a licensing agreement, granted in return for remuneration. (20) While it has always been clear that the uploading, by a user, of a work or protected subject matter to a sharing service constitutes an act of ‘communication to the public’ requiring such prior authorisation, the question whether the providers of those services should themselves conclude licensing agreements and remunerate rightholders had become a subject of controversy between those providers and rightholders. (21)

18.      Secondly, Article 14 of Directive 2000/31 contains a ‘safe harbour’ for providers of information society services that consists of the storage of information provided by third parties. That provision provides, in essence, that the provider of such a service is exempt from any liability which may arise (22) from illegal content which it stores at the request of the users of that service, provided that it is unaware of it or, where it becomes aware, it removes it expeditiously. Again, there was controversy as to whether the providers of online sharing services could benefit from that exemption in the field of copyright. (23)

19.      That controversy was even greater as the Court had not had occasion to settle those disputes until now. (24)

20.      In that context, some sharing service providers had simply refused to conclude licensing agreements with rightholders for the protected works and subject matter uploaded by users of their services, taking the view that they were not required to do so. Other providers had nevertheless agreed to enter into such agreements, but the terms of those agreements were not fair, according to rightholders, as they were unable to negotiate on an equal footing with those service providers. (25)

21.      That said, the proposal for a directive was therefore aimed at, first, enabling rightholders to obtain better remuneration for the use of their works and other protected subject matter on online sharing services, by upholding the obligation, for the providers of those services, to conclude licensing agreements with those rightholders. (26)

22.      Secondly, that proposal aimed to enable rightholders to control more easily the use of their works and protected subject matter on the services in question. In that regard, Article 13 of the proposal for a directive required the providers of such services, in essence, to use automatic content recognition tools that had already been introduced, voluntarily, by some of them, that is to say, IT tools, the operation of which will be described below, (27) which may be used inter alia when a user uploads content – hence such tools commonly being called an ‘upload filter’ – in order to verify, through an automated process, whether that content includes a work or other protected subject matter and, if that is the case, to block its dissemination. (28)

23.      The proposal for a directive, in particular Article 13 thereof, has given rise to numerous debates within the Parliament and the Council during the legislative process. That process has also been marked by intensive lobbying campaigns on the part of the economic operators concerned and demonstrations of opposition from part of civil society, academic circles and advocates for freedom of expression, who argue that the obligation for sharing service providers to install ‘upload filters’ would, in their view, be likely to have harmful effects on that freedom. (29)

24.      The proposal for a directive was finally approved by the Parliament on 26 March 2019 and by the Council on 16 April 2019. (30) That proposal was officially adopted as Directive 2019/790 on 17 April 2019. It had to be transposed by the Member States by 7 June 2021 at the latest. (31)

B.      Article 17 of Directive 2019/790

25.      During that legislative process, Article 13 of the proposal for a directive underwent various amendments. It has been adopted, using substantially different wording, as Article 17 of Directive 2019/790. It seems to me appropriate at this stage to set out some of the key aspects.

26.      First, Article 17 of Directive 2019/790 is targeted at ‘online content-sharing service providers’, as stated in its title. (32) That concept is defined in the first subparagraph of Article 2(6) of that directive as referring to any ‘provider of an information society service of which the main or one of the main purposes is to store and give the public access to a large amount of copyright-protected works or other protected subject matter uploaded by its users, which it organises and promotes for profit-making purposes’. Irrespective of the open nature of the terms used, it is clear from that definition that Article 17 concerns the ‘large’ sharing service providers deemed to be linked to the ‘Value Gap’, (33) and whose operation that definition is clearly intended to reflect. (34)

27.      Secondly, the first subparagraph of Article 17(1) of Directive 2019/790 stipulates that a sharing service provider ‘performs an act of communication to the public or an act of making available to the public for the purposes of this Directive when it gives the public access to copyright-protected works or other protected subject matter uploaded by its users’. Therefore, as stated in the second subparagraph of that paragraph, those providers must, in principle, obtain an authorisation from the rightholders, for instance by concluding a licensing agreement, for the use on their service of protected content uploaded by users. (35) The EU legislature has, therefore, settled ex lege the dispute referred to in point 17 of this Opinion in favour of said rightholders. (36)

28.      That obligation is directly related to the general objective pursued by Article 17 of Directive 2019/790, namely ‘to achieve a well-functioning and fair marketplace for copyright’, (37) by fostering the development of ‘the licensing market between rightholders and … sharing service providers’. The aim is to strengthen the position of those rightholders during the negotiation (or renegotiation) of licensing agreements with those providers, in order to ensure that those agreements are ‘fair’ and keep a ‘reasonable balance between both parties’ (38) – and, in so doing, to remedy the ‘Value Gap’. The negotiating position of those rightholders is further strengthened by the fact that, in principle, they are not obliged to conclude such agreements with those providers. (39)

29.      Thirdly, Article 17 of Directive 2019/790 specifies, in paragraph 3, that, when a sharing service provider performs an act of ‘communication to the public’ or ‘making available to the public’ under the conditions laid down in paragraph 1 of that article, the exemption from liability provided for in Article 14 of Directive 2000/31 does not apply. (40)

30.      Fourthly, Article 17(4) states that, where sharing service providers have not obtained authorisation from the rightholders, they are liable for ‘unauthorised’ (41) acts of communication to the public performed through their services. This is a logical consequence of the foregoing: since those providers are now deemed to perform acts of ‘communication to the public’ when they ‘give access’ to works and other protected subject matter uploaded by the users of their services, they bear direct (or ‘primary’) liability in the event of unlawful ‘communication’.

31.      In principle, the direct liability of the person who performs an unlawful act of ‘communication to the public’ is strict. (42) Sharing service providers should therefore be automatically liable each time a work or protected subject matter is illegally uploaded to their services. In that regard, they could inter alia be ordered to pay potentially significant damages to the rightholders concerned. (43)

32.      Nevertheless, since, first, it is the users of sharing services who upload the content found on them, without their providers making a prior selection in that regard, (44) and, secondly, those providers will probably not be able to obtain authorisation from all rightholders for all of the protected works and other subject matter, present and future, which could therefore be uploaded to them, (45) such strict liability would have required those providers completely to change their economic model – and, in so doing, to abandon the very model of the interactive ‘Web 2.0’.

33.      Therefore, the EU legislature took the view that it was appropriate to provide for a specific liability mechanism for those providers. (46) In accordance with Article 17(4) of Directive 2019/790, they may, in the event of unlawful ‘communication to the public’ through their services, exempt themselves from all liability by demonstrating that they have:

‘(a)      made best efforts to obtain an authorisation, and

(b)      made, in accordance with high industry standards of professional diligence, best efforts to ensure the unavailability of specific works and other subject matter for which the rightholders have provided the service providers with the relevant and necessary information; and in any event

(c)      acted expeditiously, upon receiving a sufficiently substantiated notice from the rightholders, to disable access to, or to remove from their websites, the notified works or other subject matter, and made best efforts to prevent their future uploads in accordance with point (b).’

34.      Two of those cumulative conditions are at the heart of the present action. The other paragraphs of Article 17 of Directive 2019/790 will be presented in the course of the examination of this action. (47)

IV.    Procedure before the Court and forms of order sought

35.      By application lodged at the Registry of the Court of Justice on 24 May 2019, the Republic of Poland brought the present action.

36.      The Republic of Poland claims that the Court should:

–        annul Article 17(4)(b) and (c), in fine, that is to say, in so far as the wording ‘and made best efforts to prevent their future uploads in accordance with point (b)’ is concerned, of Directive 2019/790;

–        in the alternative, should the Court consider that the contested provisions cannot be separated from the remainder of Article 17 of that directive without changing its substance, annul that article in its entirety;

–        order the Parliament and the Council to pay the costs.

37.      The Parliament contends that the Court should:

–        dismiss the action as unfounded;

–        order the Republic of Poland to pay the costs.

38.      The Council contends that the Court should:

–        reject the principal claims as inadmissible;

–        in the alternative, dismiss the action as unfounded in its entirety;

–        order the Republic of Poland to pay the costs.

39.      By decision of the President of the Court of 17 October 2019, the Kingdom of Spain, the French Republic, the Portuguese Republic and the European Commission were granted leave to intervene in support of the forms of order sought by the Parliament and the Council. Statements in intervention were lodged by all of the interveners, with the exception of the Portuguese Republic.

40.      The parties and the interveners, with the exception of the Portuguese Government, were represented at the hearing held on 10 November 2020.

V.      Analysis

41.      In support of its action, the Republic of Poland raises a single plea in law, alleging infringement of the right to freedom of expression and information guaranteed by Article 11(1) of the Charter. (48) Before examining the merits of that plea in law (section B), I shall briefly address the admissibility of the application (section A).

A.      Admissibility

42.      The Parliament, the Council, the French Government and the Commission submit that the principal form of order sought in the application, in so far as it seeks the annulment only of Article 17(4)(b) and (c), in fine, of Directive 2019/790, is inadmissible. I also take this view.

43.      In accordance with the Court’s settled case-law, partial annulment of an EU act is possible only if the elements the annulment of which is sought may be severed from the remainder of that act. That requirement is not satisfied where that partial annulment would have the effect of altering the substance of said act. (49)

44.      The annulment of only points (b) and (c), in fine, of paragraph 4 would clearly alter the substance of Article 17 of Directive 2019/790. As the Parliament, the Council, the French Government and the Commission submit, the various provisions of Article 17 constitute, as a whole, a ‘complex’ liability regime which reflects the balance sought by the EU legislature between the rights and interests of sharing service providers, the users of their services and rightholders. The annulment of only the contested provisions would have the consequence of replacing that liability regime with a regime that is both substantially different and significantly more favourable to those providers. In other words, a partial annulment of that kind would be tantamount to the Court revising Article 17, which it cannot do in annulment proceedings under Article 263 TFEU.

45.      However, it is common ground between the parties that the form of order sought by the applicant in the alternative, by which it requests the annulment of Article 17 of Directive 2019/790 in its entirety, is admissible. As important as that article may be, its annulment would not change the substance of that directive. The many articles of Directive 2019/790 have a variety of purposes and are divided into different titles and chapters. Accordingly, Article 17 of that directive may be severed from its other articles, which could wholly remain in force if Article 17 were annulled. (50)

B.      Substance

46.      The single plea in law raised by the Republic of Poland can be summarised in a few words. In essence, the Republic of Poland submits that, in accordance with Article 17(4)(b) and (c), in fine, of Directive 2019/790, sharing service providers are obliged, in order to be exempt from any liability in the event of the unlawful ‘communication to the public’ of works or other protected subject matter on their services, to carry out preventive monitoring of the content users wish to upload. To do this, they must use software tools which enable the automatic filtering of such content. That preventive monitoring is said to constitute a limitation on the exercise of the right to freedom of expression, guaranteed in Article 11 of the Charter. That limitation is not compatible with the Charter since it undermines the ‘essence’ of that fundamental right or, at the very least, fails to comply with the principle of proportionality.

47.      In defence, the Parliament and the Council, supported by the Spanish and French Governments and the Commission, dispute each of those points. I shall therefore examine them in turn in the following sections. I shall consider, first, the scope of the contested provisions (section 1). Next, I shall address the issue of the limitation on the exercise of the right to freedom of expression and information (section 2) and, finally, that of the compatibility of that limitation with the Charter (section 3).

1.      The scope of the contested provisions

48.      In order properly to understand the scope of the conditions for exemption from liability provided for in Article 17(4)(b) and (c), in fine, of Directive 2019/790, it is useful to have in mind, as a point of comparison, those contained in Article 14 of Directive 2000/31. In essence, under that article, a provider is exempt from any liability which may arise from illegal information which it stores at the request of a user of its service provided that, first, it is unaware of it or that, secondly, where it becomes aware, it has expeditiously removed that information or has blocked access to it. In practice, such a provider is not expected to monitor the information on its servers and actively to search for illegal information on those servers. (51) However, where the existence and the location of such illegal information is brought to its knowledge, as a general rule by means of a notice sent by a third party, that provider must react by removing the information in question or blocking access to it – following a system of ‘notice and take down’. (52)

49.      By contrast, as the applicant submits, in order to satisfy the conditions laid down in the contested provisions, sharing service providers must carry out preventive monitoring of the information uploaded by the users of those services (section (a)). In order to carry out such monitoring, those providers, in many situations, will have to use software tools which enable the automatic filtering of that content (section (b)).

(a)    Preventive monitoring of content uploaded by users …

50.      In the first place, I note that, first, in accordance with Article 17(4)(b) of Directive 2019/790, sharing service providers must make ‘in accordance with high industry standards of professional diligence, best efforts’ to ‘ensure the unavailability’ of specific works and other protected subject matter for which the rightholders have provided the service providers with the relevant and necessary information.

51.      Secondly, in accordance with point (c) of that paragraph, where they receive a sufficiently substantiated notice from the rightholders, concerning the presence of works or other protected subject matter on their services, those providers must not only act expeditiously to disable access to that subject matter or remove it from their websites, (53) but must also make ‘best efforts’ to ‘prevent [its] future uploads’ – following, this time, the logic of ‘notice and stay down’.

52.      In short, the contested provisions impose on sharing service providers obligations of diligence – or, in other words, obligations to use best endeavours (54) – with regard to the monitoring of their services. In order to ‘ensure the unavailability’ of the works and other protected subject matter identified by the rightholders and ‘prevent their future uploads’, those providers must take ‘all the steps that would be taken by a diligent operator’ (55) actively to detect and disable access to or remove, from the mass of content uploaded by users, content which reproduces the subject matter in question. (56)

53.      That interpretation is confirmed by the objective pursued in Article 17 of Directive 2019/790. Under Article 14 of Directive 2000/31, rightholders had to monitor sharing services and inform their providers, by means of notifications, of infringing content found on the service so that providers could remove it. As the Council has pointed out, the EU legislature considered, when adopting Article 17, that such a system placed too heavy a burden on rightholders and did not allow them to control effectively the use of their works and other protected subject matter on those services. (57) In particular, removed content was often re-uploaded soon afterwards, which forced rightholders to increase the number of notices. (58) In order to address the issue, the contested provisions transfer to the sharing service providers the responsibility for monitoring their services. (59)

54.      In the second place, as the Republic of Poland submits, in order to achieve the objectives set out in the contested provisions, sharing service providers must seek to prevent – ex ante – infringing content from being uploaded, and no longer simply remove such content ex post.

55.      In that regard, it is clear from recital 66 of Directive 2019/790 that, in accordance with Article 17(4)(b) of that directive, sharing service providers must seek to ‘avoid’ works and other protected subject matter identified by rightholders from ‘[becoming] available’ on their services. Article 17(4)(c) is even more explicit as to the nature of the measures to be taken, since it states that those providers must seek to ‘prevent … future uploads’ of works or other subject matter which have been notified by rightholders. The phrase ‘in accordance with point (b)’ also emphasises that the same is expected from those providers in both points: they must seek to prevent the uploading – or the re-uploading, in the context of the ‘stay down’ system – of certain illegal content on their services.

56.      That interpretation is, again, confirmed by the objective pursued by Article 17 of Directive 2019/790 of enabling rightholders to monitor more easily the use of their works on sharing services. As the Council has submitted, that provision seeks to reaffirm the exclusive nature of the right of ‘communication to the public’ in the digital environment. The obligations of diligence imposed on sharing service providers by the contested provisions seek to ensure that those rightholders can effectively ‘intervene, between possible users of their work and the communication to the public which such users might contemplate making’ (60) on those services. As the Parliament and the Council have pointed out, providers must therefore endeavour to intervene before content is uploaded, that is to say, before the works or protected subject matter which that content may reproduce are actually ‘communicated to the public’ in breach of that exclusive right.

(b)    … which, in many cases, will require the use of filtering tools

57.      In this section of the Opinion, I feel it would be useful to explain that a number of software tools enable the automatic detection of specific information that is uploaded to or present on a server. In particular, there are automatic content recognition (ACR) tools, which are based on various techniques, namely – from the simplest to the most complex – hashing, watermarking and fingerprinting. (61)

58.      Since the second half of the 2000s, such tools, particularly using the latter technique, (62) have been introduced on a voluntary basis by some sharing service providers, in order in particular (63) actively to search for infringing content on their services. (64) ‘Digital fingerprinting’ recognition tools can automatically filter rightholders’ protected works and subject matter from the content uploaded to sharing services by comparing that content when it is uploaded or once it has been posted with reference information provided by those rightholders. (65) When that comparison identifies a match, those tools generally give the rightholders concerned the choice of deciding manually or automatically to block the content in question, to authorise its upload and track its popularity through viewer statistics or even to ‘monetise’ that content by inserting advertisements. (66)

59.      The proposed directive embraced those technological developments. The Impact Assessment highlighted the effectiveness of recognition tools using ‘digital fingerprinting’ with regard to counterfeiting and their increased availability on the market. That proposal sought, I would reiterate, (67) to make it compulsory for sharing service providers to put those tools into place, the aim being to force those who had not yet done so to ‘catch up’ and to require others to give rightholders transparent access to their recognition tools. (68)

60.      As has been pointed out by the Parliament, the Council and the Spanish Government, the final version of Directive 2019/790 no longer contains explicit references to automatic content recognition tools. Article 17(4)(b) and (c) of that directive are drafted in general terms. Those provisions do not formally require sharing service providers to adopt specific measures or techniques in order to attain the objectives they pursue. (69)

61.      According to the defendants and the interveners, the contested provisions do not therefore require those providers to use such tools. The latter are said to have ‘wiggle room’ in respect of the measures and techniques to be implemented in order to attain the objectives referred to in those provisions. In that context, those providers may ‘choose’ to use such tools – or to continue to do so, in the case of those who already use them – or even develop ‘innovative solutions’. (70) In any event, in accordance with Article 17(5) of Directive 2019/790, the measures required of those providers must be examined on a case-by-case basis, in the light of the principle of proportionality.

62.      That said, as the Republic of Poland submits, it seems to me that the contested provisions do force sharing service providers, in many situations, to use those content recognition tools. (71) In my view, the EU legislature has simply changed its approach between the proposal for a directive and its adoption as Directive 2019/790. Rather than directly providing for an obligation to put those tools into place, it has indirectly imposed their use by means of the conditions for exemption from liability laid down in those provisions.

63.      First, as the applicant has rightly pointed out, the factual context of the contested provisions must be borne in mind. Article 17 of Directive 2019/790 concerns service providers which store and give the public access to a ‘large amount of copyright-protected works or other protected subject matter’. In other words, these are operators which manage a significant, or even huge, volume of content. Moreover, those sharing services are provided on a continuous basis and are open to a considerable number of users, and therefore substantial volumes of new content can be uploaded at any time.

64.      In that context, it is clear to me that, as the applicant submits, the employees of sharing service providers would not be able to check all or even most of the content uploaded (72) – a fact which the Parliament also acknowledges. Therefore, I find it difficult to see by what means other than the use of an automatic recognition tool enabling them to filter the content uploaded to their services those providers would reasonably be able to ‘ensure the unavailability’ of protected works and subject matter identified by rightholders and ‘prevent their future upload’ to their services, in accordance with the objectives set out in the contested provisions (73) – and the reference by the Parliament and the Council to possible ‘innovative solutions’ is not particularly helpful in that respect. (74) Furthermore, the defendants and the interveners acknowledged in a roundabout way at the hearing that those tools will often be, de facto, essential in that regard. (75)

65.      Secondly, I recall that sharing service providers, in order to comply with the obligations of diligence imposed on them, in accordance with the wording of Article 17(4)(b) of Directive 2019/790, have to take measures which meet ‘high industry standards of professional diligence’. As stated in the second paragraph of recital 66 of that directive, account must be taken in that regard of ‘best industry practices’ and the ‘state of the art’.

66.      As I explained in point 58 of this Opinion, recognition tools using ‘digital fingerprinting’ are already used by various sharing service providers, in respect of several types of content. (76) Other providers accepting such content on their services would therefore appear to be obliged, in order to comply with the obligations of diligence resulting from the contested provisions, to conform to ‘best industry practices’ and the ‘state of the art’ by putting those tools into place to filter those categories of content.

67.      Admittedly, as the defendants and the interveners have pointed out, in accordance with Article 17(5) of Directive 2019/790, the measures to be taken by sharing service providers must, in each case, comply with the principle of proportionality. In that regard, account must be taken of, first, ‘the type, the audience and the size of the service and the type of works or other protected subject matter uploaded by the users of the service’ and, secondly, ‘the availability of suitable and effective means and their cost for service providers’. (77) In that context, it cannot be ruled out that, in specific cases, it would be contrary to that principle to require certain providers to use a content recognition tool. It also appears that, in the current state of technology, those tools are neither suitable nor effective as regards certain specific types of protected works and subject matter. (78)

68.      However, those specific cases aside, it is clear, to me, that, in all situations in which various appropriate and effective tools are available on the market and are not unreasonably expensive, sharing service providers are a priori required to put them into place in order to demonstrate that they have made ‘best efforts’ to prevent the uploading of illegal content and, therefore, to comply with the contested provisions. (79) Where appropriate, in accordance with the principle of proportionality, they are able to choose from among the available tools those which are best adapted to their situations and the resources available to them (80) – or even, for the wealthiest among them, to develop such a tool in-house.

69.      In short, in order to demonstrate, in accordance with the contested provisions, that they have made ‘in accordance with high industry standards of professional diligence, best efforts’ to ‘ensure the unavailability’ of protected works and subject matter identified by rightholders and ‘prevent their future uploads’ to their services, sharing service providers must, in many cases, put into place automatic content recognition tools, in order to filter the content that users upload and, where appropriate, block certain content before it is uploaded. (81)

2.      The existence of a limitation on the exercise of the right to freedom of expression and information

70.      As the scope of the contested provisions has been clarified, I must now begin the examination of those provisions in the light of the right to freedom of expression and information.

71.      The right guaranteed in Article 11 of the Charter, which ‘shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers’, corresponds to that provided for in Article 10 of the European Convention for the Protection of Human Rights and Fundamental Freedoms, signed in Rome on 4 November 1950 (‘the ECHR’). (82) Under Article 52(3) of the Charter, those two rights therefore have the same meaning or, at the very least, the same scope. It follows that Article 11 of the Charter must be interpreted in the light of Article 10 of the ECHR and the related case-law of the European Court of Human Rights (the ‘ECtHR’).

72.      That fundamental right is undeniably relevant in the present case. As the Republic of Poland and the Commission stated in their respective observations, the sharing services referred to in Article 17 of Directive 2019/790 are of particular importance to the freedom to receive and impart information and ideas.

73.      As the applicant submits, the uploading of content to those services – whether it be videos, photographs, texts, and so forth – therefore falls within the scope of the exercise of the right to freedom of expression and information. (83) Such uploading is also likely to affect other related freedoms. In particular, where the content in question constitutes the artistic expression of the users who upload it, its posting falls within the scope of the exercise of the freedom of the arts, guaranteed in Article 13 of the Charter and Article 10 of the ECHR. (84)

74.      I would point out that that is the case irrespective of whether or not that content infringes copyright. The argument to the contrary put forward by the Parliament is, in my view, based on a legal approximation. The fact that information is protected by copyright does not have the effect of excluding it automatically from the scope of freedom of expression. (85) While, as a general rule, a restriction on the transmission of such information is justified, this is relevant only at the stage of examining the conditions for the admissibility of such a restriction on that freedom. (86)

75.      According to the applicant, the filtering measures which sharing service providers are obliged to put into place in order to comply with Article 17(4)(b) and (c), in fine, of Directive 2019/790 are, by nature, ‘preventive measures’ to monitor users’ information. Those measures would entail ‘prior restraints’ within the meaning of the case-law of the ECtHR on Article 10 of the ECHR. The contested provisions therefore entail the introduction, on sharing services, of ‘general and automated preventive censorship’ by the providers of those services. Those provisions therefore constitute a particularly serious ‘interference’ by the EU legislature with the freedom of expression and information of those users.

76.      By contrast, the defendants and the interveners dispute that the contested provisions entail such ‘censorship’ or any form of ‘interference’ with that freedom. In particular, according to the Council, the purpose of those provisions – or of Article 17 of Directive 2019/790 in general – is not to restrict ex ante the information that may be disseminated on those services. Users are said to remain free to upload whatever content they wish. Simply, in all situations where uploaded content is protected by copyright, those providers should obtain authorisation from the rightholders concerned and, failing that, they would be liable ex post.

77.      Like the applicant, I consider that the contested provisions do in fact entail an ‘interference’ with the freedom of expression of the users of sharing services. However, I would like to clarify the terminology from the outset. It is true that the term ‘censorship’ has multiple meanings. However, it is clear from the applicant’s observations that it uses the term to refer to the idea of a prior review of information before it is disseminated. In that context, the arguments put forward by the Parliament, the Council and the Spanish Government that the term ‘censorship’ is not relevant in this case on the ground that Article 17 of Directive 2019/790 does not imply any ‘political or moral’ control of the information uploaded to sharing services are, in my view, irrelevant. In order to avoid any further confusion, in this section I shall simply use the terms ‘preventive measures’ and ‘prior restraints’.

78.      That said, Article 17 of Directive 2019/790 does not simply stipulate, as the Council submits, that sharing service providers must obtain an authorisation for the protected content uploaded by users of their services or, otherwise, they are directly liable for it. As I explained in the previous section, the contested provisions also stipulate that those providers are exempted from that liability when they make ‘best efforts’ to prevent the uploading, by those users, of content which reproduces the works and other protected subject matter identified by rightholders. Those providers are therefore required preventively to filter and block the content in question.

79.      As the applicant submits, filtering, by nature, is a ‘preventive measure’ to monitor the information disseminated on those services, and the blocking measures which may result from it constitute ‘prior restraints’ within the meaning of the case-law of the ECtHR relating to Article 10 of the ECHR: (87) in order not to suppress, but to prevent any infringement of copyright, the information which users intend to upload is monitored, and information considered likely to lead to such an infringement is restricted prior to its dissemination. (88)

80.      In those situations, contrary to the Council’s submissions, users are therefore not ‘free’ to upload whatever content they wish to sharing services. The filtering and blocking measures put into place by sharing service providers will restrict the content they can upload. This results in an ‘interference’ with the exercise of those users’ freedom of communication. The filtering and blocking of content prior to its dissemination also entail an ‘interference’ with the public’s freedom to receive information. (89)

81.      The Parliament and the Council assert in reply that sharing service providers, as private operators, may freely choose the information they wish to see disseminated using their services and, therefore, they may decide to filter and block content. Even if this were to constitute an ‘interference’ with the freedom of expression of users, that interference, in any event, would not be attributable to the EU legislature.

82.      In my view, that argument confuses two situations. It is true that, in the exercise of the freedom to conduct a business and the freedom of contract which they are guaranteed by Article 16 of the Charter, sharing services providers may, in the terms of use of their services or ‘community standards’, devise a content policy and, on their own initiative, exercise a form of ‘self-regulation’ by filtering and blocking content which, in their view, contravenes those rules. In this situation, there is no ‘interference by public authority’, within the meaning of Article 10 of the ECHR and Article 11 of the Charter, in users’ freedom of expression. (90)

83.      However, in the present case, in my view, there is no question of ‘self-regulation’ by sharing service providers. Irrespective of whether the prohibition on posting infringing content is in their terms and conditions or their ‘community standards’, the filtering and blocking of content is carried out in order to comply with the contested provisions. (91)

84.      Therefore, in my view, the ‘interference’ with the freedom of expression of users is indeed attributable to the EU legislature. It has instigated that interference. Moreover, the Parliament and the Council themselves acknowledge that the contested provisions are intended, in essence, to make sharing service providers responsible for monitoring copyright infringements committed on their services. To some extent, the legislature has delegated to those providers the task of monitoring the proper application of copyright in the digital environment. The legislature cannot delegate such a task and at the same time shift all liability to those providers for the resulting interferences with the fundamental rights of users. (92)

85.      My view in that regard has not been changed by the Council’s argument that the contested provisions do not ‘oblige’ sharing service providers to filter and block content uploaded to their services by their users on the ground that Article 17(4) of Directive 2019/790 does not, strictly speaking, impose any ‘obligation’ on those providers, but merely provides for a liability exemption mechanism which they have the ‘option’ of using where they have not obtained authorisation from the rightholders.

86.      In my view, in order to assess the compatibility of Article 17 of Directive 2019/790 with Article 11 of the Charter, account must be taken not only of its wording, but also of its actual effects. In view of the fact that, on the one hand, sharing services providers will not be able to obtain authorisation from rightholders for a number of works and other protected subject matter (93) whereas, on the other, users could, potentially, still upload content which reproduces the subject matter in question, recourse to the exemption mechanism provided for in Article 17(4) of Directive 2019/790 will be a necessity rather than an ‘option’ for those providers, otherwise they will bear a disproportionate risk of liability. Thus, in many cases, the conditions for exemption laid down in the contested provisions will, in practice, constitute genuine obligations for those providers. Moreover, I note that Article 17(5) refers to ‘obligations [on service providers] under paragraph 4’ (emphasis added).

87.      In my view, such a mechanism of liability/exemption is just as effective a technique as a direct obligation to require the economic operators concerned to filter their users’ content as a preventive measure. As I stated in point 62 of this Opinion, the EU legislature has simply changed its approach in this respect. However, these different methods have the same effects and must, for that reason, be considered in the same way with regard to fundamental rights. (94)

3.      The compatibility of that limitation with the Charter

88.      It is clear from the foregoing section that, as the Republic of Poland submits, the contested provisions entail a limitation on the exercise of the right to freedom of expression, as guaranteed by Article 11 of the Charter.

89.      However, freedom of expression is not an absolute right. In accordance with Article 52(1) of the Charter, limitations on the exercise of that freedom are permissible provided that they, first, are ‘provided for by law’, secondly, respect the ‘essence’ of that freedom and, thirdly, respect the principle of proportionality.

90.      Similarly, in accordance with Article 10(2) of the ECHR and the related case-law of the ECtHR, an interference with freedom of expression is permissible provided that it, first, is ‘prescribed by law’, secondly, pursues one or more legitimate aims defined in paragraph 2 and, thirdly, is ‘necessary in a democratic society’. (95) Although those conditions differ in part, in their wording, from those laid down in Article 52(1) of the Charter, they must, again, be regarded as having the same meaning or, at the very least, the same scope. (96)

91.      Therefore, in the following sections, I shall examine compliance with the three conditions laid down in Article 52(1) of the Charter, while interpreting them in the light of the relevant case-law of the ECtHR. In that context, I shall set out the reasons why the limitation at issue is ‘provided for by law’ (section (a)), why it respects the ‘essence’ of the right to freedom of expression (section (b)) and why, provided that Article 17 of Directive 2019/790 is interpreted correctly, it complies with the principle of proportionality (section (c)).

(a)    The limitation at issue is ‘provided for by law’

92.      In accordance with the Court’s settled case-law, the condition that any limitation on the exercise of fundamental rights must be ‘provided for by law’ within the meaning of Article 52(1) of the Charter, read in the light of the case-law of the ECtHR relating to the equivalent condition laid down in Article 10(2) of the ECHR, implies not only that that limitation must have a legal basis (‘existence of the law’), but also that that legal basis must have certain qualities of accessibility and foreseeability (‘quality of the law’). (97)

93.      In the present case, first, the limitation at issue clearly has a legal basis since it stems from provisions adopted by the EU legislature.

94.      As regards, secondly, the ‘quality’ of that legal basis, I would point out that, in accordance with the case-law of the Court (98) and that of the ECtHR, (99) the legal basis entailing a limitation on the exercise of a fundamental right must be adequately accessible and foreseeable in its effects, that is to say, formulated with sufficient clarity and precision to enable the persons concerned, if need be with appropriate advice, to regulate their conduct.

95.      I consider that the contested provisions are sufficiently clear and precise to meet that standard. It is true that the definition of ‘online content-sharing service provider’ provided for in Article 2(6) of Directive 2019/790 and the contested provisions contain several open concepts – ‘large amount of copyright-protected works or other protected subject matter’; ‘best efforts’; ‘high industry standards of professional diligence’, and so forth – which create a degree of uncertainty as to the economic operators concerned and the obligations imposed on them in each situation. However, according to the explanations provided by the Parliament and the Council, the use of those concepts is intended to ensure that those provisions can be adapted to different types of operators and situations, as well as changes in practice and technological developments, in order to be future-proof. In accordance with the case-law of the ECtHR, the EU legislature may, without undermining the requirement of ‘foreseeability’, choose to endow the texts it adopts with a certain flexibility rather than absolute legal certainty. (100) Moreover, the clarifications provided in this Opinion, and those which the Court will provide in its forthcoming judgment and in future decisions, will help to clarify those concepts and dispel the doubts surrounding them – which, again, satisfies the requirement of ‘foreseeability’. (101)

96.      That said, I note that the Court (102) and the ECtHR (103) also link to the requirement of ‘foreseeability’ the question whether the legal basis for the interference offers sufficient safeguards against the risk of arbitrary or abusive interferences with fundamental rights (in accordance with the principle of the ‘supremacy of the law’). That aspect is disputed by the applicant in the present case.

97.      Nevertheless, the question whether the contested provisions offer sufficient safeguards to protect the freedom of expression of users of sharing services against excessive or arbitrary filtering and blocking measures also concerns the proportionality of the limitation stemming from those provisions. (104) Therefore, in order to avoid repetition, I shall reserve this question for the examination of the condition relating to compliance with the principle of proportionality. (105)

(b)    The limitation at issue respects the ‘essence’ of the right to freedom of expression

98.      It should be recalled that the condition, set out in Article 52(1) of the Charter, that any limitation on the exercise of the rights and freedoms recognised by that instrument must ‘respect the essence of those rights and freedoms’ means that, where a measure undermines that ‘essence’, it cannot be justified. That measure is then deemed to be contrary to the Charter and, in the case of an act of the European Union, it must be annulled or declared invalid without it being necessary to examine the condition relating to compliance with the principle of proportionality. (106)

99.      Indeed, the EU legislature may limit the exercise of certain fundamental rights in the common interest in order to protect other rights and interests. It may do so, in particular, in order to protect another fundamental right. In that context, it has a certain margin of discretion to weigh up and strike a ‘fair balance’ between the various rights and interests involved. (107) Nevertheless, there is an absolute limit to that margin of discretion. The ‘essence’ of a fundamental right is an ‘untouchable core’ which must remain free from any interference. Accordingly, no objective, however legitimate it may be, justifies certain – exceptionally serious – interferences with fundamental rights. In other words, the end does not justify all means.

100. In the present case, according to the Republic of Poland, the contested provisions undermine the ‘essence’ of the right to freedom of expression. Since, in accordance with those provisions, preventive monitoring of content uploaded must be carried out by sharing service providers, this is said to call into question that right as such, on the ground that it involves an interference with that content, and its possible blocking, even before it is disseminated.

101. Like the defendants and the interveners, I do not share that view.

102. It is true that preventive measures for monitoring information are generally regarded as particularly serious interferences with freedom of expression (108) on account of the excesses they may entail. Those preventive measures are, in principle, disapproved of in a democratic society, on the ground that, by restricting certain information even before its dissemination, they prevent any public debate on the content, thus depriving freedom of expression of its very function as a vehicle for pluralism. (109) For those reasons, as the applicant points out, many Member States prohibit the general prior control of information in their respective constitutions.

103. Those considerations are fully relevant with regard to the Internet. As the applicant submits, the Internet is of particular importance to the freedom to receive and impart information and ideas. (110) That is the case, more specifically, in respect of large social networks and platforms, which, by enabling anyone to upload the content they wish and the public to access it, are ‘unprecedented’ tools for exercising that freedom. (111) In that respect, those platforms play a role in a form of ‘democratisation’ of the production of information and, although managed by private operators, they have in fact become essential infrastructures for online expression. (112) In the current state of forms of communication, the right to freedom of expression therefore entails, in particular, the freedom to access those platforms and express oneself on them, in principle, without interference by public authority. (113)

104. If those authorities were to impose, directly or indirectly, (114) on intermediary service providers which control those infrastructures for expression the obligation preventively to monitor, in general, the content of users of their services in search of any kind of illegal, or even simply undesirable information, that freedom of communication would be called into question as such. In my view, the ‘essence’ of the right to freedom of expression, as provided for in Article 11 of the Charter, would be affected.

105. In that context, Article 15 of Directive 2000/31 is, in my view, of fundamental importance. By providing that intermediary providers cannot be made subject to a ‘general obligation … to monitor the information which they transmit or store’, that provision prevents online information from being subject to general preventive monitoring, delegated to those intermediaries. In so doing, it ensures that the Internet remains a free and open domain. (115)

106. For that reason, I am inclined to regard the prohibition laid down in Article 15 of Directive 2000/31 as a general principle of law governing the Internet, in that it gives practical effect, in the digital environment, to the fundamental freedom of communication. (116) I note, moreover, that the Court has already brought together compliance with that freedom and that prohibition in its case-law. (117) One cannot exist without the other. It follows, in my view, that that prohibition goes beyond the scope of Article 15 of Directive 2000/31 and is binding not only on the Member States, but also on the EU legislature.

107. However, contrary to the applicant’s submissions, the fundamental right to freedom of expression, as embodied in the prohibition of ‘general monitoring obligations’, does not preclude all types of monitoring obligation.

108. As the Commission points out, in its case-law concerning injunctions which may be issued against online intermediaries, (118) the Court has acknowledged that it is possible to order an intermediary of that kind to ‘prevent’ certain offences, by carrying out a form of targeted monitoring of its service. (119) It has, accordingly, distinguished ‘general’ monitoring obligations from those which apply in ‘specific’ cases. (120) Similarly, the ECtHR does not consider preventive measures for monitoring information, including blocking orders, to be incompatible as such with Article 10 of the ECHR, provided that they fall within a specific legal framework. (121) That court even acknowledged, in its judgment in Delfi AS v. Estonia, that certain intermediaries could be expected actively to monitor their services to search for certain types of illegal information. (122)

109. The applicant asserts in reply that the monitoring obligation imposed on sharing service providers under the contested provisions is indeed ‘general’. In order to ‘ensure the unavailability’ of the works and other protected subject matter identified by the rightholders and to ‘prevent their future uploads’ onto their services, those providers, in practice, have to filter all content uploaded by all users.

110. However, like the defendants and the interveners, I consider that those provisions actually impose a ‘specific’ monitoring obligation. (123) I must nevertheless acknowledge that there has been a recent development in the case-law of the Court (124) with regard to the criterion to distinguish between ‘general’ and ‘specific’.

111. Initially, the Court appeared to focus on the amount of information to be inspected. In the judgment in L’Oréal and Others, (125) the Court held that the operator of an online marketplace cannot be required to carry out ‘active monitoring of all the data of each of its customers in order to prevent any future infringement of intellectual property rights’. In the judgment in Scarlet Extended, it took the view that an Internet service provider could not be required, by means of an injunction, to install a filtering system which applies to ‘all electronic communications passing via its services’ and therefore ‘indiscriminately to all its customers’ in order to ‘identif[y] on that provider’s network the movement of electronic files containing a musical, cinematographic or audio-visual work in respect of which the applicant claims to hold intellectual property rights, with a view to blocking the transfer of files the sharing of which infringes copyright’. (126) In the judgment in SABAM, (127) the Court adopted the same reasoning with regard to the obligation for the operator of a social network platform to install a similar filtering system. Finally, in the judgment in Mc Fadden, (128) it took the view that a wireless local area network operator could not be required to monitor ‘all of the information transmitted’ by means of that network, even if it were a question of blocking copies of a single musical work identified by the rightholder. (129)

112. Now, the Court appears to focus on the detail of searches. In that regard, in the judgment in Glawischnig-Piesczek, (130) which this time concerned the area of defamation, the Court considered that the obligation, on the owner of a social network platform, to monitor all information posted on that network (131) had to be regarded as ‘specific’ on the ground that it was a matter of searching for and blocking a ‘particular’ (132) piece of defamatory information, that the service provider was not required to carry out an ‘independent assessment’ of the lawfulness of the filtered information and that, on the contrary, it could have ‘recourse to automated search tools and technologies’. (133)

113. In my view, that development in the Court’s case-law (134) is justified. Although I will set out the limits below, (135) I should mention here that, to consider that a monitoring obligation is ‘general’ where it de facto obliges an intermediary provider to filter, using software tools, all of the information uploaded by the users of its service, even if it is a matter of searching for specific infringements, would regrettably amount to ignoring the technological developments which make such filtering possible and to depriving the EU legislature of a useful means of combating certain types of illegal content.

114. In the present case, in order to attain the objectives referred to in the contested provisions, sharing service providers must, admittedly, monitor all of the content uploaded by their users. However, it is a matter of searching, among that content, for ‘specific works or other subject matter’ for which the rightholders will have already communicated to them the ‘relevant and necessary information’ (Article 17(4)(b) of Directive 2019/790) or a ‘sufficiently substantiated notice’ (Article 17(4)(c)). I shall explain in more detail which content will have to be blocked in the remainder of this Opinion. (136) Nevertheless, at this stage of the analysis, those factors are sufficient, in my view, to demonstrate that those provisions do indeed lay down, indirectly, a ‘specific’ monitoring obligation and to rule out an infringement of the ‘essence’ of the right to freedom of expression. (137)

115. Finally, I would point out that, although the EU legislature cannot delegate to online intermediaries the task of carrying out general preventive monitoring of information shared or transmitted through their services, it may, in my opinion, without undermining the ‘essence’ of the freedom of expression, choose to impose certain active surveillance measures concerning certain specific illegal information, on certain online intermediaries. I note, moreover, that Article 17 of Directive 2019/790 is, in that regard, in line with a series of communications and recommendations from the Commission (138) and new regulations (139) which seek, to that effect, to make certain intermediaries – in particular the large ‘platforms’ – contribute to the tackling of certain types of illegal content. Nevertheless, in each case, observance of the principle of proportionality must be ensured. This form of delegating the review of online legality (140) to certain intermediaries is accompanied, inter alia, by risks for the freedom of expression of the users of their services and cannot therefore be carried out without sufficient safeguards for those users. (141)

(c)    The limitation at issue complies with the principle of proportionality

116. It now remains to examine the condition relating to compliance with the principle of proportionality, which, according to Article 52(1) of the Charter, is subdivided into two sub-conditions: the limitation at issue must, first, be ‘necessary’ and, secondly, ‘genuinely meet objectives of general interest recognised by the Union or the need to protect the rights and freedoms of others’.

117. Compliance with the second sub-condition is not disputed by the parties. Having regard to the general objective pursued by Article 17 of Directive 2019/790, (142) the limitation at issue meets the ‘need to protect the rights and freedoms of others’, namely copyright and the related rights of the rightholders. I note that intellectual property is protected as a fundamental right, inter alia, (143) in Article 17(2) of the Charter and Article 1 of Protocol No 1 to the ECHR. (144) The contested provisions thus constitute ‘positive measures of protection’ adopted by the EU legislature in order to ensure that those rightholders can genuinely and effectively exercise their intellectual property rights in their relations with sharing service providers. (145)

118. However, the parties disagree as to whether the limitation at issue complies with the first sub-condition. In that regard, I should point out that the examination of whether a limitation on the exercise of a fundamental right guaranteed by the Charter is ‘necessary’ within the meaning of Article 52(1) of the Charter encompasses, in reality, a review of three cumulative requirements: it is necessary to ascertain whether that limitation is (1) ‘appropriate’, (2) ‘necessary’ and (3) ‘proportionate’ stricto sensu. (146) I shall examine those three requirements in turn in the following sections.

(1)    The limitation at issue is ‘appropriate’

119. The requirement that the limitation at issue is ‘appropriate’ does not appear to be disputed by the Republic of Poland. In any event, like the Parliament and the Council, I consider that requirement to be met.

120. In the context of the analysis of the appropriateness of a given measure, the Court must ascertain not whether that measure constitutes the best means of attaining the objective pursued, but whether it is appropriate for contributing to the achievement of that objective. (147)

121. In the present case, the monitoring obligations imposed on sharing service providers pursuant to the contested provisions are appropriate for contributing to the objective pursued by the EU legislature. By shifting the burden of monitoring their services and actively combating infringing content that may be found on them to the sharing service providers, those provisions, first, strongly encourage those providers to conclude licensing agreements with rightholders (148) and, secondly, enable those rightholders to control more easily the use of their works and subject matter on those services. (149)

(2)    The limitation at issue is ‘necessary’

122. The Republic of Poland submits, however, that the limitation on the exercise of the right to freedom of expression resulting from Article 17(4)(b) and (c) of Directive 2019/790, in fine, goes beyond what is ‘necessary’ to attain the objective pursued by the EU legislature. According to the Republic of Poland, the obligations laid down in points (a) and (c), in principio, of that paragraph are sufficient in that regard. First, the obligation on sharing service providers, in accordance with point (a), to make their ‘best efforts’ to obtain an authorisation from the rightholders strengthens the rightholders’ negotiating position. Secondly, the obligation on those providers, under point (c), in principio, to act expeditiously, upon receiving a sufficiently substantiated notice, to disable access to, or to remove from their websites, the notified works or other protected subject matter ensures that those rightholders are protected effectively.

123. I do not agree.

124. In that regard, I recall that the ‘necessity’ test amounts to verifying whether alternative measures exist which would be as effective as the measure chosen to attain the objective pursued whilst being less restrictive. (150)

125. As the Parliament and the Council submit, in essence, a liability regime which imposes only the obligations laid down in Article 17(4)(a) and (c), in principio, of Directive 2019/790 is clearly not as effective in attaining the objective pursued by the EU legislature as a regime which, in addition, provides for the obligations under points (b) and (c), in fine, of that paragraph – even if the former obligations are effectively less restrictive in respect of the right to freedom of expression than the latter. (151)

126. First, if, as the applicant submits, the obligation for sharing service providers to make ‘best efforts’ to obtain an authorisation from the rightholders already, in itself, strengthens the position of those rightholders in negotiating licensing agreements with those providers, Article 17 of Directive 2019/790 is not intended solely to ensure that those rightholders receive equitable remuneration for the use of their works and other protected subject matter on those services. The broader issue is to ensure that those same rightholders can effectively control such use and, in particular, if they so wish, prevent that subject matter from being available on such services.

127. In that respect, secondly, it cannot be denied that, as the defendants point out, a system of notice and take down, such as that resulting from Article 14 of Directive 2000/31 and reproduced, in essence, in Article 17(4)(c) of Directive 2019/790, in principio, does not allow the rightholders concerned to oppose the illegal use of their works on sharing services as effectively as a system such as that resulting from the contested provisions, which, in addition, imposes monitoring obligations on the providers of those services.

(3)    The limitation at issue is ‘proportionate’ stricto sensu

128. According to the settled case-law of the Court, a limitation on the exercise of a fundamental right guaranteed by the Charter is regarded as ‘proportionate’, in the strict sense of the term, if the disadvantages caused by the measure in question are not disproportionate to the aims pursued. (152)

129. In the present case, the contested provisions preclude, on the one hand, the right to freedom of expression guaranteed in Article 11 of the Charter and, on the other, the right to intellectual property, protected in Article 17(2) of the Charter. As the Parliament, the Council and the Spanish Government recall, the first right has no ‘automatic priority’ over the second. (153) The assessment of the proportionality of those provisions must be carried out, in the words of the Court, ‘in accordance with the need to reconcile the requirements of the protection of those various [fundamental] rights’ and striking a ‘fair balance’ between them. (154) Moreover, in the field of copyright law, the Court has placed particular emphasis on the need, in the digital environment, to safeguard that ‘fair balance’. (155)

130. The Republic of Poland submits that the EU legislature has in fact not safeguarded that balance in Article 17 of Directive 2019/790. In its view, the harm that the contested provisions cause to freedom of expression is disproportionate to the advantages that they are likely to bring in terms of protecting intellectual property rights.

131. For my part, I consider, like the Parliament, the Council and the Commission, that the EU legislature could choose to reconsider the balance inherent in the liability regime applicable to sharing service providers (subsection (i)). The new liability regime adopted nevertheless entails significant risks for freedom of expression (subsection (ii)), requiring the provision of sufficient safeguards to minimise those risks (subsection (iii)), which, in my view, the EU legislature has done (subsection (iv)).

(i)    The EU legislature could legitimately substitute a new balance for the one it had originally implemented

132. The exemption from liability, for intermediary providers, provided for in Article 14 of Directive 2000/31 reflects a balance between, inter alia, freedom of expression and intellectual property rights, as desired by the EU legislature when that directive was adopted. At that time, the EU legislature intended to promote the development of those providers in order to stimulate more generally the growth of electronic commerce and ‘information society services’ in the internal market. It was therefore important not to impose on those providers a liability which could jeopardise their activity. The interests of rightholders had to be safeguarded and balanced against the freedom of expression of Internet users in the context of the ‘notice and take down’ system and in the context of injunctions that could be issued against those providers. (156)

133. As the Council submits, circumstances have undoubtedly changed since then. The emergence of ‘Web 2.0’ services has brought advantages as well as new economic and social risks, impacting on the various interests at stake. In that respect, the EU legislature was entitled to review the choices it had made almost 20 years earlier, assess those changes in circumstances and evaluate those advantages and risks. (157)

134. In that regard, as the Parliament, the Council and the French Government have noted, the EU legislature has a broad discretion in areas in which its action involves political, economic and social choices and in which it is called upon to undertake complex assessments and evaluations. (158) Adapting copyright to the digital environment and establishing, in this field, a liability regime for online sharing services which ensures a fair balance between all of the rights and interests at stake is, undoubtedly, a ‘complex’ task. (159)

135. Similarly, the ECtHR recognises that public authorities have a broad margin of discretion when they have to strike a balance between different rights protected by the ECHR. (160) That discretion was all the more important in the present case since the EU legislature had to regulate, in principle, not political speeches, but the use of works and other protected subject matter. (161)

136. In a context which has been widely debated, (162) the EU legislature has made a policy choice in favour of the creative industries. It has taken the view that the previous balance between the rights and interests at stake was no longer satisfactory and that, in order to continue to ensure a high level of protection for rightholders, (163) it was necessary to adopt a new liability regime for certain ‘Web 2.0’ service providers, imposing on them certain obligations to monitor the content uploaded by the users of their services. In view of the broad discretion enjoyed by the legislature, I consider that such a choice was not, in principle, disproportionate.

137. More specifically, the proportionality of the contested provisions lies, in my view, in the combination of the factors put forward by the defendants and the interveners, namely, first, the extent of the economic harm to rightholders caused by their works being uploaded illegally to online sharing services, having regard to the huge amount of content uploaded to those services and the speed of the exchange of information on the Internet, (164) secondly, the fact that, for the same reasons, the ‘notice and take down’ system makes it difficult for rightholders to control the use of their works on such services, thirdly, the difficulties they face in prosecuting the users responsible and, fourthly, the fact that the monitoring obligations concern specific intermediary providers. On the latter point, I note that sharing service providers, by the content promotion that they carry out, (165) have some influence on the information accessed by the public. Those aspects tend, to a certain extent, (166) to bring those providers into line with traditional intermediaries such as editors, and therefore it may be proportionate, so far as they are concerned, to adopt a specific liability regime which is different from that applicable to other host providers. (167)

138. Moreover, as the Spanish and French Governments submit, the ECtHR, in its judgment in Delfi AS v. Estonia, held that it was not disproportionate, in the context of striking a balance between freedom of expression, within the meaning of Article 10 of the ECHR, and the right to honour guaranteed in Article 8 of that convention, to hold a large online news portal liable for failing to prevent the publication of certain types of unlawful comments left by users on its website below an article or, at least, not to have removed them on its own initiative within a short period of time.

139. In that judgment, the ECtHR focused on, first, the extent of the harm caused by such comments, given the speed at which information is circulated online (168) and, secondly, the fact that, although the ‘notice and take down’ system may in many cases be an appropriate tool for balancing the rights and interests of all those involved, it was insufficient to put an end to the serious harm resulting from such comments. (169) The ECtHR also pointed out, thirdly, that it would have been difficult for the victim to prosecute the authors of comments and, fourthly, that the operator of the news portal had a certain influence over the comments posted by users, and therefore the adoption of a specific approach to liability in respect of such an intermediary could be justified. (170) A certain analogy may therefore be drawn with the present case. (171)

(ii) The risks inherent in a liability regime such as that resulting from the contested provisions

140. As the Parliament submits, in essence, in so far as the filtering to be carried out by sharing service providers pursuant to the contested provisions will prevent the dissemination on those services of content which infringes copyright or related rights, the limitation on the exercise of the right to freedom of expression resulting from those provisions is justified in respect of that content.

141. Nevertheless, the link that the EU legislature established, in those provisions, between the liability of sharing service providers and the effectiveness of such filtering entails a significant risk to freedom of expression, namely the risk of ‘over-blocking’ lawful content.

142. Such a risk of an ‘over-blocking’ exists, generally, where public authorities hold intermediary providers liable for illegal information provided by users of their services. In order to avoid any risk of liability, those intermediaries may tend to be overzealous and excessively block such information where there is the slightest doubt as to its lawfulness. (172)

143. In the present case, the risk is, more specifically, that, in order to avoid any risk of liability vis-à-vis rightholders, the sharing service providers systematically prevent the making available, on their services, of all content which reproduces works and other protected subject matter for which they have received the ‘relevant and necessary information’ or a ‘sufficiently substantiated notice’ from those rightholders, including content which does not infringe their rights. (173)

144. In addition to the fact that some users wishing to upload the content concerned may have a licence for the works and subject matter in question, the rightholders do not have an absolute monopoly over the use of their protected subject matter. In that regard, Article 5(3) of Directive 2001/29 contains a list of exceptions and limitations to the exclusive right of ‘communication to the public’. Those exceptions and limitations ensure, in principle, a ‘fair balance’ between, on the one hand, the interest of those rightholders in the protection of their intellectual property and, on the other, the protection of the interests and fundamental rights of users of protected subject matter, as well as the public interest (174) – inter alia public access to culture. In particular, a number of those exceptions and limitations, including those relating to quotations, criticism and review (175) and those relating to caricature, parody and pastiche, (176) in their respective fields of application, accord users’ rights to freedom of expression and creation precedence over the interests of those rightholders.

145. Specifically, a significant proportion of the content uploaded by users to sharing services consists of uses, or even creative reappropriations, of works and other protected subject matter which may be covered by those exceptions and limitations. (177)

146. Nevertheless, the question as to whether such an exception or limitation is applicable to particular content depends on the context and requires some analysis. (178) The line between legitimate use and infringement may, in different cases, be debatable. (179) In all of those ambiguous situations, sharing service providers may find it easier to prevent the content concerned from being made available rather than having to claim themselves, in the context of a possible action for liability brought by the rightholders, that those exceptions or limitations apply. (180)

147. The risk of ‘over-blocking’ which I have just described is increased, in the present case, by the fact that the conditions for exemption laid down in Article 17(4)(b) and (c), in fine, of Directive 2019/790 in fact require, in many cases, sharing service providers to use automatic content recognition tools.

148. In that regard, it is important not to lose sight of the inherent limitations of the tools in question, limitations which the applicant duly pointed out and which, moreover, have already been noted by the Court in the judgments in Scarlet Extended and SABAM. (181) Automatic content recognition tools detect content and not copyright infringements. Such tools, especially those which use ‘digital fingerprinting’ technology, are capable of detecting matches, thus they can recognise that the content of a given file reproduces, in whole or in part, that of a reference file. (182) However, as the Republic of Poland submits, those tools are currently not capable of assessing the context in which the reproduced work is used and, in particular, of identifying the application of an exception or limitation to copyright. (183) The risk of ‘over-blocking’ is all the more significant as the ability of such tools to recognise matches in ever shorter extracts (for example, a few seconds for a phonogram) increases. Their use therefore entails the risk of depriving users of a space for expression and creation which is permitted by those exceptions and limitations. (184) Moreover, the ability of automatic recognition tools to identify infringing content depends on the accuracy and veracity of information provided by rightholders. The use of those tools may therefore lead to unjustified complaints concerning, for example, works in the public domain, (185) on the basis of incorrect or improper reference information (so-called risk of ‘over-complaining’). (186)

(iii) The need to provide sufficient safeguards to minimise those risks

149. In view of the risks of ‘over-blocking’ described in the subsection above, a liability regime such as that resulting from the contested provisions must, in my view, be accompanied by sufficient safeguards to minimise those risks and, therefore, ensure that the extent of the interference with freedom of expression is precisely circumscribed. (187) Generally speaking, any kind of delegation, by public authorities, of the review of online legality to intermediary providers, (188) in the form of monitoring obligations which are imposed directly or indirectly on those intermediaries, must be accompanied by such safeguards.

150. More specifically, I consider that such a regime must form part of a legal framework laying down clear and precise rules governing the scope and application of the filtering measures to be implemented by the service providers concerned, so as to ensure that the users of those services have effective protection against the improper or arbitrary blocking of information they wish to upload. (189)

151. I would also point out that, where the limitation of fundamental rights stems from the EU legislation itself, and is therefore attributable to the EU legislature, as is the case here, (190) the EU legislature bears a significant share of the responsibility in that regard. It cannot, in such a case, leave to the Member States – or, a fortiori, the service providers responsible for implementing that legislation – the task of establishing such safeguards. On the contrary, it must define at the very least its substance. (191) That said, since the present case concerns a directive which, moreover, concerns a technical field, certain detailed rules for its application will have to be specified by the Member States – and by the Commission. (192)

152. I would add that the need for the EU legislature to set out the substance of those safeguards is essential in order to ensure the uniform application of EU legislation in all the Member States – such uniformity being all the more necessary since what is at issue in the present case is a harmonisation directive adopted on the basis of Article 114 TFEU. Sharing service providers, which operate internationally, should not be subject to 27 national liability regimes which may differ as to the extent of the filtering imposed on them. Above all, the users of those services should enjoy substantially identical protection against abusive or arbitrary blocking measures, irrespective of the Member State in which they are located.

153. To sum up, although the EU legislature has a wide margin of discretion in deciding on the principle of a liability regime such as that laid down in the contested provisions, it may not, however, dispense with sufficient safeguards to minimise the resulting risks to freedom of expression. In my view, it is for the Court to scrutinise compliance with that requirement. (193)

(iv) The safeguards provided for in the present case

154. The Republic of Poland submits that the EU legislature has not met this requirement in the present case. In its view, the contested provisions are not accompanied by any safeguards capable of circumscribing the extent of the interference with the freedom of expression of users of sharing services.

155. By contrast, the defendants and the interveners submit that Article 17 of Directive 2019/790 contains a ‘complete system of safeguards’. They state that the contested provisions are, in fact, inseparable from paragraphs 5, 7, 8 and 9 of that article. Those paragraphs establish clear and precise rules defining the scope and application of the measures which must be implemented by sharing service providers and, in so doing, maintain a ‘fair balance’ between intellectual property rights and freedom of expression.

156. Article 17(5) of Directive 2019/790, which, I reiterate, states that the measures to be taken by each supplier must be assessed, in the light of the principle of proportionality, with regard to factors such as the ‘size of the service’ or the ‘cost’ of available tools, seems to me to be more relevant to the question of compliance with the freedom to conduct a business, which is not the subject of the present case, than to freedom of expression. I therefore do not consider it necessary to revisit that paragraph.

157. However, paragraphs 7, 8 and 9 of that article do indeed, in my view, contain meaningful safeguards to protect the users of sharing services against measures involving the improper or arbitrary blocking of their content. I shall therefore examine them in the following subsections.

–       The right to legitimate uses of protected subject matter (paragraph 7) and the complaint mechanism (paragraph 9)

158. The defendants and the interveners have rightly pointed out that one of the main safeguards intended to limit the risk of sharing service providers preventing, pursuant to the contested provisions, the availability on their services of content which lawfully reproduces the works and other protected subject matter identified by rightholders is contained in Article 17(7) of Directive 2019/790.

159. First, the first subparagraph of that paragraph provides that ‘the cooperation between … sharing service providers and rightholders (194) shall not result in the prevention of the availability of works or other subject matter uploaded by users, which do not infringe copyright and related rights, including where such works or other subject matter are covered by an exception or limitation’. (195)

160. Secondly, in accordance with the second subparagraph of that paragraph, Member States must ensure that users are able to rely on the exceptions and limitations relating to (a) quotation, criticism and review and (b) use for the purpose of caricature, parody or pastiche (196) when they upload content to sharing services.

161. It follows that the EU legislature has expressly recognised that users of sharing services have subjective rights under copyright law. Those users now have the right, which is enforceable against the providers of those services and rightholders, to make legitimate use, on those services, of protected subject matter, including the right to rely on exceptions and limitations to copyright and related rights. (197) That acknowledgement, by the legislature, of the importance of those exceptions and limitations for users is in line with the case-law of the Court which, itself, has recently recognised that those exceptions and limitations ‘confer rights’ on users. (198)

162. I would point out that, in accordance with Article 17(7) of Directive 2019/790, users may rely on all of the exceptions and limitations provided for in Union law, (199) and in particular those set out in Article 5 of Directive 2001/29 – in so far as they are included in the applicable national law, however. While Article 5 of Directive 2001/29 gives Member States the option to transpose the exceptions and limitations listed therein, (200) Article 17(7) of Directive 2019/790 now requires Member States to provide, at the very least, for the exceptions and limitations relating to quotation and parody in their domestic law, (201) given their particular importance for freedom of expression.

163. It follows, specifically, that sharing service providers are not legally authorised to block or remove content which makes lawful use of works or other protected subject matter on the ground that that content infringes copyright. (202) In particular, they may no longer exclude the application of exceptions and limitations in their terms and conditions or in contractual agreements with rightholders by providing, for example, that a mere allegation by rightholders of infringement of copyright will be sufficient to justify such blocking or removal. (203) On the contrary, those providers must inform their users, in those terms and conditions, that they may use works and other protected subject matter under those exceptions and limitations. (204)

164. In my view, in adopting Article 17(7) of Directive 2019/790, the EU legislature, aware of the risks of ‘over-blocking’ (205) which may result from the liability regime it has established, and in order to ensure a ‘fair balance’ between the rights and interests at stake and protect the freedom of expression of sharing service users, (206) has provided for a clear and precise limit on the filtering and blocking measures which must be implemented by the providers of those services in accordance with paragraph 4 of that article.

165. In that regard, the Parliament, the Council and the Commission have rightly pointed out that, in view of the mandatory nature of the words used in its first subparagraph – ‘shall not result in’ (207) – Article 17(7) imposes an obligation on sharing service providers to achieve a result: the result they must achieve is not to prevent the making available on their services of content that legitimately reproduces works and other protected subject matter, even if such works and subject matter have been identified by the rightholders. The limit of permissible filtering and blocking measures is therefore clearly defined: they must not have the objective or the effect of preventing such legitimate uses. That provision therefore helps to counteract the tendency of those providers to be ‘overzealous’ and, therefore, to circumscribe the extent of the interference with freedom of expression so that it is limited to the dissemination of content which infringes copyright rules.

166. The Republic of Poland asserts in reply, however, that, given the inherent limitations of the use of content recognition tools, referred to in point 148 of this Opinion, and in particular their inability to identify the application of exceptions and limitations to copyright, Article 17(7) of Directive 2019/790 is more wishful thinking than an effective safeguard. In practice, content falling within the scope of those exceptions and limitations will be blocked automatically by those tools. That provision is therefore not capable of affording the users of sharing services effective protection against the improper or arbitrary blocking of their content.

167. The applicant’s arguments on that point reflect a fundamental difference of opinion between the parties and the interveners with regard to the scope of Article 17(7) and the specific way in which the rights of users must be respected in practice. Two different interpretations of that provision have been discussed before the Court in this regard.

168. According to the first interpretation, on which the Republic of Poland bases its action, and which is also put forward by the Spanish and French Governments, the (only) mechanism (208) ensuring, in practice, that the filtering and blocking measures taken by sharing service providers, pursuant to the contested provisions, do not prevent the making available, on their services, of legitimate uses of protected works and other subject matter is the ‘complaint and redress mechanism’ which, in accordance with Article 17(9) of Directive 2019/790, those providers must make available to users of their services ‘in the event of disputes over the disabling of access to, or the removal of, works or other subject matter uploaded by them’.

169. Specifically, sharing service providers must, in accordance with the wishes of rightholders, block ex ante all content reproducing in whole or in part the works and other protected subject matter identified by those rightholders – irrespective of whether it infringes their rights – the onus being on a user, who believes that he or she is making legitimate use of such subject matter, for example in the context of an exception or limitation, to make a complaint to that effect. If that complaint were well founded, the content concerned would be uploaded, ex post, after it has been examined. I would point out that, although the applicant and the Spanish and French Governments have a common understanding of Article 17(7) of Directive 2019/790, their views as to the conclusions to be drawn from it are radically different. (209)

170. According to the second interpretation, put forward by the Parliament, the Council and the Commission, the right of users of sharing services to make legitimate use of protected subject matter, provided for in Article 17(7) of Directive 2019/790, should be taken into account ex ante by the providers of those services in the filtering process itself. The contested provisions and Article 17(7) should be read together and the obligations which they lay down apply ‘simultaneously’. The ‘best efforts’ which those providers must make, in accordance with those provisions, to prevent the uploading of works and protected subject matter identified by rightholders cannot therefore result, in practice, in the preventive and systematic blocking of those legitimate uses. The complaint and redress mechanism envisaged in Article 17(9) constitutes an additional and final safeguard for situations where, despite the obligation in paragraph 7, those providers nevertheless block such legitimate content mistakenly.

171. I support that latter interpretation, which, in my view, follows from a literal, systematic and historical analysis of Article 17 of Directive 2019/790.

172. First of all, from a textual point of view, I would note that, according to Article 17(7) of Directive 2019/790, the cooperation between rightholders and sharing service providers must not result in ‘the prevention of the availability’ of content that legitimately reproduces works or other protected subject matter. The interpretation that that content could be systematically blocked ex ante, provided that users could obtain its reinstatement ex post, is, in my opinion, far from being the most natural way of understanding that wording. (210)

173. Next, from a systematic point of view, as the Commission submits, the contested provisions and paragraph 7 must be read in the light of the third subparagraph of Article 17(9), in accordance with which that directive ‘shall in no way affect’ legitimate uses of protected works and subject matter. If the content concerned were to be systematically blocked ex ante, the onus being on users to make a complaint in order to be able to upload it, those legitimate uses would clearly be ‘affected’ to a certain extent.

174. I also note that the matter of legitimate uses of protected subject matter is addressed not only in recital 70 of Directive 2019/790, which refers to the complaint mechanism, but also in the first paragraph of recital 66, (211) concerning the preventive measures to be implemented by sharing service providers under the contested provisions. In addition, according to the first paragraph of recital 70, that mechanism is intended to ‘support’ – and not ‘enable’ – such legitimate uses.

175. Lastly, the travaux préparatoires would appear to confirm that interpretation. In that regard, I note that Article 17(9) of Directive 2019/790 can be traced back to Article 13(2) of the proposal for a directive. That proposal did not contain a provision on the legitimate uses of works and other protected subject matter. Such a provision was added by way of amendments during the first reading of the text within the Parliament and the Council. In those amendments, the complaint and redress mechanism was specifically intended to enable such legitimate uses. (212) Following the first rejection of the text by the Parliament on 5 July 2018, in the subsequent versions of the text and in the version finally adopted, users’ rights and the complaint and redress mechanism were separated into two distinct provisions.

176. That legislative process also demonstrates, in my view, that the intention of the EU legislature has evolved in that regard. Although Article 13 of the proposal for a directive was unilaterally in favour of rightholders, that article, when adopted as Article 17 of Directive 2019/790, evolved into a complex provision which attempts to recognise and balance the various interests at stake. As the Council has submitted, the legislature has chosen to protect both rightholders and users in that provision. As the Parliament points out, Article 17 reflects a delicate compromise in that regard. This development cannot be ignored in its interpretation. (213)

177. The interpretation put forward by the Parliament, the Council and the Commission, to the effect that the rights of users under Article 17(7) of Directive 2019/790 must be taken into account ex ante, and not only ex post, ensures, moreover, the proportionality of the limitation on the exercise of the right to freedom of expression resulting from the contested provisions. (214)

178. In that regard, it is true that the complaint and redress mechanism envisaged in Article 17(9) of Directive 2019/790 is both an essential safeguard and a major step forward in relation to Directive 2000/31. (215) It is a necessary component of any filtering system, given the resulting risk of ‘over-blocking’. The EU legislature has also provided for procedural ‘sub-safeguards’ to accompany that mechanism. That mechanism must be ‘effective and expeditious’ and complaints submitted under it must be processed ‘without undue delay’. In other words, sharing service providers are required to act, in this respect, with the same promptness as they must display upon receiving notices from rightholders, under Article 17(4)(c) of Directive 2019/790. (216) In addition, rightholders must ‘duly’ justify their requests for access to content to be disabled and complaints must be examined by a natural person.

179. Furthermore, in accordance with Article 17(9), Member States must also ensure that out-of-court redress mechanisms are available for the settlement of disputes between users and rightholders. Such mechanisms are useful for the impartial resolution of those disputes. What is even more important, in my view, is that Member States are required to provide for ‘efficient judicial remedies’ in that area. In that regard, in its judgment in UPC Telekabel Wien, (217) the Court emphasised, in essence, that that right to an effective judicial remedy is essential in order to ensure the exercise of the right to freedom of expression online.

180. However, although those procedural safeguards are important, they are not sufficient on their own to ensure a ‘fair balance’ between copyright and users’ freedom of expression.

181. In the first place, in accordance with the case-law of the Court and of the ECtHR, the existence of such procedural safeguards does not exempt the public authorities from ensuring that the collateral effect of a filtering and blocking measure is minimised. These are separate and cumulative requirements.

182. Those two courts have repeatedly held that any filtering and blocking measure must be ‘strictly targeted’ in the sense that it must be aimed at illegal content and not have an arbitrary or excessive effect on lawful content. (218) In its judgment in L’Oréal and Others, (219) the Court held, to the same effect, that the surveillance measures imposed on an intermediary must not create obstacles to lawful uses of its service. Finally, in its judgment in UPC Telekabel Wien, (220) it held that a blocking measure must not ‘unnecessarily deprive’ Internet users of the possibility of lawfully sharing and accessing information.

183. That case-law does not mean that the right to freedom of expression precludes such measures if they are likely to result in any blocking of lawful content. The term ‘unnecessarily’, used by the Court, reflects, in my view, the idea that the effectiveness of the protection of the rights of rightholders may justify certain cases of ‘over-blocking’.

184. Nevertheless, there must again be a ‘fair balance’ between the effectiveness of filtering and its collateral effect. As is apparent, in essence, from the case-law of the ECtHR, in a democratic society it is not possible to require absolute effectiveness – and thus ‘zero risk’ of copyright infringement – where this would have the effect of blocking a significant amount of lawful content. (221)

185. The French Government contends that, according to its interpretation of Article 17 of Directive 2019/790, the filtering measures which sharing service providers must take under the contested provisions meet that requirement since they are ‘strictly targeted’ at content reproducing all or part of the works and other protected subject matter identified by rightholders.

186. That argument cannot be accepted. It is clear from the judgments in Scarlet Extended and SABAM that a filtering system which systematically blocks content that makes legitimate use of protected subject matter would disproportionately undermine freedom of expression and information. (222) This is the case, in my opinion, precisely because the collateral effect of such filtering is too great to be compatible with that freedom, irrespective of whether injured users have a right of appeal against the blocking of their information, something which the Court did not even mention in those judgments.

187. There are good reasons for this. In the present case, first, the preventive blocking of all content reproducing the works and other protected subject matter identified by rightholders would have the effect of systematically imposing the burden of inaction on users, since the dissemination of legitimate content could not take place without those users making a successful complaint. If those users had to assert systematically their rights under the complaint mechanism, it is highly likely that a significant proportion of them would refrain from doing so because, in particular, of a lack of sufficient knowledge to assess whether their use of that subject matter is legitimate and whether, therefore, there are grounds for making such a complaint. (223) The preventive ‘over-blocking’ of all of those legitimate uses and the systematic reversal of the burden of demonstrating that legitimacy on users could therefore lead, in the short or long term, to a ‘chilling effect’ on the freedom of expression and creation, resulting in a decrease in the activity of those users. (224)

188. Secondly, the exchange of information online is characterised, in particular, by its speed. The public will search for certain types of content uploaded to sharing services only for a short period of time, in particular content relating to current events. (225) Such content thus often becomes obsolete within a few days. Delaying the posting of such content by its systematic blocking ex ante would risk rendering it irrelevant and of no interest to the public. Therefore, unlike the Spanish and French Governments, I consider that such systematic blocking would be particularly problematic, even if it were only ‘temporary’, since the possible restoration of content following the examination of users’ complaints is not capable of remedying the damage caused to those users’ freedom of expression. (226)

189. In the second place, I note that, in its recent case-law, the Court emphasises the need to ‘safeguard the effectiveness’ of exceptions and limitations to copyright, given their importance in order to maintain a ‘fair balance’ between the rights and interests at stake, in particular where they are aimed at ensuring that freedom of expression is observed – as is the case with regard to use for the purpose of quotation, criticism or review and use for the purpose of caricature, parody or pastiche. (227)

190. Specifically, in order to ‘safeguard the effectiveness’ of those exceptions and limitations, it is important, in my view, to ensure that the preventive measures taken pursuant to the contested provisions do not undermine systematically the right of users to make use of them. If, in the digital environment, rightholders have options for monitoring their protected subject matter which have no equivalent in the ‘real world’ – since content recognition tools give them the virtual means to prevent all uses of that subject matter, including uses not covered by their monopoly, such as parody – those exceptions and limitations must also be protected. The danger in that regard is that maximum protection of certain forms of intellectual creativity is to the detriment of other forms of creativity which are also positive for society. (228)

191. It follows from all the foregoing, in my view, that, in accordance with a combined reading of the contested provisions and Article 17(7) of Directive 2019/790, the filtering measures which sharing service providers are required to implement must comply with two cumulative obligations: they must seek to prevent the uploading of content which unlawfully reproduces the works and other protected subject matter identified by rightholders while not preventing the making available of content which lawfully reproduces that subject matter.

192. Contrary to the applicant’s submissions, sharing service providers cannot therefore ‘apply any available measure’ to protect the intellectual property rights of rightholders. (229) The ‘best efforts’ and ‘professional diligence’ which they must exercise in that regard must be read in the light of Article 17(7) of Directive 2019/790. Since those providers find themselves in a bilateral professional position vis-à-vis users and rightholders, they must act ‘diligently’ in relation to both categories.

193. Article 17(7) of Directive 2019/790 therefore obliges those providers – and also the administrative and judicial authorities of the Member States when supervising the implementation of that article (230) – to consider the collateral effect of the filtering measures they implement. (231) Therefore, they cannot preventively and systematically block content falling within, inter alia, the scope of the exceptions and limitations to copyright. They must take into account, ex ante, respect for users’ rights. I invite the Court to confirm unequivocally in its forthcoming judgment that this is the correct interpretation of Article 17.

–       The prohibition of general monitoring obligations (paragraph 8)

194. Article 17(8) of Directive 2019/790 provides that ‘the application of [that] Article shall not lead to any general monitoring obligation’. Accordingly, the contested provisions must also be read in the light of that paragraph.

195. By reaffirming the prohibition of such an ‘obligation’, (232) the EU legislature has, in my view, laid down another significant safeguard for freedom of expression. That prohibition delimits the scope of the filtering measures that may be expected of any intermediary provider and, in the present case, of sharing service providers.

196. In that regard, lessons can be drawn from the judgment in Glawischnig-Piesczek, which I mentioned above. (233) In that judgment, the Court, interpreting that prohibition in the version resulting from Article 15 of Directive 2000/31, took the view that the operator of a social network could be required, by means of a judicial injunction, to search for and block, among the information posted on that network, ‘a particular piece of information … the content of which was examined and assessed by a court having jurisdiction … which, following its assessment, declared it to be illegal’. (234) The court could thus require that operator to block access to all identical information to that declared illegal. The injunction could even extend to equivalent information, since the operator is not obliged to carry out an ‘independent assessment’ of its lawfulness and may, by contrast, have ‘recourse to automated search tools and technologies’. (235)

197. It follows, in general, that, although intermediary providers are technically well placed to combat the presence of certain illegal information disseminated through their services, (236) they cannot be expected to make ‘independent assessments’ of the lawfulness of the information in question. Those intermediary providers do not generally have the necessary expertise and, above all, the necessary independence to do so – particularly when they face the threat of heavy liability. (237) They cannot therefore be turned into judges of online legality, who are responsible for coming to decisions on legally complex questions. (238)

198. Consequently, in order to minimise the risk of ‘over-blocking’ and, therefore, ensure compliance with the right to freedom of expression, an intermediary provider may, in my view, only be required to filter and block information which has first been established by a court as being illegal or, otherwise, information the unlawfulness of which is obvious from the outset, that is to say, it is manifest, without, inter alia, the need for contextualisation. (239)

199. I observe, moreover, that the monitoring obligations which the ECtHR considered to be justified in its judgment in Delfi AS v. Estonia concerned information that was clearly unlawful. (240) In its subsequent case-law, the ECtHR has clarified that, in the case of information which is not immediately apparent as being unlawful and requires a contextual analysis, such monitoring cannot be required. (241) For the latter type of information, a duly reasoned notification, providing the contextual elements likely to make the unlawfulness apparent, or even, where such a notification is not sufficient in this respect, an injunction order, is necessary to obtain its removal.

200. Specifically, as I explained in my Opinion in Joined Cases YouTube and Cyando, (242) transposed to the field of copyright, it is clear from the judgment in Glawischnig-Piesczek that, although, in accordance with Article 15 of Directive 2000/31, an intermediary provider cannot be required to undertake general filtering of the information it stores in order to seek any infringement, that provision does not, a priori, prevent that provider from being compelled to block a specific file that makes an illicit use of a protected work, previously established by a court. That provision does not, in that context, preclude the provider from being obliged to detect and block not only identical copies of that file, but also other equivalent files, namely those that use the work in question in the same way.

201. That interpretation, in my view, can be transposed, mutatis mutandis, to Article 17(8) of Directive 2019/790. In so far as, within the scheme of Article 17, the unlawful nature of the content to be filtered has not been established at the outset by a court, it can only be a question, as explained in point 198 of this Opinion, of looking for content that, in the light of the information provided by rightholders, seems manifestly infringing. Under the contested provisions, read in the light of Article 17(8), the filtering measures which sharing service providers are required to use, pursuant to the contested provisions, must therefore, in my view, be limited to content which is ‘identical’ or ‘equivalent’ to works and other protected subject matter identified by rightholders. (243)

202. The first category referred to in the previous point specifically concerns identical reproductions, without additional elements or added value, of works and other protected subject matter identified by rightholders. The second concerns content which reproduces that subject matter in the same way, but with insignificant alterations, with the result that the public would not distinguish it from the original subject matter (for example in the case of simple technical alterations intended to circumvent the filtering system, such as a change in format, reversing the image or changing its speed, and so forth). (244) Detection of those two categories of content will not require sharing service providers to make an ‘independent assessment’ of their lawfulness – the infringement will seem manifest in the light of the ‘relevant and necessary’ information provided by rightholders – and may be carried out using ‘automated search tools and technologies’. (245)

203. However, sharing service providers cannot be required also to filter preventively content which, while it reproduces works and protected subject matter identified by rightholders, is significantly different from those works and that subject matter, as is the case when extracts of works are re-used in other contexts, or with ‘transformative’ content, and so forth, which may be covered by exceptions and limitations to copyright. Identifying possible infringements in that content would require ‘independent assessments’ on the part of those providers as they would have to evaluate the context of those uses. As the Republic of Poland submits, complex issues of copyright relating, inter alia, to the exact scope of the exceptions and limitations cannot be left to those providers. It is not for those providers to decide on the limits of online creativity, for example by examining themselves whether the content a user intends to upload meets the requirements of parody. Such delegation would give rise to an unacceptable risk of ‘over-blocking’. Those questions must be left to the court.

–       The consequences which arise from the foregoing

204. It is clear from the foregoing sections, in my view, that Article 17 of Directive 2019/790 contains sufficient safeguards to delimit the scope of the limitation on the exercise of the right to freedom of expression resulting from the contested provisions.

205. First, in accordance with paragraph 7 of that article, sharing service providers are not authorised preventively to block, pursuant to the contested provisions, all content which reproduces the works and other protected subject matter identified by rightholders, including those that may be lawful. Secondly, under paragraph 8 of that article, those providers may be obliged to detect and block only content which is ‘identical’ and ‘equivalent’ to that subject matter, that is to say, content the unlawfulness of which seems manifest in the light of the ‘relevant and necessary’ information provided by the rightholders. In such cases, since an infringement is highly probable, that content may be presumed to be illegal. It is therefore proportionate to block it preventively, with the onus being on the users concerned to demonstrate its lawfulness – for example, that they have a licence, or that the work is in fact in the public domain (246) – in the context of the complaint mechanism. In short, the ‘best efforts’ imposed on sharing service providers under the contested provisions consist of blocking those manifest infringements. (247)

206. Conversely, in all ambiguous situations – short extracts from works included in longer content, ‘transformative’ works, and so forth – in which, in particular, the application of exceptions and limitations to copyright is reasonably conceivable, the content concerned cannot be the subject of a preventive blocking measure.

207. As the Parliament, the Council and the Commission have pointed out, the obligation to achieve a result, laid down in the first subparagraph of Article 17(7) of Directive 2019/790, consisting of not preventing legitimate content from being uploaded is, in that regard, more binding than the obligations to make ‘best efforts’ arising from the contested provisions, which are obligations to use best endeavours. (248) This means that the intention of the EU legislature, quite rightly in my view, was to ensure that, in such a case, sharing service providers give priority to freedom of expression. In other words, the legislature considered that ‘false positives’, consisting of blocking legal content, were more serious than ‘false negatives’, which would mean letting some illegal content through.

208. Therefore, as the Parliament, the Council and the Commission have submitted, in those equivocal situations, the content concerned must be presumed to be lawful and, consequently, its uploading cannot be hindered.

209. The difficulty lies in defining practical solutions to enforce that dichotomy using automatic content recognition tools which, in many situations, sharing service providers will have to use. The applicant has also submitted that the EU legislature did not provide any concrete solution in that regard in Directive 2019/790.

210. That said, in my view, it was for the EU legislature, as I have indicated, to set out the substance of the safeguards necessary to minimise the risks posed to freedom of expression resulting from the contested provisions. However, as the Council has submitted, in an area which involves the adoption of technological measures, such as that at issue in the present case, and in view of the fact that Article 17 of Directive 2019/790 will apply to different types of providers, services and protected subject matter, it is for the Member States and the Commission to determine the detailed rules for such measures. (249)

211. In practice, those solutions will consist of incorporating parameters in content recognition tools which help distinguish between what seems manifest and what is ambiguous. This may vary according to the types of protected subject matter and exceptions in question. Account will have to be taken, for example, of the match rates detected by those tools and thresholds will have to be determined above which automatic blocking of content is justified and below which the application of an exception, such as quotation, is reasonably conceivable. (250) Such a solution could be coupled with a mechanism allowing users to flag at the time of or immediately after uploading content whether, in their view, they benefit from an exception or limitation which would require the provider concerned to review the content in question manually in order to verify whether the application of that exception or limitation is manifestly precluded or, on the contrary, whether it is reasonably conceivable. (251)

212. Generally speaking, in respect of the different types of providers, services and protected subject matter, the definition of those practical solutions can neither be left to those providers nor, contrary to the French Government’s submissions, be left entirely to rightholders. (252) In view of the importance of those solutions for users’ freedom of expression, they must not be defined by those private parties alone in a way which lacks transparency, rather the process should be transparent and under the supervision of public authorities.

213. In my view, this is precisely where the dialogue between stakeholders envisaged by the EU legislature in Article 17(10) of Directive 2019/790 is useful. That provision imposes the obligation on the Commission, in cooperation with the Member States, to organise dialogues between sharing service providers, rightholders, users’ organisations and other relevant stakeholders in order to examine ‘best practices for cooperation between … sharing service providers and rightholders’. On that basis, the Commission must issue guidance on the application of Article 17, in particular as regards the manner in which the contested provisions are to be implemented. In that process, ‘special account’ must be taken of ‘the need to balance fundamental rights and of the use of exceptions and limitations’. Therefore, the Commission, with the assistance of stakeholders, must propose practical solutions to enable the contested provisions to be implemented, in compliance with paragraphs 7 and 8 of Article 17. (253)

214. Lastly, I would point out that, as the Commission submits and in line with what I stated in point 183 of this Opinion, the obligation laid down in Article 17(7) of Directive 2019/790 does not mean that the mechanisms which lead to a negligible number of cases of ‘false positives’ are automatically contrary to that provision. Nevertheless, the error rate should be as low as possible. It follows that, in situations in which it is not possible, in the current state of technology, for example as regards certain types of works and protected subject matter, to use an automatic filtering tool without resulting in a ‘false positive’ rate that is significant, the use of such a tool should, in my view, be precluded under paragraph 7. (254)

215. The interpretation suggested in this Opinion is not called into question by the argument put forward by the Spanish and French Governments that it is imperative preventively to block all content which reproduces in whole or in part the protected subject matter identified by rightholders in order to eliminate any risk of dissemination of illegal content on a sharing service, such dissemination being likely to cause ‘irreparable’ harm to them, given the speed in which information is exchanged on the Internet.

216. In my view, although the risk of serious and imminent harm caused by an attempt to upload manifestly infringing content is such as to justify a measure which preventively blocks that content, (255) those rightholders cannot require ‘zero risk’ as regards possible infringements of their rights, as I have indicated in point 184 of this Opinion. It would be disproportionate to apply such measures to all – more questionable – cases of any potential damage caused, for example, by ‘transformative’ content that may or may not fall within the scope of the exceptions and limitations to copyright, which is not in direct competition with the original protected subject matter. (256) For those situations, adopting such preventive measures would, conversely, risk causing ‘irreparable’ damage to freedom of expression, for the reasons I have explained in point 188 of this Opinion.

217. Moreover, the Court has repeatedly held that ‘there is nothing whatsoever in the wording of Article 17(2) of the Charter to suggest that the right to intellectual property is inviolable and must for that reason be absolutely protected’. (257)

218. Furthermore, the interpretation suggested in this Opinion does not leave rightholders unprotected in respect of such equivocal content. In particular, it is not a question of going back to the scope of the right of communication to the public as such. (258) The fact that certain content that unlawfully reproduces their works and other protected subject matter is not blocked when it is uploaded does not prevent those rightholders, inter alia, (259) from requesting the removal or permanent blocking of the content in question by means of a notice, in accordance with Article 17(4)(c) of Directive 2019/790, (260) containing reasonable explanations as to the reasons why, for example, the application of an exception should be precluded. (261) The provider concerned, for its part, will have to examine that notice diligently and decide whether, in the light of that new information, the unlawfulness is apparent. (262) Assuming that to be the case, the provider concerned must, if it is not to incur liability, promptly block access to the content or remove it from its website. As the Commission points out, it is clear from the second paragraph of recital 66 of Directive 2019/790 (263) that the EU legislature had provided that, in some cases, that approach is the only way to ensure the unavailability of particular content. In the event that the unlawfulness is not apparent from those explanations, on the ground that the content in question raises complex and/or new legal questions concerning copyright, the intervention of the court, which alone is competent to decide such questions, will in principle be necessary. It will then be the responsibility of the rightholders to refer the matter to a judicial authority, in particular on the basis of Article 8(3) of Directive 2001/29, so that that authority can rule on the content and, if it is unlawful, order its blocking.

219. As the Parliament has rightly pointed out, that ensures a ‘fair balance’ between the measures imposed on users, in some cases, to be able to upload their content and those required of rightholders, in other cases, to have content removed. (264)

4.      Conclusion as to the compatibility of the limitation in question with the Charter

220. It follows from all of the foregoing considerations that the limitation on the exercise of the right to freedom of expression and information resulting from the contested provisions, as interpreted in this Opinion, satisfies all of the conditions laid down in Article 52(1) of the Charter. In my view, that limitation is therefore compatible with the Charter. Consequently, the action brought by the Republic of Poland must, in my view, be dismissed. (265)

C.      Postscript

221. Subsequent to the drafting of this Opinion, in the course of its translation by the services of the Court, two important documents have been published.

222. First, the judgment in YouTube and Cyando (266) has been delivered. The reasoning adopted by the Court in that judgment with regard to Directives 2000/31 and 2001/29, which I cannot examine in detail here, does not, in my view, call into question the considerations developed in this Opinion. (267)

223. Secondly,  the Commission has published its guidance on the application of Article 17 of Directive 2019/790. (268) In essence, that guidance sets out what the Commission had submitted before the Court and reflects the explanations given in points 158 to 219 of this Opinion. However, that guidance also states, in an unprecedented fashion, that rightholders should have the possibility to ‘earmark’ subject matter the unauthorised uploading of which ‘could cause significant economic harm to them’. Those providers should exercise particular diligence with regard to such subject matter. It is further stated that they would not be fulfilling their ‘best efforts’ obligations if they allowed content reproducing that same subject matter to be uploaded despite such ‘earmarking’. If this is to be understood as meaning that those same providers should block content ex ante simply on the basis of an assertion of a risk of significant economic harm by rightholders – since the guidance does not contain any other criterion objectively limiting the ‘earmarking’ mechanism to specific cases (269) – even if that content is not manifestly infringing, I cannot agree with this, unless I alter all the considerations set out in this Opinion.

VI.    Costs

224. Under Article 138(1) of the Rules of Procedure of the Court of Justice, the unsuccessful party is to be ordered to pay the costs if they have been applied for in the successful party’s pleadings. Since, in my view, the action brought by the Republic of Poland must be dismissed and the Parliament and the Council have applied for costs, that Member State should be ordered to pay the costs. Nevertheless, the Spanish and French Governments and the Commission, which have intervened in the proceedings, should bear their own costs, in accordance with Article 140(1) of those Rules of Procedure.

VII. Conclusion

225. In the light of all of foregoing considerations, I propose that the Court should:

–        dismiss the action brought by the Republic of Poland;

–        order that Member State to pay the costs; and

–        order the Kingdom of Spain, the French Republic and the European Commission to bear their own costs.


1      Original language: French.


2      OJ 2019 L 130, p. 92.


3      I shall use the terms ‘post’ and ‘upload’ without distinction to refer to the process by which digital content is made available to the public on websites or applications for smart devices associated with those sharing services.


4      Directive of the European Parliament and of the Council of 8 June 2000 (OJ 2000 L 178, p. 1).


5      Directive of the European Parliament and of the Council of 22 May 2001 (OJ 2001 L 167, p. 10).


6      Judgment of 24 November 2011 (C‑70/10, EU:C:2011:771; ‘the judgment in Scarlet Extended’).


7      Judgment of 16 February 2012 (C‑360/10, EU:C:2012:85; ‘the judgment in SABAM’).


8      Judgment of 3 October 2019 (C‑18/18, EU:C:2019:821; ‘the judgment in Glawischnig-Piesczek’).


9      I should point out at this point that, after the drafting of this Opinion, in the course of its translation, first, the judgment of 22 June 2021, YouTube and Cyando (C‑682/18 and C‑683/18, EU:C:2021:503), was delivered and, secondly, the Commission published its guidance on the application of Article 17 of Directive 2019/790 (communication from the Commission to the European Parliament and the Council, ‘Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market’, 4 June 2021 (COM(2021) 288 final)). In view of the advanced stage of this Opinion, I have confined myself to examining those two documents in a postscript, which the reader will find in point 221 et seq. of said Opinion.


10      Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market (COM(2016) 593 final) (‘the proposal for a directive’).


11      See Explanatory Memorandum of the proposal for a directive, pp. 2 and 3.


12      See the proposal for a directive, p. 3, and ‘Commission Staff Working Document, Impact Assessment on the modernisation of EU copyright rules’ (SWD(2016) 301 final) (‘Impact Assessment’), Part 1/3, pp. 137-141.


13      See, in relation to the YouTube platform, my Opinion in Joined Cases YouTube and Cyando (C‑682/18 and C‑683/18, EU:C:2020:586; ‘Opinion in Joined Cases YouTube and Cyando’; points 14 to 18).


14      For example, several hundred thousand videos are published every day on YouTube by users of that platform, who, if Google is to be believed, number more than 1.9 thousand million. See my Opinion in Joined Cases YouTube and Cyando (point 43).


15      See Impact Assessment, Part 1/3, pp. 137, 139 and 142, and Part 3/3, Annex 12B.


16      Article 3(1) of Directive 2001/29 therefore contains, strictly speaking, a ‘right of communication to the public’ and a ‘right of making available to the public’. Nevertheless, since the former encompasses the latter, for convenience I shall use ‘communication to the public’ to designate those two rights without distinction.


17      See the list in Article 3(2) of Directive 2001/29, reproduced in point 9 of this Opinion.


18      Strictly speaking, Article 3(2) of Directive 2001/29 grants only holders of related rights an exclusive right to make their protected subject matter ‘available to the public’. The right of ‘communication to the public’ in the strict sense is conferred on them by Article 8 of Directive 2006/115/EC of the European Parliament and of the Council of 12 December 2006 on rental right and lending right and on certain rights related to copyright in the field of intellectual property (OJ 2006 L 376, p. 28). For some holders of related rights, the latter right is an exclusive right; for others it is only a right to remuneration. That said, those nuances are irrelevant to the present case. I shall therefore confine my references to Article 3 of Directive 2001/29.


19      Subject to the exceptions and limitations to copyright (see point 144 of this Opinion).


20      See, inter alia, judgment of 14 November 2019, Spedidam (C‑484/18, EU:C:2019:970, paragraph 38 and the case-law cited).


21      See, in that regard, my Opinion in Joined Cases YouTube and Cyando (points 53 to 93).


22      Article 14(1) of Directive 2000/31 applies horizontally to all forms of content and liability, irrespective of the area of law concerned (intellectual property, defamation, online hate, and so forth). See my Opinion in Joined Cases YouTube and Cyando (point 138 and footnote 128).


23      See, in that regard, my Opinion in Joined Cases YouTube and Cyando (points 132 to 168).


24      It has just done so, to some extent, in its judgment of 22 June 2021, YouTube and Cyando (C‑682/18 and C‑683/18, EU:C:2021:503). See, on this judgment, point 222 of this Opinion.


25      See Impact Assessment, Part 1/3, p. 140.


26      See Explanatory Memorandum of the proposal for a directive, p. 3.


27      See point 57 of this Opinion.


28      See the third paragraph of recital 38, recital 39 and Article 13(1) of the proposal for a directive.


29      See, inter alia, the petition ‘Stop the censorship-machinery! Save the Internet!’ (available at https://www.change.org/p/european-parliament-stop-the-censorship-machinery-save-the-internet). See, also, Kaye, D., ‘Mandate of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression’, 13 June 2018, and ‘Open Letter to Members of the European Parliament and the Council of the European Union: The Copyright Directive is failing’, 26 April 2018, United Nations.


30      Directive 2019/790 was not adopted unanimously. In the final vote in the Council, six Member States (the Italian Republic, the Grand Duchy of Luxembourg, the Kingdom of the Netherlands, the Republic of Poland, the Republic of Finland and the Kingdom of Sweden) opposed the text, while three Member States (the Kingdom of Belgium, the Republic of Estonia and the Republic of Slovenia) abstained (see document 8612/19 of 16 April 2019, ‘Voting result, Directive of the European Parliament and of the Council on copyright and related rights in the Digital Single Market and amending Directives 96/9/EC and 2001/29/EC (first reading)’, available at https://data.consilium.europa.eu/doc/document/ST‑8612-2019-INIT/en/pdf). Moreover, several Member States have indicated in various statements their concerns about the effects of that directive on users’ rights (see the joint statement by the Netherlands, Luxembourg, Poland, Italy and Finland, the statement by Estonia and the statement by Germany, available at http://data.consilium.europa.eu/doc/document/ST‑7986-2019-ADD-1-REV-2/EN/pdf).


31      Directive 2019/790 was published in the Official Journal of the European Union on 17 May 2019. It entered into force on 7 June of that year (see Articles 29 and 31 of that directive).


32      For convenience, I shall refer to ‘sharing service providers’.


33      Recital 62 of Directive 2019/790 states, moreover, that the concept of an ‘online content-sharing service provider’ covers services that ‘play an important role on the online content market by competing with other online content services, such as online audio and video streaming services, for the same audiences’, which reflects the argument summarised in point 15 of this Opinion.


34      The second subparagraph of Article 2(6) of Directive 2019/790 contains a non-exhaustive list of service providers to whom Article 17 of that directive should not apply.


35      This liability does not replace, but is additional to the liability of users who upload content, who themselves perform separate acts of ‘communication to the public’. See, however, footnote 265 of this Opinion.


36      Recital 64 of Directive 2019/790 states that this is a ‘clarification’. In fact, in my view, the EU legislature has redefined the scope of the right of ‘communication to the public’ within the meaning of Article 3 of Directive 2001/29 for the (sole) purpose of the application of that Article 17. See my Opinion in Joined Cases YouTube and Cyando (points 250 to 255).


37      Recital 3 of Directive 2019/790.


38      See recital 61 of Directive 2019/790.


39      See recital 61 of Directive 2019/790, which states that ‘as contractual freedom should not be affected by those provisions, rightholders should not be obliged to give an authorisation or to conclude licensing agreements’.


40      As stated in the second subparagraph of Article 17(3) of Directive 2019/790, that must not affect the application of Article 14 to sharing service providers for purposes falling outside the scope of that directive. As I explained in my Opinion in Joined Cases YouTube and Cyando (points 141 to 168), the providers in question benefit, in my view, in other situations, from the exemption laid down in Article 14 of Directive 2000/31. As the Commission submits, Article 17(3) of Directive 2019/790 is therefore a lex specialis in relation to Article 14 of Directive 2000/31.


41      This must be understood as meaning that providers are liable for ‘unlawful’ acts of communication to the public, that is to say, unauthorised acts in respect of which no exception or limitation applies (see point 143 et seq. of this Opinion).


42      See my Opinion in Joined Cases YouTube and Cyando (points 100 and 101).


43      See Article 13 of Directive 2004/48/EC of the European Parliament and of the Council of 29 April 2004 on the enforcement of intellectual property rights (OJ 2004 L 157, p. 45).


44      See my Opinion in Joined Cases YouTube and Cyando (points 73 to 78).


45      First of all, not all rightholders will want to authorise the use of their protected works and subject matter on those services. Next, while it will be relatively easy for sharing service providers to conclude licences, where appropriate, with the ‘heavyweights’ or with collective management organisations, this will be more complex with regard to the myriad of ‘small’ rightholders and individual authors. Finally, that complexity is compounded by the fact that the content uploaded to sharing services is likely to involve many different types of rights and that copyright and related rights are subject to the principle of territoriality. Licences therefore operate on a ‘country-by-country’ basis, which multiplies the number of authorisations which must be obtained.


46      See recital 66 of Directive 2019/790.


47      With the exception of Article 17(6) of Directive 2019/790 which lays down specific conditions for exemption from liability for ‘new’ sharing service providers, and goes beyond the scope of the present action.


48      I shall therefore confine my examination to that fundamental right, irrespective of the questions that Article 17 of Directive 2019/790 may raise in relation to other fundamental rights guaranteed by the Charter, such as the freedom to conduct a business (Article 16).


49      See, inter alia, judgment of 8 December 2020, Poland v Parliament and Council (C‑626/18, EU:C:2020:1000, paragraphs 28 and the case-law cited).


50      With the exception of the definition of ‘online content-sharing service provider’ in Article 2(6) of Directive 2019/790, which would lose its rationale.


51      In accordance with Article 15 of Directive 2000/31 (see point 105 of this Opinion). That said, certain monitoring obligations may be imposed on providers, independently of that exemption from liability, by means of injunctions (see, inter alia, Article 14(3) of Directive 2000/31 and Article 8(3) of Directive 2001/29).


52      See, for more details, my Opinion in Joined Cases YouTube and Cyando (points 173 to 196).


53      The condition provided for in Article 17(4)(c), in principio, is, therefore, similar to those laid down in Article 14 of Directive 2000/31.


54      An obligation to use best endeavours requires the debtor to make best efforts to achieve a result without being obliged to achieve it. See, to that effect, judgments of 4 June 2009, Commission v Greece (C‑250/07, EU:C:2009:338, paragraph 68), and of 27 March 2014, UPC Telekabel Wien (C‑314/12, EU:C:2014:192, paragraph 53).


55      See the second paragraph of recital 66 of Directive 2019/790.


56      Accordingly, Article 17(4) of Directive 2019/790 is, in essence, a system of liability for negligence: sharing service providers will be held liable, in accordance with that provision, where they have not taken sufficient care to prevent the uploading of illegal content by users of their services. Article 17 is, in that respect, a sort of ‘hybrid’ construction between the direct liability borne by those who commit unlawful acts and the indirect (or ‘secondary’) liability borne by intermediaries on the part of third parties. See, for that distinction, my Opinion in Joined Cases YouTube and Cyando (points 64, 65, 102 and 103).


57      See Impact Assessment, Part 1/3, p. 140.


58      See my Opinion in Joined Cases YouTube and Cyando (point 193).


59      See Impact Assessment, Part 1/3, p. 137, and recital 61 of Directive 2019/790.


60      See, inter alia, judgment of 8 September 2016, GS Media (C‑160/15, EU:C:2016:644, paragraph 28 and the case-law cited).


61      ‘Hashing’ consists in using a dedicated tool to represent a computer file digitally by a unique alphanumeric character string, known as a ‘hashcode’. By comparing that hashcode with those of files uploaded to a server, it is possible automatically to detect all identical copies of the original file contained on that server. ‘Watermarking’ consists of embedding into content, using a dedicated tool, a specific ‘marker’ which may be visible or invisible to the naked eye, which can then be traced in order to identify the original content and the copies made of it. Lastly, ‘fingerprinting’ involves using a dedicated tool to generate a unique digital representation (‘fingerprint’) of particular content – an image, a phonogram, a video, and so forth – by reducing it to some of its characteristic elements. By comparing that ‘fingerprint’ with those of the files on a server, it is possible to identify all files which, in essence, have matching content. For more details, see Mochon, J.‑P., ‘Rapport de mission – Une application effective du droit d’auteur sur les plateformes numériques de partage: État de l’art et propositions sur les outils de reconnaissance des contenus’, Conseil supérieur de la propriété littéraire et artistique, 29 January 2020.


62      The tools that use the ‘hashing’ technique are of limited efficacy when it comes to content recognition, since that technique merely makes it possible, as I stated in the previous footnote, to detect identical copies of a particular computer file. The slightest alteration compared to the original file (change in image pixel values, and so forth) will prevent automatic detection when the files being compared, in essence, are identical in content. Similarly, the technique of ‘watermarking’ can only detect copies of a marked file and can be easily circumvented. See Mochon, J.‑P., op. cit.


63      These tools are used to detect other types of illegal content (child pornography, insulting content, and so forth). See Mochon, J.‑P., op. cit.


64      The most famous in this respect is undoubtedly the ‘Content ID’ software developed by Google for YouTube. See my Opinion in Joined Cases YouTube and Cyando (point 22).


65      In practice, the system involves generating ‘digital fingerprints’ of works and other protected subject matter identified by rightholders and entering those ‘fingerprints’ into a database associated with the recognition tool. Then, using an algorithm, all uploaded files are automatically scanned and their own ‘fingerprints’ are compared to those in that database in order to detect matches. Recognition tools using ‘digital fingerprinting’ are capable of identifying such matches even over a short duration (for example, several seconds in the case of a phonogram), or where the content has been altered in order to avoid automatic detection (for example, the film image has been reversed, accelerated, and so forth). Certain tools, such as Content ID, are even capable of recognising not only phonograms, but also the melodies of underlying works. See Mochon, J.‑P., op. cit.


66      For more details on these tools and their providers, see Impact Assessment, Part 3/3, pp. 164-172, and Mochon, J.‑P., op. cit.


67      See point 22 of this Opinion.


68      See Impact Assessment, Part 1/3, pp. 140-144. See, to the same effect, Communication from the Commission, ‘Tackling illegal content online. Towards an enhanced responsibility of online platforms’ (COM(2017) 555 final), 28 September 2017, pp. 13 and 14.


69      Recital 68 of Directive 2019/790 merely states that ‘various actions could be undertaken’ by sharing service providers.


70      See, to the same effect, Parliament press release, 27 March 2019, ‘Questions and answers on issues about the digital copyright directive’: ‘Is the directive creating automatic filters on online platforms? No. The draft directive sets a goal to be achieved … The draft directive however does not specify or list what tools, human resources or infrastructures may be needed to prevent unremunerated material appearing on the site. There is therefore no requirement for upload filters. However, if large platforms do not come up with any innovative solutions, they may end up opting for filters …’ (available at https://www.europarl.europa.eu/news/en/press-room/20190111IPR23225/questions-and-answers-on-issues-about-the-digital-copyright-directive).


71      That is also the opinion of many experts in the field. See, in particular, Grisse, K., ‘After the storm – examining the final version of Article 17 of the new Directive (EU) 2019/790’, Journal of Intellectual Property Law & Practice, 2019, vol. 14, No 11, pp. 887-899, in particular pp. 894 and 895; Leitsner, M., ‘European Copyright Licensing and Infringement Liability Under Art. 17 DSM-Directive – Can We Make the New European System a Global Opportunity Instead of a Local Challenge?’, Zeitschrift für geistiges Eigentum, 2020, vol. 12, No 2, pp. 123-214, in particular pp. 141 and 143; Lambrecht, M., ‘Free speech by design – Algorithmic protection of exceptions and limitations in the Copyright DSM directive’, JIPITEC, vol. 11, 2020, pp. 68-94, in particular p. 71; Dusollier, S., ‘The 2019 Directive on Copyright in the Digital Single Market: Some Progress, a Few Bad Choices, and an Overall Failed Ambition’, Common Market Law Review, vol. 57, 2020, pp. 979-1030, in particular p. 1016; Mochon, J.‑P., op. cit., p. 106; and Frosio, G., and Mendis, S., ‘Monitoring and Filtering: European Reform or Global Trend?’, Oxford Handbook of Online Intermediary Liability, Frosio, G. (ed.), Oxford University Press, 2020, in particular p. 562.


72      See Lambrecht, M., op. cit., p. 71: ‘… if YouTube wanted to ensure a human review of the 432 000 hours of video uploaded daily, it would have to hire roughly 70 000 full time (very efficient) employees’.


73      That does not mean that sharing service providers could not have certain content checked by their employees at all. Nevertheless, those providers will have to use automatic tools in order, at the very least, to reduce the mass of content which must undergo such verification (see point 211 of this Opinion).


74      Some operators have apparently started to develop and use tools which can identify content using artificial intelligence. See, inter alia, Mochon, J.‑P., op. cit., p. 35. In any event, this, by its very nature, is still an automatic content recognition technique.


75      In particular, the French Government acknowledged that ‘in the current state of technology, the use of automatic filtering mechanisms appears to be the most effective means of identifying rapidly the unauthorised upload of protected content, given the mass of content which is uploaded continuously to the platforms covered by Article 17’. The Council also acknowledged that ‘large’ sharing service providers might ‘feel obliged’ to use such tools.


76      ‘Digital fingerprinting’ recognition tools can be used on audio, photo and video content. See Mochon, J.‑P., op. cit. I would point out that Article 17 of Directive 2019/790 applies, in the absence of any limitation in that regard in the wording of that article or in the definition of an ‘online content-sharing service provider’ contained in Article 2(6) of that directive, to all types of protected subject matter (visual, musical, cinematographic, textual, but also lines of code, video games, and so forth).


77      See also the second paragraph of recital 66 of Directive 2019/790 (‘… Different means to avoid the availability of unauthorised copyright-protected content could be appropriate and proportionate depending on the type of content …’).


78      See inter alia, Mochon, J.‑P., op. cit., p. 12. This seems to be the case, inter alia, for gameplay footage of video games. In the absence of software tools which enable certain types of protected works and subject matter to be filtered automatically and effectively, it cannot be ruled out that the scope of the obligations of diligence imposed on sharing service providers is, in that regard, significantly reduced, pursuant to the principle of proportionality. See, to that effect, the second paragraph of recital 66 of Directive 2019/790 according to which ‘it cannot be excluded that in some cases availability of unauthorised content can only be avoided upon notification of rightholders’.


79      See Grisse, K., op. cit., p. 895, and Frosio, G., and Mendis, S., op. cit., p. 562.


80      See, by analogy, judgment of 27 March 2014, UPC Telekabel Wien (C‑314/12, EU:C:2014:192, paragraphs 51 to 53).


81      Furthermore, content recognition tools will, in many cases, enable sharing service providers to fulfil their transparency obligation imposed on them in the second subparagraph of Article 17(8) of Directive 2019/790. In accordance with that provision, those providers must provide rightholders with information on the use of content covered by any licensing agreements concluded with them. Those tools in fact make it possible to gather statistics, which are often very precise, regarding the audience of content present on those services (see Impact Assessment, Part 3/3, p. 165, and point 58 of this Opinion).


82      See, inter alia, judgment of 29 July 2019, Funke Medien NRW (C‑469/17, EU:C:2019:623, paragraph 73 and the case-law cited). See, also, Article 19 of the Universal Declaration of Human Rights, adopted on 10 December 1948 by the United Nations General Assembly (Resolution 217 A(III)), and Article 19 of the International Covenant on Civil and Political Rights, adopted on 16 December 1966 by the United Nations General Assembly.


83      See, by analogy, ECtHR, 19 February 2013, Neij and Sunde Kolmisoppi v. Sweden, CE:ECHR:2013:0219DEC004039712 (‘ECtHR, Neij and Others v. Sweden’), pp. 9 and 10; ECtHR, 10 April 2013, Ashby Donald and Others v. France, CE:ECHR:2013:0110JUD003676908 (‘ECtHR, Ashby Donald and Others v. France’), § 34; and Opinion of Advocate General Jääskinen in L’Oréal and Others (C‑324/09, EU:C:2010:757, points 49 and 157).


84      Although the ECHR does not recognise such a freedom as an autonomous right, the ‘freedom of artistic expression’ falls within the scope of Article 10 of that convention. See, inter alia, ECtHR, 24 May 1988, Müller and Others v. Switzerland, CE:ECHR:1988:0524JUD001073784, § 27, and ECtHR, 8 July 1999, Karataş v. Turkey, CE:ECHR:1999:0708JUD002316894, § 49.


85      To my knowledge, the only opinions, information or ideas which are excluded automatically from the protection conferred by Article 10 of the ECHR are hate speech, on the ground that it is incompatible with the values proclaimed and guaranteed by that convention (see, inter alia, ECtHR, Gunduz v. Turkey, CE:ECHR:2003:1204JUD003507197, § 41).


86      See, inter alia, ECtHR, Neij and Others v. Sweden, pp. 10 and 12, and ECtHR, Ashby Donald and Others v. France, §§ 35 and 44. See, also, Smith, G., ‘Copyright and freedom of expression in the online world’, Journal of Intellectual Property Law & Practice, 2010, vol. 5, No 2, pp. 88-95, and Michaux, B., ‘Chapitre 13. Diffusion du savoir. Droit d’auteur et Internet’, L’Europe des droits de l’homme à l’heure d’Internet, Van Enis, Q. (ed.), Bruylant, 2019, pp. 491-526. See also point 117 of this Opinion.


87      See ECtHR, 18 December 2012, Ahmet Yildirim v. Turkey, CE:ECHR:2012:1218JUD000311110 (‘ECtHR, Ahmet Yildirim v. Turkey’), § 55; ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 36; and ECtHR, 30 April 2019, Kablis v. Russia, CE:ECHR:2019:0430JUD004831016 (‘ECtHR, Kablis v. Russia’), § 90.


88      See, by analogy, the judgments in Scarlet Extended (paragraphs 29, 36, 37 and 40) and SABAM (paragraphs 26, 35, 37 and 38). This may be contrasted with the ‘repressive’ measures which have existed until now, such as the removal, following notification or ordered by a court, of information which has already been posted, the unlawfulness of which is manifest and/or has been assessed by that court.


89      See, by analogy, ECtHR, Ahmet Yildirim v. Turkey, § 55, and ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 36 (concerning measures to block websites), and Opinion of Advocate General Cruz Villalón in Scarlet Extended (C‑70/10, EU:C:2011:255, point 85). The fact, as I shall explain in detail below, that blocked content may, if appropriate, be re-uploaded where the users concerned demonstrate, in the context of the complaint mechanism envisaged in Article 17(9) of Directive 2019/790, that that content does not infringe copyright is not, in my view, capable of calling into question that finding of ‘interference’. The same applies to the question of whether or not such filtering is strictly targeted at illegal content. However, those factors will be analysed as part of the examination of the proportionality of that interference.


90      Under those circumstances, the question which arises in respect of that freedom is to what extent, in view of the importance acquired by those services – which have become essential infrastructures for the exercise of the freedom of online communication (see point 103 of this Opinion) – must their providers respect the fundamental rights of users and to what extent are public authorities required, under the ‘positive obligations’ deriving from those articles, to adopt ‘positive measures of protection’, guaranteeing the effective enjoyment of that freedom in relations between the users and providers of sharing services. There is no need to answer those questions in the present case. See, with regard to the ‘positive obligations’ under Article 10 of the ECHR, in particular, ECtHR, 6 May 2003, Appleby v. the United Kingdom, CE:ECHR:2003:0506JUD004430698, § 39, and ECtHR, 16 December 2008, Khurshid Mustafa v. Sweden, CE:ECHR:2008:1216JUD002388306, § 31.


91      See, by analogy, ECtHR, 28 June 2001, VgT Verein gegen Tierfabriken v. Switzerland, CE:ECHR:2001:0628JUD002469994, §§ 44 to 47. In that case, the ECtHR held that the refusal, by a private television channel, to broadcast a television commercial produced by an animal protection association constituted an ‘interference’ with that association’s freedom of expression that was attributable to the respondent State since that refusal sought to comply with the national law on television and radio which prohibited political advertising. Therefore, unlike the Republic of Poland, I do not consider the theory of ‘positive obligations’ to be relevant in the present case. In any event, that point is not decisive in the reasoning. In that regard, the ECtHR has repeatedly held that ‘the boundaries between the State’s positive and negative obligations under the [ECHR] do not lend themselves to precise definition’ and that the principles that apply in either case are essentially the same (see, inter alia, ECtHR, 13 July 2012, Mouvement Raëlien Suisse v. Switzerland, CE:ECHR:2011:0113JUD001635406, § 50 and the case-law cited).


92      See, by analogy, ECtHR, 25 March 1993, Costello-Roberts v. the United Kingdom, CE:ECHR:1993:0325JUD001313487, § 27: ‘the State cannot absolve itself from responsibility by delegating its obligations to private bodies or individuals’.


93      See footnote 45 of this Opinion.


94      I would add, in conclusion, that the very fact that sharing service providers are required indirectly to carry out such monitoring of their services constitutes, in itself, an ‘interference’ by the EU legislature with the freedom of expression of those providers. Since they provide everyone with the means to receive or impart information, their activity falls within the scope of Article 11 of the Charter and Article 10 of the ECHR. See, by analogy, ECtHR, Neij and Others v. Sweden, pp. 9 and 10; ECtHR, 2 February 2016, Magyar Tartalomzolgáltatók Egyesület and Index.hu zrt v. Hungary, CE:ECHR:2016:0202JUD002294713, § 45; and ECtHR, 4 June 2020, Jezior v. Poland, CE:ECHR:2020:0604JUD003195511, § 41.


95      See, inter alia, ECtHR, 14 March 2002, Gaweda v. Poland, CE:ECHR:2002:0314JUD002622995, § 37; ECtHR, Ahmet Yildirim v. Turkey, § 56; and ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 36.


96      See point 71 of this Opinion. Moreover, under Article 53 of the Charter, the level of protection afforded by that instrument can never be lower than that guaranteed by the ECHR. To that end, the Court must adopt an interpretation of the conditions laid down in Article 52(1) of the Charter which is at least as strict as the ECtHR’s interpretation of the conditions set out in Article 10(2) of the ECHR.


97      See, inter alia, judgment of 17 December 2015, WebMindLicenses (C‑419/14, EU:C:2015:832, paragraph 81); Opinion 1/15 (EU-Canada PNR Agreement), of 26 July 2017 (EU:C:2017:592, paragraph 146); and my Opinion in Facebook Ireland and Schrems (C‑311/18, EU:C:2019:1145, point 263).


98      See, inter alia, judgment of 17 December 2015, WebMindLicenses (C‑419/14, EU:C:2015:832, paragraph 81).


99      See, inter alia, ECtHR, 26 April 1979, Sunday Times v. the United Kingdom, CE:ECHR:1979:0426JUD000653874, § 49; ECtHR, 14 March 2002, Gaweda v. Poland, CE:ECHR:2002:0314JUD002622995, § 39; and ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 37.


100      To that effect, the ECtHR has repeatedly held that ‘whilst certainty is desirable, it may bring in its train excessive rigidity, and the law must be able to keep pace with changing circumstances. Accordingly, many laws are inevitably couched in terms which, to a greater or lesser extent, are vague, and whose interpretation and application are questions of practice’. See, inter alia, ECtHR, 16 June 2015, Delfi AS v. Estonia, CE:ECHR:2015:0616JUD006456909 (‘ECtHR, Delfi AS v. Estonia’), § 121 and the case-law cited.


101      The mere fact that, in the present case, the parties and the interveners have put forward different interpretations of Article 17 of Directive 2019/790 (see points 168 and 170 of this Opinion) does not mean that the requirement of ‘foreseeability’ has not been met (see, inter alia, ECtHR, 17 February 2004, Gorzelik and Others v. Poland, CE:ECHR:2004:0217JUD004415898, § 65). It will be for the Court to clarify the correct interpretation of that provision.


102      See, inter alia, judgment of 17 December 2015, WebMindLicenses (C‑419/14, EU:C:2015:832, paragraph 81).


103      See, inter alia, ECtHR, Ahmet Yildirim v. Turkey, §§ 59 and 64, and ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 37.


104      See, by analogy, judgment of 6 October 2020, La Quadrature du Net and Others (C‑511/18, C‑512/18 and C‑520/18, EU:C:2020:791, paragraph 132).


105      See point 128 et seq. of this Opinion.


106      See my Opinion in Facebook Ireland and Schrems (C‑311/18, EU:C:2019:1145, point 272).


107      See points 117 and 129 of this Opinion.


108      See inter alia, ECtHR, 26 November 1991, Observer and Guardian v. the United Kingdom, CE:ECHR:1991:1126JUD001358588, § 60; ECtHR, 14 March 2002, Gaweda v. Poland, CE:ECHR:2002:0314JUD002622995, § 35; and ECtHR, Ahmet Yildirim v. Turkey, § 64 and the references cited.


109      I would point out that, for that reason, freedom of expression is an essential foundation of any democratic society. See, inter alia, judgment of 23 April 2020, Associazione Avvocatura per i diritti LGBTI (C‑507/18, EU:C:2020:289, paragraph 48 and the case-law cited); ECtHR, 7 December 1976, Handyside, CE:ECHR:1976:1207JUD000549372, § 49; and ECtHR, 26 November 1991, Observer and Guardian v. the United Kingdom, CE:ECHR:1991:1126JUD001358588, § 59.


110      See, first, judgment of 8 September 2016, GS Media (C‑160/15, EU:C:2016:644, paragraph 45). See, secondly, ECtHR, 10 March 2009, Times Newspapers Ltd v. the United Kingdom, CE:ECHR:2009:0310JUD000300203, § 27; ECtHR, Ahmet Yildirim v. Turkey, §§ 48 and 54; and ECtHR, 1 December 2015, Cengiz and Others v. Turkey, CE:ECHR:2015:1201JUD004822610, §§ 49 and 52.


111      See, inter alia, ECtHR, 1 December 2015, Cengiz and Others v. Turkey, CE:ECHR:2015:1201JUD004822610, §§ 51 and 52, and ECtHR, Kablis v. Russia, § 81.


112      See, inter alia, Recommendation CM/Rec(2018)2 of the Committee of Ministers to Member States on the roles and responsibilities of Internet intermediaries, adopted by the Committee of Ministers on 7 March 2018 at the 1309th meeting of the Ministers’ Deputies, and Balkin, J.M., ‘Old-School/New-School Speech Regulation’, Harvard Law Review, vol. 127, No 8, 2014, pp. 2296-2342, in particular p. 2304.


113      See Conseil constitutionnel (Constitutional Council, France), decision No 2020-801 DC of 18 June 2020, ‘Loi visant à lutter contre les contenus haineux sur internet’, § 4.


114      That is to say, by means of a liability exemption mechanism, as in the present case (see point 62 of this Opinion).


115      See Opinion of Advocate General Poiares Maduro in Google France and Google (C‑236/08 to C‑238/08, EU:C:2009:569, points 142 and 143). See also Recommendation CM/Rec(2007)16 of the Committee of Ministers to member states on measures to promote the public service value of the Internet, adopted by the Committee of Ministers on 7 November 2007 at the 1010th meeting of the Ministers’ Deputies, Annex, Part III(a), and Recommendation CM/Rec(2018)2 on the roles and responsibilities of Internet intermediaries, Appendix, point 1.3.5.


116      See, to the same effect, Smith, G., ‘Time to speak up for Article 15’, Cyberleagle Blog, 21 May 2017 (available at https://www.cyberleagle.com/2017/05/time-to-speak-up-for-article-15.html).


117      See the judgments in Scarlet Extended (paragraphs 40 and 52) and SABAM (paragraphs 38 and 50).


118      The Court has repeatedly held that the national rules setting out the rules for the operation of such injunctions, and likewise their application by the national courts, must observe the prohibition of ‘general monitoring obligations’ provided for in Article 15 of Directive 2000/31. See, inter alia, the judgments in Scarlet Extended (paragraphs 32 to 35) and SABAM (paragraphs 30 to 33).


119      See recital 45 and Article 14(3) of Directive 2000/31.


120      See recital 47 of Directive 2000/31 (‘Member States are prevented from imposing a monitoring obligation …; this does not concern monitoring obligations in a specific case …’).


121      See ECtHR, Ahmet Yildirim v. Turkey, § 64; ECtHR, 1 December 2015, Cengiz and Others v. Turkey, CE:ECHR:2015:1201JUD004822610, § 62; and ECtHR, Kablis v. Russia, § 97.


122      See point 138 of this Opinion.


123      See, to the same effect, Grisse, K., op. cit., p. 897; Spindler, G., ‘The liability system of Art. 17 DSMD and national implementation – Contravening prohibition of general monitoring duties?’, JIPITEC, vol. 10, 2020, pp. 350 and 353-359; and Cabay, J., ‘Lecture prospective de l’article 17 de la directive sur le droit d’auteur dans le marché unique numérique: Vers une obligation de filtrage limitée par la CJUE, garante du “juste équilibre”’, Propriété intellectuelle à l’ère du big data et de la blockchain, Schulthess, De Werra, J., and Benhamou, Y. (eds), Geneva, 2021, pp. 225-237.


124      I would point out that, contrary to what the Parliament suggests, that case-law is relevant in the present case even though it concerns injunctions issued by national courts and not an act adopted by the EU legislature. The concept of ‘general monitoring obligation’ must be interpreted in the same way, irrespective of the origin of such an obligation (see, to that effect, my Opinion in Joined Cases YouTube and Cyando (footnote 104)).


125      Judgment of 12 July 2011 (C‑324/09, EU:C:2011:474, paragraph 139).


126      The judgment in Scarlet Extended (paragraphs 29 and 38 to 40). According to my reading, the Court attached decisive weight to the fact that ‘preventive monitoring of this kind would … require active observation of all electronic communications conducted on the network of the [Internet service provider] concerned and, consequently, would encompass all information to be transmitted and all customers using that network’ (paragraph 39) (emphasis added).


127      See paragraphs 35 to 38 of that judgment.


128      Judgment of 15 September 2016 (C‑484/14, EU:C:2016:689, paragraphs 25 and 88).


129      The question could be seriously raised as to whether the contested provisions entail a ‘general monitoring obligation’, in the way that that concept is understood in those judgments. In particular, the differences between the obligations resulting from those provisions and the filtering system at issue in the case which gave rise to the judgment in Scarlet Extended are far from obvious. In that case, the rightholders requested that the intermediary be ordered to ‘identify … the files containing works in respect of which [those rightholders] claim to hold rights’ (paragraph 38 of that judgment), specifically using the ‘Audible Magic’ tool (see Opinion of Advocate General Cruz Villalón in Scarlet Extended (C‑70/10, EU:C:2011:255, points 21 and 24)). This is a recognition tool that uses ‘digital fingerprinting’ and that works on the basis of reference files provided by those rightholders. Moreover, it is mentioned in the Impact Assessment (Part 3/3, p. 55).


130      I would point out that that judgment was delivered after the present action was brought.


131      See Opinion of Advocate General Szpunar in Glawischnig-Piesczek (C‑18/18, EU:C:2019:458, points 25, 26 and 59).


132      The judgment in Glawischnig-Piesczek (paragraph 35).


133      The judgment in Glawischnig-Piesczek (paragraph 46).


134      I would point out that, although the judgment in Glawischnig-Piesczek concerns defamation law, the lessons learned from it go beyond that sphere. The concept of ‘general monitoring obligation’ applies horizontally, irrespective of the type of infringement which the intermediary must seek. See Opinion of Advocate General Szpunar in Glawischnig-Piesczek (C‑18/18, EU:C:2019:458, point 43).


135      See points 194 to 199 of this Opinion.


136      See points 200 to 203 of this Opinion.


137      I note, in addition, that the EU legislature reaffirmed the prohibition of ‘general monitoring obligations’ in Article 17(8) of Directive 2019/790 (see points 194 to 203 of this Opinion).


138      See, inter alia, communications from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 6 May 2015, ‘A Digital Single Market Strategy for Europe’ (COM(2015) 192 final), pp. 4, 8 and 12-14; of 25 May 2016, ‘Online Platforms and the Digital Single Market – Opportunities and Challenges for Europe’ (COM(2016) 288 final), pp. 8-11; and of 28 September 2017, ‘Tackling Illegal Content Online. Towards an enhanced responsibility of online platforms’ (COM(2017) 555 final). See also Commission Recommendation (EU) 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online (OJ 2018 L 63, p. 50), recitals 1 to 5, 24 and 36, and paragraphs 18, 36 and 37.


139      See, inter alia, Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (OJ 2021 L 172, p. 79), in particular Article 5, and Proposal for a Regulation of the European Parliament and of the Council of 15 December 2020 on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC (COM(2020) 825 final), in particular Article 27. I note that the principle that online intermediaries cannot be subject to a general monitoring obligation is reaffirmed in Article 7 of the latter proposal.


140      See point 84 of this Opinion.


141      See points 140 to 153 of this Opinion. See also, to the same effect, Recommendation 2018/334 (recitals 24, 27, 36 and paragraphs 19 to 21).


142      See point 28 of this Opinion.


143      See also Article 27(2) of the Universal Declaration of Human Rights and Article 15 of the International Covenant on Economic, Social and Cultural Rights.


144      See, inter alia, ECtHR, 11 January 2007, Anheuser-Busch Inc. v. Portugal, CE:ECHR:2007:0111JUD007304901, § 72; ECtHR, Ashby Donald and Others v. France, § 40; and ECtHR, Neij and Others v. Sweden, p. 11.


145      See, by analogy, judgments of 29 January 2008, Promusicae (C‑275/06, EU:C:2008:54, paragraph 53); and of 15 September 2016, Mc Fadden (C‑484/14, EU:C:2016:689, paragraph 81); and ECtHR, Ashby Donald and Others v. France, § 36; and ECtHR, 19 February 2013, Neij and Others v. Sweden, pp. 10 and 11. As the Parliament and the Council submit, the limitation at issue meets at the same time an objective of general interest which is not only ‘recognised’ but also ‘pursued’ by the European Union, namely the promotion of cultural diversity (see recital 2 of Directive 2019/790). The protection of copyright is intended, inter alia, to support the creation, production and dissemination of information, knowledge and culture (see, inter alia, recitals 9 to 11 and 14 of Directive 2001/29). The European Union has set itself the objective, in accordance with Article 3(3) TEU, to ‘ensure that Europe’s cultural heritage is safeguarded and enhanced’.


146      See, inter alia, judgments of 22 January 2013, Sky Österreich (C‑283/11, EU:C:2013:28, paragraph 50); of 15 February 2016, N. (C‑601/15 PPU, EU:C:2016:84, paragraph 54); and of 17 December 2020, Centraal Israëlitisch Consistorie van België and Others (C‑336/19, EU:C:2020:1031, paragraph 64).


147      See, inter alia, judgment of 17 December 2020, Centraal Israëlitisch Consistorie van België and Others (C‑336/19, EU:C:2020:1031, paragraph 66 and the case-law cited).


148      Since, I recall, the contested provisions apply only in respect of the protected works and subject matter identified by rightholders for which the sharing service providers did not obtain authorisation from those rightholders.


149      See point 53 of this Opinion. The fact that any filtering may be circumvented by malicious users (see, however, with regard to the robustness of ‘fingerprint’ filtering tools, footnote 65 of this Opinion) and is necessarily accompanied by some margin of error may possibly reduce the ability of those obligations to attain the objective pursued, but does not make them inappropriate, however (see judgment of 27 March 2014, UPC Telekabel Wien (C‑314/12, EU:C:2014:192, paragraph 63)).


150      See, inter alia, judgments of 22 January 2013, Sky Österreich (C‑283/11, EU:C:2013:28, paragraphs 54 and 55); of 17 October 2013, Schwarz (C‑291/12, EU:C:2013:670, paragraphs 46, 52 and 53); and of 4 May 2016, Philip Morris Brands and Others (C‑547/14, EU:C:2016:325, paragraph 160).


151      See footnote 172 of this Opinion.


152      See, inter alia, judgment of 17 December 2020, Centraal Israëlitisch Consistorie van België and Others (C‑336/19, EU:C:2020:1031, paragraph 64 and the case-law cited).


153      See, by analogy, judgment of 9 November 2010, Volker und Markus Schecke and Eifert (C‑92/09 and C‑93/09, EU:C:2010:662, paragraph 85).


154      See, inter alia, judgment of 17 December 2020, Centraal Israëlitisch Consistorie van België and Others (C‑336/19, EU:C:2020:1031, paragraph 65 and the case-law cited).


155      See, inter alia, judgment of 9 March 2021, VG Bild-Kunst (C‑392/19, EU:C:2021:181, paragraph 54 and the case-law cited).


156      See recitals 1 to 7, 40, 41 and 45 to 49 of Directive 2000/31, and my Opinion in Joined Cases YouTube and Cyando (point 245).


157      See my Opinion in Joined Cases YouTube and Cyando (point 246).


158      See, inter alia, judgment of 17 December 2015, Neptune Distribution (C‑157/14, EU:C:2015:823, paragraph 76 and the case-law cited).


159      See Recommendation CM/Rec(2018)2 on the roles and responsibilities of Internet intermediaries, preamble, point 9.


160      See, inter alia, ECtHR, Ashby Donald and Others v. France, § 40 and the case-law cited. See also ECtHR, Neij and Others v. Sweden, p. 11, and ECtHR, 11 March 2014, Akdeniz v. Turkey, CE:ECHR:2014:0311DEC002087710, § 28. See, to the same effect, judgment of 3 February 2021, Fussl Modestraße Mayr (C‑555/19, EU:C:2021:89, paragraphs 91 to 93).


161      See, by analogy, ECtHR, Neij and Others v. Sweden, p. 11.


162      Some legal writers maintain that there is no empirical evidence that the ‘Value Gap’ exists. See, inter alia, Frosio, G., ‘From horizontal to vertical: an intermediary liability earthquake in Europe’, Journal of Intellectual Property Law & Practice, 2016, vol. 12, No 7, pp. 565-575, in particular pp. 567-569. See, for an opposing view, Bensamoun, A., ‘Le value gap ou le partage de la valeur dans la proposition de directive sur le droit d’auteur dans le marché unique numérique’, Entertainment, Bruylant, No 2018-4, pp. 278-287.


163      I would point out that it is, of course, for the EU legislature to decide on the level of protection it wishes to afford to copyright and related rights in the European Union.


164      See Impact Assessment, Part 1/3, pp. 137-144, 175. See, to the same effect, judgments of 13 May 2014, Google Spain and Google (C‑131/12, EU:C:2014:317, paragraph 80), and in Glawischnig-Piesczek (paragraphs 36 and 37).


165      See point 14 of this Opinion.


166      Nevertheless, the fact that those providers do not pre-select the information uploaded to their services (see point 32 of this Opinion) is, in my view, a decisive difference which prevents them from being treated in the same way as editors.


167      See, to that effect, Recommendation CM/Rec(2018)2 on the roles and responsibilities of Internet intermediaries, preamble, points 4 and 5, and Appendix, point 1.3.9. For ‘traditional’ host providers, such a liability regime would not be proportionate in my opinion. The same applies, a fortiori, to other types of intermediaries, such as the providers of ‘mere conduit’ services (see Article 12 of Directive 2000/31).


168      See ECtHR, Delfi AS v. Estonia, § 133.


169      See ECtHR, Delfi AS v. Estonia, §§ 151, 155, 158 and 159.


170      See ECtHR, Delfi AS v. Estonia, §§ 113, 115, 117, 128 and 145.


171      I would point out that, in that case, the operator of the news portal had to monitor that portal to search for ‘clearly unlawful’ information. As I shall clarify in points 194 to 218 of this Opinion, that is also the case here.


172      See, inter alia, Balkin, J.‑M., op. cit., p. 2309, and ECtHR, 4 June 2020, Jezior v. Poland, CE:ECHR:2020:0604JUD003195511, § 60 (‘The attribution [to an intermediary provider] of liability for comments from third parties may … have a chilling effect on the freedom of expression on the Internet’). See also ECtHR, 2 February 2016, Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, CE:ECHR:2016:0202JUD002294713, § 86, and ECtHR, 7 February 2017, Pihl v. Sweden, CE:ECHR:2017:0207DEC007474214, § 35. By contrast, in the context of the exemption from liability provided for in Article 14 of Directive 2000/31, that risk is reduced since intermediate providers must remove only information the unlawfulness of which has been established or is ‘apparent’.


173      Particularly since, on the one hand, in accordance with Article 17(4) of Directive 2019/790, sharing service providers bear the burden of proof to demonstrate that they have made ‘best efforts’ to prevent infringing content being uploaded and, on the other, they bear a considerable risk of liability, having regard to the ‘large amount’ of content to which those services provide access.


174      See, inter alia, judgments of 3 September 2014, Deckmyn and Vrijheidsfonds (C‑201/13, EU:C:2014:2132, paragraph 26); of 7 August 2018, Renckhoff (C‑161/17, EU:C:2018:634, paragraph 43); and of 29 July 2019, Spiegel Online (C‑516/17, EU:C:2019:625, paragraphs 38, 42, 43 and 54). See also recital 6 of Directive 2019/790.


175      See Article 5(3)(d) of Directive 2001/29 and judgments of 29 July 2019, Funke Medien NRW (C‑469/17, EU:C:2019:623, paragraph 71), and of 29 July 2019, Spiegel Online (C‑516/17, EU:C:2019:625, paragraph 57).


176      See Article 5(3)(k) of Directive 2001/29 and judgment of 3 September 2014, Deckmyn and Vrijheidsfonds (C‑201/13, EU:C:2014:2132, paragraph 25).


177      I am thinking here of memes, film reviews, misappropriation and a whole raft of other types of content for entertainment or education which are abundant on those services and which, moreover, may in themselves constitute works which, often, are ‘transformative’.


178      For example, as regards the exception provided for in Article 5(3)(d) of Directive 2001/29, it is necessary to ascertain whether the user has ‘[established] a direct and close link between the quoted work and his own reflections, thereby allowing for an intellectual comparison to be made with the work of another’ (judgment of 29 July 2019, Spiegel Online (C‑516/17, EU:C:2019:625, paragraph 79)). Concerning the exception in point (k) of that paragraph involves ascertaining whether the content, first, ‘[evokes] an existing work while being noticeably different from it’ and, secondly, ‘[constitutes] an expression of humour or mockery’ (judgment of 3 September 2014, Deckmyn and Vrijheidsfonds (C‑201/13, EU:C:2014:2132, paragraph 20)).


179      The analysis is further complicated by the fact that the applicable exceptions and limitations and their scope are likely to vary from one national law to another. Although the list in Article 5 of Directive 2001/29 is exhaustive, it gives each Member State the option of transposing the exceptions or limitations it wishes into its domestic law. Moreover, those States, depending on the case, have discretion in their implementation. See my Opinion in Joined Cases YouTube and Cyando (point 188).


180      See, by analogy, my Opinion in Joined Cases YouTube and Cyando (point 189).


181      See, to that effect, paragraph 52 of the first judgment and paragraph 50 of the second judgment. See also my Opinion in Joined Cases YouTube and Cyando (point 243).


182      See point 58 of this Opinion.


183      See, inter alia, Commission document, ‘Targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the Directive on Copyright in the Digital Single Market’, p. 15.


184      See, inter alia, Grisse, K., op. cit., p. 887; Dusollier, S., op. cit., p. 1018; and Lambrecht, M., op. cit., p. 73. See also Jacques, S., Garstka, K., Hviid, M., and Street, J., ‘The impact on cultural diversity of Automated Anti-Piracy Systems as copyright enforcement mechanisms: an empirical study of YouTube’s Content ID digital fingerprinting technology’, 2017.


185      The related right of phonogram producers also entails specific risks in that regard. For example, such a producer may identify, for the purposes of blocking, the recording of a performance of a Chopin nocturne (a work which is also in the public domain) over which it holds rights. Since some tools, such as Content ID, are able to recognise not only content which reproduces excerpts of that phonogram, but also content which reproduces the same melody (see footnote 65 of this Opinion), they are likely automatically to block, for example, videos of users filming themselves as they themselves perform the nocturne in question.


186      In particular, for that reason, Content ID has, it seems, already mistaken innocent content for protected works. See, for various examples, Garstka, K., ‘Guiding the Blind Bloodhounds: How to Mitigate the Risks art. 17 of Directive 2019/790 Poses to the Freedom of Expression’, Intellectual Property and Human Rights,  Wolters Kluwer Law & Business, 4th ed., Torremans, P. (ed.), 2020, pp. 327-352, in particular pp. 332-334.


187      See, by analogy, judgments of 22 January 2013, Sky Österreich (C‑283/11, EU:C:2013:28, paragraph 61), and of 8 April 2014, Digital Rights Ireland and Others (C‑293/12 and C‑594/12, EU:C:2014:238, paragraph 65).


188      See points 84 and 115 of this Opinion.


189      See, to that effect, ECtHR, Ahmet Yildirim v. Turkey, § 64; ECtHR, 8 October 2013, Cumhuriyet Vakfi and Others v. Turkey, CE:ECHR:2013:1008JUD002825507, § 61; ECtHR, 1 December 2015, Cengiz and Others v. Turkey, CE:ECHR:2015:1201JUD004822610, § 62; and ECtHR, Kablis v. Russia, § 97. See also, by analogy, judgments of 8 April 2014, Digital Rights Ireland and Others (C‑293/12 and C‑594/12, EU:C:2014:238, paragraphs 54, 55 and 65); of 21 December 2016, Tele2 Sverige and Watson and Others (C‑203/15 and C‑698/15, EU:C:2016:970, paragraph 117); and of 2 March 2021, Prokuratuur (Conditions of access to data relating to electronic communications) (C‑746/18, EU:C:2021:152, paragraph 48).


190      See point 84 of this Opinion.


191      See judgment of 8 April 2014, Digital Rights Ireland and Others (C‑293/12 and C‑594/12, EU:C:2014:238, paragraphs 60 to 67), and Opinion of Advocate General Cruz Villalón in Digital Rights Ireland and Others (C‑293/12 and C‑594/12, EU:C:2013:845, points 117 and 120).


192      See points 210 to 213 of this Opinion.


193      In that regard, I note that the ECtHR has repeatedly held that the ‘dangers’ which the ‘prior restraints’ resulting from blocking measures pose to freedom of expression call for ‘the most careful scrutiny’. See, inter alia, ECtHR, Ahmet Yildirim v. Turkey, § 47.


194      These terms must be understood as meaning the filtering and blocking measures which those providers must take pursuant to the contested provisions. See the first paragraph of recital 66 of Directive 2019/790.


195      Also covered are, inter alia, the use of works and other protected subject matter covered by licensing agreements concluded by users (see the first paragraph of recital 66 of Directive 2019/790) and the use of works in the public domain.


196      The exceptions and limitations in question are also provided for, as I stated in footnotes 175 and 176 of this Opinion, in Article 5(3)(d) and (k) of Directive 2001/29.


197      They may do so inter alia before national courts (see the second subparagraph of Article 17(9) of Directive 2019/790: ‘… users [may] have access to a court or another relevant judicial authority to assert the use of an exception or limitation to copyright and related rights’).


198      See judgments of 29 July 2019, Funke Medien NRW (C‑469/17, EU:C:2019:623, paragraph 70), and of 29 July 2019, Spiegel Online (C‑516/17, EU:C:2019:625, paragraph 54).


199      Although paragraph 7 is ambiguous in that regard, this interpretation is clear from recital 70 of Directive 2019/790 (‘The steps taken by … sharing service providers in cooperation with rightholders should be without prejudice to the application of exceptions or limitations to copyright …’) and paragraph 9 of that article, in particular the fourth subparagraph thereof (‘… they can use works and other subject matter under exceptions or limitations … provided for in Union law’) (emphasis added).


200      See footnote 179 of this Opinion.


201      At least as far as the use of protected subject matter on sharing services is concerned.


202      To that extent, Article 17(7) of Directive 2019/790 limits the freedom of sharing service providers to conduct a business in order to ensure freedom of expression for users. Nevertheless, those providers remain free to remove content which falls within the scope of exceptions or limitations on grounds other than copyright issues, for example if it is insulting or contravenes their nudity policy. That provision does not therefore impose on those providers, as such, an obligation to disseminate (‘must carry’) such content.


203      See, to the same effect, Leistner, M., op. cit., pp. 165 and 166. Any provision to the contrary in those terms and conditions or in such contractual agreements would therefore, in my eyes, be incompatible with Article 17(7) of Directive 2019/790.


204      See the fourth subparagraph of Article 17(9) of Directive 2019/790. The information given to users about their right to use protected subject matter under exceptions or limitations, and about the limits of that right, is essential in order to support the exercise by those users, of their freedom of expression and creation, whilst reducing the risk of accidental infringements of copyright.


205      See Impact Assessment, Part 1/3, pp. 140 and 141, and footnote 422.


206      See the first paragraph of recital 70 of Directive 2019/790.


207      ‘må ikke føre’, in the Danish version of Directive 2019/790.


208      The French Government also refers to possible voluntary measures taken by rightholders (see footnote 252 of this Opinion).


209      For the former, as I mentioned in point 166 of this Opinion, the fact that the rights of users of sharing services are taken into account only ex post, in the event of a complaint by those users, is said to demonstrate that the limitation on the exercise of the right to freedom of expression is disproportionate. By contrast, for the latter, that interpretation is said to maintain a ‘fair balance’ between the rights and interests at stake, since it ensures that rightholders are able to control a priori the use of their works and other subject matter while causing only a ‘temporary’ inconvenience to those users.


210      This is all the more obvious in the English-language version of Directive 2019/790 (‘…shall not result in the prevention of the availability…’) (emphasis added). Moreover, in my view, paragraph 7 gives concrete expression to the right to freedom of expression, and therefore its wording lends itself to a broad interpretation. See, by analogy, judgment of 14 February 2019, Buivids (C‑345/17, EU:C:2019:122, paragraph 51 and the case-law cited).


211      ‘… Steps taken by such service providers should … not affect users who are using the … sharing services in order to lawfully upload … information on such services’.


212      See, first, ‘Report on the proposal for a directive of the European Parliament and of the Council on copyright in the Digital Single Market’, 29 June 2019, Parliament, Committee on Legal Affairs, document A8-0245/2018 (available at https://www.europarl.europa.eu/doceo/document/A-8-2018-0245_EN.html?redirect), Amendment 77: ‘… To prevent misuses or limitations in the exercise of exceptions and limitations to copyright, Member States shall ensure that the service providers referred to in paragraph 1 put in place effective and expeditious complaints and redress mechanisms …’. See, secondly, Council, document 12254/16 + ADD1 + ADD2 + ADD3 + ADD4, 25 May 2018 (available at https://www.consilium.europa.eu/media/35373/st09134-en18.pdf): ‘Member States shall ensure that the measures referred to in paragraph 4 are implemented by the online content sharing service provider without prejudice to the possibility for their users to benefit from exceptions or limitations to copyright. For that purpose, the service provider shall put in place a complaint and redress mechanism …’ (emphasis added).


213      See, by analogy, the Opinion of Advocate General Wathelet in Karen Millen Fashions (C‑345/13, EU:C:2014:206, point 82), and my Opinion in Joined Cases Acacia and D’Amato (C‑397/16 and C‑435/16, EU:C:2017:730, points 53 and 63 to 65).


214      In that regard, I recall that, in accordance with the Court’s settled case-law, ‘the wording of secondary EU legislation must be interpreted, as far as possible, in such a way as not to affect its validity and in conformity with primary law as a whole and, in particular, with the provisions of the Charter’ (see, inter alia, judgment of 2 February 2021, Consob (C‑481/19, EU:C:2021:84, paragraph 50 and the case-law cited)).


215      Directive 2000/31 does not contain any obligation for intermediary service providers to provide for a ‘counter-notification’ procedure enabling users to challenge the ‘over-removal’ of their information.


216      Moreover, in the English-language version of Directive 2019/790, the same adjective is used in Article 17(4)(c) and the first subparagraph of Article 17(9) of that directive (‘acted expeditiously, upon receiving a … notice …’ and ‘… an effective and expeditious complaint and redress mechanism …’) (emphasis added).


217      Judgment of 27 March 2014 (C‑314/12, EU:C:2014:192, paragraph 57).


218      See, first, judgments of 27 March 2014, UPC Telekabel Wien (C‑314/12, EU:C:2014:192, paragraph 56), and of 15 September 2016, Mc Fadden (C‑484/14, EU:C:2016:689, paragraph 93). See, secondly, ECtHR, 23 June 2020, Vladimir Kharitonov v. Russia, CE:ECHR:2020:0623JUD001079514, § 46: ‘… When exceptional circumstances justify the blocking of illegal content, a State agency making the blocking order must ensure that the measure strictly targets the illegal content and has no arbitrary or excessive effects … Any indiscriminate blocking measure which interferes with lawful content or websites as a collateral effect of a measure aimed at illegal content or websites amounts to arbitrary interference with [freedom of expression]. …’ See also Recommendation CM/Rec(2018)2 on the roles and responsibilities of Internet intermediaries, Appendix, point 2.3.2.


219      Judgment of 12 July 2011 (C‑324/09, EU:C:2011:474, paragraph 131).


220      Judgment of 27 March 2014 (C‑314/12, EU:C:2014:192, paragraph 63).


221      See, inter alia, ECtHR, Ahmet Yildirim v. Turkey, § 66; ECtHR, 23 June 2020, Vladimir Kharitonov, CE:ECHR:2020:0623JUD001079514, § 45; and ECtHR, Kablis v. Russia, § 94.


222      See, to that effect, the judgment in Scarlet Extended (paragraph 52: ‘[the filtering injunction] could potentially undermine freedom of information since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications. Indeed, it is not contested that the reply to the question whether a transmission is lawful also depends on the application of statutory exceptions to copyright which vary from one Member State to another. Moreover, in some Member States certain works fall within the public domain or can be posted online free of charge by the authors concerned’) (emphasis added). See, for the same reasoning, the judgment in SABAM (paragraph 50).


223      That tendency on the part of users not to assert their rights has been documented in Europe and the United States. See, inter alia, Urban, J.‑M., Karaganis, J., and Schofield, B., ‘Notice and Takedown in Everyday Practice’, UC Berkeley Public Law Research Paper no 2755628, 2017; Fiala, L., and Husovec, M., ‘Using Experimental Evidence to Design Optimal Notice and Takedown Process’, TILEC Discussion Paper No 2018-028,  2018, p. 3.


224      See, to the same effect, Spindler, G., op. cit., p. 355. This would also risk depriving the public of their right to access that legitimate content which has been blocked unfairly.


225      For example, a reaction video to a trailer for a video game or a film is searched for around the time when that trailer is released. Similarly, a parody video linked to a recent political scandal is, as a general rule, watched immediately after the scandal. See, in that regard, Garstka, K., op. cit., p. 339.


226      Moreover, the ECtHR has repeatedly held that ‘news is a perishable commodity and to delay its publication, even for a short period, may well deprive it of all its value and interest’ (see, inter alia, ECtHR, 26 November 1991, Observer and Guardian v. the United Kingdom, CE:ECHR:1991:1126JUD001358588, § 60; ECtHR, Ahmet Yildirim v. Turkey, § 47; and ECtHR, Kablis v. Russia, § 91). The Spanish and French Governments assert in reply that the speed of the exchange of information online would justify, on the contrary, preventively blocking all content which reproduces the works and other protected subject matter identified by rightholders, in order to avoid any risk of unlawful content being uploaded and thus giving rise to ‘irreparable’ harm to those rightholders. I will return to this argument in points 215 and 216 of this Opinion.


227      See, inter alia, judgments of 29 July 2019, Funke Medien NRW (C‑469/17, EU:C:2019:623, paragraphs 51 and 57), and of 29 July 2019, Spiegel Online (C‑516/17, EU:C:2019:625, paragraphs 36, 55 and 72).


228      See my Opinion in Joined Cases YouTube and Cyando (point 243).


229      I am well aware of the fact that recital 66 of Directive 2019/790 states, in its third paragraph, in the French version of that directive, that sharing service providers may be exempted from liability, where protected subject matter is unlawfully uploaded to their service, only by demonstrating that they have ‘tout mis en œuvre pour éviter cette situation’, (which could be translated as ‘taken every measure to avoid that situation’). This, in my opinion, is bad drafting and cannot call into question what I have just explained.


230      See footnote 249 of this Opinion.


231      See, by contrast, ECtHR, Ahmet Yildirim v. Turkey, § 66.


232      I have explained the fundamental nature of the prohibition of ‘general monitoring obligations’ for the freedom of online communication in the section on the ‘essence’ of freedom of expression (point 98 et seq. of this Opinion).


233      See point 112 of this Opinion.


234      The judgment in Glawischnig-Piesczek (paragraph 35).


235      The judgment in Glawischnig-Piesczek (paragraph 46).


236      See recital 59 of Directive 2001/29, in accordance with which ‘in many cases such intermediaries are best placed to bring … infringing activities [committed through their services] to an end’.


237      See points 142, 143 and 146 of this Opinion.


238      See my Opinion in Joined Cases YouTube and Cyando (points 187 and 188).


239      See, to that effect, Recommendation 2018/334 (recital 25: ‘It can, in particular, be appropriate to take such proactive measures where the illegal character of the content has already been established or where the type of content is such that contextualisation is not essential’), and Recommendation CM/Rec(2018)2 on the roles and responsibilities of Internet intermediaries (Appendix, point 1.3.2: ‘State authorities should obtain an order by a judicial authority … when demanding intermediaries to restrict access to content. This does not apply in cases concerning content that is illegal irrespective of context, such as content involving child sexual abuse material, or in cases where expedited measures are required in accordance with the conditions prescribed in Article 10 of the Convention’) (emphasis added).


240      The ECtHR emphasised that at issue were ‘clearly unlawful’ comments, which, ‘viewed on their face, [were] tantamount to an incitement to hatred or to violence’, and therefore ‘the establishment of their unlawful nature did not require any linguistic or legal analysis’ (see in particular §§ 110, 114, 115, 117, 140 and 155 of that judgment).


241      See ECtHR, 7 February 2017, Pihl v. Sweden, CE:ECHR:2017:0207DEC007474214, § 25; ECtHR, 19 March 2019, Hoiness v. Norway, CE:ECHR:2019:0319JUD004362414, § 68; ECtHR, 2 February 2016, Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, CE:ECHR:2016:0202JUD002294713, §§ 63 and 64; and ECtHR, 4 June 2020, Jezior v. Poland, CE:ECHR:2020:0604JUD003195511, §§ 54 and 58.


242      See point 221 of that Opinion.


243      See, to the same effect, Quintais, J., Frosio, G., van Gompel, S., et al., ‘Safeguarding User Freedoms in Implementing Article 17 of the Copyright in the Digital Single Market Directive: Recommendations From European Academics’, JIPITEC, 2019, vol. 10, No 3; Lambrecht, M., op. cit., pp. 88-90; Cabay, J., op. cit., pp. 237-273; and Dusollier, S., op. cit., p. 1020. That interpretation follows, moreover, from a natural reading of the wording of Article 17(4)(b) and (c) of Directive 2019/790, which requires sharing service providers to ensure the unavailability of ‘works’ and other ‘subject matter’, and not of any infringement involving in any way the subject matter in question (see, to the same effect, Lambrecht, M., op. cit., p. 89).


244      In other words, I am referring to what are sometimes described as ‘slavish’ and ‘quasi-slavish’ copies. In that regard, I note that some of the content on sharing services consists precisely of slavish or quasi-slavish copies of works or other protected subject matter, such as cinematographic works or phonograms.


245      The judgment in Glawischnig-Piesczek (paragraph 46). I would point out that, in my view, in that judgment, the Court did not intend to exclude the need for certain checks by natural persons. I shall return to this in point 211 of this Opinion.


246      In my view, the ‘relevant and necessary’ information provided by rightholders should include evidence demonstrating that they hold rights over the works or other protected subject matter which they are seeking to block, in order to limit the risk of such ‘over-complaining’ (see point 148 of this Opinion).


247      That interpretation cannot be called into question by the argument put forward by the applicant and the French Government that the word ‘manifest’ does not appear in the text of Directive 2019/790. It is not a substantive concept, but merely indicates the extent of the monitoring which the sharing service providers must undertake. The extent of that monitoring can be inferred from Article 17(8) of that directive and the need to implement the contested provisions in a manner that complies with paragraph 7 of that article. Moreover, contrary to the French Government’s submissions, the difference between information the unlawfulness of which is apparent at first glance and that which requires further examination is not a new idea. In that regard, I will simply recall that, in the context of Article 14 of Directive 2000/31, an intermediary provider must remove information where it has ‘actual knowledge’ that it is unlawful or where its unlawfulness is ‘apparent’ in the light of the evidence available to it. The same applies in the present case. I would also refer to the reasoning adopted by the ECtHR in its judgment in Delfi AS v. Estonia. Finally, the idea of manifest infringement, understood as ‘slavish’ or ‘quasi-slavish’ copying, is already known.


248      See point 52 of this Opinion. See, to the same effect, Geiger, C., Jütte, B.‑J., ‘Platform liability under Article 17 of the Copyright in the Digital Single Market Directive, Automated Filtering and Fundamental Rights: An Impossible Match’, SSRN Papers (available at https://papers.ssrn.com/sol3/papers.cfm?abstract_id= 3776267), p. 44.


249      In general, it is for the administrative and judicial authorities of those States to supervise how the contested provisions are applied by sharing service providers and to ensure that users are able effectively to avail themselves of their right to legitimate uses, in accordance with Article 17(7) of Directive 2019/790. On that point, the Republic of Poland has submitted that that directive does not contain any provision on the liability of sharing service providers vis-à-vis users in the event of infringement of paragraph 7. In my view, such liability must be found in the Member States’ national law, in accordance with the principle of procedural autonomy. Other effective, dissuasive and proportionate sanctions should be provided for therein. Moreover, since observance of users’ rights is incorporated in the ‘best efforts’ which must be made by the sharing service providers, it follows that, if such a provider did not observe the rights in question, it should lose the benefit of the exemption mechanism provided for in paragraph 4 of that article.


250      ‘Fingerprinting’ recognition tools are capable of drawing distinctions on the basis of the quantity of protected content that is reused in uploaded content, in particular with regard to audio and video content. Admittedly, as the French Government submits, it is not sufficient, for example, for a musical extract to be shorter than a certain duration in order for the exception for quotations to be applicable, since that depends on the user’s intention (see footnote 178 of this Opinion). However, it is simply a matter of providing a margin in the settings of the filtering tool within which the application of that exception is not certain, but merely reasonably conceivable.


251      In that regard, the approach explained by the Commission in its document ‘Targeted consultation addressed to the participants to the stakeholder dialogue on Article 17 of the Directive on Copyright in the Digital Single Market’ is, in my eyes, a good way to proceed (see pp. 15 and 16 of that document). See, for similar proposals, Quintais, J., Frosio, G., van Gompel, S., et al., op. cit.; Lambrecht, M., op. cit., pp. 79-94; and Leitsner, M., op. cit., pp. 193-208. Anti-abuse measures could also be provided for by the Member States. It is not necessary, in the present case, to adopt a more precise position on those various proposals.


252      The French Government submits that the implementation of recognition tools is based on management rules set by the rightholders and that, for example, in the field of cinema, those rightholders would generally tolerate, in exchange for a proportion of the revenue linked with the ‘monetisation’ of the video concerned (see point 58 of this Opinion), the posting of extracts of their films which are several minutes long. It is true that the idea that compliance with the exceptions and limitations could be ensured by voluntary measures taken by rightholders is not unknown in EU law (see Article 6(4) of Directive 2001/29 on technological protection measures). Nevertheless, I consider that in a context of filtering such as that resulting from the contested provisions, given the risks which I have described, the protection of users’ rights should not rest solely on the willingness of those rightholders.


253      In my view, by defining those ‘best practices’, that guidance will help to establish the ‘high industry standards of professional diligence’ which suppliers must meet, in accordance with the contested provisions. Where appropriate, that guidance will have to be updated in order to keep pace with ‘the evolving state of the art’.


254      See, to the same effect, Grisse, K., op. cit., p. 898.


255      I am thinking of, for example, the illegal posting on a sharing service of a film which has recently or even not yet been released.


256      See, to the same effect, Lambrecht, M., op. cit., pp. 89 and 90.


257      See, inter alia, judgment of 27 March 2014, UPC Telekabel Wien (C‑314/12, EU:C:2014:192, paragraph 61 and the case-law cited).


258      In particular, it is not a question of laying down a de minimis threshold below which rightholders would lose all possibility of asserting their rights. The existence of such a threshold has always been rejected by the Court. See, in that regard, Opinion of Advocate General Szpunar in Pelham and Others (C‑476/17, EU:C:2018:1002, points 28 to 33).


259      That is, moreover, without prejudice to the possibility for rightholders to obtain a court injunction (see the first paragraph of recital 66 of Directive 2019/790), or even bring an action against the user responsible.


260      See, to the same effect, Cabay, J., op. cit., p. 221, and Lambrecht, M., op. cit., p. 90. In that connection, a practical solution could be to inform rightholders as soon as content that reproduces their protected works is uploaded, so that they can, where appropriate, quickly prepare a reasoned request for access to be disabled (see the second subparagraph of Article 17(9) of Directive 2019/790). Those rightholders, therefore, in any event, would not be responsible for monitoring the sharing services themselves in order to discover and locate the content in question, which would be detected automatically for them using the recognition tool.


261      I note, moreover, that, for content the unlawfulness of which cannot be regarded as manifest, the ECtHR has considered the ‘notice and take down’ system to be an appropriate tool to maintain a ‘fair balance’ between the rights and interests at stake, since such notices make it possible, precisely, to provide service providers with the contextual information necessary to establish that the content is unlawful. See, inter alia, ECtHR, Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, CE:ECHR:2016:0202JUD002294713, § 91.


262      See, by analogy, my Opinion in Joined Cases YouTube and Cyando (point 190). Sharing service providers will therefore not be able simply to approve every blocking request they receive from rightholders without verification.


263      ‘… it cannot be excluded that in some cases availability of unauthorised content can only be avoided upon notification of rightholders.’


264      And, as I stated in point 178 of this Opinion, sharing service providers will have to examine those complaints and notifications with the same promptness – and the same diligence.


265      I wish to make one final point. The defendants and the interveners have emphasised that the ‘main’ way for a sharing service provider to avoid all liability for works and other protected subject matter uploaded to its service is, in accordance with Article 17(1) of Directive 2019/790, to obtain an authorisation from the rightholders. In that context, the liability exemption mechanism provided for in paragraph 4 of that article – and the filtering obligations arising therefrom – will, in any event, concern only works and other protected subject matter for which such authorisation has not been obtained. To me, although this is not, strictly speaking, a ‘safeguard’ governing such filtering, this point is important for the freedom of expression of the users of those services, particularly since, in accordance with Article 17(2) of Directive 2019/790, those authorisations will also cover, in certain circumstances, acts of ‘communication to the public’ carried out by those users. I therefore share the Commission’s view that the Member States should provide, in their national law, for mechanisms to facilitate the grant of such authorisations. The more such authorisations can be obtained by sharing service providers, the more rightholders will obtain appropriate remuneration for the use of their protected subject matter, and the less users will have to undergo filtering and blocking measures for their content.


266      Judgment of 22 June 2021 (C‑682/18 and C‑683/18, EU:C:2021:503).


267      See judgment of 22 June 2021, YouTube and Cyando (C‑682/18 and C‑683/18, EU:C:2021:503, paragraph 59).


268      See Communication from the Commission, ‘Guidance on Article 17 of Directive 2019/790 on Copyright in the Digital Single Market’, in particular pp. 18-24.


269      The Commission refers to subject matter which has a particular economic value during a certain period. However, the ‘earmarking’ mechanism does not seem to be limited to such subject matter. Nor does the guidance define the notion of ‘significant economic harm’. I recall that, according to the French Government, in the event of infringing content being uploaded, the economic harm suffered by rightholders would always be ‘irreparable’.