ACT Homepage
Members area

Position Papers

Copyright - Realities & Myths

Press Releases, published on 19 June 2018 Download document

OVERVIEW ARTICLE 13 – WHAT IS IT ABOUT?

 

Context

  • Since eCommerce Directive adoption in 2000, technology has evolved and platforms have gained more and more control over the content available on their services and revenues attached
  • Such increase in control and dominance over markets requires a rebalancing and an increase in responsibility
  • Without payment, there is no investment. Rightholders also have moral rights to decide how their works should be used and by whom (part of the fundamental right to property)

Goals

  • Provide legal certainty for all stakeholders
  • Ensure eCommerce liability limitations are not abused by active services making profit from unauthorised use of copyright protected content
  • Provide for an environment where re-investment is encouraged
  • Safeguards in equal measure the fundamental freedoms of individual users, right holders and online services

Focus on Active Platforms

 

  • Rapporteur’s compromise amendments refer to the fact that a general obligation to monitor is forbidden “where applicable (Art. 13 para. 1.b) as the eCommerce liability exemptions, including Article 15, apply ONLY TO PASSIVE services, i.e. services that do not have any influence on the content they give access to, and NOT TO ACTIVE services, which are the target of the legislative proposal
  • One has to bear in mind that this ban does not only concern copyright infringement, but all sorts of illegal content, including terrorist content and child pornography
  • The distinction between active & passive means proposal is not contradicting the eCommerce liability exemptions and the ban on the general obligation to monitor, and as such upholds/balances Fundamental Rights and preserves data protection

DETAIL ARTICLE 13 – WHAT IS IT NOT ABOUT?

 

Not violating the fundamental freedoms of the Charter of Fundamental Rights

  • In the case of Article 13, the Council Legal Service has concluded that the fundamental freedom of expression is not breached as: 1) only content identified by rightholders is to be covered by the measures; and, 2) complaint mechanisms need to be available to uploaders of content to allow them to contest a take-down.
  • The Charter of Fundamental Rights is protecting the fundamental freedom of expression, the freedom to conduct a business, the fundamental right of property, including intellectual property (Art. 17(2) of the Charter)
  • The fundamental freedom to conduct business does not only protect internet services, but also creative businesses who should be able to obtain remuneration from business entities profiting from creations they do not own and do not have permission to exploit
  • The aim is not to limit citizens’ freedom of expression and this freedom should not be opposed to the rights of creators – economic and moral.
  • The Court of Justice of the EU constantly ruled that all of the fundamental freedoms and rights should be balanced against each other, none of them is taking precedence over the other.
  • In reality, the number of erroneous take-down of content is extremely low as numbers show a ratio of 0.0002 % in the case of audiovisual content, 0.04 % in the case of music.

 

Not violating personal data protection

  • Article 13 concerns exclusively uploaded (and not downloaded or streamed) copyright protected content identified at the moment of the upload (on the basis, for instance, of the fingerprints provided by the rightholders)
  • The envisaged measures would not entail the processing of information on all users or the systematic analysis of their profiles
  • There is no requirement whatsoever to check other data related to the uploaded content such as the identity or IP address of individual “uploaders” or the date, time or location

 

Not in contradiction with the eCommerce liability exemptions & ban on general obligation to monitor

  • There is no hierarchy between secondary acts of legislation. As stated by the Council Legal Service, the use of qualifications such as “not in accordance with “ or “incompatible” or even “infringing” in relation to the E-commerce Directive is not, as a matter of law, correct. Indeed, the rule of a more recent act governing a specific subject matter overrides a rule in an existing act governing relevant subject matter in a general manner.
  • Safe harbour clauses are relevant only for passive platforms. The ban on general obligation of monitoring does not apply to services that are active, that perform a copyright relevant act and that, as a result, should conclude a license with rightholders. Such a license could provide for a filtering measure, as this is allowed by the eCommerce Directive. Only in the case of purely passive platforms does the ban apply.
  • The Court of Justice has recently ruled in cases such as Stichting Brein vs Ziggo (C-610/15, para. 38) that when a service is providing a search engine, indexing and categorising content, it cannot be considered as making a ‘mere provision’ of physical facilities, i.e. it is not a passive platform service (see rec. 42 of the E-Commerce Directive)
  • In the case of passive services, the technical measures referred to in the draft directive will not amount to a general obligation of monitoring (see also Council Legal Service opinion) as:
  • the measures are not costly and burdensome (measures are already being used in practice and they are varied in their nature and costs to adequately fit the size and type of every platform, big or small);
  • online services will not have to determine themselves whether a specific content is uploaded unlawfully but only to identify protected content at the moment of the upload on the basis of the identification data that will have been provided by the rightholders; and,
  • since rightholders will have to produce themselves the data needed, they will participate in the cost of the measures to be taken by the online services which will in turn reduce the logistic and financial burden of the mechanisms for the online services.
  • The Court cases where filtering measures have been rejected (C-70/10 Scarlet Extended, C-360/10 Netlog v. Sabam) concerned respectively an internet service provider and a social networking platform, but in no event an online content sharing service, which main purpose is exactly to provide access to content.

 

 

Press Contact

Grégoire Polad
ACT Director General
Email: gp@acte.be