ACT Position Paper on the Digital Services Act

Download the document

INITIAL REMARKS

The DSA addresses issues and areas Broadcasters are faced with on a daily basis, as players that stand at the nexus of media, technology, news and data policy. The DSA has a specific media dimension. ACT stresses the importance of seeing these proposals in light of fostering pluralism, safeguarding the rule of law whilst delivering innovative digital services, quality entertainment and trusted news.

The guiding mantra to benchmark the DSA – to ensure that what is illegal offline should be illegal online –requires to be further and fully reflected in the text and discussions amongst co-legislators. This is at the core of broadcasters view on the proposals to achieve an effective level playing field for creative industries and viewer protections.

Broadcasters and media pluralism at large requires a strong liability regime that can deliver a safe online and trustworthy environment, effective enforcement as well as ensuring that the Internet continues to fulfil its role as the vibrant and engaging place we all enjoy. While some players can continue benefiting from liability exemptions, the DSA should not by any means grant additional liability privileges.

The proposed regulation should reflect the present realities of the market, where several players have emerged that have surpassed “mere technical, automatic and passive nature” status.

Enlarging the liability exemptions to accommodate this new type of intermediaries will prove detrimental to the content creation market which needs more robust instruments to fight against the illegal dissemination of their content online and would fall short of answering market demand.

The proposed notice and action procedures will have to be analysed in light of existing copyright laws to ensure processes that lead to rapid take down and stay down measures can continue to be applied and improved.

Co-legislators may wish to assess lost opportunities to crack down on online TV piracy. The role and scope of trusted flaggers and Know Your Business Customer provisions are too narrow to effectively target and suspend abusive behaviour. Unless more is done in this respect, the broadcasting industry will suffer from online piracy for many years to come.

Similarly, there is no logical reason for digital platforms to avoid liability for advertising content which they select, place, promote and ultimately profit from.  An effective regime should ensure that digital platforms are directly liable for all advertising content on their services and are held to account for content that falls short of generally accepted standards – as is the case for broadcasters.

We welcome the proposal’s ability to achieve more accountability regarding harmful content, particularly as regards disinformation. Stringent codes of conduct will be required to achieve tangible and verifiable results, commitments and oversight. Mandatory independent audits imposed on very large platforms – essential to assess if these platforms effectively fight against illegal/harmful content and protect fundamental freedoms online – is a first, but not sufficient, step towards much greater and needed algorithmic transparency. In sum, while certain measures are in line with the needs of media pluralism and cultural sovereignty in Europe, others will need to be revised to ensure the DSA presents a real upgrade for Europe’s media ecosystem.

ACT and members look forward to engaging with European institutions on both of these proposals. The diagnosis delivered by the EC is accurate. Now we must make sure the cure is effective. We will continue to advocate for fair competition and a liability landscape that is fit for the digital age in order to drive Europe’s media strategy and support a robust, responsible and reliable media landscape.

EXECUTIVE SUMMARY – KEY AREAS OF FOCUS FOR BROADCASTERS

SECTION I: CONDITIONAL LIABILITY EXEMPTIONS (Chapter II, Articles 3-9)

  1. 1 Active/Passive distinction (Articles 3-5; 18, 20, 23)1
  2. The requalification of articles 3-5 of the eCommerce Directive creates ambiguity and needs to better reflect the rich jurisprudence of the CJEU and national courtsOnline intermediaries that take active measures to maximise profit and consumer attention should be held liable based on criteria “optimising the presentation or promoting the content” in line with CJEU case law (L’Oréal/eBay) regardless of size
  3. No special regime for small players (small broadcasters have no such benefits
  1.  Ensuring that the due diligence obligations capture the right players
  2. Some actors play a strategic role in the piracy ecosystem and could, through their actions, contribute to limiting the phenomenon: providers of dedicated server services/leasing servers facilitate piracy by allowing hosting solutions of illegal streaming sites; providers of “reverse proxy” services are an essential link in the web woven by pirate sites to organize their anonymity
  3. To be fully effective, the DSA should permit the technical intermediaries mentioned above to be expressly qualified as hosting service providers under Section 2(f) of the DSA. The same reasons underlie the need to include CDN services in the definition of hosting: these are services that are increasingly being used by operators of online platforms that are clearly and almost exclusively dedicated to the distribution of counterfeit products
  1. Make the exemption of liability conditional on the compliance with due diligence obligations (Article 5; Recital 18)
  2. Mandatory compliance with due diligence obligations should be a necessary precondition of eligibility for liability exemptions
  1. “Good Samaritan” Principle (Article 6)
  2. “Good Samaritan” principle goes against established EU doctrine and will create a weaker system to the detriment of the European interest and online safety of European citizens; legislators should refrain from creating new liability exemptions
  3. The basis for the Good Samaritan – removing alleged disincentives for platforms to proactively act against illegal content – is not supported by any factual evidence and disregards already applicable duties of care on passive hosts in the eCommerce Directive
  4. It is not acceptable for online intermediaries to decide by themselves which kind of illegal content they intend to track or not track
  1. Orders to act against illegal content/ Catalogue wide injunctions (Article 8; Recitals 29-30)
  2. Preserving the standing of national orders is important, yet Member States need greater standing to issue injunctions
  3. Both the applicable DSA recital and 2017 Communication do not elaborate practical basis to tackle new forms of piracy such as illegal IPTV and illegal live streaming
  4. DSA Article should reflect practical arrangements to terminate or prevent an infringement allowing courts to issue forward looking, catalogue-wide and dynamic injunctions
  1. Orders to provide information (Article 9; Recitals 31-33)
  2. To ensure information requests are effective, the language provided in Article 15.2 (ECD) should be mirrored in Art. 9 of the DSA ;namely requests by competent authorities enabling the identification of recipients of their service with whom information society service providers have storage agreements
  3. It is essential that the scope of these articles is explicitly limited to cross-border orders in order to avoid unnecessary overregulation and interference in Member States’ judicial laws
  1. Content moderation (Article 12; Recital 38)
  2. We welcome the introduction of an obligation for all providers of intermediary services to clearly describe in their terms and conditions and to enforce in a diligent manner any policies, procedures, measures and tools used for the purpose of content moderation and recommender systems

SECTION II: DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT & SAFE ONLINE ENVIRONMENT (Chapter III)

2.1          Notice and Action Procedures (Articles 5,14; Recitals 40,42)

  • In practice, broadcasters face organisations that tend to escape their expeditious removal obligations; ACT suggests to expand the definition of the hosting services providers and simplify procedures
  • Requirements proposed diminish the nature and effectiveness of the existing notice & take down procedures and need futureproofing to ensure they are not obsolete upon publication
  • The title of the copyrighted content and the logo of the broadcaster are and should remain sufficient to trigger the validity of the notice as already validated by rulings

2.2          Trusted flaggers (Article 15,19; Recital 46)

  • Trusted flagger system should become a standard for all hosting service providers; exclusion of micro and small enterprises misses sources of specific, prevalent and damaging types of pirated content
  • The status should be refined in the proposal to recognise that the scope of entities needs to be wider than collective interests to allow for IP rightholders and their partners to effectively tackle illegal use of their content and continue to rely and develop existing best practices
  • An obligation for hosting providers to treat notices from trusted flaggers with priority – and immediately for live content – should be combined with a fast track take-down procedure

2.3          Repeat infringer policy (Article 20; Recital 47)

  • Repeat infringer counter-measures are welcome and to be effective need to capture micro & small entities hosting repeat infringers
  • Account suspension duration (for a reasonable period of time) would benefit from specifications to avoid disparities in interpretations and subsequent transpositions
  • The scope of suspensive measures should be widened to tackle the network of online and dynamic pirate accounts with stay down measures and termination of service for repeat infringers across all accounts
  • Illegal content repeatedly uploaded should stay down

2.4          Know Your Business Customer (Article 22 NEW; Recitals 48-50)

  • KYBC obligations should apply to providers of information society services that piracy services and other illegal operators rely on
  • Requiring commercial entities to reveal their identity on the internet would automatically reduce illegal or harmful content online

2.5          Transparency reporting obligations for providers for online platforms & online advertising (Articles       13, 16, 23-24)

  • There should not be any distinction between illegal content and manifestly illegal content
  • The compliance with the due diligence obligations for a transparent and safe online environment should not be seen as burdensome
  • Adapting the reach of the law to only parts of the market (digital SMEs structurally advantaged vs physical SMEs), sets a dangerous precedent and should be avoided


SECTION III: ADDITIONAL OBLIGATIONS FOR VERY LARGE ONLINE PLATFORMS TO MANAGE SYSTEMIC RISKS FOR ILLEGAL AND HARMFUL CONTENT

3.1.         Risk assessment (Article 26)

  • Threshold foreseen by the Commission to qualify risk as (significantly) systemic are high. The assessment should be made in light of the prejudicial nature it has on a certain sector.
  • The dissemination of illegal content, infringing property rights – fully protected by Article 17 of the Charter of Fundamental Rights – should be considered as a sufficiently prejudicial risk
  • Safeguards are required to preserve media integrity and avoid oversight role over broadcasters’ pre-vetted and regulated content

3.2.        Mitigation of risks (Article 27; Recitals 56-58)

  • Regulators should have a greater role and means to compel commitments, voluntary “Codes of Conducts” and “Crisis protocols” should be more robust to qualify as effective mitigation measures

3.3.        Transparency measures for very large online platforms (Articles 28-29)

  • Content providers should be informed, ideally in advance, about any modification to the algorithm and the foreseen consequences on the visibility of third party content

3.4.        Additional online advertising transparency (Article 30)

  • We welcome the obligations as foreseen in Art. 24 and 30 as the very large online platforms monetize their business through online advertising. These obligations would help creating a trusted and transparent online environment.  Broadcasters already comply with a comprehensive set of legal and self-regulatory rules [for their online and offline offerings].  Personalized advertising, which meets the same high standards, is an increasingly crucial source of revenue for media companies that don’t have the reach and massive data collection of the dominant online platforms.
  • Meaningful transparency measures require verifiability and open data access for regulators
  • To fully assess flows of illegal/harmful content on ad networks a self-declarative approach cannot be a substitute for independent oversight and national regulatory approaches

3.5.         Data access and scrutiny (Article 31; Recital 64)

  • Supervision of VLOP’s recommendation and moderation algorithms upon request of the Digital Services Coordinator to address pro illegal or harmful content biases should be the norm
  • Princle of compliance should prevail over trade secrets to prevent the dissemination of illegal content online
  • Trade secrets shall not be opposed by VLOPs to the Digital Services Coordinator, and obligations like explainability, transparency by design and active collaboration with the Digital Services Coordinator (DSC) on algorithms’ purposes should be included in DSA
  • DSC should be entitled to have access to all data and algorithms requested for their investigation to ensure that VLOPs are DSA compliant. Vetted researchers should be able to conduct studies on the DSA and thus require data to the VLOPs.

3.6.        Codes of conducts (Article 35; Recitals 67-68)

  • To deliver a true regulatory backstop, the DSA will need to be bolstered with complementary measures
  • For harmful content, and associated Code of Practice on online disinformation, there is a pressing need for guidance that delivers a step change in commitments and allows regulators powers to compel a platform to adhere in good faith to a high standard co-regulatory framework, with binding commitments and enforcement with penalties

SECTION IV: IMPLEMENTATION, COOPERATION, SANCTIONS AND ENFORCEMENT (CHAPTER IV)

  • The viral spread of illegal and harmful content has dramatic impact and needs immediate attention, procedures need to be streamlined to ensure the Commission can take the lead
  • Relevant authorities should have the power to request and suggest commitments by VLOPs