INGE vote confirms ACT views on failures of Code of Practice on Disinformation review & need to move past self-regulation
Download the document
Brussels, 25 January, 2022. The Association of Commercial Television and Video on Demand Services in Europe (ACT) welcomes today’s vote in the European Parliament’s INGE Committee on its report on Foreign interference in all democratic processes in the European Union, including disinformation.
We share INGE’s deep concerns over the current self-regulation approach which continues to fall very short of expectations despite Europe facing enormous challenges to its public health policy, aggravated by anti-vaccination movements that thrive on video-sharing platforms, social networks and messaging channels. Tech platforms have failed to effectively deal with this issue and continue to delay important commitments to ensure public health and safety.
ACT agrees with the conclusions of the Report, notably, “Deplores the continued self-regulatory nature of the Code since self-regulation is insufficient when it comes to protecting the public from attempts of interference and manipulation, is worried that the updated Code of Practice on Disinformation may not be able to provide an answer to the challenges ahead; is concerned by the strong reliance of the Guidance to strengthen the Code of Practice on the Commission’s DSA proposal; calls for swift action to ensure that the Code of Practice incorporates binding commitments for the platforms to ensure EU’s readiness before the next local, regional, national and European elections;”
We call on the European Commission to fix a firm date for the commitments to be made public. We continue to deplore the lack of transparency in this process and the lack of sufficient safeguards to vet the outcome of the Reviewed Code and guarantee proper oversight.
The secrecy around the drafting of the code is unacceptable and erodes trust in the process, democratic scrutiny and trust in the outcome. There should be transparency on the development of the discussions, access to the meetings and relevant papers/reports produced by the group. Confidentiality clauses forbidding parties involved in the revision from disclosing any information regarding the process and the outcomes of the discussions are particularly outrageous given the Commission’s own guidelines on self-regulation and must be lifted.
We continue to believe that any resulting commitments should be subject to a binding Opinion from the Audiovisual Regulators (ERGA) and the Special INGE Committee. As we previously pleaded in INGE, the taskforce foreseen in the European Commission’s guidance should be created before the adoption of the code and be given an oversight role on its drafting. Representatives of the Parliament and other affected stakeholders should take part in this taskforce. We remain concerned that neither the Special Committee nor the regulator’s group have been able to get significant insights into the state of discussions.
“Every European is affected by disinformation and particularly disinformation related to Covid. Antivax groups continue to thrive online. We cannot wait forever for Platforms to make meaningful commitments. The Commission needs to set a clear deadline, share status with the public and start developing regulation to protect Europeans now, not tomorrow.”
Grégoire Polad, ACT Director General
ANNEX – KEY PROVISIONS IN THE INGE REPORT
52. Underlines that the updated Code of Practice on Disinformation, the Digital Services Act, the Digital Markets Act and other measures linked to the European Democracy Action Plan will require an effective overview, assessment and sanction mechanism after their adoption, in order to evaluate their implementation at national and EU level on a regular basis and identify and remedy loopholes without delay and to sanction misapplication of the commitments as well as inactions; calls in this respect for strong and resourceful Digital Services Coordinators in each Member State, as well as sufficient resources for the enforcement arm of the European Commission to execute the tasks it has been awarded by the DSA; furthermore, stresses the importance for online platforms to be subject to independent audits certified by the Commission; notes that to ensure independence of the auditor, the auditor cannot be funded by the individual platforms;
52a. Calls in this respect for key performance indicators (KPIs) to be defined, by means of co-regulation, in order to have objective indicators to ensure verifiability of the actions taken by the platforms as well as their effects; underlines that these KPIs should include country specific-metrics such as the audience of disinformation, engagement (click-through rate, etc.), funding of in-country fact-checking or research activities, and indicators of the prevalence and strength of in-country civil society relationships;
52b. Is deeply concerned by the lack of transparency in the revision of the Code of Practice on Disinformation, as the discussion remained largely preserved for the private sector and the European Commission; regrets that the European Parliament, in particular the INGE special committee, and some other key stakeholders were not properly consulted during the drafting of the review of the Code of Practice;
52c. Deplores the continued self-regulatory nature of the Code since self-regulation is insufficient when it comes to protecting the public from attempts of interference and manipulation, is worried that the updated Code of Practice on Disinformation may not be able to provide an answer to the challenges ahead; is concerned by the strong reliance on the Guidance to strengthen the Code of Practice on the Commission’s DSA proposal; calls for swift action to ensure that the Code of Practice incorporates
 Special Committee on Foreign Interference in all Democratic Processes in the European Union, including Disinformation (INGE)
 Paragraph 52c of the adopted report, see Annex for full text
 See Principles for Better Self- and Co-Regulation, page 130 of the Better Regulation Toolbox 2021 https://ec.europa.eu/info/sites/default/files/br_toolbox-nov_2021_en_0.pdf