Logo Assembly Logo Hemicycle

The control of online communication: a threat to media pluralism, freedom of information and human dignity

Doc. 15537: compendium of written amendments | Doc. 15537 | 23/06/2022 | Final version

Caption: AdoptedRejectedWithdrawnNo electronic votes

ADraft Resolution

1The Parliamentary Assembly holds that communication policies must be open, transparent and pluralistic, and it must build on unhindered access to information of public interest and the responsibility of those disseminating information to society. It notes that online communication has become an essential part of people’s daily lives and is concerned that a handful of internet intermediaries are de facto controlling online information flows. This concentration in the hands of a few private corporations gives them huge economic and technological power, as well as the possibility to influence almost every aspect of people’s private and social lives.

23 June 2022

Tabled by Mr Gianni MARILOTTI, Mr Roberto RAMPI, Ms Ada MARRA, Mr Gerardo GIOVAGNOLI, Mr Andrej HUNKO, Mr Gianluca PERILLI

Votes: 29 in favor 1 against 1 abstention

In the draft resolution, after paragraph 1, insert the following paragraph:

"In contrast, a virtuous model of interrelation between users and intermediaries was constituted by Wikileaks. Therefore, it is unacceptable that States tend to punish this horizontal and democratic information communication model instead of encouraging it with a view to gradually implementing a comprehensive legal declassification mechanism."

In amendment 2, replace the words "paragraph 1" with the following words:

"paragraph 3"

2There are questions on the capacity and willingness of an economic-technological oligopoly to ensure diversity of information sources and pluralism of ideas and opinions online; on the expediency of entrusting artificial intelligence with the task of monitoring online pluralism; and on the real capacity of legal frameworks and democratic institutions in place to prevent the concentration of economic-technological-informational power from being converted into non-democratic political power. Indeed, as electoral communication shifts to the digital sphere, whoever controls online communication during election campaigns may become a formidable political force. Voters may be seriously encumbered in their decisions by misleading, manipulative and false information.
3Main risk factors in this context are: the lack of transparency of new forms of online advertising, which can too easily escape the restrictions applicable to advertising on traditional media, such as those intended to protect children, public morals or other social values; the fact that journalists, whose behaviour is guided by sound editorial practices and ethical obligations, are no longer the ones playing the gatekeeper role; and the growing amount of disinformation available online, in particular when it is strategically disseminated with the intent to influence election results.
4From an economic point of view, network effects and economies of scale create a strong tendency towards market concentration. In the context of oligopolistic competition driven by technology, inefficiencies and market failures may come from the use of market power to discourage the entrance of new competitors, from the creation of barriers to switching services or from information asymmetries. Therefore, to address the dominance of a few internet intermediaries in the digital marketplace, member States should use anti-trust legislation. This may enable citizens to have greater choice when it comes to choosing, to the extent possible, platforms that are likely to better protect their privacy and dignity.
5A few innovative remedies to mitigate the power of internet intermediaries include giving users, where possible, the option of accessing, consulting and receiving services from third-party providers of their choice, which would rank and/or deliver content following a classification previously made by the user him/herself and could alert him/her in case of violent, shocking or other dangerous content.
6Beyond the business model, crucial issues for internet intermediaries and for the public are the quality and variety of information, and plurality of sources available online. Internet intermediaries are increasingly using algorithmic systems, which are helpful to search the internet, automatically create and distribute content, identify potentially illegal content, verify information published online and moderate the communication online. However, algorithmic systems can be abused or used dishonestly to shape information, knowledge, the formation of individual and collective opinions and even emotions and actions. Coupled with economic and technological power of big platforms, this risk becomes particularly serious.
7With the emergence of internet intermediaries, harmful content is spreading at a very high speed on the web. Internet intermediaries should be particularly mindful of their duty of care where they produce or manage the content available on their platforms, or where they play a curatorial or editorial role, while avoiding taking down third-party content, except for clearly illegal content.
8The use of artificial intelligence and automated filters for content moderation is neither reliable nor effective. Big platforms already have a long record of mistaken or harmful content moderation decisions in areas such as terrorist or extremist content. Solutions to policy challenges such as hate speech, terrorist propaganda and disinformation are often multifactorial; therefore, mandating automated moderation by law is an inappropriate and incomplete solution. It is important to acknowledge and properly articulate the role and necessary presence of human decision makers, as well as the participation of users in the establishment and assessment of content moderation policies.
9Today there is a trend towards the regulation of social media platforms. Whilst increased democratic oversight is necessary, regulation enacted in practice often entails overbroad power and the discretion of government authorities over information flows, which endanger freedom of expression. Lawmakers should aim at reinforcing transparency and focus on companies’ due processes and operations, rather than on the content itself. Moreover, legislation should deal with “illegal content” and avoid using broader notions such as “harmful content”.
10If lawmakers choose to impose very heavy regulations on all internet intermediaries, including new smaller companies, this might consolidate the position of big actors which are already in the market. In such a case, new actors would have little chance of entering the market. Therefore, there is a need for a gradual approach, to accommodate different types of regulations on different types of platforms.
11The Parliamentary Assembly recalls that, in its Recommendation CM/Rec(2018)2 on the roles and responsibilities of internet intermediaries, the Committee of Ministers of the Council of Europe indicates that any legislation should clearly define the powers granted to public authorities as they relate to internet intermediaries; and Recommendation CM/Rec(2020)1 on the human rights impacts of algorithmic systems confirms that the rule of law standards must be maintained in the context of algorithmic systems.
12Internet intermediaries must ensure a certain degree of transparency of the algorithmic systems they use, because this may have an impact on our freedom of expression. At the same time, in their capacity as private companies, they should enjoy, without prejudice for effective transparency and for human rights, their legitimate right to commercial secrecy. Member States must strike a balance between the freedom of private economic undertakings, with the right to develop their own commercial strategies, including the use of algorithmic systems, and the right of the public to communicate freely online, with access to a wide range of information sources. They should also recognise that content removal is not in itself a solution for societal harms, as more rigorous content moderation may displace the problem of online hate speech to less popular platforms rather than address its causes.

23 June 2022

Tabled by Mr Gianni MARILOTTI, Mr Roberto RAMPI, Ms Ada MARRA, Mr Gerardo GIOVAGNOLI, Mr Andrej HUNKO, Mr Gianluca PERILLI

Votes: 29 in favor 2 against 1 abstention

In the draft resolution, at the end of paragraph 12, insert the following sentences:

"The right balance is enhanced by adherence to the professional rules and ethics of the journalist, who with cross-checking subjects the sources to scrutiny. This was scrupulously respected by Assange and member States must act in line with Resolution 2300 (2019) "Improving the protection of whistle-blowers all over Europe". Each member State must recognise, and ensure respect of, the right of journalists to protect their sources, and develop an appropriate normative, judicial and institutional framework to protect whistleblowers and whistleblowing facilitators. It is unfair that this right was not taken into account in the extradition judgment brought against him. In accordance with Resolution 2317 (2020) "Threats to media freedom and journalists' security in Europe", the detention and criminal prosecution of Mr Julian Assange constitute a dangerous precedent for journalists. As the UN Special Rapporteur on Torture and Other Cruel, Inhuman or Degrading Treatment or Punishment declared on 1 November 2019, Mr Assange's extradition to the United States must be barred and he must be promptly released."

In amendment 3, replace the words "at the end of paragraph 12, insert the following sentences: The right balance" with the following words:

"after paragraph 14, insert the following sentences: The right to free and pluralist information"

13Internet intermediaries have the responsibility to ensure the protection of users’ rights, including freedom of expression. Therefore, member States should ensure internet intermediaries’ accountability for the algorithmic systems they develop and use in the automated production and distribution of information, as well as for their lines of funding and policies they implement for creating information flows and dealing with illegal content.
14In particular, internet intermediaries should assume specific responsibilities based on international standards and national legislation regarding users’ protection against manipulation, disinformation, harassment, hate speech and any expression which infringes privacy and human dignity. The functioning of internet intermediaries and technological developments behind their operation must be guided by high ethical principles. It is from both a legal and ethical perspective that internet intermediaries must assume their responsibility to ensure a free and pluralistic flow of information online which is respectful of human rights.

23 June 2022

Tabled by Mr Gianni MARILOTTI, Mr Roberto RAMPI, Ms Ada MARRA, Mr Gerardo GIOVAGNOLI, Mr Andrej HUNKO, Mr Gianluca PERILLI

Votes: 30 in favor 1 against 1 abstention

In the draft resolution, after paragraph 14, insert the following paragraph:

"When they deal with information, Internet intermediaries have to act in accordance with the principles set out in Resolution 2382 (2021) "Media freedom, public trust and the people's right to know". The "chilling effect" against journalists is an example of major threat to the people's right to know. As the Commissioner for Human Rights puts it in her letter to Priti Patel on 10 May 2022, "the broad and vague nature of the allegations against Mr Assange, and of the offences listed in the indictment, are troubling as many of them concern activities at the core of investigative journalism in Europe and beyond" (CommHR/DM/sf012/2022)."

In amendment 1, delete the second and third sentences.


15Consequently, the Assembly calls on Council of Europe member States to:
15.1bring their legislation and practice into line with Recommendation CM/Rec(2020)1 on the human rights impacts of algorithmic systems, and Recommendation CM/Rec(2018)2 on the roles and responsibilities of internet intermediaries;
15.2consider whether the concentration of economic and technological power in the hands of a few internet intermediaries can be properly dealt with via general and already existing competition regulations and tools;
15.3use anti-trust legislation to force monopolies to divest a part of their assets and reduce their dominance in the digital markets;
15.4develop a gradual regulatory approach to accommodate different types of regulations to different types of internet intermediaries, with the aim to avoid pushing new actors outside the market or enabling them to enter the market;
15.5address the issue of anticompetitive conduct in digital markets by strengthening the enforcement of regulations on merging and abuse of monopolistic positions;
15.6guarantee that any legislation imposing duties and restrictions on internet intermediaries with an impact on users’ freedom of expression be exclusively aimed at dealing with “illegal content” thus avoiding broader notions such as “harmful content”;
15.7ensure that mere automated content moderation is not allowed by the legislation; in this connection, encourage internet intermediaries, via legal and policy measures, to:
15.7.1allow users to choose means of direct and efficient communication which do not solely rely on automated tools;
15.7.2ensure that where automated means are used, the technology is sufficiently reliable to limit the rate of errors where content is wrongly considered as illegal;
15.8guarantee that legally mandated content moderation provides for the necessary presence of human decision makers, and incorporates sufficient safeguards so that freedom of expression is not hampered;
15.9encourage, via legal and policy measures, the participation of users in the establishment and assessment of content moderation policies;
15.10ensure that regulation enacted to ensure transparency of automated content moderation systems is based on a clear definition of information that is necessary and useful to disclose and of public interest that legitimises the disclosure;
15.11support the elaboration and respect of a general framework of internet intermediaries’ ethics, including the principles of transparency, justice, non-maleficence, responsibility, privacy, rights and freedoms of users;
15.12encourage internet intermediaries, via legal and policy measures, to counteract hate speech online by issuing warning messages to persons who spread hate speech online or by inviting users to review messages before sending them; encourage internet intermediaries to add such guidelines to the codes of conduct dealing with hate speech;
15.13consider adapting election legislation and policies tuned to the new digital environment by reviewing provisions on electoral communication; in this respect, reinforce accountability of internet intermediaries in terms of transparency and access to data, promote quality journalism, empower voters towards a critical evaluation of electoral communication and develop media literacy.