Logo Assembly Logo Hemicycle

Online media and journalism: challenges and accountability

Committee Opinion | Doc. 14240 | 24 January 2017

Committee
Committee on Legal Affairs and Human Rights
Rapporteur :
Mr Boriss CILEVIČS, Latvia, SOC
Origin
Reference to committee: Doc. 13589, Reference 4082 of 3 October 2014. Reporting committee: Committee on Culture, Science, Education and Media. See Doc. 14228. Opinion approved by the committee on 23 January 2017. 2017 - First part-session

A Conclusions of the committee

1. The Committee on Legal Affairs and Human Rights congratulates the rapporteur of the Committee on Culture, Science, Education and Media, Ms Adele Gambaro (Italy, ALDE), for her comprehensive report, and supports by and large the draft resolution.
2. Due to rapid development and day-to-day changes, online media and journalism is a complex field which involves difficult legal issues, especially in terms of liability of internet news portals. The committee therefore proposes some amendments aimed at strengthening the legal aspects of the resolution.

B Proposed amendments

Amendment A (to the draft resolution)

After paragraph 2, insert the following paragraph:

“The Assembly wishes to emphasise in this context the public broadcasters’ special responsibility to adequately reflect the entire diversity of views present in society and recalls Committee of Ministers’ Recommendation CM/Rec(2012)1 on public service media governance. Since public broadcasters are increasingly involved in the online media market, they could be instrumental in achieving the goals of the present resolution.”

Amendment B (to the draft resolution)

After paragraph 3, insert the following paragraph:

“As constantly stressed by the European Court of Human Rights in its case law, the press plays a vital role in a democratic society in imparting information on matters of public interest. It acts as a ‘public watchdog’, allowing members of the public to discover and form opinions about the attitudes and actions of political figures.”

Amendment C (to the draft resolution)

After paragraph 5, insert the following paragraph:

“The Assembly stresses the importance of the case law of the European Court of Human Rights and especially of its Grand Chamber judgment in the case of Delfi AS v. Estonia (Application No. 64569/09). This landmark decision has clarified the duties and responsibilities of internet news portals when they provide, on a commercial basis, a platform for user-generated comments on previously published content.”

Amendment D (to the draft resolution)

After paragraph 7, insert the following paragraph:

“Referring to Recommendation CM/Rec(2014)7 of the Committee of Ministers on the protection of whistleblowers, and recalling its own Resolutions 1729 (2010) and 2060 (2015) on this subject, the Assembly reminds member States that they should have in place a normative, institutional and judicial framework to protect individuals who, in good faith, report or disclose information on threats or harm to the public interest. This is particularly relevant in the context of online media and journalism as the internet is one of the channels typically used by whistle-blowers to make wrongdoings public.”

Amendment E (to the draft resolution)

After paragraph 8.1.6, add the following paragraph:

“who have not yet done so, sign and ratify the Council of Europe Convention on Cybercrime (ETS No. 185) as well as its Additional Protocol concerning the criminalisation of acts of a racist and xenophobic nature committed through computer systems (ETS No. 189).”

Amendment F (to the draft resolution)

After paragraph 8.1.6, add the following paragraph:

“should co-operate with online media and internet service providers in order to set up codes of conduct which are inspired by the code of conduct countering illegal hate speech online agreed upon by the European Commission and major internet companies on 31 May 2016.”

Amendment G (to the draft resolution)

After paragraph 8.1.6, add the following paragraph:

“should develop clearer rules on liability of internet site owners for content posted by third parties, taking in particular into account the landmark judgment of the European Court of Human Rights in the case of Delfi AS v. Estonia.”

C Explanatory memorandum by Mr Boriss Cilevičs, rapporteur for opinion

1. I can only congratulate Ms Adele Gambaro on her report, which rightly highlights the challenges posed by the growing presence and usage of online media and the effect this has on journalism. Whilst internet journalism provides easy access to a huge range of information and news that would be otherwise unavailable, it also increases the possibility that such information may be wrongly reported, manipulated or altered. The variety in the types of internet news platforms presents complex considerations for member States in deciding how to tackle these issues. I should, therefore, like to propose some amendments to the draft resolution, with a view to completing it from a legal perspective.
2. It seems to me useful to complement the report presented by the Committee on Culture, Science, Education and Media with some definitions as well as with further reference to the relevant recent case law of the European Court of Human Rights (“the Court”).

1 Definitions

3. Throughout the report, the term “service provider” is used. It is useful to recall that Article 1.c of the Council of Europe Convention on Cybercrime (ETS No. 185) defines it as follows:
“‘service provider’ means:
i. any public or private entity that provides to users of its service the ability to communicate by means of a computer system, and
ii. any other entity that processes or stores computer data on behalf of such communication service or users of such service.”
4. Furthermore, the report and the draft resolution often refer to “professional journalists”. For obvious reasons, it would be difficult to provide a precise definition of such a “professional” journalist and it might not be advisable either. However, I would like to recall an existing broad definition which ought to be taken into account in this context.
5. Recommendation No. R (2000) 7 of the Committee of Ministers on the rights of journalists not to disclose their sources of information defines the term “journalist” as follows: “the term ‘journalist’ means any natural or legal person who is regularly or professionally engaged in the collection and dissemination of information to the public via any means of mass communication” (see Appendix to Recommendation No. R (2000) 7). While recalling that the Court has not specified the requirements for being considered a journalist under Article 10 of the European Convention on Human Rights (ETS No. 5, “the Convention”), the Explanatory memorandum to Recommendation No. R (2000) 7 adds that “the Recommendation uses the terms ‘regularly or professionally engaged’. This must not exclude, however, journalists who work freelance or part-time, are at the beginning of their professional career, or work on an independent investigation over some time. Professional accreditation or membership is not necessary. Nevertheless, individuals who otherwise would not regard themselves as being journalists shall not qualify as journalists for the purposes of this Recommendation”.

2 The Strasbourg Court’s relevant case law

6. The European Court of Human Rights found in a number of cases that the right of the public to receive information falls under the protection of Article 10 in the same way as the right to impart information. Furthermore, in the case Times Newspapers Ltd v. the United Kingdom, the Court emphasised the importance of the internet and its vital role “in enhancing the public’s access to news and facilitating the dissemination of information in general”.Note It described the internet as a one-of-a-kind medium, because of its capacity to store and communicate vast amounts of information and its general accessibility to the public. Internet archives and their maintenance also fall under the protection of Article 10. Moreover, the Court expressed in detail how far the scope of Article 10 reaches in the case Ahmet Yıldırım v. Turkey,Note by noting that Google Sites, as a service that facilitates the creation and sharing of websites within a group, constitute a means of exercising freedom of expression.
7. In this context, the question arises as to how to define when liability can be imposed on intermediary websites for user-generated content, and to what extent they can remove offensive or defamatory material without infringing on the right to freedom of expression, as enshrined in Article 10 of the Convention. Currently, there exist differing legal regimes as to what extent owners of internet sites should monitor the content and information that is posted or provided by users.Note
8. Directive 2000/31/EC of the European Parliament and of the Council on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (“Directive on electronic commerce”) excludes, under certain conditions, liability of service providers for information stored on their websites, and prevents member States from imposing a general obligation to monitor the information that they transmit or store. The European Court of Human Rights, on the other hand, ruled in Delfi AS v. EstoniaNote and Magyar Tartalomszolgáltatók Egyesülete And Index.Hu Zrt v. HungaryNote that website owners may be liable for hate speech or incitement to violence posted by third parties, even if they do not have notice of it. Data protection considerations also come into play, and as noted by Ms Gambaro in her report, the Court of Justice of the European Union, referring to Data Protection Directive 95/46/EC of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data, recently ruled that personal information about users cannot be stored without a time limit, and all users have the “right to be forgotten”.Note
9. In the case Delfi AS v. Estonia, the Court was able to provide certain clarifications. It had to examine a complaint about liability for comments made by users on an internet news portal. Delfi, one of the biggest news portals in Estonia, ran a story that generated some offensive responses, including threats directed against an individual. Even though Delfi removed the offending comments, the individual facing threats was awarded damages. The question before the Court was whether Delfi’s freedom to impart information was breached, because it was held liable for comments posted by third parties. For the first time, the Court was confronted with such an innovative issue and emphasised the important role the internet plays in society, as it does not only provides the public with a range of benefits, especially with regard to the freedom of expression, it also involves a whole series of dangers. As stressed by the Court, “[d]efamatory and other types of clearly unlawful speech, including hate speech and speech inciting violence, can be disseminated like never before, worldwide, in a matter of seconds, and sometimes remain persistently available online”. (paragraph 110). The Court had to balance two conflicting realities in this case: the violation of personality rights according to Article 8 and the internet as a medium facilitating the freedom of expression according to Article 10. It found that holding Delfi liable is a justified and proportionate restriction on freedom of expression communicated by its portal, taking into account the lucrative nature of the internet news portal and, most importantly, that the incriminated comments amounted to hate speech or incitement to violence.
10. In the case Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, the Court was also faced with liability of a self-regulatory body of internet content providers and an internet news portal for comments posted on their websites. The Court made a direct comparison between the two cases and followed the same approach in the Delfi Case. However, it came to the conclusion, while taking into account the economic interest of the applicants and the consequences of the comments, that this case was in fact different. It held that there had been a violation of freedom of expression under Article 10 and noted “although offensive and vulgar, the incriminated comments did not constitute clearly unlawful speech; and they certainly did not amount to hate speech or incitement to violence” (paragraph 64). Most importantly, however, the Court reiterated that internet portals assume duties and responsibilities under Article 10.2 and can be held liable for user-generated comments which contain clearly unlawful expressions, amounting to hate speech and incitement to violence.

3 Amendments

3.1 Amendment A (to the draft resolution)

Explanatory note:

This amendment emphasises the importance and special responsibility of public media and its obligations in a democratic society while referring to Committee of Minister Recommendation CM/Rec(2012)1 on public service media governance which states that “public service media play a specific role with regard to [the respect of the right to seek and receive information] and the provision of a varied and high-quality content, contributing to the reinforcement of democracy and social cohesion, and promoting intercultural dialogue and mutual understanding”.

3.2 Amendments B, C and G (to the draft resolution)

Explanatory note:

These amendments aim to recall the role of the press in a democratic society, as described by the European Court of Human Rights in its case law and by adding a specific reference to its Delfi AS v. Estonia landmark decision, whose relevance is described above.

3.3 Amendment D (to the draft resolution)

The amendment is self-explanatory.

3.4 Amendment E (to the draft resolution)

Explanatory note:

The Council of Europe Convention on Cybercrime is the first international treaty on crimes committed via the internet and other computer networks, dealing particularly with infringements of copyright, computer-related fraud, child pornography and violations of network security. Its main objective, set out in the preamble, is “to pursue … a common criminal policy aimed at the protection of society against cybercrime, inter alia, by adopting appropriate legislation and fostering international co-operation”.

The scope of the Cybercrime Convention has been extended via its Additional Protocol concerning the criminalisation of acts of a racist and xenophobic nature committed through computer systems (ETS No. 189). As is pointed out in the Explanatory Report to the Additional Protocol, “[t]he emergence of international communication networks like the internet provide certain persons with modern and powerful means to support racism and xenophobia and enables them to disseminate easily and widely expressions containing such ideas”.

Both instruments are therefore relevant in this context, especially with regard to defining terms such as “service provider” or legally unacceptable content, and deserve to be mentioned in the resolution.

3.5 Amendment F (to the draft resolution)

Explanatory note

Code of conducts can become instrumental tools in preventing the spread of illegal hate speech, which is so easy and quick via the internet. The Code of conduct countering illegal hate speech online agreed upon by the European Commission and major internet companies on 31 May 2016 should serve as an inspiration.

;