Committee Opinion | Doc. 15056 | 30 January 2020
Democracy hacked? How to respond?
Committee on Legal Affairs and Human Rights
A. Conclusions of the Committee
(open)


B. Proposed amendments
(open)Amendment A (to the draft resolution)
At the end of paragraph 1, add the following words: “, affecting the right to freedom of expression, including the right to receive information, and the right to free elections”.
Amendment B (to the draft resolution)
Replace paragraph 2 by the following: “The Assembly recalls that in its Resolution 2217 (2018) on legal challenges related to hybrid war and human rights obligations, it raised concerns over non-military techniques of “hybrid war”, such as cyberattacks, mass disinformation campaigns and interference in the election processes, as well as over the adequacy of existing legal norms. It was particularly concerned about numerous cases of mass disinformation campaigns intended to undermine security, public order and peaceful democratic processes, and stressed the vital need to develop tools to protect democracy from “information weapons”.
Amendment C (to the draft resolution)
Add a new paragraph 3 with the following wording: “The Assembly is concerned that the phenomenon of systematic influencing of public opinion, in particular before elections, has reached an unprecedented scale and that it is often associated with foreign interference, whether by State or non-State actors. It recalls that such interference by a State in another State’s internal affairs contravenes the well-established international legal principle of non-intervention and that States have a duty to investigate and hold to account any non-State actors that interfere in the democratic processes in other States.”
Amendment D (to the draft resolution)
At the end of paragraph 3, add the following sentence: “It calls on Council of Europe member States to refrain from taking any measures aimed at imposing State’s control over Internet and is worried about the growing trend, in non-democratic States, where governments ascertain authority over society by controlling Internet.”
Amendment E (to the draft resolution)
In paragraph 4, after sub-paragraph 4.3, add a new sub-paragraph 4.4 as follows: “invites Council of Europe member States to sign and/or ratify, where this is not already the case, and fully implement the Council of Europe Convention on Cybercrime (ETS No. 185) and its Additional Protocol concerning the criminalisation of acts of a racist and xenophobic nature committed through computer systems (ETS. No. 189)”.
Amendment F (to the draft resolution)
In paragraph 5, add a new sub-paragraph 5.4. with the following wording: “consider updating national legislation in order to counter disinformation campaigns more effectively;”.
Amendment G (to the draft resolution)
At the end of paragraph 5, add a new sub-paragraph with the following wording: “provide, in their respective civil, administrative and/or criminal legislation, for sanctions against natural and legal persons involved in “troll factories” disseminating disinformation;”
Amendment H (to the draft resolution)
At the end of paragraph 7 add: “It also invites the European Commission and the Strategic Communication Task Forces of the European External Action Service to ensure greater participation of relevant NGOs, acting within the European Union, in the leadership and consultation of their relevant bodies countering disinformation, in order to better detect, analyse and expose disinformation. These bodies should work closely, in a more transparent way, and regularly exchange information, for the sake of the common good”.
Amendment I (to the draft resolution)
After paragraph 7, add a new paragraph: “The Assembly also calls on the member States of the European Union to considerably increase the European Union’s budget allocated to the Strategic Communication Task Forces of the European External Action Service in order to strengthen the European Union’s capability to combat disinformation”.
Amendment J (to the draft resolution)
After paragraph 7, add a new paragraph: “The Assembly also welcomes the work of the intergovernmental European Centre of Excellence for Countering Hybrid Threats, which co-operates closely with the European Union and the North Atlantic Treaty Organization (NATO) and assists the participating States, the European Union and NATO in better understanding and countering hybrid threats, including the influence of disinformation”.
C. Explanatory memorandum by Mr Zingeris, rapporteur for opinion
(open)
1. Explanatory notes
1.1. Amendment A (to the draft resolution)
This amendment is intended to stress that digital disinformation campaigns aimed at shaping public opinion, and other phenomena and behaviours mentioned in paragraph 1 of the draft resolution, represent a challenge not only for democracy, but also for the right to freedom of expression, enshrined in Article 10 of the European Convention on Human Rights, and the right to free elections, guaranteed by Article 3 of Protocol No. 1 to the Convention.
1.2. Amendment B (to the draft resolution)
This amendment is intended to reformulate paragraph 2 of the draft resolution in line with paragraphs 3 and 4 of Resolution 2217 (2018); the current version of this paragraph refers only to paragraph 4 of the said resolution. In paragraph 3 of Resolution 2217(2018), the Assembly referred to the notion of “hybrid war”, as a combination of military and non-military techniques; the latter may consist, in particular, of cyberattacks, mass disinformation campaigns and interference in electoral processes. The Assembly also expressed concern as to the adequacy of the existing legal norms that apply to the phenomenon of “hybrid war”. Moreover, I would propose deleting the words “as regards cyberattacks” at the beginning of the paragraph (since the latter does not necessarily cover the issue of disinformation), as well as the reference to Recommendation 2130 (2018) (which contains other proposals).
1.3. Amendment C (to the draft resolution)
While paragraph 1 of the draft resolution mentions “trends
in foreign electoral interference”, this problem is not mentioned
elsewhere in the text. Foreign interference in elections is a very
worrying issue, which should have been given much more emphasis
in this report. According to experts, misinformation campaigns,
“sometimes backed by governments”, have influenced major events
in Europe, such as the Brexit vote and the debates around Catalan
independence in Spain and immigration in Italy. The fact that the 2016 elections
in the United States of America and the presidential elections in
France in 2017 were or may have been influenced by actors tracked
back to Russian locations is mentioned in Mr Schmidt’s report (paragraph
26).
These incidents,
as well as other cases of Russian actors’ involvement in misinformation
campaigns, were also addressed in the above-mentioned report by
our committee colleague Mr Cilevičs.
Moreover, Mr Schmidt’s report
refers
to the 2019 EU report
on the implementation of the Action Plan Against Disinformation
, according to which Russian sources
carried out a widespread disinformation campaign intended to suppress
voter turnout and influence voters’ preferences during the latest
European elections. According to this report, such a campaign, deployed by
State and non-State actors, poses “a hybrid threat to the EU”.
The
European Parliament’s resolution of 10 October 2019
(mentioned
in Mr Schmidt’s report,
but in a general context) refers
many times to disinformation campaigns originating from Russia and
to incidents of foreign electoral interferences.
In this resolution, the European
Parliament “expresses” 77deep concern over the highly dangerous
nature of Russian propaganda in particular (…) and notes with concern
that the number of disinformation cases attributed to Russian sources
and documented by the EU East Strategic Communication Task Force
more than doubled since January 2019.
The said resolution also strongly condemned
“the increasingly aggressive actions of state and non-state actors
from third countries seeking to undermine or suspend the normative
foundations and principles of European democracies and the sovereignty
of all EU accession countries in the Western Balkans and Eastern
Partnership countries”.
Moreover, the European Parliament stressed that “(…) foreign
interference in elections undermines the right of people to have
their say in the governance of their country, directly or through
freely chosen representatives, as enshrined in the Universal Declaration
of Human Rights, and that such interference by other states constitutes
a violation of international law, even when there is no use of military
force, threat to territorial integrity or threat to political independence”. The
principle of non-interference in other States’ domestic affairs
is well-established in international law and derives from the principle
of the sovereign equality of states; both principles were reaffirmed
by the United Nations General Assembly in its declaration of 24
October 1970.
Moreover, in the Helsinki Final Act of
the Conference on Security and Cooperation in Europe of 1 August
1975, the participating States stated that they would “(…) refrain
from any intervention, direct or indirect, individual or collective,
in the internal or external affairs falling within the domestic
jurisdiction of another participating State, regardless of their
mutual relations.”
Therefore, I am of the opinion that the draft
resolution should put more emphasis on the issue of foreign interference
in political processes and on its implications for States’ sovereignty.
1.4. Amendment D (to the draft resolution)
This amendment is intended to reaffirm the Assembly’s attachment to the right to freedom of expression, including the freedom of expression on Internet, and to protect this freedom from State’s undue interference. The right to “receive impart information and ideas without interference by public authorities and regardless of frontiers” is enshrined in Article 10 of the European Convention on Human Rights.
1.5. Amendment E (to the draft resolution)
This amendment aims at adding a sub-paragraph to promote implementation
at national level of the Council of Europe Convention on Cybercrime
(ETS No. 185), which is the only binding international instrument
in this field, and its Additional Protocol to concerning the criminalisation
of acts of a racist and xenophobic nature committed through computer
systems (ETS. No. 189), which states that “distributing, or otherwise
making available to the public”, through a computer system of “racist
and xenophobic material” and “material which denies, grossly minimises,
approves or justifies acts constituting genocide or crimes against
humanity” shall be criminalised in national laws (Articles 3 paragraph
1 and 6 paragraph 1). Member States of the Council of Europe which
have not yet adhered to these instruments should sign and/or ratify
them.
1.6. Amendment F (to the draft resolution)
This amendment is aimed at stressing the need to amend national
legislation, if necessary, in order to make it better adapted to
the objective of countering disinformation campaigns. As explained
in Mr Schmidt’s report, Germany and France have adopted special
laws to tackle this phenomenon . The German Network Enforcement
Act (NetzDG, also known as the “Facebook law”)
of June 2017, although criticised
as a possible threat to freedom of expression, entered into force
on 1 January 2018. It obliges social media companies such as Facebook
and Twitter to introduce complaint procedures for users to report
“illegal content” (meaning hate speech, defamation, incitement to
violence and other offences defined in the Criminal Code) posted
on their platforms. A bill inspired by the German law was announced
by the Croatian government in January 2018.
In France, the “Law on combating information manipulation”,
was passed in December 2018 and is now in force.
The law defines “information manipulation”
as “inexact or misleading allegations or imputations of a fact that
could alter the sincerity of an upcoming vote and that is spread
deliberately, artificially or automatically and massively to the
online public through a communication service”.
It imposes
several obligations (including reporting and transparency) on digital
platform operators during electoral campaigns, especially during
the three months preceding any electoral vote. During this period,
a candidate, a political party or a political group, the public
prosecutor or any person having an interest may bring the case before
a judge (juge des référés)
in order to counter disinformation during the electoral period.
The
law gives the authorities (the Conseil
Supérieur de l’Audiovisuel) the power to suspend the
diffusion of a service provided by a digital platform operator until
the end of the vote, if it finds that the service is under the influence
of a foreign State and is deliberately diffusing false information
in order to influence the vote.
Despite
the controversies raised even before its adoption
, it is important to note that this
law specifically addresses the issue of disinformation campaigns
aimed at influencing the result of elections.
1.7. Amendment G (to the draft resolution)
As mentioned in Mr Schmidt’s report (paragraph 81), “trolling”
is one of the symptoms of “democracy hacking”. For example, the
existence of a “troll factory”, based in Saint Petersburg, was often
reported by the media. Therefore,
States should do more to combat the actions of “troll factories”,
in particular by providing, in their relevant legislation, for adequate
sanctions against natural and legal persons involved in such activities.
1.8. Amendment H (to the draft resolution)
In carrying out its various activities against disinformation, the European Commission and the European External Action Service should co-operate with relevant NGOs, who could assist it detecting fake news. Their relevant internal bodies should also co-operate closely.
1.9. Amendment I (to the draft resolution)
The European Union bodies charged with combatting discrimination
have been allocated too scarce resources. As indicated in the European
Commission’s Joint Communication of 5 December 2018, in 2019, the budget for strategic communication
was expected to more than double from 1,9 million euros to 5 million
euros. However, this is still very little compared with the sums
allegedly spent by some countries to spread their propaganda (for
example, it seems that Russia spends more than 1 billion euros for
this purpose).
1.10. Amendment J (to the draft resolution)
While the report rightly examines (in section 5) the measures
taken by the European Union to combat disinformation, this amendment
is aimed at putting emphasis on the work of the European Centre
of Excellence for Countering Hybrid Threats (“Hybrid CoE”) , which is based in Helsinki and was
established in 2013. The role of this intergovernmental think-tank
consists of assisting the participating States
, the European Union and
NATO in better understanding and countering hybrid threats, including
influencing information”.