‘Democracy hacked’

“The Internet is not only a place of democracy and open debate but also a place where criminal views are expressed, where discourse is deliberately manipulated through the hidden use of social bots, where fake news is knowingly disseminated on a large scale, where major cyberattacks are increasing in number and where individuals’ personal data are traded before being used for the purposes of manipulation,” PACE rapporteur Frithjof Schmidt (Germany, SOC) stressed in his opening speech at a hearing on “‘Democracy hacked? How to respond?”.

The hearing was organised by the PACE Political Affairs Committee in Paris on 11 September with the participation of Professor Divina Frau-Meigs, sociologist and media researcher at the Sorbonne Nouvelle University in Paris, and Ben Scott, member of the management board of the think-tank Stiftung Neue Verantwortung and former co-ordinator of technology and innovation policy adviser for the Hillary Clinton campaign in 2016.

There have been attempts to influence opinion-forming processes and, consequently, in democratic states, decision-making processes, with a high risk of the destabilisation of political institutions, participants agreed. They stressed that it is high time for states to assume greater responsibility with regard to the Internet and to work to strengthen the principle of accountability on the part of social media companies.

For Professor Frau-Meigs, the European market is at least partly responsible as it was unable to create a European equivalent of Facebook or promote a European search engine. On a daily basis, she said, most of us use tools with US-embedded values such as Google, to which none of our European rules apply, such as tax obligations, advertising constraints, public service obligations, anti-trust laws etc. “We need to create a new paradigm that corresponds to the European idea of value and we need to build societal resilience through fact-checking, the integrity of information and media literacy,” she said, recommending the creation of a European institution on information disorder and digital citizenship.

Ben Scott warned that the greatest threat to democracy comes not from outside but from inside. Social media products can serve to split audiences by simply reinforcing existing biases with divisive political messages about issues like race, migration and religion, and can amplify domestic political movements. At the same time, he said, it was getting ever easier to make extreme points of view credible in larger parts of society. “This toxic combination is not only a reality in the US, but also in Europe and poses a serious challenge for democracy,” he said, advocating a public sector initiative for a ‘Charter for Digital Democracy’. Election security, he said, is clearly one of the most pressing issues “if we want to stop public distrust turning into an existential threat”.

Another priority is the protection against illegal content - such as hate speech - without endangering the freedom of expression. For rapporteur Schmidt, the trend in discourse on the web and social media, and the changes that have affected the role of information there, must indeed be taken into account in international conventions that govern digital activities.

Other aspects such a Charter for Digital Democracy should focus on, according to the experts, are a monitoring function through the development of tools for civil society to better understand ‘online research results’, transparency - e.g. robot-run websites should be branded as such, social accountability for automated decision-making, transparency of resources and, most importantly, media literacy.

Indeed, “cocooning”, i.e. the tendency of connected groups of individuals to keep to themselves and only follow “news”, whether true or false, that confirms their point of view, combined with a lack of media literacy, are issues that urgently need to be addressed, participants concluded. Even “digital natives” have very low standards about the sources they accept as accurate, and have proven inept at judging credibility.

Finally, the experts explained the functioning of the current business model of the Internet and social media which encourages the spread of false information and, in so doing, threatens representative democracies.

Digital marketing aimed at, in summary, identifying the message that is best suited to sell a product/ idea etc. demystifies our behaviours, captures our attention, and turns us into loyal buyers/followers over time. Ads ginning up fear and outrage on topics such as religion, migrants etc. can benefit from Google and Facebook’s machine-learning algorithms, which scan vast amounts of data and conduct tests on multitudes of political messages to determine the best way to find and engage an audience, participants were told.

The economic interests of advertisers and social media companies being essentially aligned, the latter need to think about how to segregate the goals they implicitly share with disinformation actors. Experts and politicians should therefore call for limits on the vast amounts of data available to a digital advertising industry dominated by social media and internet-platform companies. They should enforce comprehensive privacy reforms and raise public awareness of the way data are collected and used. In the long run, they predicted, only plummeting income from the advertising industry and tumbling numbers of followers will trigger policy changes.