Print
See related documents

Motion for a recommendation | Doc. 14628 | 26 September 2018

Justice by algorithm – the role of artificial intelligence in policing and criminal justice systems

Signatories: Mr Boriss CILEVIČS, Latvia, SOC ; Ms Thorhildur Sunna ÆVARSDÓTTIR, Iceland, SOC ; Ms Doris BURES, Austria, SOC ; Mr Ervin BUSHATI, Albania, SOC ; Ms Klotilda BUSHKA, Albania, SOC ; Mr Algirdas BUTKEVIČIUS, Lithuania, SOC ; Mr Hendrik DAEMS, Belgium, ALDE ; Ms Maria GUZENINA, Finland, SOC ; Mr Valeri JABLIANOV, Bulgaria, SOC ; Ms Ioanneta KAVVADIA, Greece, UEL ; Mr Tiny KOX, Netherlands, UEL ; Ms Kerstin LUNDGREN, Sweden, ALDE ; Mr Dirk Van der MAELEN, Belgium, SOC ; Mr Pieter OMTZIGT, Netherlands, EPP/CD ; Mr Frithjof SCHMIDT, Germany, SOC ; Ms Ingjerd SCHOU, Norway, EPP/CD ; Mr Frank SCHWABE, Germany, SOC ; Mr Gheorghe-Dinu SOCOTAR, Romania, SOC ; Ms Olena SOTNYK, Ukraine, ALDE ; Ms Petra De SUTTER, Belgium, SOC ; Mr Sergiy VLASENKO, Ukraine, EPP/CD ; Mr Phil WILSON, United Kingdom, SOC ; Mr Leonid YEMETS, Ukraine, EPP/CD ; Mr Emanuelis ZINGERIS, Lithuania, EPP/CD

This motion has not been discussed in the Assembly and commits only those who have signed it.

The criminal justice system represents one of the central areas of state activity, ensuring public order, preventing violations of various fundamental rights and detecting, investigating, prosecuting and punishing criminal offences. It gives the authorities significant intrusive or coercive powers including surveillance, arrest, search and seizure, detention, and use of physical and even lethal force.

Data processing tools are increasingly being used in criminal justice systems. The most advanced systems use predictive algorithms to inform decision-making in areas including policing patterns, bail and sentencing. They have in many ways proved effective and are often valued by the authorities that use them.

There are, however, grounds for concern. These systems are usually provided by private companies, in which case the algorithms are commercial secrets – “black boxes” that cannot be subject to public scrutiny. The quality of output of an algorithm is dependent on the quality of the input data: if the input data inadvertently reflects, for example, racial bias, so will the output, despite the algorithm’s apparent neutrality and objectivity. Decision makers may be reluctant to depart from recommendations generated by algorithms, to the detriment of the often important role of individual judgment and discretion. Police departments may lose control over their own data, making them dependent on the private companies that have acquired it, with little choice but to maintain contractual relations whatever the cost.

The Parliamentary Assembly should examine the role of algorithms and artificial intelligence in criminal justice systems from the perspective of Council of Europe standards on human rights and the rule of law, with a view to making possible recommendations to member States and to the Committee of Ministers for further action.