1. Introduction
1. Over the past two years, data
collection and processing by using digital public health technologies,
such as contact tracing applications (CTAs), have been promoted
worldwide by governments as well as private companies to mitigate
the Covid-19 pandemic, identify subjects at risk of contamination
or ensure compliance with confinement rules. The widespread use
of these novel digital tools seemed an appealing solution to limit infection,
also for the World Health Organisation (WHO), which swiftly issued
guidelines for their use and ethical considerations in May 2020.
In
many countries, however, their introduction was met with hesitation due,
inter alia, to privacy issues.
2. Nowadays, countless other health, lifestyle and wellness applications
are also available in app stores, to help people quit smoking, count
steps, eat healthier, guide lifestyle changes, etc. The ethical
and legal frameworks remain unclear to this day and a global debate
has emerged around the risks associated with such tools. These go
beyond the legal challenges posed by unlawful interference with
the right to a private life of individuals. They can threaten human
rights and fundamental freedoms during and after a crisis and traverse the
blurred line between disease surveillance and population surveillance.
3. Furthermore, the effectiveness of such digital technologies
largely pertains to the technical designs, implementation methods
and the level of public trust towards those tools. Thus, there might
be a need for appropriate legislation and policies, including in
education, to frame and control a privacy-friendly use of such digital
tracking technologies.
4. These considerations prompted our Committee to table a motion
for resolution
in October 2020.
5. My report aims at identifying the problem and putting forward
relevant recommendations not only to safeguard human rights and
fundamental freedoms but also to analyse the balance that must be
struck between data protection concerns and the need to assess the
effectiveness of these tools in protecting public health.
6. My analysis builds on the background report by Professor Wolfgang
E. Ebbers, whom I thank for his work, and on the valuable contribution
by the experts who participated in our committee hearings.
The discussions
pointed to the need for the scientific community to assess the effectiveness
and impact of CTAs to improve public health policies, while complying
with data protection standards. I have also taken account of contributions
by other scientific experts
and by members
of the committee.
2. The position of Council of Europe bodies
on contact tracing applications
7. The Council of Europe Consultative
Committee established by the Convention for the protection of individuals
with regard to automatic processing of personal data (Convention
108), the Data Protection Commissioner as well as the Committee
on Bioethics (DH-BIO) issued several statements in 2020-2021, on contact
tracing applications raising their concerns and providing useful
guidelines. Indeed, these applications allow for massive surveillance
and collection of personal data (for example on monitoring people’s
movements and vaccinations, evidence of past infection, symptom
detection, test results, enforcement of confinement measures or
digital tracing of contacts), the storage, analysis and use of which
raise serious questions regarding the protection of privacy. Data
protection standards laid down by Convention 108 and its modernised version,
Convention 108+, must be implemented when adopting any extraordinary
measures.
8. In order to assist Parties to Convention 108 in addressing
privacy and data protection issues when setting up and implementing
measures to fight against the Covid-19 pandemic, two joint declarations
by the Chair of the Committee of Convention 108 and the Data Protection
Commissioner of the Council of Europe were issued in 2020. Those
declarations recall that the general principles and rules of data
protection are fully compatible and reconcilable with other fundamental
rights and relevant public interests, such as public health. It
is essential to ensure that data protection frameworks continue
protecting individuals and that the necessary privacy and data protection
safeguards are incorporated in extraordinary measures that are taken
to protect public health.
9. According to the first joint statement of 30 March 2020 on
the right to data protection in the context of the Covid-19 pandemic,
States must only take
temporary measures that are necessary and proportionate to the legitimate
aim pursued and which respect democracy, the rule of law and human
rights, including the rights to privacy and to protection of personal
data. Special attention is required in certain sectors such as public health,
employment, telecommunications and education.
10. The second Joint Statement on Digital Contact Tracing of 28
April 2020
recalls that “large-scale personal
data processing can only be performed when, on the basis of scientific
evidence, the potential public health benefits of such digital epidemic
surveillance (e.g. contact tracing), including their accuracy, override
the benefits of other alternative solutions which would be less
intrusive”. The need for these measures to constitute an integral
part of a national epidemiologic strategy should be underlined,
and the importance of the choice of the tracing model and its inclusive
nature highlighted. The joint statement also points to conditions
of acceptability, and therefore of effectiveness, of such a system,
among which trust and voluntariness are first and foremost.
11. As for the legitimacy of the data processing, the joint statement
advocates for a legal basis directly provided by law, while keeping
the voluntariness in the use of the application. Carrying out a
privacy impact assessment and the “privacy-by-design” principle
figure among the guarantees to be used in relation to such processing
operations.
The necessity to
thoroughly apply the purpose specification principle is highlighted. The
practical applicability of important concepts must also be presented
(such as the sensitivity, quality, minimisation of data, the right
of data subjects with respect to automated decision making, the
necessity for de-identification of data, data security requirements,
the link between the choice of digital architecture used and the
protection of privacy, the importance of interoperability, transparency,
temporary nature of databases, oversight and audit).
3. Key
aspects of contact tracing applications
12. The following are some key
features which should be taken into account when developing CTAs, highlighting
some of the major recommendations and guidelines issued by WHO as
well as relevant Council of Europe bodies. They mainly pertain to
data protection, ethics, and risks of discrimination. It should
also be noted that the acceptability of these systems depends on
the understanding and trust that they inspire in the population.
3.1. Efficacy
and accuracy issues
13. Despite the wide reception
of contact tracing applications around the globe, there is still
a lack of scientific evidence about their efficacy, and often no
risk assessment studies. Thus, the accuracy and effectiveness remain
ambiguous, and even if they can contribute to the pandemic response,
the data quality and integrity risks may outnumber their advantages.
This uncertainty and the lack of knowledge on their scientific efficacy
in turn can make a risk assessment of these technologies utterly
challenging.
14. The development and deployment of a novel technology usually
require a great deal of investment and time for the sake of effectiveness.
These technologies may only be effective in countries with an adequate technological
infrastructure. A widespread use of smartphones or other appropriate
devices as well as easy access to the internet are indispensable.
A digital proximity tracking technology should be adopted by a sufficiently
high rate of the population to be effective for contact identification.
However, reaching the
targeted usage rate, while safeguarding freedom of choice is not
easy and public attitudes may vary significantly.
15. Their effectiveness also depends on the users’ speedy response
to any alerts.
Furthermore,
some applications are not compatible with older smartphones and
applications can lead to false alarms.
The
risk of over-reporting interactions can also lead to a redundant
number of “false positives”. This, in turn, could result in needless
self-isolation or might even lead to users’ distrust in the reliability
of warnings.
16. As stated by the Council of Europe data protection authorities,
the key question is: “considering the absence of evidence of their
efficacy, are the promises worth the predictable societal and legal
risks?” If the benefits override those of alternative and less intrusive
solutions, then the legal and technical safeguards to mitigate the
risks at stake must be in place.
I will further elaborate
on the current impact assessment challenges in the next chapter.
3.2. Data
protection and privacy
17. Deployment of contact tracing
applications requires data protection and privacy laws to be in
place in order to provide a legal basis and define limits for data
processing and usage. All digital public health tools breach individual
privacy through access to information on the individual’s health
status, behaviour or location.
18. A possible risk is often referred to as “surveillance creep”,
which implies using the crisis as an opportunity to establish and
retain tracking of citizens during and after crises. Consequently,
governments might retain tracking citizens’ data which could be
used in other contexts such as law enforcement.
19. Several regulatory instruments offer safeguards to the right
to privacy and data protection, in particular Convention 108 and
at European Union level, the General Data Protection Regulation,
the e-Privacy Directive, as well as the European Union Charter of
Fundamental Rights. These tools also allow abridgement of rights
in specific circumstances, including public health crises. These
must be proportionate to public benefits and must lead to faster
restoration of other suspended rights and freedoms, such as freedom
of movement and assembly.
20. The processing of personal data gathered by such applications
has also raised considerable fundamental rights concerns. Some countries
have developed bio-surveillance programmes that share some characteristics
of both a pandemic response and counter-terrorism programmes.
For instance,
an Alibaba-backed government-run application that supports decisions
on who should be quarantined for Covid-19 in China also shared information
with the police.
21. It is crucial to distinguish digital public health technologies
that allow third-party sharing of information for non-health-related
purposes from those that do not. Additionally, contact tracing applications
must clarify upfront the duration, the type of data collected, and
the duration of information storage. Heightened surveillance empowered
by digital public health technologies must stop after the end of
the emergency. Finally, information collected should reside on a
user’s own device, rather than on servers run by the application developer
or a public health entity.
3.3. Voluntariness
and autonomy
22. Digital technologies have the
potential to threaten not only privacy but also personal autonomy.
The most obvious form of violation of personal autonomy is making
their use mandatory. All workers in India have been obliged to use
a government-backed Covid-19 tracking application.
Even when such technologies are not mandatory,
some companies might require their use to gain access to their services.
23. Some mobile phone applications include permissions to collect
data beyond the stated purpose of the application, undermining people’s
ability to consent to being tracked or having their information
shared. For instance, Bluetooth-based proximity tracking applications
require users to keep their Bluetooth turned on creating additional
risks. Allegedly, a Polish app forced patients to take selfies as
a proof of their confinement or else face a police visit.
24. For the use of those applications to be voluntary, individuals
should decide whether to carry a smartphone, download and install
the applications, leave them operating in the background all the
time, react to alerts, and share the contact logs after testing
positive.
In addition,
users should be free to uninstall those applications at any time
and remove any data that has already been collected. Potential users
should be able to make informed decisions on the application’s functioning
and on the transmission, access and use of data.
25. Those who decide not to use or to remove the application should
not be sanctioned or restricted in any way. In China, people were
required to use contact tracing applications to access public areas,
such as subways, malls and markets. Even though the use of applications
is based on voluntariness, the implied restrictions make it
de facto compulsory.
Voluntariness
can also be impeded through indirect coercion if a government threatens
to impose a second lockdown if not enough people download the application.
In France, for instance, the use of tracing applications was discussed
in parliament alongside easing lockdown measures. Peer pressure
and societal expectations can engender an environment in which people
feel compelled to act.
3.4. Inequities
and discrimination
26. There is a manifest inequality
in terms of access to contact tracing applications. For instance,
high-risk groups, such as the elderly, may not have access to smartphones
and may therefore be excluded from the use of this technology.
Thus, these tools could deepen the
digital divide and cause an unequal distribution of burdens and
benefits amongst the population.
27. Disadvantaged workers and their families may be less likely
to work from home and run a higher risk of infection.
Thus, in the case of a selective quarantine
due to tracking applications, a larger proportion of disadvantaged
workers could be quarantined and be more likely to bear the social,
economic, and psychological ill effects of quarantine.
Once again, voluntary
access to such technologies in resource-limited settings should
be encouraged, for example through lower mobile data costs or low-cost
devices.
28. Since these technologies can collect data, they have the capacity
to include race, ethnic groups, gender, political affiliation, and
socio-economic status. Some sensitive data are not necessarily related
to public health and can cause stigmatisation of ethnic or socio-economic
groups.
29. The stratification of the population might deepen existing
divides leaving some groups more vulnerable to the crisis. At the
same time, data collection should also capture epidemiological factors
such as social and economic differences that are known to drive
disparities in infection rates. Such efforts, especially when taking place
in low trust environments, need to be supplemented by robust safeguards,
including analytical capacities to contextualise the data to avoid
further stigmatisation of underserved groups and provide evidence-based action
against persistent health inequalities.
4. Difficulty
in assessing CTAs’ impact on public health
30. Many countries in and outside
Europe accompanied the introduction of CTAs with efforts to monitor
and assess their effectiveness and impact on public health. The
European Centre for Disease Prevention and Control together with
WHO, developed an indicator framework to evaluate the public health
effectiveness of digital proximity tracing solutions, with a list
of indicators that can be used as a basis for a standardised evaluation
of the public health effectiveness of CTAs.
31. Scientists all over the world have collected scientific evidence
as to why and how citizens were downloading and activating CTAs
and when they anonymously notified others after testing positive.
Experts call the first two steps “adoption” of CTAs, while the latter
reflects one aspect of users' “adherence” to CTAs’ recommendations.
32. Initial scientific reports focused on what makes users follow
the recommendations.
CTAs send anonymous notifications
that the user has been in contact with an individual who has thereafter
tested positive. Many of these notifications are accompanied by
public health recommendations, such as messages containing instructions
to “get tested”, “quarantine” and “not receive any visitors”. Do
people follow those recommendations? If not, deploying CTAs to protect
public health would not achieve its goal to prevent onward transmission.
33. So far, substantial scientific evidence on the public health
impact of CTAs seems to be relatively limited. The paucity of data
collected has made evaluating their effectiveness very challenging,
with few reports on even basic metrics such as the number of active
users, number of positive tests entered, number of notifications
sent and received, and data that would enable the inference of both
the accuracy and the effectiveness of the notifications.
34. At the hearings organised by the committee, some experts highlighted
the strictness in the implementation of data protection standards.
35. CTAs using the Google-Apple Exposure Notification system are
designed in such a way that they do not collect identifiable health
data, at least not without explicit consent. Manual contact tracing,
via labs and testing centres for instance, may lead to the accumulation
of very sensitive, health-related information. This data cannot
be shared with third parties without consent, including the scientific
community, or only in an aggregated and anonymised way.
36. This principle also applies to data collected via touchpoints
of CTA users with the health system, for example when getting tested
or traced by manual contact tracing. These data are vital for CTA
effectiveness analyses but have been virtually unavailable for technical
and data protection reasons.
5. Insights
into CTAs’ adoption and adherence
37. Clearly, the higher the CTA
adoption and adherence rate, the higher the chances of effectively
breaking chains of infection.
The numbers and rates of adoption
vary per country
and per time period and are generally
quite low according to the recent studies.
38. In 2022, many countries started to lift isolation and quarantine
requirements and several apps are no longer in place, at least temporarily,
for instance the Dutch, the Swiss and the Icelandic apps.
39. To assess the adoption rate, the related numbers are often
expressed as proportions of a country’s entire population. An alternative
is to look at only the eligible target population based on compatible
phones and age range.
40. Data sources include surveys and numbers of downloads from
the main app stores as a proxy to express the proportion of users.
However, this method can lead to overestimating the proportion of
CTA users as they can have more than one smartphone, delete the
app or change smartphones.
41. An alternative is to take the number of active users as a
basis, while complying with data protection standards. The UK app
for example pings a small and anonymous “status” packet back to
a server, whereas for example in Switzerland a more indirect method
was used.
Some app stores also
provide metrics which allow for a better estimation of the number
of active users rather than the number of downloads.
42. Likewise, the more users adhere to the recommendations, the
higher the chances of effectively breaking chains of infection.
Rates vary among countries and time periods as well as the calculation
methods, which is why this report does not present country-based
statistics.
43. For instance, until 2022, users in the Netherlands could only
share a user code in co-operation with public health authorities
and every instance was registered and used for epidemiological analyses
of routine contact tracing data.
Ireland issued codes as part of a structured
series of phone calls performed by contact tracers, and in 2022
it moved to a more automated approach. Switzerland used self-report
questionnaires.
44. Germany on the other hand uses data directly collected from
the CTAs themselves, for which the user gave explicit consent in
order to evaluate the effectiveness of the German
Corona-Warn-App.
45. Alternatively, countries can use voluntary, anonymous self-report
surveys on adherence to recommendations, as was the case in Germany,
Switzerland,
the Netherlands,
and the United Kingdom.
This would comply with data protection
standards. However, the data might be biased and non-representative
due to the uncertainty of self-reporting.
46. Recent scientific reports pointed to several factors that
facilitate or hamper CTA adoption and adherence.
Higher education and income levels
for instance relate to higher levels of adoption.
Perceived effectiveness, benefit,
social influence, peer pressure and trust in a technological innovation
and in a government’s action are also relevant indicators. The higher
the level of these factors, the higher the chances of adoption and
adherence, which in turn leads to better CTA performance.
47. CTAs can be perceived as not directly affecting one’s own
health, once infected, but only protecting others and society as
a whole, by reducing contagion. However, in the case of new variants
or entirely new viruses, one could personally benefit from early
detection and timely treatment, and CTAs may have a key role to
play both for the individual and for society.
48. These observations show the importance of a timely, substantial
and thorough assessment of the public health impact of this digital
technology, which, if negative, may cause the population to reject
or abandon the innovation. A government’s action in honestly and
accurately informing the population about public health measures
may increase trust in government.
49. Furthermore, voluntary access to CTAs in resource-limited
settings should be encouraged, for example by minimising mobile
data costs, promoting low-cost devices, and facilitating conditions
such as a help function, a tutorial, or testimonials of other individuals
who use the technology.
50. Another important aspect to consider is related to public
attitudes towards public health measures which may restrict individual
freedoms for the sake of public interest. Low CTA adoption can also
be explained by a limited sense of individual and collective responsibility
for one’s own health as well as other people’s health, including
that of vulnerable groups. This phenomenon is similar to anti-vaccination
attitudes, which were discussed in
Resolution 2455 (2022) “Fighting vaccine-preventable diseases through quality
services and anti-vaccine myth-busting”, adopted by the Assembly
on 24 June 2022.
51. Negative attitudes or low interest can be tackled through
systematic, targeted information campaigns, both through the media
and with civic initiatives in schools, that are context-specific,
based on science, address doubts and concerns raised, debunk disinformation and
highlight individual and collective responsibility for one’s own
health as well as other people’s health.
6. The
need for public debate and parliamentary scrutiny
52. The evaluation of the Dutch
app and the review of the Swiss app
highlighted the importance of public debate
in relation to two important issues, namely location tracking and
adoption rates.
53. In early 2020, policymakers and scientists urged caution and
expressed concern related to data protection issues.
In the UK
and the Netherlands,
scientists wrote open letters to
express their concern about the risks of privacy infringement.
54. This was understandable. Regarding location tracking, US developers
of Covid-19 apps said that it was vital they be allowed to use GPS
location data in conjunction with the new contact tracing system
to track how outbreaks move and to identify hotspots.
The UK National Health Service also
initially considered then abandoned the idea of centrally recording
the de-anonymised ID of someone who is infected and his or her contacts.
This feature would indeed have enabled a form of surveillance.
55. Apple and Google, whose operating systems power a vast majority
of all smartphones, started implementation work in their operating
systems on a decentralised, privacy-preserving proximity tracing protocol
for notifying people who have been in contact with people who have
tested positive for Covid-19, which was eventually adopted by many
countries. Both companies stressed that privacy and preventing governments from
using the system to compile data on citizens was a primary goal.
56. The Dutch evaluation team found out that up until October
2021, the majority of Dutch citizens still believed that the CTA
was tracking their location,
which
possibly hampered the app uptake.
57. Furthermore, the findings of a March 2020 Oxford University
study
were misinterpreted by the media claiming
that at least 60% of a country’s population needed to install a
CTA to make it work, which was never stated in the original study.
The researchers themselves complained that their work had been profoundly misinterpreted,
and that in fact much lower levels of app adoption could still be
vitally important for tackling Covid-19.
58. In the Netherlands, the majority of Dutch citizens up until
October 2021 still believed that at least 50% of the population
needed to install the app to make it work. At the same time, the
majority of the Dutch population also indicated that the actual
adoption ratio was below 50%.
Likewise,
in Switzerland, a comparatively lower proportion of
SwissCovid app users was met with
disappointment by the public and the media.
59. These examples highlight the need for public debate and parliamentary
scrutiny from the initial stages on matters like apps’ effectiveness
or data protection concerns, also to avoid misinformation in the
media, undermining people’s trust and limiting their uptake of novel
technology.
60. Ms Emilija Gargcin, a representative of the Council of Europe
Advisory Council on Youth, speaking before our committee on 4 March
2022, also warned against the risk for young people growing up in
a “datafied” society, with little control over their data, with
many commercial apps changing privacy settings with software updates.
Collecting data should not be an easy way for governments to circumvent
engagement with citizens, in particular young people. Governments
need instead to clearly communicate how health-related data collection
and processing are different from commercial collection and use.
61. Public debate, including but not only at parliamentary level
and following a proper assessment in each country, is also needed
in the later stages to decide whether and how to use this technology
and communicate with the public, with a view to fighting a future
pandemic or other threats to public health.
7. Balancing
data protection principles and the need for scientific evidence
and impact assessment
62. Over the past two years, concerns
about privacy protection and security were indeed justified. Several gaps
were found in Google and Apple’s exposure notification framework.
A substantial privacy flaw was uncovered in April 2021
, when the Android version of Google's
exposure notification framework let other preinstalled apps potentially
view sensitive data, including whether someone had been in contact
with someone who tested positive for Covid-19.
Also, the Google Play Services component
has frequent contacts with Google servers, potentially enabling
location tracking.
63. Despite these weaknesses showing that data protection must
remain a priority and may also have a negative impact on CTA adoption,
according to experts, in general the available CTAs are well-engineered.
However, to protect public health,
potentially sensitive data are needed for the scientific community
to be able to prove CTAs’ effectiveness and impact.
Furthermore, as
discussed above, high CTA adoption is influenced by proven effectiveness.
64. The public must always be honestly and accurately informed
about public health interventions. According to the scientific community
we consulted, transparency on whether CTAs are effective can however be
at odds with data protection regulations. For a proper assessment
of CTAs’ health impact via sound data analyses and modelling, CTA
data are needed, notably those related to the number of exposure
notifications, to the risk scoring function (probability of notified
contacts then testing positive), to testing and voluntary or mandatory
quarantine.
65. Many if not all European CTAs are designed, secured, and protected
to such an extent that it is virtually impossible to access data,
not even by governments and independent scientists.
66. Continuous quality improvement of public policy interventions
and processes is essential to public health.
CTAs in particular
must remain responsive to an evolving situation with changing transmission
and immune-evasion patterns and other new scientific evidence. For
instance, time and distance determine whether being in contact with
infected users constitutes such a high risk that one is notified
and advised to test for the coronavirus.
67. From the start, many CTAs had a time parameter set within
a range from five to fifteen minutes and a distance parameter set
to about one and a half metres. What if a much more infectious variant
emerges and spreads, like the Omicron variant? Should the parameters
be changed then? To answer these questions, CTAs’ datasets must
be combined and analysed. However, in most cases they lack the users’
consent to be processed.
68. An early study on CTAs based on Google/Apple Exposure Notification
showed that the European Union General Data Protection Regulation
is not to be seen as “a hindrance, but rather as an advantage in
conditions of uncertainty such as a pandemic”,
as it offers a functional blueprint
for a system design that is compatible with fundamental rights.
69. A more recent study underlined that the lack of detailed and
centralised data limits evaluations of CTA effectiveness.
The effectiveness of CTAs was often
doubted, especially in the early days. Yet concrete data for comparing
the effects of digital contact tracing with manual contact tracing
are largely missing.
70. Nevertheless, a balance between the need for data protection
on the one hand and the need for health impact assessments on the
other hand should be struck. This is key not only for Covid-19 CTAs
but also for future technology to be designed to fight future health
crises.
71. At the hearing organised by the committee on 4 March 2022
in Paris, Ms Alessandra Pierucci, Chairperson of the Committee on
the Convention 108, stressed that privacy and the right to the protection
of personal data were not and should not be depicted as an obstacle
neither to saving lives nor to the achievement of other fundamental
rights and public interests. I share her views that Convention 108
as well as its modernised version, Convention 108+, which have been
drawn up precisely to reply to the manifold challenges raised by
new technologies, are flexible enough to ensure the balancing of
the protection of personal data with other rights and public interests.
72. For example, the monitoring of a life-threatening epidemic
is explicitly mentioned (paragraph 47 of the Explanatory Report
of Convention 108+) among those interests for which the processing
of personal data was legitimate. Further processing for scientific
research is also compatible with that purpose, subject to appropriate safeguards
(Article 5). Restrictions on certain data protection principles
are allowed in respect of scientific research purposes, but only
if provided for by law and when there was no recognisable risk of
infringement of the rights of data subjects (Article 11.2).
73. Ms Pierucci also warned that personal data, in particular
health-related data, could end up in the hands of unwanted actors
such as insurance companies or insurers, employers, or even other
public actors for unforeseen purposes. This should certainly be
avoided.
74. The intervention of data protection authorities in the aftermath
of the launch of contact tracing apps did not consist in a ban of
such digital tools, but rather in a call to ensure the application
of appropriate data protection standards, for example: 1) that a
data protection impact assessment is carried out before starting the
processing; 2) that the technological set-up is aimed at avoiding
the processing of unnecessary data; 3) that data processing is not
carried out for unforeseeable purposes; 4) that a high level of
data security is granted, as well as data quality considering that
the implications of the processing could be serious (self-isolation,
testing) for the individuals identified as potential contacts of
someone infected; 5) that the data subject has the right not to
be subject to automated decisions without a clear facility to challenge
the consequences of such decisions, particularly in the light of
inaccuracies which may occur in the systems.
75. Data protection rules are essential to ensure trust. As the
sharing of health information with one’s personal doctor should
be based on trust, the same should happen with any kind of processing
carried out by authorities, even with the help of private actors,
in the interest of public health.
76. There might be a margin to render certain processing legitimate
for scientific research, but this should be provided for by the
law and with the appropriate safeguards. Convention 108 and Convention
108+ allow for exceptions in extraordinary circumstances, if well
defined by legislation, preserving the essence of data protection
principles and re-expanding them once the emergency is over.
77. Furthermore, recourse to CTAs must be efficient and based
on a comprehensive national epidemiologic strategy articulated in
different tools. Technology can highly contribute to the achievement
of public interests but problem-solving cannot be reduced to an
uncritical delegation to technology without a careful balancing
of all interests at stake and an appropriate evaluation of its effects
and efficiency.
78. Those points were also highlighted by Mr Walter, Council of
Europe Data Protection Commissioner, speaking before our committee
on 21 June 2022. In particular, he stressed that opting for the
blind use of technology without evaluating its impact and effectiveness
is based on the false perception that technology as such is a panacea
for all problems. Technology can make a significant contribution
to the promotion of public interests only by ensuring a careful
balance of all interests at stake and by carrying out an in-depth
assessment of the risks posed to human rights and fundamental freedoms
in a democratic society.
8. Interoperability
79. An important aspect to consider
is related to the communication between apps and the implementation of
a Covid-19 app that operates in different countries. This is key
to containing the pandemic and to preparing for future threats.
80. Within the European Union, interoperability guidelines for
CTAs were adopted by consensus by the eHealth Network in May 2020.
The European Commission set up an
EU-wide system to ensure interoperability, a so-called “gateway”,
ensuring that apps work across borders.
81. Three national apps (Germany, Ireland, and Italy) were first
linked in 2020. In total, 20 apps are based on decentralised systems
and can become interoperable
, and I was recently in touch with
European Union officials to understand the state of play. Two important
studies have recently been procured by the European Commission.
82. The first one on “Lessons learned, best practices and epidemiological
impact of the common European Union approach on digital contact
tracing apps” has three objectives:
- to provide an up-to-date and comprehensive overview of
the approach and lessons learned regarding EU-level actions on cross-border
interoperability, co-ordination, implementation and epidemiological impact
of digital contact tracing;
- to propose a monitoring framework and methodology to gather
and evaluate evidence on the use and performance of digital proximity
tracing solutions in the European Union;
- to provide an up-to-date and comprehensive assessment
of the impact of digital contact tracing across the European Union
member States based on the developed monitoring framework and methodology.
83. This study is conducted in close cooperation with all European
Union member States that developed and used national digital contact
tracing apps during the pandemic and results are expected by the
end of 2022.
84. The European Commission is also analysing cross-border contact
tracing via a “Feasibility study on whether the contact tracing
tools and applications used at national and EU level could be integrated
and interoperable within Early Warning and Response System (EWRS),
selective exchange module”. This is not specifically on digital
proximity tracing apps and has two main objectives:
- the assessment of the lessons
learnt with the Covid-19 large-scale cross-border contact tracing
practice at national, EU and international levels;
- the assessment of the feasibility of the linkage between
the contact tracing applications data with the Early Warning and
Response System.
85. The study will analyse the benefits, difficulties and legal
barriers, as well as the parameters required to support the integration
within the Early Warning and Response System (EWRS) and the selective
exchange module. It will identify new features and structures to
ensure that the different applications for digital contact tracing
become interoperable thereby improving the effectiveness of national
and international contact tracing efforts.
86. I believe that our Assembly should emphasise the need for
developing co-ordinated solutions at international level to promote
safe international travel and global control of the Covid-19 pandemic
as well as future threats to public health.
9. Concluding
remarks and recommendations
87. CTAs are relatively new and
largely untested in many countries. Technology has the power to
amplify society’s efforts to tackle complex problems. However, governments
and private entities are also capable of deploying harmful tracking
technologies. The word “crisis” cannot be used to limit people’s
freedoms through surveillance.
88. The Council of Europe 2020 Data Protection Report also highlighted
that by adopting widely diverging systems, countries have limited
the efficiency of the measures taken and the influence they could
have exercised on actors in the digital market. According to its
findings, very few applications used in State Parties to Convention
108 stood the lawfulness test.
89. Retrospectively, governments should be encouraged to evaluate
the technology and monitor its implementation and compliance with
data protection standards. The collection and processing of personal
and health data should be justified by legitimate public health
objectives and be suitable and proportionate to achieving the intended
goal.
90. Users’ trust in new technology is instrumental to the level
of adoption and adherence, and thus the effectiveness of the system.
The lack of citizens’ involvement in the debate may explain the
ineffectiveness of CTAs and the low adoption rates of the available
applications.
91. The further proliferation of data gathered via these applications
should not be accessible to third parties that are not involved
in public health management, such as other government departments
or agencies, private companies, etc. Data collection and processing
should be transparent and concise, and reader-friendly information
on the purpose of data collection, data storage and sharing should
be easily available. Decisions on downloading and using applications
should remain voluntary and respect personal autonomy, also to avoid discrimination
due to the digital divide.
92. Monitoring and surveillance should be temporary and only be
pursued to tackle a crisis. Last but not least, data protection
authorities should be involved in the development, oversight, and
audit of digital contact tracing systems.
93. At the same time, the ultimate goal of this technology must
be to prevent forward transmission and break the chains of infection.
Adoption of this technology via smartphones and adherence to its
recommendations (such as notifications to others when testing positive
or testing after receiving a notification) are essential to its effectiveness
and the higher the levels of adoption and adherence, the more effectively
CTAs can break the chains of infection.
94. In 2020, the Chairperson of the Committee of Convention 108
and the Data Protection Commissioner of the Council of Europe stressed
that large-scale personal data processing can only be performed
when, on the basis of scientific evidence, the potential public
health benefits of such digital epidemic surveillance override the
benefits of alternative solutions.
95. To date, substantial scientific evidence of CTAs’ impact and
effectiveness remains relatively limited and the point of my report
is that strict interpretation of data protection standards could
be an obstacle to that aim.
96. CTAs using the Google/Apple Exposure Notification framework
are designed in such a way that they do not collect identifiable
health data, at least not without explicit consent. On the other
hand, manual contact tracing (testing centres, etc.) may lead to
the accumulation of sensitive health-related information, which
cannot be shared with third parties without consent, including the
scientific community, or only in an aggregated, anonymised way.
These data are vital for CTA effectiveness analyses.
97. CTAs with limited data collection which are based on decentralised
systems to protect privacy, as most European systems are, may hinder
the ability of governments to analyse aggregated data, including
user demographics or temporal, spatial trends and public health
impact of CTA usage and exposure notifications. Contact tracing
and testing datasets cannot be processed and combined without citizens’
consent.
98. As already stated, effective CTAs are drivers for adoption
of this technology and adherence to its recommendations. In turn,
higher adoption and adherence rates lead to a better performance
of CTAs and a timely, honest and accurate assessment of CTAs’ public
health impact is therefore a key prerequisite of an effective public
health policy.
99. A continuous adaptation to changing circumstances and quality
improvement of public health processes and interventions are essential
to public health. In particular, CTAs must respond to an evolving
situation, taking into account the changing transmission and immune-evasion
properties of a virus.
100. Easing the tension between data protection standards and health
impact assessments would not only help fight the current pandemic
but also design future technology aimed at tackling other health
crises.
101. When technology is designed and implemented under time pressure,
without really knowing the real-world effects and the health impact,
data protection standards should be considered as an advantage in conditions
of uncertainty such as a pandemic. However, those regulations must
be interpreted in a way that allows for detailed data collection
via the technology itself or via the healthcare system, especially
in times of global health crises.
102. Building on these considerations, I elaborated a draft resolution,
including a series of actions concerning the use of CTAs or similar
future technology which our governments should take. In particular,
CTAs should be part of a comprehensive national epidemiologic strategy,
remain voluntary and safeguard privacy. Public authorities should
be proactive in delivering accurate information and raising citizens’
awareness on the benefits of these tools and their proper use, provide
strict guarantees to the users’ right to privacy, build trust and
ensure that the effectiveness of CTAs is properly assessed.
103. Governments have started to turn their attention to medium
and longer-term reforms, focusing on building capacity to anticipate
and manage current and future crises such as new pandemics or health
threats and climate shocks.
This report is meant to be a contribution
to the governments’ reflection on what could help in preparing a
long-term response in the field of CTAs.