1. Introduction
1.1. Preparation of
this report
1. Having tabled the motion on
privacy
and the management of private information on the Internet and other online
media,
I was appointed rapporteur by the
Committee on Culture, Science and Education on 8 December 2009.
Having been Slovenian Minister for Culture from 2000 to 2004, media
policy and new media are subjects close to my heart.
2. The Sub-Committee on the Media organised the Council of Europe’s
Open Forum on Privacy and Internet Freedom at the United Nations
Internet Governance Forum in Vilnius (Lithuania) on 15 September 2010.
This Open Forum allowed invited experts as well as all Forum stakeholders
to express their views on this subject at the beginning of the preparation
of my report.
3. I am particularly grateful for the thematic contributions
by Ms Maud de Boer-Buquicchio, Deputy Secretary General of the Council
of Europe; Ms Catherine Pozzo di Borgo, Vice-Chair of the Consultative Committee
of the Convention for the Protection of Individuals with regard
to Automatic Processing of Personal Data (ETS No. 108, hereafter
“Convention No. 108”) and Deputy Government Commissioner to the
National Commission on Information Technologies and Liberties (CNIL,
Paris); Mr Richard Allan, Director of European Public Policy with
Facebook (London); Ms Katitza Rodríguez, International Rights Director
at the Electronic Frontier Foundation and Member of the Advisory
Board of Privacy International (San Francisco); and Professor Peng
Hwa Ang from the Nanyang Technological University in Singapore.
4. Professor Cécile de Terwangne and Professor Jean-Noël Colin
from the University of Namur (Belgium) were commissioned to prepare,
along the thematic lines agreed in consultation with me, a background
report on the legal and technical challenges to privacy and data
protection in today’s cyberspace. They presented their joint report
to the Committee on Culture, Science and Education in Paris on 18
March 2011. This explanatory memorandum relies largely on the background
report, and I am very grateful to its authors.
5. On 18 March 2011, the committee also organised a hearing on
this subject with Professor de Terwangne and Professor Colin, as
well as Ms Pozzo di Borgo speaking on behalf of the Consultative
Committee of Convention No. 108; Mr Michael Donohue, Senior Policy
Analyst, Information Security and Privacy, OECD; and Mr Olivier
Matter, Lawyer, European and International Affairs, National Commission
for Information Technologies and Liberties (CNIL), Paris. The presentations
made during the hearing have enriched my report.
6. Representatives of the European Commission and the International
Chamber of Commerce were unable to attend the hearing on 18 March
2011. I therefore contacted these two institutions, sending them
the preliminary draft resolution for their comments.
7. The Data Protection Commissioner of the Council of Europe,
Dr Karel Neuwirt, also provided an opinion on the draft report.
I am grateful for this supportive opinion and took account of his
suggestions in my final report.
8. In view of the strong international initiatives taken by independent
data protection authorities, I also sent the preliminary draft resolution
to CNIL and to the Federal Institute of Access to Information and
Data Protection of Mexico, which will host the 2011 Conference of
Data Protection and Privacy Commissioners.
9. I thank everybody who helped with this report, and I hope
it will be useful for achieving common standards on the protection
of privacy and personal data in Europe and beyond in the age of
cyberspace.
1.2. Basic concepts
10. The spectacular growth in information and communication
technologies (ICT) has offered considerable opportunities and multiple
benefits. Communication networks, especially the Internet, have
enabled previously unimaginable services to be introduced, while
the efficiency and accessibility of existing services have been improved.
11. However, use of these technologies also presents new dangers
for privacy and individual freedoms. There are many dark sides to
the fate of personal data on the Internet: data gathered without
an individual’s knowledge, data reused for unacknowledged purposes,
data kept for months or even years, data passed on to third parties,
confidential data circulated and so on. Individuals using the Internet
and the whole range of online services that now exist have to a
large extent lost control of their personal information. They do
not know what happens to it and have no means of checking from afar
who accesses it. A string of Internet and new media stakeholders,
on the other hand, are familiar with their tastes, interests, movements,
the places they frequent, the people they associate with, etc. These
facts call into question the right to privacy and to data protection.
12. Privacy, in this context, should be understood not in the
traditional sense of a private sphere to be protected and containing
a set of private, or even confidential, information to be kept secret
but rather as the right to self-determination and autonomy and an
individual’s capacity to make existential choices.
Here
we are talking, more specifically, about informational self-determination,
that is, individuals’ rights to “know what is known about them”,
be aware of information stored about them, control how it is communicated
and prevent its abuse. Privacy is thus not confined to a pursuit
of confidentiality; it is an individual’s control over his or her image
in terms of information.
13. Data protection derives from the right to privacy via the
associated right to self-determination. Every individual has the
right to control his or her own data, whether private, public or
professional.
2. Protection
of privacy and personal data in Europe
2.1. Legal standards
14. This section looks first at the binding legal provisions
adopted by the Council of Europe. Next, reference is made to the
International Covenant on Civil and Political Rights (ICCPR) as
the only binding universal instrument in the field of privacy. Lastly,
European Union provisions are considered, as 27 of the 47 Council
of Europe member states are also member states of the European Union.
The same approach is used for the list of policy standards below.
2.1.1. Article 8 of the
European Convention on Human Rights
15. Article 8 of the European Convention on Human Rights
(ETS No. 5, hereafter “the Convention”) guarantees everyone the
right to respect for his or her private and family life. Exceptions
to this right are allowed if they are in accordance with the law
and necessary in a democratic society (that is, they respect the
principle of proportionality laid down in the case law of the European
Court of Human Rights (“the Court”) in order to protect the legitimate
interests listed in Article 8, paragraph 2.
16. The Court has expressly extended the scope of privacy to cover
data protection, thus signalling that protection of personal data
is fundamental to the right to respect for private life enshrined
in Article 8. The Court holds that Article 8 requires domestic law
to afford appropriate safeguards to prevent any misuse or abuse
of personal data. Domestic law must also ensure that such data are
relevant and not excessive in relation to the purposes for which
they are stored and that they are preserved in a form that permits
identification of the data subjects for no longer than is required
for the purposes for which those data are stored.
2.1.2. Convention No.
108
17. Born out of concern to strengthen the protection
of privacy and other personal rights in the context of developments
in the field of information technology, the Convention for the Protection
of Individuals with regard to Automatic Processing of Personal Data
was adopted on 28 January 1981.
18. This convention contains the basic principles of data protection,
principles which have been adopted by most national and international
instruments in this field and are still relevant today, even if
they probably need to be added to. They are as follows:
- fair and lawful data collection;
- specified purpose (data stored for specified and legitimate
purposes and not used in a way incompatible with those purposes);
- data quality (relevant, appropriate, up to date, stored
for a limited period);
- special arrangements for sensitive data;
- data security;
- right of access, rectification and remedy;
- possible exceptions for the sake of paramount public or
private interests.
19. In 2001, Convention No. 108 was supplemented by an Additional
Protocol regarding supervisory authorities and trans-border data
flows (ETS No. 181).
20. The Consultative Committee established under Convention No.
108 currently examines possible legal loopholes in the convention,
caused by the rapid technological progress of ICT worldwide. The
analytical report prepared for the Consultative Committee by Dr
Jean-Marc Dinant, Professor de Terwangne and Mr Jean-Philippe Moiny
of the University of Namur (Belgium) focuses on technological challenges
such as geo-localisation, cookies and traceability with regard to
the legal standards under Convention No. 108.
21. Until March 2011, the Consultative Committee also pursued
a public consultation on the possible modernisation of Convention
No. 108.
2.1.3. Convention on Cybercrime
22. The Convention on Cybercrime (ETS No. 185) of 23
November 2001 was drawn up by the Council of Europe but is open
for signature by any state in the world.
It requires establishing
as criminal offences the breaching of data confidentiality by unauthorised
access or illegal interception, interference with the integrity
of data through their alteration or suppression and interference
with the integrity of the system. States parties also have to establish
the offences of computer-related forgery and computer-related fraud
in order to prevent malicious tampering with data.
23. In addition, parties must enable their authorities to obtain
expedited preservation of data, including traffic data, in order
to make them available for investigations. A signatory state can
be required to preserve and disclose data under mutual assistance
arrangements.
2.1.4. Convention on Human
Rights and Biomedicine
24. Personal health data are among the most sensitive
personal data. The protection of privacy and personal data is therefore
regulated in Article 10 of the Convention on Human Rights and Biomedicine
(ETS No. 164) and Article 16 of the Additional Protocol to this
convention concerning Genetic Testing for Health Purposes (CETS
No. 203).
25. The right of everyone to the protection of personal health
data must include the right to be informed of, and to consent to
or not, any collection and processing of such data.
2.1.5. Article 17 of the
International Covenant on Civil and Political Rights
26. Article 17 of the International Covenant on Civil
and Political Rights, signed in New York on 16 December 1966, provides
that: “(1) No one shall be subjected to arbitrary or unlawful interference
with his privacy, family, home or correspondence, nor to unlawful
attacks on his honour and reputation. (2) Everyone has the right
to the protection of the law against such interference or attacks.”
This is the only binding provision protecting privacy at global
level.
2.1.6. Articles 7 and
8 of the European Union Charter of Fundamental Rights
27. The Charter of Fundamental Rights of the European
Union
has been legally binding since
the Lisbon Treaty came into force. While Article 7 of this Charter
enshrines the right to privacy in the usual terms, Article 8 departs
from tradition by securing a separate right to protection of personal
data within the general catalogue of fundamental rights. Article
8 provides that everyone has the right to the protection of personal
data concerning himself or herself; the data must be processed fairly
for specified purposes and on a legitimate basis (consent or some
other basis laid down by law); and everyone has the right of access
to his or her data and the right to rectify them. Compliance with
these rules must be subject to control by an independent authority.
2.1.7. European Union
Data Protection Directive 95/46/EC
28. Directive 95/46/EC of 24 October 1995 on the protection
of individuals with regard to the processing of personal data and
on the free movement of such data
took up, in greater
detail, the principles contained in Convention No. 108. However,
it offers improved protection rules on a variety of points. It has
established criteria for making data processing legitimate. The
catalogue of the data subject’s rights has been extended. Right
of access includes the right to know the source of the data and
the logic used to process them. The directive establishes the right
to object to processing of personal data and the right not to be
subject to wholly automated decisions. In addition, the controller
is required to give certain information to the data subject. Currently,
the European Union data protection law is undergoing a reform.
2.1.8. European Union
Privacy and Electronic Communications Directive 2002/58/EC
29. Directive 2002/58/EC of 12 July 2002 concerning the
processing of personal data and the protection of privacy in the
electronic communications sector
is a specific directive
complementing the general directive (95/46/EC) for regulating data
protection in the electronic communications sector. It lays down
an obligation to ensure confidentiality of electronic communications
as well as traffic data and location data, with some exceptions.
It introduces a duty to ensure data security, now coupled with a
requirement to notify of any serious risks to data. However, this
requirement applies only to providers of publicly available electronic communications
services. The directive also deals with the use of cookies and sending
of spam.
2.1.9. European Union
Data Retention Directive 2006/24/EC
30. Directive 2006/24/EC of 15 March 2006 on data retention
requires providers of communication services (Internet, fixed and
mobile telephony, fax) to retain systematically everyone’s traffic
and location data for periods of between six months and two years.
This directive is currently
being revised.
31. Data retention and access to such data by law enforcement
authorities has become a politically important aspect in fighting
crime and terrorism. In this respect, the Convention on Laundering,
Search, Seizure and Confiscation of the Proceeds from Crime and
on the Financing of Terrorism (CETS No. 198) and the Convention
on Cybercrime could serve as international legal references, as
well as the Convention on Mutual Administrative Assistance in Tax
Matters (ETS No. 127 and CETS No. 208).
2.2. Policy standards
Assembly Resolution 428 (1970) containing a declaration on mass communication media
and human rights
32. In its
Resolution
428 (1970) containing a declaration on mass communication media
and human rights, the Parliamentary Assembly established already
more than 40 years ago that, “where regional, national or international
computer-data banks are instituted, the individual must not become
completely exposed and transparent by the accumulation of information
referring even to his (or her) private life. Data banks should be restricted
to the necessary minimum of information required”.
33. This resolution followed up
Recommendation 509 (1968) on human rights and modern scientific and technological
developments, which had called on the Committee of Ministers “to
study and report on the question whether, having regard to Article
8 of the [European] Convention on Human Rights, the national legislation
in the member States adequately protects the right to privacy against
violations which may be committed by the use of modern scientific
and technical methods”.
34. The resolution was adopted together with
Recommendation 582 (1970) on mass communication media and human rights, which
reaffirmed the earlier appeal by recommending that the Committee
of Ministers consider “the establishment of an agreed interpretation
of the right to privacy provided for in Article 8 of the European
Convention on Human Rights, by the conclusion of a protocol or otherwise,
so as to make it clear that this right is effectively protected
against interference not only by public authorities but also by
private persons or the mass media”.
Assembly Resolution 1165 (1998) on the right to privacy
35. In its “Declaration on mass communication media and
human rights”, in
Resolution
428 (1970), the Assembly defined the right to privacy as “the right
to live one’s own life with a minimum of interference”. Almost 30
years later, the Assembly specified in
Resolution 1165 (1998) that, “in view of the new communication technologies
which make it possible to store and use personal data, the right
to control one’s own data should be added to this definition”.
36. The 1998 resolution contains guidelines intended to supplement
national privacy provisions and covering various legal proceedings
and penalties to be made available to individuals whose privacy
has been infringed.
Assembly Resolution 1797 (2011) on the need for a global consideration of the human
rights implications of biometrics
37. The Assembly recently adopted
Resolution 1797 (2011) on the need for a global consideration of the human
rights implications of biometrics, which calls on member states
to,
inter alia, “promote proportionality
in dealing with biometric data, in particular by limiting their
evaluation, processing and storage to cases of clear necessity,
namely when the gain in security or in the protection of public
health or of the rights of others clearly outweighs a possible interference
with human rights and if the use of other, less intrusive techniques
does not suffice”.
Committee of Ministers Resolution
(73) 22 on the protection of privacy of individuals vis-à-vis electronic
data banks in the private sector
38. The Committee of Ministers developed the first set
of political principles through its Resolution (73) 22 on the protection
of privacy of individuals vis-à-vis electronic data banks in the
private sector.
Through the adoption of this resolution,
on 26 September 1973, the Committee of Ministers considered that
it was urgent, pending the possible elaboration of an international
agreement, to take steps to prevent further divergences between
the laws of member states in this field.
39. Resolution (73) 22 defined, inter
alia, that “information relating to the intimate private
life of persons or information which might lead to unfair discrimination
should not be recorded or, if recorded, should not be disseminated;
rules should be laid down to specify the periods beyond which certain
categories of information should no longer be kept or used; without
appropriate authorisation, information should not be used for purposes
other than those for which it has been stored, nor communicated
to third parties; the person concerned should have the right to
know the information stored about him, the purpose for which it
has been recorded, and particulars of each release of this information;
every care should be taken to correct inaccurate information and
to erase obsolete information or information obtained in an unlawful
way; electronic data banks should be equipped with security systems
which bar access to the data held by them to persons not entitled
to obtain such information, and which provide for the detection
of misdirection of information, whether intentional or not”. Such
standards seem still pertinent in the current ICT era.
Committee of Ministers Resolution
(74) 29 on the protection of privacy of individuals vis-à-vis electronic
data banks in the public sector
40. Resolution (73) 22 was complemented a year later
by Resolution (74) 29 on the protection of privacy of individuals
vis-à-vis electronic data banks in the public sector. Besides a
comparable set of standards, Resolution (74) 29 stated that, “especially
when electronic data banks process information relating to the intimate
private life of individuals or when the processing of information
might lead to unfair discrimination, (a) their existence must have
been provided for by law, or by special regulation or have been
made public in a statement or document, in accordance with the legal
system of each member state; (b) such law, regulation, statement
or document must clearly state the purpose of storage and use of
such information, as well as the conditions under which it may be
communicated either within the public administration or to private
persons or bodies; (c) that data stored must not be used for purposes
other than those which have been defined unless exception is explicitly
permitted by law, is granted by a competent authority or the rules
for the use of the electronic data bank are amended”.
Committee of Ministers Recommendation
No. R (99) 5 for the protection of privacy on the Internet
41. Recommendation (99) 5 is aimed at Internet service
users and providers. It contains “guidelines for the protection
of individuals with regard to the collection and processing of personal
data on information highways” to be incorporated in codes of conduct.
These guidelines set out principles of fair practice for privacy
and data protection in Internet communications and exchanges.
Committee of Ministers Recommendation
CM/Rec(2010)13 on the protection of individuals with regard to automatic
processing of personal data in the context of profiling
42. Adopted on 23 November 2010, Recommendation CM/Rec(2010)13
proposes supervision of the widespread phenomenon of profiling (see
sections 3.2 and 4.2 below). The appendix to the recommendation contains
principles that should ensure fair and lawful profiling. A list
of cases in which profiling is lawful is provided. The controller
is required to limit the risk of error, take security measures and
provide data subjects with information about his or her profiling
activities. With certain exceptions, individuals are entitled to
access their data, correct them, be informed of the purpose of the
profiling and the logic used to attribute their profiles and, last
but not least, object to use of their data or to a decision taken
solely on the basis of profiling.
Resolution No. 3 of the European
Ministers of Justice on data protection and privacy in the third
millennium
43. In Resolution No. 3, adopted on 26 November 2010
at their 30th ministerial conference in Istanbul, the Council of
Europe Ministers of Justice show their support for bringing Convention
No. 108 up to date in order to find ways of guaranteeing protection
of personal rights in the face of the new challenges posed by technology and
the globalisation of information. This updating should meet ministers’
concerns with regard to issues such as transparency, effective exercise
of rights, data security breaches, jurisdiction and applicable law
in respect of virtual and trans-border relationships (for example,
cloud computing and social networks) and liability.
44. The ministers note that Convention No. 108 is currently the
only potentially universal binding legal instrument in the field
of data protection. It could therefore become the universal instrument
demanded by national data protection authorities. The ministers
consequently invite parties outside the Council of Europe to take
part in the updating process.
United Nations guidelines concerning
computerised personal data files, adopted by the United Nations
General Assembly on 14 December 1990
45. The United Nations General Assembly adopted on 14
December 1990 guidelines concerning computerised personal data files.
46. These guidelines are the only political standard established
by the United Nations since the entry into force of Article 17 of
the ICCPR. They state that information about persons should not
be used “for ends contrary to the purposes and principles of the
Charter of the United Nations”.
OECD Guidelines on the Protection
of Privacy and Transborder Flows of Personal Data of 1980
47. The OECD Guidelines of 1980
contain what are known as “fair
information principles”. These basic principles of data protection
are almost identical to those in Convention No. 108. But, unlike
the latter, they are not legally binding.
48. The OECD is currently reviewing its data protection guidelines
with a view to modernising them. Most of the OECD member states
are legally bound by European Union law and/or Convention No. 108.
Madrid Resolution of 2009 adopted
by data protection and privacy commissioners
49. The 2009 Madrid Resolution
was the result of joint work by
data protection authorities in 50 countries under the leadership
of the Spanish Data Protection Agency. It is intended to provide
a model incorporating universal standards of data protection, thus
covering data protection values and principles that are safeguarded
on five continents.
50. In addition to standard aspects of data protection, this resolution
contains new elements such as proactive measures (procedures to
detect and prevent security breaches, appointment of a data protection officer,
privacy impact assessments, etc.) and the accountability principle,
which calls for internal arrangements that can be used to show that
the controller is complying with data protection rules.
Jerusalem Resolution of 2010
adopted by data protection and privacy commissioners
51. At their 32nd International Conference in Jerusalem,
from 27 to 29 October 2010, data protection authorities adopted
a resolution calling for the organisation of an intergovernmental
conference with a view to developing a binding international instrument
on privacy and the protection of personal data.
52. This initiative was responded to by the 30th conference of
the Council of Europe Ministers of Justice in their Resolution No.
3 on data protection and privacy in the third millennium (see above),
which invited non-European states to accede to Convention No. 108
and supported its modernisation.
3. Technology-related
challenges to privacy and data protection
53. Ever greater computing power and storage capacity
and ever-increasing connectivity are making possible the development
of new technologies and applications that constitute a genuine challenge
to privacy and data protection. They often entail bulk collection
of personal data on citizens, on-line buyers, social network users
and so on, sometimes without their knowledge, and an increasingly
common use of identifiers (such as IP addresses, RFID tag identifiers
and session identifiers) enabling a user to be linked with his or
her actions, geographical position or personal data. Moreover, these
data can be analysed and correlated to infer further information,
for profiling purposes, for example. Lastly, the storage and passing
on of collected or inferred data increasingly tend to be beyond
the control of the data subject, who is powerless with regard to
the use, and sometimes misuse, of these data.
54. The consequence is an increased risk of data leakage and tracing
of individuals, thus infringing their privacy. It is therefore necessary
to look at a number of emerging or changing technologies, their
effects and operation, as well as the potential dangers that they
pose to privacy and data protection.
55. The European Commission commissioned a comparative study on
different approaches to new privacy challenges in the light of technological
developments, which was presented by its authors LRDP Kantor Ltd. (Cambridge/London/Oxford)
on 20 January 2010. This study proposed some modernisation of the
European Union
Directive
95/46/EC on the protection of individuals with regard to
the processing of personal data and on the free movement of such
data.
3.1. Convergence of
communication systems
56. Changes in communication systems and information-sharing/information-delivery
services are leading to ever greater convergence between these various
systems, resulting in less and less transparency regarding the actual
tools used and, above all, a loss of control over the dissemination
of information that circulates or is aggregated, reformatted or
forwarded.
57. Thus, the telephone, equipped with computing power and storage
capacity, has become “smart” (smart phones); a computer can be used
to telephone; videoconferencing is available on MP3 players; behind
a fax number may lie an e-mail address; and calls to a mobile phone
can be rerouted to a landline before arriving in the voice mailbox
of a VoIP (Voice over Internet Protocol) service consulted on a
PC. These examples show how very complicated it is for a user to
determine the type of communication system being used and especially where
the data sent or received are going to or coming from.
58. Mention may also be made here of developments such as Microsoft’s
Outlook Social Connector, which can show a person receiving an e-mail
the sender’s Facebook status. This illustrates the ever increasing confusion
between hitherto clearly separate spheres and the risk of unwanted
disclosure of data that this allows.
3.2. Examples of new
technologies
Geo-location
59. Increasingly sophisticated and accurate methods exist
to establish a user’s geographical position, whether directly through
data from his or her terminal (using a GPS chip, increasingly common
in mobile phones) or through the network to which the user is connected
(by triangulating GSM stations or using databases containing the
location of WiFi networks – consider the data collected by Google
Street View cars in this connection
).
60. Electronic management of public transport also makes it possible
to track users’ movements, for example, when they swipe their travel
cards. A user’s position is sometimes stored or notified to third
parties without informing the user or obtaining his or her consent,
opening the way to movement tracking, profiling of absences from
home, etc. This calls into question the freedom to move around anonymously.
61. Even more perniciously, geo-location data from photographs
(such as those taken with a mobile phone), combined with face recognition
technologies as found in software such as Apple iPhoto® and Google
Picasa®, enable the location of a person in a photo to be determined
without his or her knowledge.
User traceability
62. Contrary to popular belief, browsing the Web leaves
far more indications than where we go and what we do in real life.
Surfing the Internet leaves various people with a record of what
has been done (IP address, service provider, page from which user
has come, browsing history, etc.). Tools such as IPv6 addressing
and cookies (see below) make it possible to identify individual
computers and therefore their users. The situation is unlike that
in real life, in that there is no question of strolling along the
information highways, visiting virtual shops, reading a newspaper
or showing an interest in an advertisement – without this being
known. One has to wonder about such constant transparency, which
would surely not be accepted in the real world.
Ipv6 addressing
63. Because of the proliferation of systems connected
to the Internet, the address space defined by the IPv4 standard
has
been exhausted, jeopardising the growth of the web. The Ipv6 standard
has been created in response, providing a much larger number of
separate addresses.
By
way of illustration, Ipv6 would allow every individual on earth
to have several dozen billions of addresses for his or her own personal
use.
64. An IPv6 address can be allocated to a device in a variety
of ways, one of which uses the device’s physical address (MAC address)
to generate an Ipv6 address, thus linking traffic to the device
and even directing it to an individual. Other methods avoid this
situation by generating addresses semi-randomly or using an address
server that assigns them automatically.
65. Whether or not an IPv6 address can be used for identification
will therefore depend on either the default settings of the system
used or the expertise of the user.
Cookies
66. The arrangements concerning cookies are defined by
the web browsing protocol (http) and enable a web server to send
to a user’s browser a series of data, which the browser will send
back to it upon subsequent visits (only to that particular site).
A cookie has a limited lifespan, determined either by closing of
the browser or by an expiry date. Cookies are thus stored locally
by the browser, usually on the user’s hard disk.
67. Cookies are used by web servers for session management and
personalisation, but they can also be used for tracing. Moreover,
it should be noted that, upon visiting a site, a browser may receive
cookies from third-party sites because the original site includes
content from those third-party sites. This technology is often used
for audience monitoring and profiling for advertising purposes.
68. Although the most common browsers allow users to manage and
even block cookies, these functions are seldom used, either because
of ignorance or simply because blocking cookies would make Internet browsing
unfeasible.
Smart grids
69. Power distribution grids are now migrating towards
a smart model (smart grids) that incorporates information technology
in order to optimise generation and distribution, the aim being
to match generation and consumption as closely as possible, thus
leading to energy savings, avoidance of power cuts, etc. A smart
grid is based on smart meters fitted with sensors and linked through
a network to a system that collects, collates and analyses consumption
data.
70. Smart meters send the operator real-time consumption data,
enabling consumers to be profiled: absence or presence in the building,
use of appliances with an energy signature and so on.
71. Moreover, within the building itself, appliances can also
be connected to the smart meter, not only reporting consumption
instantaneously but also allowing the meter to influence this consumption,
for example, by automatically adjusting the temperature of a thermostat
or turning off air-conditioning at times of peak consumption.
72. Here again we are witnessing bulk collection of data that
can be linked to a person or group of people, enabling their attributes
and behaviour to be pinpointed. The risk of uncontrolled disclosure
is even greater when this information is collected by third parties,
as in the case of Google’s PowerMeter system.
RFID and the Internet of Things
73. RFID (radio frequency identification) is an identification
technology relying on three components:
- a tag, which is attached to or incorporated in the object
to be identified;
- a reader, used to read the tag when it is within range;
- an information system, which receives data from the reader
and processes it.
74. The tag consists of an antenna and an electronic chip containing,
as a minimum, an identifier. When a tag is read by a reader (using
electromagnetic waves), it sends the reader its identifier. The
tag structure is very simple, allowing mass production at a cost
permitting use on a massive scale, usually just a few pence.
Contact with the reader is not required
to scan the tag; the scanning distance can vary between a few centimetres
and a few dozen centimetres, or even further, depending on the type
of tag.
75. RFID tags are used for stock and supply control, collecting
road tolls, management of supermarket stocks, check-outs and after-sales
service, luggage tracking at airports and as a means of identifying
animals. In some cases, tags may be implanted in human beings –
for example, to ensure the safety of children or the elderly, or,
on a lighter note, to monitor access to and manage drinks orders
in discotheques. Since the identifier is specific to a particular
tag, the latter’s movements – and therefore the movements of the
person or object to which it is attached – can thus be tracked in
relation to readers. Since tags are read from a distance, users
are not necessarily aware when they are being scanned, which may
lead to data leakage or tracking without their knowledge. Simultaneous
scanning of a large number of tags allows tagged objects or persons
to be identified almost instantaneously in the immediate environment,
thus again facilitating profiling of tagged individuals.
76. Various technical solutions now exist (and more continue to
be developed) to limit the scope for misuse of RFID technology.
But they often add significantly to the manufacturing cost, hindering
large-scale deployment. Recently, the Article 29 Data Protection
Working Party of the European Union endorsed a Framework Privacy
Impact Assessment for RFID applications.
77. The Internet of Things takes the idea of the Internet and
identification a (big) step further, with the vision of a world
in which everything is interconnected, not just people but also
things. The Internet thus emerges from the strictly virtual world
to include objects from the real physical world using technologies
such as RFID, near field communication (NFC), geo-location and sensor
networks. Here, the connected objects function largely independently,
being able to acquire and transmit information collected through
sensors, process it and interact with users and their environment.
78. Although the Internet of Things is a young field, where scientific
and commercial applications are still in their infancy, it is clearly
based on bulk collection and processing of information, most of
which can be either directly or indirectly linked to individuals
and, by this very fact, present a threat to their privacy.
Web crawlers
79. A Web crawler (also known as a Web spider) is software
written to scan the Web automatically and index content visited,
thus providing data for search engines in order to make searching
more effective and facilitate access to information. It works by
analysing the pages visited, following hyperlinks recursively.
80. Some malicious crawlers analyse pages to harvest e-mail addresses
for spam. Others may also browse pages to aggregate and correlate
data collected and infer further information.
Biometric data
81. Biometric methods, based on an individual’s physiological
attributes such as fingerprints, retina, voiceprint and DNA, are
increasingly being used to authenticate individuals (verify their
identity), whether for electronic payments, border control, access
control, face recognition or other purposes.
82. Biometric data must first be collected before they can be
compared with the template to confirm identity. This requires storage
of a large amount of personal data, some of which, such as DNA,
is an invasion of an individual’s privacy, which includes his or
her ancestry and descendants.
83. The Assembly recently adopted
Resolution 1797 (2011) and
Recommendation
1960 (2011) on the need for a global consideration of the human
rights implications of biometrics, which specifically addressed
the protection of personal data and privacy.
Privacy by design
84. The term “privacy by design” refers to a set of principles
drawn up for the design, development and operation of information
systems in order to ensure that privacy and data protection aspects
are properly taken into account at the design stage and that these
systems therefore comply with statutory and regulatory requirements
in this field.
85. Ms Ann Cavoukian, the Information and Privacy Commissioner
of Ontario (Canada), is the person behind this initiative, which
is based on respect for the user, transparency regarding data collection
and processing and refusal to compromise by sacrificing privacy
to other goals. The basic principles are security measures that
are proactive, data protection by default (any exceptions having
to be approved by the data subject) and data protection embedded
in information systems, rather than being an add-on, and extending throughout
the life cycle of the data collected.
86. These principles are applicable not just to information technology
but also to business practices and physical infrastructure. There
is a website on this topic,
which, besides providing a general
introduction, illustrates the applicability of the approach through
numerous case studies, thus demonstrating that it is possible to
design efficient systems satisfying business requirements without
necessarily sacrificing data protection.
Cloud computing
87. Cloud computing is a recent paradigm of IT. The term
covers both the services accessed and delivered through the Internet
and the information systems and hardware/software providing these
services.
88. The “cloud” allows considerable flexibility in managing and
allocating resources, with the investment model based more on usage-related
invoicing, as well as for integrating services between and within organisations,
irrespective of geographical position.
89. Various tiers of cloud services are available, the following
being the three main ones:
- Infrastructure
as a Service (IaaS): The services provided are infrastructure services
– principally system hardware and software – and connectivity; infrastructure
management is left to the customer;
- Platform as a Service (PaaS): The services provided take
the form of a platform, consisting of infrastructure but also a
software environment enabling customers to develop or run their
own applications; overall management is therefore shared between
the service provider and the customer;
- Software as a Service (SaaS): The provider here supplies
the customer with a complete applications solution, covering both
infrastructure and application software. Such services are offered,
for example, by salesforce.com for business management and by Google
through Google Mail, Google Documents, Google Calendar, etc.
90. Cloud computing is thus an extension of the security perimeter
towards the Internet, where it is extremely hard to exercise effective
control. The user entrusts data storage to a third party, the service
provider, who hosts the data and processes them in a manner often
unknown to the user. This necessitates a genuine relationship of
trust between user and service provider – trust that may be reinforced
by contractual guarantees. The main challenges concern protecting
cloud data, preserving their integrity and maintaining appropriate
access control.
Deep packet inspection
91. Data circulating on a network are usually transmitted
in the form of packets comprising header and content; the header
contains the information needed to allow the network infrastructure
to deliver the packet to its destination.
92. Filtering of network traffic to authorise transit – done mostly
by firewalls – is based on routing data in the packet header, usually
the message source and destination. Deep packet inspection is based
on content-related criteria, analysing not just the header but also
the body of the message, that is, its content.
93. This method is obviously more costly in terms of time and
resources. It allows information system security to be improved
by detecting and filtering out malicious content. However, it can
also be used for surveillance or censorship purposes.
4. Usage-related challenges
4.1. Data processing
By government authorities
94. Growth of ICT-based e-government is leading to networking
of government authorities. This change rests mainly on data sharing
between authorities, the creation of lookup files and huge data
warehouses and the interconnection of previously separate databases.
This model raises serious questions concerning privacy. The previous
“silo” model of government, in which each body had its own separate
data for its specific statutory mission, was presented as a safeguard
against an omniscient state for which its citizens would be totally transparent.
“Practical obscurity” was the key to balance in relations between
government and the public. This safeguard has disappeared in the
name of efficiency. Questions now have to be asked about individuals’ control
over the data collected on them, the transparency of data interchange
and the proportionality of processing.
95. Use of unique identifiers for the purposes of interconnection
and transverse access to an individual’s data has further increased
the risks of loss of control and failure to respect proportionality.
Concerns relating to processing of personal data by public authorities
have been heightened by the fact that such processing is used as
the basis for decisions such as granting of a pension, recognition
of special status, assessment for tax purposes or opening of a criminal
investigation.
96. Martin Scheinin, the Special Rapporteur of the United Nations
on the Promotion and Protection of Human Rights and Fundamental
Freedoms while Countering Terrorism, condemns the erosion of privacy
by state counter-terrorism measures in his report to the 13th session
of the United Nations Human Rights Council.
Many states have dramatically expanded
their powers in the name of national security and public safety, including
overt and secret surveillance, the interception of communications
and profiling of persons.
97. The Council of Europe Convention on Access to Official Documents
(CETS No. 205) tries to balance the right to information with the
protection of privacy and personal data.
By commercial bodies
98. Personal data have an economic value. This value
is significant in three respects:
- for providers of Internet-based services: knowing the
profile of Internet users interested in products or services and
being able to break down this interest in detail (web pages visited,
links clicked on, frequency of visits, etc.) allows them to choose
the best way of presenting what they offer;
- for commercial users of databases containing personal
information: gathering data from every possible source allows them
to build up extremely full databases that can be used and sold on
for marketing and mailing;
- for actual operation of the Web: most services offered
on the Internet are free in appearance only. It is user exposure
to advertising that finances them. The business model is based on
marketing. The latter will be all the more profitable if recipients’
profiles are accurate and allow advertisements to be targeted effectively.
99. In each of these three examples, collection and collation
of data to create user profiles have become vital operations. Yet
in all too many cases these operations occur without the knowledge
of the people concerned. They often entail use of data for purposes
other than the original one. And the amount of data collected inevitably
raises the question of proportionality. For example, is it necessary,
or simply customary, for search engines (such as Google) to keep
all the words entered by a particular individual (identified by
a cookie) for months on end? Such a set of words is usually incredibly
revealing of the user’s interests, activities, plans, etc.
100. The National Commission for Information Technologies and Liberties
(CNIL) has been monitoring and punishing violations of privacy and
data protection in France since 1974. While in the past, 90% of
the cases pursued by CNIL involved violations committed by the public
sector, today 90% of all cases concern violations by the private
sector. The number of cases tripled between 2005 and 2009. In its
decision 2011-35 of 17 March 2011, CNIL imposed a fine of 100 000
euros against Google in France for having collected and processed secretly
a large amount of personal data derived from Wi-Fi networks and
the Google Location Server while taking pictures through mobile
cameras for Google Maps and Google Street View.
By employers
101. ICT has given employers surveillance tools that would
have been unimaginable in the past. Magnetic access cards tell the
network operator who is where at what time, whereas ordinary keys
gave no such indications. Closed-circuit Television (CCTV) allows
both visitors and staff to be watched. Staff surveillance also occurs
through monitoring of Internet browsing and e-mail use on company
systems. For employees working off site, geographical location and
tracking systems allow fleets of taxis, broken-down vehicles or driving
vehicles to be managed remotely and their movements to be monitored
in real time.
102. ICT also provides tools of insight. Many employers use the
Internet to find information about prospective employees. Google
and Facebook, in particular, thus play the part of informers, revealing
to a future employer many aspects of the applicant not included
in his or her CV.
103. Among famous privacy violations by employers, the case of
Deutsche Telekom seems particularly serious. In 2005 and 2006, top
managers of Deutsche Telekom (in Bonn, Germany) had hired a private detective
to investigate alleged information leaks within the company. Deutsche
Telekom had secretly used its access to mobile phone data and computer
data of staff members and journalists. Following the revelation
of this scandal through the media in 2008, the German Parliament
and the public prosecutor in Bonn started investigations. Deutsche
Telekom subsequently established the post of a company data protection commissioner
and sought damages from the top managers, who had been fired in
the meantime.
104. The Consultative Committee on Convention No. 108 is currently
analysing the need to revise Committee of Ministers Recommendation
No. R (89) 2 concerning the protection of personal data used for
employment purposes. A report prepared for the Consultative Committee
by Mr Giovanni Buttarelli, Assistant European Data Protection Supervisor
of the European Union, formulates proposals for the revision of
this Council of Europe recommendation. He suggests that “it would
be necessary to prohibit more explicitly activities which consist,
even occasionally, in the processing of personal data for the direct
and primary purpose of remote monitoring (physical or logical) of
work and other personal conduct. Employers should abstain from using
the results of this unlawful processing, even when employees are
aware of it”.
By individuals themselves
105. In many cases, individuals are not fully aware of
the impact of their actions when on the Internet. Web 2.0 has given
them an opportunity to interact, comment, publish content themselves
and continuously share knowledge, photos, videos, news, moods, etc.
However, the spread of information on the Internet sometimes considerably
exceeds what might be expected. The example of information drawn
from public Facebook pages and automatically attached to e-mails
by e-mail software without the senders’ knowledge has already been mentioned
above. The power behind the Web robots that provide data for the
search engines allows the retrieval from scattered sites of information
that was published for what was believed to be a limited audience. Something
written for a specific circle (a comment on a forum, for example)
is therefore likely to reappear, removed from its context and juxtaposed
with other information.
106. Once information (text, pictures or video) has been published,
it is no longer possible to control what happens to it. Deleting
it from the original site will not prevent it from continuing to
exist wherever it was copied or downloaded prior to deletion. And
there is really no chance of ensuring that the use made of the information (for
example, on the other side of the world or by persons unknown) will
respect the purpose for which it was originally published.
107. This loss of control is particularly disturbing because it
is coupled with an “eternity effect”. Unlike human memory, the electronic
memory erases nothing involuntarily. Data can perpetually be retrieved
from the past unless it has been decided to spend time and energy
deleting them (when it is actually possible to delete them).
108. Individual acts of spite are also a cause for concern. Publishing
defamatory or confidential information on Facebook, posting a private
or humiliating video on YouTube or placing a false article about
somebody on Wikipedia can cause immeasurable damage in off-line
life.
4.2. Profiling
109. Last year, the Committee of Ministers adopted Recommendation
CM/Rec(2010)13 on the protection of individuals with regard to automatic
processing of personal data in the context of profiling.
Profiling
consists of applying algorithms to aggregated data to reveal correlations
and develop profiles. The latter are used for individuals, in order
to determine how they are to be treated (whether or not as tax evaders,
marketing targets for a particular product, possible terrorists
on the move, etc.). Profiling for business purposes (see above), security
reasons or other motives is easy to do, using widely available data
(traces, search engine queries, etc.) and cookies, for example.
110. Profiling meets legitimate social needs and interests: risk
analysis, fraud identification, market segmentation, matching supply
and demand, etc. However, it may unjustifiably deprive some individuals
of access to certain services. The existence of profiles leads to
information being filtered, sorted and selected according to the
recipient. This is overwhelmingly true of marketing information
today. Will it be the case for all information tomorrow? Profiling
may also be a tool of discrimination. How is it possible to challenge
a profile or its inappropriate use? Most of the time, the individuals
concerned are unaware that these profiles exist, and the people
using them have no understanding of the basis on which they have
been developed. Lastly, profiling raises serious concerns as to
proportionality. The amount of data collected and the periods for
which they are kept are in many cases out of all proportion.
4.3. Data retention
111. Data relating to Internet and new-media use constitute
a gold mine of information for police investigations and crime prevention.
Since the 11 September 2001 attacks, provisions have been adopted
at the European level to harmonise the circumstances in which content
data, traffic data and location data can be kept for the criminal
authorities. These data cover the duration, date, recipients and
location of all communications, the volume of text messages and
e-mails, etc.
112. It is interesting to see the changes in these provisions.
The 2001 Convention on Cybercrime provides that states may, at an
authority’s request, require expedited preservation of specified
data for a maximum period of 90 days, whereas the Data Retention
Directive 2006/24/EC of 15 March 2006 requires providers of communication
services (Internet, fixed and mobile telephony, fax) systematically
to retain everyone’s traffic and location data for periods of between
six months and two years. The evaluation of this directive is ongoing.
5. Conclusions
5.1. Self-regulation
113. While technology raises concerns, it also provides
answers. The technical design of tools can ensure that data collection
is minimised. The exercise of rights (of access, rectification and
objection) can be facilitated through online procedures. The default
settings for data disclosure options can be limited rather than
set at full. Thus, the private sector can meet the concerns raised
in this report by applying the principle of “privacy by design”.
It can also adopt or encourage Internet users to use privacy-enhancing
technologies (PETs). However, private-sector regulation should not
be limited to technology but should also cover existing customs and
practices in this sector, as described above.
114. A study on the economic benefits of privacy-enhancing technologies,
which was prepared by the private consultancy firm London Economics
in July 2010 for the European Commission, analyses the opportunities
of privacy-enhancing technologies applied by companies and individuals.
115. One of the weaknesses of self-regulation is that it depends
on the initiative and goodwill of the people concerned. It is clear
that collective awareness raising inevitably puts pressure on them
and can increase their motivation for reasons relating to their
image or that of an entire industry. Another weakness lies in the
fact that, unlike legislation, self-regulation is not the result
of bringing together different points of view in order to find a balance.
Since the rules are mostly laid down by a single category of people,
they reflect only the concerns taken into consideration by those
people and their own view of a socially and economically acceptable
balance.
116. Self-regulatory measures should support and supplement statutory
rules. They would undoubtedly strengthen the latter’s effectiveness.
They ought to be widely encouraged but, given their weaknesses,
should not take the place of national or international legal standards.
In order to be successful, a good self-regulation programme should:
provide added value and contribute to proper practical application
of the principles and rules enshrined in the legal framework, taking
into account the specific features of the various sectors; involve all
stakeholders concerned, including data protection authorities, in
the preparation phase, in a transparent way; provide for robust
monitoring and enforcement mechanisms, which would foster the trust
of individuals; and create a mechanism for its regular review and
improvement.
117. However, improved effectiveness will inevitably depend also
on raising and extending users’ own awareness.
5.2. 5.2. International
law
118. Existing legislation is too ineffectual and has too
many shortcomings in its data protection rules. Convention No. 108
and the general European Union Data Protection Directive were drawn
up before the advent of the Internet. It was impossible to take
account of the global dimension of information services and any
virtual and trans-border context when these data protection rules
were drawn up. The frighteningly widespread lack of clarity in the
system and the pernicious opportunities for surveillance could not
have been anticipated.
119. The relevant provisions most certainly need to be updated,
and this should entail the incorporation of new principles such
as data minimisation, increased liability and better security (including
requirements relating to breaches of data security). Individuals’
rights should be strengthened (right to object, including objection
to automated decisions; right to data deletion). Transparency requirements
should be established or redesigned.
120. Compliance with legislation could be improved by, amongst
other means, strengthening the powers of regulatory authorities
and introducing the right to take legal class action. Machinery
could also be set up to check national legislation before ratification
of Convention No. 108.
121. Convention No. 108 is the only existing and advanced set of
standards in this sector under public international law. Therefore,
it is necessary to promote accession to Convention No. 108 by as
many states as possible globally and to start the drafting of a
new protocol to this convention in order to adapt the established standards
to new challenges.