Tải bản đầy đủ - 0 (trang)
2 Privacy, A Fundamental Human Right

2 Privacy, A Fundamental Human Right

Tải bản đầy đủ - 0trang

5.2



Privacy, A Fundamental Human Right



105



attached to this distinction and it was limited to the exclusive domain of an individual, a kind of inner circle granting him or her the privilege to make choices that

exclude others. The case law of the Court has nevertheless separated it into two

senses. On the one hand it reaffirmed Article 8 as a way to protect privacy in the

negative sense mentioned above and on the other it amplified the notion of private

life through a constructive or dynamic interpretation of the Convention. Judicial

interpretation of private life has been shown to be less a question of designing specific borders than presenting newer meanings. The role of the Court in defining the

concept of private life goes hand-in-hand with the argument that laws must avoid

anachronism or, as the ECtHR says, the Convention must be interpreted in the light

of present-day conditions. In this sense the Court has repeatedly affirmed that the

notion of private life is non-exhaustive.

Dimensions of Private Life While revealing the extension of the concept of private life, the Court has affirmed numerous dimensions of the protection afforded by

Article 8 of the Convention. Some of these are related to: (a) Human integrity: both

personal and psychological, integrity falls into the scope of private life, and has

been upheld by the Court in cases related to unpunished sexual abuse32and compulsory gynecological exams.33 (b) Identity: has been considered a privacy-relevant

issue in cases related to the prerogative of parents to name their children34 and to the

choice of surnames by married people.35 Identity has also been protected as the

individual prerogative one has to self-identify, as in cases where national law refused

to recognize a change of gender in public records systems, such interference being

considered a cause of alienation and stress.36 In a case where the applicant, who had

been boarded out with various foster parents, claimed access to confidential public

records, the Court maintained vital interest for someone to know his origins also

comes under the scope of private life.37 In another case it was held that private life

32



X and Y v. the Netherlands, no. 8978/80, 26 March 1985, Series A no. 91.

Y.F. v. Turkey, no. 24209/94, ECHR 2003-IX. The now defunct European Commission of Human

Rights (ECommHR) had already pronounced in the sense that cases involving compulsory medical

intervention affected human integrity and, therefore, respect for private life as in X v. Austria,

Commission decision of 13 December 1979, Decisions and Reports 18, p.154 and Acmanne and

others v. Belgium, Commission decision of 10 December 1984, Decisions and Reports 40, p.255.

34

Guillot v. France, no. 22500/93, 24 October 1996, Reports 1996-V.

35

Burghartz v. Switzerland, no 16213/90, 22 February 1994, Series A no. 280-B. A similar questioning was seen more recently in Ünal Tekeli v. Turkey, the Court having recognized violation of

Articles 8 and 14. Ünal Tekeli v. Turkey, no. 29865/96, ECHR 2004-X.

36

“The stress and alienation arising from a discordance between the position in society assumed by

a post-operative transsexual and the status imposed by law which refuses to recognise the change

of gender cannot, in the Court‘s view, be regarded as a minor inconvenience arising from a formality. A conflict between social reality and law arises which places the transsexual in an anomalous

position, in which he or she may experience feelings of vulnerability, humiliation and anxiety”

(par.77). Christine Goodwin v. the United Kingdom [GC], no. 28957/95, ECHR 2002-VI.

37

Gaskin v. the United Kingdom, no. 10454/83, 7 July 1989, Series A no. 160. The same argument,

i.e., the vital interest for someone to know his or her origins, is present in Mikulić v. Croatia, a case

involving a person searching to establish paternity and where the lengthiness of State procedures

33



5 Privacy and Human Flourishing



106



involves identity and the possibility for someone to live in accordance with a tradition of nomadic existence.38 (c) Sexual life is one of the “most intimate aspects of

private life” and considered as an outcome of the “right to make choices about one’s

own body”.39 The Court has considered that consensual, non-violent sexual relations between adults are genuinely private activities and that no circumstances justified their incrimination40 or deprivation of rights.41 (d) The fourth category is

fundamental life choices such as those related to abortion, 42 suicide43 and assisted

suicide.44

So private life is relevant to a relatively wide array of dimensions of life, this

width being partially explained by the amplification of the concept of private life, as

described above. Two of the dimensions of privacy mentioned are the objects of

particular attention of legal and philosophy scholars, namely: identity and integrity.

I will briefly review these below.



5.2.1.1



Identity: From Individuals to Dividuals



Self-identity and Identification Relevant from a privacy perspective as the case

law of the ECtHR makes evident, identity encompasses two possible meanings.

First, as the prerogative people have to define and present themselves to the world;

here identity is subjective and defined by each individual, an ensemble of personality traits. Second, identity is taken as a synonym for identification, i.e., a set of

attributes – such as name, filiation, nationality, home, date of birth and gender – that

makes it a priori possible to distinguish one person from another.45 Our information

prolonged uncertainty with regard to the applicant’s civil status. Mikulić v. Croatia, no. 53176/99,

ECHR 2002-I. On the other hand, the same vital interest was not recognized in a case where the

applicant claimed access to a name kept secret under the system of anonymous births. Odièvre v.

France [GC], no. 42326/98, ECHR2003-III.

38

Chapman v. the United Kingdom [GC], no. 27238/95, ECHR 2001-I.

39

K.A. and A.D. v. Belgium, no. 42758/99 and 45558/99, 17 February 2005, par.83.

40

Dudgeon v. the United Kingdom, no. 7525/76, 22 October 1981, Series A no. 45, par.52; Norris

v. Ireland, no. 10581/83, 26 October 1988, Series A no. 142, par.46; A.D.T. v. the United Kingdom,

no. 35765/97, ECHR 2000-IX, pars. 36 and 37.

41

Lustig-Prean and Beckett v. the United Kingdom, no. 31417/96 and 32377/96, 27 September

1999, par. 104; Beck, Copp and Bazeley v. the United Kingdom, no. 48535/99, 48536/99 and

48537/99, 22 October 2002, par. 51.

42

Though affirming that Article 8 does not confer a right to abortion, the Court has held that the

interruption of pregnancy “touches upon the sphere of the private life of the woman”. National

legislation concerning abortion may regard the subject in different ways as long as they fairly balance the different rights involved, namely between those of women and those of the unborn child.

A, B and C v. Ireland [GC], no. 25579/05, ECHR 2010.

43

Pretty v. the United Kingdom, no 2346/02, ECHR 2002-III.

44

Haas v. Switzerland, no. 31322/07, ECHR 2011.

45

Pfitzmann and Hansen suggest that the elements that may compose an identity set are multiple,

the matter is rather of identities than identity (Pfitzmann and Hansen 2010, 30). See Ahonen et al.

in the same sense (Ahonen et al. 2008, 146).



5.2



Privacy, A Fundamental Human Right



107



societies have significantly changed the way in which people are identified: access

to information systems requires complex identification mechanisms.46 In a world of

AmI, identity in the second sense (identity as identification) is marked by quantitative and qualitative trends. The quantitative trends include the growth of identifiers

and cross-referencing (Accenture 2010) while the qualitative encompasses developments such as machine-governance of the identification process (Ahonen et al.

2008, 147). These trends are at the core of scholars’ attention to identity (in the first

sense) in AmI and similar technological visions (de Mul and van den Berg 2011; de

Vries 2010; Rodotá 2011; van den Berg 2010). I highlight three concerns pointed

out by works in this field, namely uncontrolled visibility, difficulty in conveying

meaning and normativity.

Uncontrolled Visibility First, as AmI technologies have substantial capacity to

make visible a wide range of information concerning our lives, they challenge people’s mastery of what they want to make visible and invisible. Since identity is about

how a person presents him or herself to the world, this might imply, to some degree,

the possibility of choosing what information to make visible or not according to

context. For instance, individuals may choose what to reveal concerning different

affiliations (e.g. family member, friend, worker). Such capacity is also temporal in

the sense it connects to the ways in which a person builds his or her identity throughout his or her life.47 While dispersing the individual in profiles – which dismiss the

relevance of context, selfhood and time – it is precisely this freedom of identity that

is endangered by automated profiling.

Conveying Meaning Second, a world of automated profiling affects people’s

capability to convey meaning. Profiling combines pieces of data that are nonmeaningful in themselves – for instance, the tag on a person’s watch or their facial

expression – to indicate whether they are a potentially good consumer and this

affects how he or she experiences his or her identity.48 Opaque, profiling algorithms

fail to lead people to self-identify with the created categories to which they are subjected, as they make no sense to individuals (de Vries 2010). In other words, how

can an individual make sense of models whose very existence he or she ignores?

The idea of a digital trace as a mark left by a person is problematic, since automated



46



Dinant observes that while this type of information was initially collected for debugging purposes, with time commercial applications of this information became current (Dinant 2009, 112).

47

This is related to what McStay refers to as the temporal dimension of being: “[b]eing has much

to do with stable properties and structures but rather is a process or event. It occurs when being

‘historicizes itself and when we historicize ourselves’” [internal quotes omitted] (McStay 2014,

109).

48

Talking about identity, de Vries’ refers to shibboleths, devices which decide “who is in and who

is out; who is us and who is them; who is likely to be a good customer and who is not; who is

allowed to pass the border and who is not”. IP addresses, credit card numbers and a wide array of

other marks are examples of shibboleths, which are arbitrary marks that will influence one’s experience of identity (de Vries 2010, 76).



108



5 Privacy and Human Flourishing



profiling is in the rationale of attribution rather than collection.49 At the root of these

identity crises, it seems, it is the very rationale that apprehends identity as something atomized in large amounts of fragmented personal data, dealing with dividuals

rather than individuals.

Normativity The lack of ability for people to master individual visibility and to

convey meaning about data is a sign of AmI normativity. If normativity may be

taken in the sense of determining the action of people – as with incentive, inhibition,

prediction and preemption50 – one may also reason that it interferes with the constitution of identity. In a sense AmI technologies make up people as pointed out by

Rouvroy, given that such technologies create

new interactions and behaviors involving subjects, objects, and (public and private) organizations and, through an elaborate interplay of statistics and correlations, producing, or,

more probably, reinforcing the norms, the criteria of normality and desirability against

which individual lifestyles, preferences, choices and behaviours will be evaluated, with

gratifications for compliant individuals, and sanctions for deviant ones, in the form of

increased surveillance and monitoring, or of a reduction of access to specific places, goods,

services, activities or other opportunities (Rouvroy 2008, 14).



For now, besides keeping in mind the three points mentioned above, I propose a

useful first application of the capability approach. The capability approach provides

a language to accommodate valuable beings and doings, the freedoms that matter.

In relation to identity, one example would be what Hildebrandt calls “double contingency”. Hildebrandt builds on the theorem of double contingency of Parsons who

takes the interaction between two people and considers the way the communication

one makes will be received by the other.51 She points out that “[i]f the construction

of identity depends on our capability to anticipate how others anticipate us, we must

learn how to figure out the way our computational environment figures us out”

(Hildebrandt 2013, 233). Here, transparency plays an important role as a means to

“enable us to foresee what we are ‘in for’”; says Hildebrandt.52 Double contingency’s

49



Building on a distinction made by Lévinas between trace (an entity that possesses materiality;

which may be exemplified with logs, terminal numbers, etc.) and signs (a status that make reference to something else; consider the process that links these traces to a specific person) Durante

observes that what autonomic computing does is to autonomously transform traces into signs.

Identity is affected by the computational power to refer to something, i.e., to transform traces into

signs (Durante 2011).

50

See section 3.1.

51

Holmes’s discussion of conversation illustrates the kind of interaction that double contigency

implies. He says that any conversation involves at least six people. So in a conversation between

John and Thomas, for instance, includes three different Johns – the real John, John’s ideal John and

Thomas’s ideal John – and, similarly, three Thomases (Holmes 1906). I thank Peter Burke for the

example.

52

The way to achieve this in computer engineering is through both the front-end – for instance

developing interfaces that allow people to contest how they are being profiled – and back-end –

promoting collaborative efforts between engineers, designers and users, providing a plurality of

mining strategies, public reports of trial experiments and transparency about used data and methods (Hildebrandt 2013, 239–240).



5.2



Privacy, A Fundamental Human Right



109



functioning – being able to anticipate how others and systems anticipate us – seems

to agree with the concept of capability. Similarly, what Rouvroy and Berns call the

meta-right to give account (of oneself) («le droit de (se) rendre compte») may also

be a capability in a world of AmI. If knowledge is equated to a digital memory that

records everything, nothing else needs to be explained or interpreted since the sense

is given. The capability to explain and interpret is precisely what makes people able

to give accounts of their actions; something that is fundamental in a democracy.

Such capability allows someone to assert – by language and advocacy for example –

his or her disagreement with norms he or she considers unfair or to express the

reasons that justify his or her actions (Rouvroy and Berns 2010, 102).



5.2.1.2



Human Integrity



Mastering Our Bodies and Minds A human right on its own,53 the protection of

human integrity is connected to Article 8 when the ECtHR affirms for instance that

bodily integrity “concerns the most intimate aspect of private life” 54 and mental

health is “a crucial part of private life associated with the aspect of moral integrity”.55

Privacy is connected with human integrity, particularly where the mastery of our

bodies and minds is concerned. The example of ICT implants evokes some difficulties for this mastery. Where experiences of consciousness and emotions are externally signified, processed by a third party and even receive external inputs,56 one

may question what place is left to the individual as the master of his or her body and

mind. Two relevant trends are noteworthy from this perspective. The first is instrumentalization of bodies – meaning biological features are used as instruments of

identification and authentication, for instance to access services. The second is

informatization – meaning human attributes are digitalized and processed across

systems and networks (Venier and Mordini 2011). Referring to the instrumentalization trend, Rodotá points out that “the body is […] becoming a source of new information and is exploited by unrelenting data mining activities – it is a veritable

open-air mine from which data can be extracted uninterruptedly. The body as such

is becoming a password – physicality is replacing abstract passwords […]” (Rodotá

2011, 184).



53



The Universal Declaration of Human Rights (UDHR) gives an early account of the protection of

human integrity that reflects a conception of rights in the negative sense, i.e., as interdictions to

violate: “[n]o one shall be held in slavery or servitude; slavery and the slave trade shall be prohibited in all their forms” (Article 4 of the UDHR) and “[n]o one shall be subjected to torture or to

cruel, inhuman or degrading treatment or punishment” (Article 5 of the UDHR). However, in the

wording of the EU charter the right to integrity is the right to respect for the body and mind: “[r]

ight to the integrity of the person […] Everyone has the right to respect for his or her physical and

mental integrity […]” (Article 3 of the European Charter).

54

Y.F. v. Turkey, no. 24209/94, ECHR 2003-IX, par. 33.

55

Bensaid v. the United Kingdom, no. 44599/80, ECHR 2001-I, par. 47.

56

See section 2.2.1.



110



5 Privacy and Human Flourishing



The use of body image scanners is an example of the first trend and illustrates

how bodies may be exposed, searched, and explored with no need for physical contact. In this sense, NGOs in Europe and in the US have been calling attention to the

excess exposure of people to body scanners as a condition for using different modes

of transport (Finn et al. 2013, 12). Here, Rodotá points out, bodies are becoming a

kind of password. The ICT implants illustrate the challenges of the second trend and

are quite representative of the challenges AmI technologies bring to comprehension

of what human integrity is about. Human bodies extended with ICT implants are

still bodies and human minds extended with cognitive capabilities are still minds.

However, beyond being still bodies and minds they are also new bodies and minds.

This means that these extensions become part of human beings. From a legal point

of view the relevance of the issue resides precisely in the fact that human integrity

as a principle – or, more precisely, as a human right – will involve these new

enhanced human beings as a whole, both materially and electronically speaking.57

Two connections are noteworthy at this point. The first is more general and consists of the link between the instrumentalization trend and the elimination of the

social effects of uncertainty, the former being a symptom of the latter. The

“Automated Target System-Persons”, to which I referred at the beginning of this

study,58 is illustrative of this connection. Here the assumption that the “body does

not lie” sets in motion a system where processing data related to body measurements appears to be a panacea for security threats. The second connection is similar

to the point I made with regard to identity; the capability approach is valuable to

name human integrity related capabilities. We may imagine that in the future, when

the use of ICT implants and the processing of corporeal and mental data will be

widespread, there will be situations where, for some reason, the fruition of these

extensions will be managed by third parties, at a distance. Let us say, for instance,

that someone has had delivery of updated implant management software suspended

due to payment failure. More than not being able to profit from a service, the discontinuity would imply the suppression of physical or mental capabilities such as

walking, hearing and seeing. Here, the lexicon of capabilities keeps the focus on

human integrity-related beings and doings that would not necessarily be evident in

perspectives focused on the protection of bodies and minds as such.



57



The roots of these challenges may perhaps be traced back to the theoretical “separation” between

body and mind, a duality that seems to be at the origin of a certain alienation regarding the body:

body as an object that we have, body as something to be worked out, body to be medicated, body

to be transformed by information, body as an instrument of identification, authentication, as if

body and mind were radically different substances. See Hayles, for whom not identifying the body

with the self is a condition “to claim for the liberal subject its notorious universality, a claim that

depends on erasing markers of bodily difference, including sex, race, and ethnicity” (Hayles 1999,

4–5).

58

See section 1.1.



5.2



Privacy, A Fundamental Human Right



5.2.1.3



111



A Focus on Data Protection



At this point we have several clues suggesting a wide relevance of privacy in a world

of AmI. Between the lines one may see a concern with the processing of data tout

court, i.e., a tension around the processing of data independently of any qualification of such data. It is in this sense that through the examples of identity and integrity issues one may read an informational dimension of privacy. Such a dimension

also touches on a stricter subject, namely data protection. By this I mean that the

relevance of privacy in a world of AmI is partially expressed through the language

of data protection or protection of personal data, to which I will briefly refer in the

following paragraphs, primarily through the connection of data protection to the

protection of private life as held by the ECtHR.

The Rise of Data Protection Legislation In the 1970s, Europe and the US saw a

prolific period of investigative and legislative initiatives concerning computer personal data systems. Concern with the impact of these systems on privacy became a

common issue. From a technological point of view these challenges were marked by

the transition from the record-keeping model to the data processing model, meaning

that non-state and state actors were progressively profiting from automated data

processing. These systems were used not only to maintain the records necessary for

their activities but also to systematically manage their operations – such as keeping

track of transactions, measuring performance and planning. These operations

required the processing of a significant amount of information on people, which is

at the root of concerns around adverse social effects that the US Secretary Advisory

Committee on Automated Personal Data Systems described, in 1973, as “loss of

individuality, loss of control over information, the possibility of linking data banks

to create dossiers [and] rigid decision making by powerful, centralized bureaucracies” (United States Secretary’s Advisory Committee on Automated Personal Data

Systems 1973).

While in Europe these concerns were at the root of the adoption of data protection laws throughout the decade,59 in the US they gave place to the creation of codes

of practice and a law to safeguard individual privacy vis-à-vis the processing of

personal information by federal agencies.60 In the context of the adverse social

effects of data banks, these instruments reflect a protective attitude towards people,

59



As did Sweden (Law n° 1973-289 of 11 May 1973), Germany (Data Protection Act of 27 January

1977), France (Law n° 78-17 of 6 January 1978), Austria (Data Protection Act of 18 October

1978), Denmark (Laws n° 293 and 294 of 8 June 1978), Norway (Law of 9 June 1978) and

Luxembourg (Law of 11 April 1979). Two important supranational initiatives at that time were the

OECD Recommendation of the Council concerning Guidelines governing the Protection of

Privacy and Transborder Flows of Personal Data (2013) (hereafter “OECD Guidelines”) and the

Council of Europe Convention for the Protection of Individuals with Regard to Automatic

Processing of Personal Data (Convention 108) (hereafter “Convention 108”).

60

The Privacy Act of 1974 deals with the protection of privacy in the context of the collection,

maintenance, use, and dissemination of personally identifiable information exclusively within the

systems of federal agencies.



112



5 Privacy and Human Flourishing



expressed through the establishment of principles such as loyalty and rights such as

access and rectification.61 The developments of data processing in the following

years, as well as the advent of the Internet and other communication networks have

been followed by other regulatory initiatives to which I will turn later.

Impacting on Private Life In parallel with the advent of data protection legislation, the ECtHR has connected data protection to respect for private life, for instance

concerning data protection rights such as access and rectification.62 Two outcomes

of the ECtHR case law are worth noting.

First, in the case law concerning surveillance one sees the outline of circumstances

where data processing is assumed to impact on private life. Some examples

where the applicability of Article 8 was recognized pointed to (a) the fact that

authorities have recorded images of people,63 (b) the fact that authorities have

disclosed video recordings identifying the person,64 (c) the intrusiveness or

covert character of investigation methods65 and (d) the foreseeable use of images,

in a case where normal use of security cameras generated a video footage further

used in criminal proceedings.66

Second, the Court departs from the exclusive private as privacy approach, meaning

the processing of personal data is relevant under Article 8 even though it is

related to the public domain.67 Here, in case law there is a trend towards protecting

61



For instance, the French Law n° 78-17 of 6 January 1978 (Loi Informatique et Libertés), as

edited in 1978, forbade human-related decision-making processes to be based on automated data

processing (Article 2), established the right to know and contest information and reasons used in

automated data processing, and created a national data protection authority to ensure the enforcement of data protection law (Article 3 and ff.), the loyalty principle in the collection of personal

data, (Article 25) the right assured to people to oppose the processing of personal data (Article 26),

the obligation of organizations that collect data to provide information to the persons concerned by

the collection (Article 27) and so on.

62

For instance the right to access personal files concerning the applicant’s childhood (Gaskin v. the

United Kingdom, no. 10454/83, 7 July 1989, Series A no. 160) and the possibility to rectify personal data (Leander v. Sweden, no. 9248/81, 26 March 1987, Series A no. 116 and in Rotaru v.

Romania [GC], no. 28341/95, ECHR 2000-V).

63

Herbecq and the Association Ligue des Droits de l’Homme v. Belgium, nos. 32200/96 and

32201/96, Commission decision of 14 January 1998, Decisions and Reports no. 92-B, p. 92.

64

Peck v. the United Kingdom, no. 44647/98, ECHR 2003-I, par. 59.

65

P.G. and J.H. v. the United Kingdom, no. 44787/98, ECHR 2001-IX, par. 53.

66

Perry v. the United Kingdom, no. 63737/00, ECHR 2003-IX, par. 40.

67

In P.G. and J.H. v. the United Kingdom the Court affirms “[t]here are a number of elements relevant to a consideration of whether a person’s private life is concerned by measures effected outside a person’s home or private premises. Since there are occasions when people knowingly or

intentionally involve themselves in activities which are or may be recorded or reported in a public

manner, a person’s reasonable expectations as to privacy may be a significant, although not necessarily conclusive, factor. A person who walks down the street will, inevitably, be visible to any

member of the public who is also present. Monitoring by technological means of the same public

scene (for example, a security guard viewing through closed-circuit television) is of a similar character”. P.G. and J.H. v. the United Kingdom, no. 44787/98, ECHR 2001-IX, par. 57.



5.2



Privacy, A Fundamental Human Right



113



privacy in public as long as “any systematic or permanent record comes into

existence of such material from the public domain”, as in cases related to the

recording of voices, and the maintenance of secret files containing information

about the applicants’ lives.68

Meanwhile at the Court of Justice of the European Union (CJEU)69 A particular connection between privacy and data protection is also present in the case law of

the CJEU, marked by two trends. On the one hand, the right to data protection –

having being singularized in the EU Charter70 – is at the core of decisions that put it

per se in balance with other rights, as in two cases mentioned by Fuster (Fuster

2013, 245–248).71 On the other hand, the Court has used various formulas to connect privacy and data protection. Examples of this include affirming the protection

of private life requires the application of data processing rules72; reference to the

fundamental right that protects personal data and hence73 private life and reference

to the right to privacy with respect to the processing of personal data.74 This second



68

P.G. and J.H. v. the United Kingdom, no. 44787/98, ECHR 2001-IX, par. 57. See also Leander v.

Sweden, no. 9248/81, 26 March 1987, Series A no. 116, par. 48, Rotaru v. Romania [GC], no.

28341/95, ECHR 2000-V, 46, Kopp v. Switzerland, no. 23224/94, ECHR 1998-II, par. 53, and

Segerstedt-Wiberg and others v. Sweden, no. 62332/00, ECHR2006-VII, par. 73.

69

The ECtHR and the CJEU are not the same Court; though the remark may sound obvious, there

is apparently still much confusion as we can deduce from the ECtHR website, where the Court is

presented through the formula “not to be confused with” the CJEU. While the mission of the

ECtHR is to ensure the observance of engagements of its contracting parties in the ECHR, the

CJEU reviews the legality of the acts of the EU institutions, ensures that its Member States comply

with obligations under EU Treaties and interpret EU law at the request of the national courts and

tribunals. The ECtHR is the human rights Court of the Council of Europe, the CJEU is the judicial

arm of the European Union (European Court of Human Rights 2016).

70

EU Charter refers to privacy and data protection in Articles 7 and 8: “Respect for private and

family life. Everyone has the right to respect for his or her private and family life, home and communications” (Article 7 of the EU Charter) and “Protection of personal data. (1). Everyone has the

right to the protection of personal data concerning him or her. (2). Such data must be processed

fairly for specified purposes and on the basis of the consent of the person concerned or some other

legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified. (3). Compliance with these rules

shall be subject to control by an independent authority” (Article 8 of the EU Charter).

71

Both cases are related to the legality analysis of network traffic filtering with the purpose of fighting copyright infringement, where the CJEU affirmed the right to the protection of personal data

independently from the right to privacy. In Scarlet Extended the Court affirmed “[…] the contested

filtering system may also infringe the fundamental rights of that ISP’s customers, namely their

right to protection of their personal data and their freedom to receive or impart information, which

are rights safeguarded by Articles 8 and 11 of the Charter respectively” and in Sabam that “[…] a

fair balance be struck between the right to intellectual property, on the one hand, and the freedom

to conduct business, the right to protection of personal data and the freedom to receive or impart

information, on the other” C-70/10 Scarlet Extended [2011] I-11959 and C-360/10 Sabam

[2012],par. 51.

72

C-101/01 Bodil Lindqvist [2003] ECR I-12971, par. 88.

73

C-275/06 Promusicae [2008] ECR I-9831, par. 63.

74

C-73/07 Satakunnan Markkinapörssi and Satamedia [2008] ECR I-9831, par. 52.



114



5 Privacy and Human Flourishing



trend has been apparent in the case law of the CJEU even after the adoption of the

EU Charter – which states the existence of the right to protection of private life and

the right to protection of personal data as separate fundamental rights. For instance,

the Court has declared the invalidity of the EU Regulation on the publication of

personal data related to the receiving of agricultural subsidy75 and more recently of

the Data Retention Directive,76 both decisions being anchored on the grounds of

close connection between privacy and data protection.

The sense in which I take privacy here cannot but suggest the relevance of its role

in a world of AmI, particularly due to the fact that AmI technologies imply an everpresent, permanent processing of data. If I wanted to cause a sensation, I would

invoke the case law of the ECtHR – which, as seen above, affirms that where there

is systematic or permanent data processing there are private life related issues – to

say that in a world of AmI, privacy is always legally relevant. I will not go that far

but rather clarify in the following paragraphs how I attempt to make sense of privacy

following scholars and several case law findings. At present, and with the Court, I

recapitulate that privacy is not reducible to the domain of the private,77 which is also

valid with regard to the processing of data.78



5.2.2



Making Some Sense of Privacy: Our Starting Points



Even if privacy openness is wide enough to manifest itself through such different

dimensions, some definition of what privacy is about is nevertheless necessary.

Having glimpsed several dimensions of privacy, in the following paragraphs I

attempt to make sense of the grounds or fundamentals of privacy. I will do this

through a review of ECtHR common references on the subject, relating them to the



75



C-92/09 Volker und Markus Schecke GbR and Eifert [2010] ECR I-11063, par. 47.

C-293/12 Digital Rights Ireland and Seitlinger and Others [2014], par. 48.

77

See section 5.1.2.

78

A different approach sustains that privacy would refer to opacity rights and data protection to

transparency rights (Ahonen et al. 2008, 238; De Hert and Gutwirth 2006, 2; Hildebrandt 2008, 67;

Lynskey 2013, 76). Rouvroy and Poullet consistently point out the impairments the “privacy as

opacity” interpretation could bring to the effectiveness of human rights; I refer to their work for

this purpose (Rouvroy and Poullet 2009, 69–75). A further remark is, nevertheless, noteworthy.

From the point of view of case-law – particularly that of the Court of Justice of the European Union

(CJEU) – it seems hasty to assume that privacy and data protection are meant to operate exclusively in an independent manner. The recent decision of the CJEU on the invalidation of the Data

Retention Directive illustrates this argument; here the Court has considered the protection of personal data in the light of the protection of privacy, suggesting an ascendancy of the latter over the

former: “[i]n the present case, in view of the important role played by the protection of personal

data in the light of the fundamental right to respect for private life and the extent and seriousness

of the interference with that right caused by Directive 2006/24, the EU legislature’s discretion is

reduced, with the result that review of that discretion should be strict”. C-293/12 Digital Rights

Ireland and Seitlinger and Others [2014], par. 48.

76



5.2



Privacy, A Fundamental Human Right



115



findings of the German Federal Constitutional Court in its well known “census decision”. This exercise has the purpose of maintaining a provisional approach to privacy, to which I will relate the double theoretical approach – capabilities and

algorithmic governmentality – that I have explored.

A Word on the Census Decision In addition to the ECtHR case law concerning

Article 8, I will refer to the 1983 decision of the German Federal Constitutional

Court (BVerfG)79 on the census of 1983.80 In 1981 the German Federal Government

introduced a census Act containing provisions about the latest population count, the

demographic and social structure of the population, and the economic conditions of

citizens in general. The census was considered necessary to, inter alia, establish the

number of voters in each land in the Council of Constituent States. A case was

brought to the BVerfG, before which the complainants argued violation of basic

constitutional rights and of the principle of rule of law. The complaints were related

to, inter alia, the possibility of re-identification of data, the use of vague terminology

that might lead to unconstitutional transmission of data and the complexity of statenetworked data systems, which created difficulty for people to withhold or retrieve

personal information. Why is the German case relevant? Generally because the census decision brings to light issues that are not merely local and specifically due to

evident bridges between the census decision and the case law of the ECtHR. Moreover,

its outcomes are valuable in revealing the large informational dimension of freedoms closely connected to privacy.

Privacy as Autonomy The fundamentals of privacy can be glimpsed in the lines

where the ECtHR refers to the rights of “personal development”, “to establish and

develop relationships”, “personal autonomy” or “self-determination”, to which the

Court refers in an irregular manner. For instance, it has referred to (a) the applicant’s

“freedom to define herself as a female person, one of the most basic essentials of

self-determination”, 81 (b) to personal autonomy as both an aspect of the right to

personal development and as a notion that may be extended to the right to make

choices over one’s own body,82 (c) to private life as involving “the right to personal

autonomy, personal development and to establish and develop relationships with

other human beings and the outside world”,83 and (d) to self-development and personal autonomy as different aspects of the right to the respect of private life.84 These

fundamentals can hardly be classified as subjective rights – meaning the exclusive

and specific prerogative attributed to a person to guarantee their interests against



79



BVerfG for Das Bundesverfassungsgericht (German Federal Constitutional Court).

Mention of the census decision refers to the English version edited by Eibe Riedel (Riedel 1984).

81

Van Kück v. Germany, no. 35968/97, ECHR 2003-VII, par. 73.

82

K.A. and A.D. v. Belgium, no. 42758/99 and 45558/99, 17 February 2005, par. 83.

83

Tysiąc v. Poland, no. 5410/03, ECHR 2007-I, par. 107.

84

Bigaeva v. Greece, no. 26713/05, 28 May 2009, par. 22.

80



Tài liệu bạn tìm kiếm đã sẵn sàng tải về

2 Privacy, A Fundamental Human Right

Tải bản đầy đủ ngay(0 tr)

×