Top Banner
This article appeared in a journal published by Elsevier. The attached copy is furnished to the author for internal non-commercial research and education use, including for instruction at the authors institution and sharing with colleagues. Other uses, including reproduction and distribution, or selling or licensing copies, or posting to personal, institutional or third party websites are prohibited. In most cases authors are permitted to post their version of the article (e.g. in Word or Tex form) to their personal website or institutional repository. Authors requiring further information regarding Elsevier’s archiving and manuscript policies are encouraged to visit: http://www.elsevier.com/copyright
11

How much privacy do clouds provide?

Jan 16, 2023

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: How much privacy do clouds provide?

This article appeared in a journal published by Elsevier. The attachedcopy is furnished to the author for internal non-commercial researchand education use, including for instruction at the authors institution

and sharing with colleagues.

Other uses, including reproduction and distribution, or selling orlicensing copies, or posting to personal, institutional or third party

websites are prohibited.

In most cases authors are permitted to post their version of thearticle (e.g. in Word or Tex form) to their personal website orinstitutional repository. Authors requiring further information

regarding Elsevier’s archiving and manuscript policies areencouraged to visit:

http://www.elsevier.com/copyright

Page 2: How much privacy do clouds provide?

Author's personal copy

How much privacy do clouds provide? An Australianperspective

Angela Adrian

School of Law & Justice, Southern Cross University, Australia

Keywords:

Cloud computing

Privacy and cloud computing

Privacy Act 1988 (Cth)

Internet privacy

Law and social networking

a b s t r a c t

Cloud computing is becoming the standard operating process, communications system

and underlying infrastructure of the Internet. This is of paradigm-shifting significance to

the law. Multinationals, such as Google, Amazon, Apple, Facebook, and Microsoft, own and

operate the cloud computing infrastructure of the Internet as well as influencing its

culture. They have been called the Four Horsemen of Technology and consider Microsoft

their inspiration.1 Business can now be transacted at the speed of thought. The digital

nervous system that Bill Gates envisioned is blossoming as cloud computing. However,

sovereign nations can no longer effectively regulate the telecommunications systems

within their borders without the tacit compliance of these cloud operating multinationals.

The aim of this paper is to determine whether or not cloud computing infrastructure can

support privacy regulation yet remain practical.

ª 2013 Angela Adrian. Published by Elsevier Ltd. All rights reserved.

1. Introduction

Cloud computing is becoming the standard operating process,

communications system and underlying infrastructure of the

Internet. This is of paradigm-shifting significance to the law.

Multinationals, such as Google, Amazon, Apple, Facebook,

and Microsoft, have built a cloud computing infrastructure

and determined its culture. They have been called the Four

Horsemen of Technology and consider Microsoft their inspi-

ration.1 Business can now be transacted at the speed of

thought.2 The digital nervous system that Bill Gates envi-

sioned is blossoming as cloud computing. However, sovereign

nations can no longer effectively regulate the telecommuni-

cations systems within their borders without the tacit

compliance of these cloud operating multinationals. The aim

of this paper is to determine whether or not cloud computing

infrastructure can support privacy regulation yet remain

practical.3 First, privacy and personal information will be

considered, then privacy and the Internet, and finally, privacy

and cloud computing.

A precise definition of privacy is an elusive concept as there

as many shades of meaning involved. Warren and Brandeis

suggested that privacy is a right of ‘the individual to be let-

alone’ so as to protect an ‘inviolate personality’. It is not

proprietorial in nature.4 Due to its elusive nature, privacy is

best recognised in its breach or loss. To some extent, the issue

of privacy gives rise to an ‘intermediate state’ in that a person

needs to provide sufficient information to take a meaningful

role in society and fulfil their duties (interaction), while at the

same time not having their personal information used in

1 Levy, Stephen (2011) CEO of the Internet, Wired Magazine.2 Gates, Bill (1999) Business @ the Speed of Thought: Using a Digital Nervous System, London: Penguin Books.3 The research for this paper was funded by the NSW Legal Scholarship Support Fund. The author is very grateful for their assistance.

The initial conclusions were presented at the Legal, Security, and Privacy Issues Conference sponsored by the International Associationof Information Technology Lawyers in Athens, Greece, October 2012.

4 Warren, Samuel & Brandeis, Louis (1890) The Right to Privacy 5 Harv L Rev 193.

Available online at www.sciencedirect.com

www.compseconl ine.com/publ icat ions/prodclaw.htm

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7

0267-3649/$ e see front matter ª 2013 Angela Adrian. Published by Elsevier Ltd. All rights reserved.http://dx.doi.org/10.1016/j.clsr.2012.11.010

Page 3: How much privacy do clouds provide?

Author's personal copy

a way that impinges upon their personal integrity and dignity

in an unnecessary way.5 Lack of privacy is ‘full and immediate

access, full and immediate knowledge, and constant obser-

vation of an individual . everything an individual did and

thought would immediately become known to others’.6 The

concept of privacy is multifaceted and “inherently contingent

and political, sensitive to changes in society and in tech-

nology.”7 The Electronic Privacy Information Centre and

Privacy International developed useful categories for the

various distinct but related forms of privacy:

� Information Privacy: involves the establishment of rules

governing the collection and handling of personal data;

otherwise known as ‘data protection’.

� Bodily Privacy: concerns the protection of people’s physical

selves against invasive procedures.

� Privacy of Communication: covers the security and privacy of

mail, telephones and other forms of communication.

� Territorial Privacy: concerns the setting of limits on intrusion

into domestic and other environments. This includes

searches, video surveillance and ID checks.8

This paper will focus on ‘information privacy’ and to some

degree ‘communications and surveillance privacy’. Informa-

tion privacy is based on the notion that all of the information

pertaining to a person is his or her ‘own’ and thus it should be

for him or her to control its distribution. This control helps to

protect the individual’s integrity and dignity by preventing the

information being used in ways which are damaging or

embarrassing to the person. Even if the individual discloses

the informationwhen compelled to do so by law, he or she has

a continuing interest in what happens to that information and

who else may eventually gain access to it. It is not a proprie-

tary right, but a ‘right’ advanced to help protect the individual

from arbitrary interference by others. This interest is to some

degree protected by existing laws, but is under great threat by

developments in the information technology industry.

A child born in 2008will havemany of theminor andmajor

events of their life recorded in digital form. This will be the

natural default of the future. However, few of us born earlier

realise just how much information is held about us by

government departments, employers, universities, compa-

nies, doctors, banks and various others with whom we may

deal. For example, whenwe apply for a service or benefit, such

as a credit card, we must fill in a number of forms and give

a range of details about ourselves. Sometimes the information

does not seem to be directly related to the service or benefit for

which we are applying. In turn, this information can be put to

a variety of uses, some of which were not contemplated at the

timewe gave the information. For instance, it could be used to

check our credit rating, be used to gather statistical informa-

tion about us as a group, or could be sold to other organisa-

tions as part of a marketing campaign. The need to provide

personal information and the uses, to which this information

can be put, has increased dramatically in the past century. As

the role government plays in people’s daily lives has

expanded, so has the need for personal information. Thus,

government administration is required to handle an enor-

mous amount of information and to do so efficiently and

effectively. Similarly the business and the private sector must

also process a great deal of information and provide fast and

efficient service to its customers.

The services provided by the information technology

industry have provided a timely solution to this problem, as it

provides a fast, cost-efficient and effective way to manage

large amounts of information. By the same token, the privacy

of individuals is unduly compromised by these burgeoning

digital databases. They have changed and intensified the way

in which personal information can be manipulated, collated,

stored, managed and controlled. This potential can pose

significant risks to our information privacy. Thus, in cyber-

space users can be said to have informational value since their

data is commercially useful to marketers. Unfortunately, the

legal infrastructure is inadequate to secure privacy. As Daniel

Solove remarks, “The problem is caused in significant part by

the law, which has allowed the construction and use of digital

dossiers without adequately regulating the practices by which

companies keep them secure.”9

2. Privacy and personal information

What then amounts to personal information? In the Privacy

Act 1988 (Cth), personal information is defined in s 6(1) as:

“information or an opinion (including information or an

opinion forming part of a database), whether true or not, and

whether recorded in a material form or not, about an indi-

vidual whose identity is apparent, or can reasonably be

ascertained, from the information or opinion.”10

Altman conceptualised privacy as the ‘selective control of

access to the self’ regulated as dialectic and dynamic

processes that include multi-mechanistic optimising behav-

iours. He regarded privacy as a boundary regulation process.11

The ability of information technology to disrupt or destabilise

the regulation of these boundaries is a key issue in privacy

management. In Privacy in Context, Helen Nissenbaum defines

privacy in terms of expected flows of personal information,

modelled with the construct of context-relative information

norms.12When the flow of information adheres to entrenched

norms, all is well. When these have been violated, protest and5 Westin, A (1967) Privacy and Freedom. New York: Athenaeum.6 Gavison, R. (1980) Privacy and the Limits of Law 89 Yale L

Rev 421.7 The Royal Academy of Engineering (2007) Dilemmas of Privacy

and Surveillance: Challenges of Technological Change at http://www.raeng.org.uk/societygov/policy/current_issues/privacy_surveillance/pdf/dilemmas_of_privacy_and_surveillance_report.pdf.

8 Banisar, D (2000) Privacy and Human Rights 2000: An InternationalSurvey of Privacy Law and Developments, Electronic Privacy Infor-mation Centre and Privacy International available at http://www.privacyinternational.org/survey/phr2000/overview.html.

9 Solove, D (2008) “The New Vulnerability: Data Security andPersonal Information” in ed. Chander, A., et al. Securing Privacy inthe Internet Age, Berkeley, CA: Stanford University Press.10 Privacy Act 1988 (Cth) s 6(1).11 Altman, Irwin (1977) Privacy Regulation: Culturally Universal orCulturally Specific? Journal of Social Issues 33 (3).12 Nissenbaum, Helen (2010) Privacy in Context: Technology, Policyand the Integrity of Social Life, Berkley, CA: Stanford UniversityPress.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7 49

Page 4: How much privacy do clouds provide?

Author's personal copy

complaint result. “Through the establishment of a civil society

each individual is protected by the whole of the community,

thereby which each individual should be granted with the

same rights and obligations and the same chance to develop.

This relates in particular to the use of freedom via the social

contract, which secures the self-determination of all individ-

uals.”13 Hence, the first challenge is to manage the persis-

tence, replicability, scalability, and searchability of the self.

The right to privacy has been seen primarily as a human or

social right arising from the nature of the relationship

between the individual and society. This view derives from

the empiricist and liberal philosophy of thinkers such as

Hobbes and Locke. Privacy is acknowledged as a human right,

under Article 12 of the Universal Declaration of Human Rights

(1948), and Article 17 of the International Covenant on Civil

and Political Rights (1976), both use the same form of words:

“No one shall be subjected to arbitrary interference with his

privacy, family, home or correspondence, nor to attacks upon

his honour and reputation. Everyone has the right to the

protection of the law against such interference or attacks.” It

is not however the only approach possible. De Boni and Prig-

more, for example, have shown how it is possible to give

a solid theoretical foundation to the right to privacy from the

point of view of an idealistic, neo-Hegelian philosophy, seeing

privacy not as a “human right” preceding society but as the

logical consequence of the Hegelian idea of free will.14 The

important consequence of these definitions of privacy as an

interest is that privacy has to be balanced against many other,

often competing, interests. It must balance the interests of the

individuals themselves, of other individuals, of groups and of

society as a whole. This balancing process is political in

nature, involving the exercise of power deriving from

authority, markets or any other available source.15

As such, privacy is recognised in the international

community as a ‘human right’. Tay describes human rights as

‘usually derived from the inherent worth, dignity and poten-

tialities of human beings and their essential kinship and

responsibility for each other’.16 Privacy can be regarded as part

of the claims that ‘each individual has the right to be treated

as an autonomous human person, not just as an object or

a statistic’.17 Australia is a signatory (subject to certain

reservations) to several international covenants which

recognise the right of privacy. These are:

� the International Covenant on Civil and Political Rights (‘the

ICCPR’)

� the Universal Declaration of Human Rights 1948, and

� the Guidelines of the OECD, of which Australia is

a member.18

The ICCPR provides in Art. 17:

1. No one shall be subject to arbitrary or unlawful interference with

his privacy, family, home or correspondence, nor to unlawful

attacks on his honour and reputation.

2. Everyone has the right to protection of the law against such

interference or attacks.19

Australia ratified the ICCPR in 1980 and is thus legally

bound to observe it, but has reserved the right to breach these

provisions in cases of national security or for the protection of

other people’s rights and freedoms. The ICCPR also includes

a way of making complaints (contained in the First Optional

Protocol to the ICCPR) to the United Nations’ Human Rights

Committee, concerning breaches of the Covenant. The

Universal Declaration of Human Rights also provides protec-

tion for privacy, but it is not legally binding on Australia.

Australia adopted the initial OECD Guidelines in 1984 and the

principles contained within them were incorporated, after

some modification, into the Privacy Act 1988 (Cth). An under-

lying tenet of these international covenants is the recognition

of and promotion of the integrity of the individual, and privacy

is regarded as necessary to preserve that integrity.

However, our ability to rely on physical, psychological

and social mechanisms for regulating privacy is reduced

and altered by information technology. Although speech is

framed by space/time coordinates of dramatic action and

writing is framed by space/time coordinates of books and

paper, electronic language does not lend itself to being

framed. It is everywhere and nowhere, always and never,

material and immaterial.20 Data is increasingly collected and

personalized in hypermedia systems. Storage technology

ensures that it remains available. Database technologies

make it discoverable. Telecommunications enable its rapid

reticulation. This new persistent identity causes a loss of

privacy. Our existence is understood via representations of

information which we have disclosed either explicitly or

implicitly, both within our direct control and outside of it.

Thus, identity management and privacy are intrinsically

linked.21 These personalized hypermedia systems conflict

with privacy concerns of individuals and with privacy laws

that are in effect in many countries. Organizations are only

faintly restrained by professional and industry association

13 Weber, Rolf H. & Weber, Romana (2009) Social Contract for theInternet Community?: Historical and Philosophical Theories as Basis forthe Inclusion of Civil Society in Internet Governance? Scripted 6(1).14 De Boni, Marco & Prigmore, Martyn (2001) “A Hegelian basisfor information privacy as an economic right”, in M. Roberts, M.Moulton, S. Hand, & C. Adams (eds) Information systems in thedigital world e Proceedings of the 6th UKAIS conference.15 Clarke, Roger (1999) Internet Privacy Concerns Confirm the Case forIntervention, 42 Communications of the ACM 2, Special Issue onInternet Privacy.16 Tay, AES (1986) Human Rights for Australia: a Survey of Literatureand Developments, and a Select and Annotated Bibliography of RecentLiterature in Australia and Abroad, Canberra: Australian Govern-ment Publishing Service.17 Victorian Law Reform Commission (2002) Defining Privacy:Occasional Paper available at http://www.lawreform.vic.gov.au/CA256902000FE154/Lookup/Privacy/$file/Defining_Privacy.pdf.

18 OECD (2012) Information Security and Privacy available at http://www.oecd.org/findDocument/0,2350,en_2649_34255_1_119820_1_1_1,00.html.19 Office of the United Nations High Commissioner for HumanRights (1976) International Covenant on Civil and Political Rightsavailable at http://www.ohchr.org/english/law/ccpr.htm.20 Poster, Mark (1990) The Mode of Information, Chicago: Universityof Chicago Press.21 Harrison, John & Bramall, Pete (2007) New Approaches to IdentityManagement and Privacy: A Guide Prepared for the InformationCommissioner, Information Commissioner’s Office UK.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 750

Page 5: How much privacy do clouds provide?

Author's personal copy

codes and governmental rules. For example, Australian

privacy law extends to overseas sites if the user involved is an

Australian citizen or permanent resident, and the site

conducts business with users in Australia and collects the

personal data in Australia. Nonetheless, conflicting foreign

law may override Australian law. Further new technologies

make jurisdictional issues trickier.

For example, user-adaptive (or “personalized”) hypermedia

systems cater to individuals more effectively the more infor-

mation which is possessed about them. The kinds of adapta-

tions that will be necessary are usually not known at the time

when different pieces of information about the individuals

become available. Therefore, personalized hypermedia

systems must keep the collected data “in stock” for possible

future usage. As such, data gathering is usually performed in

an unobtrusive manner and even without an individual’s

awareness so as not to distract them from their tasks and in

consideration of the fact thatmost people are very reluctant to

perform actions that are not directed towards their immediate

goals (like providing data about themselves) if they do not

receive immediate benefits, even when they would profit in

the long run.

Poster argues that “electronic writing” or, here more

specifically hypermedia systems, “disperses the subject so

that it no longer functions as a centre in the way it did in pre-

electronic writing.” He bases this on Derrida’s calls for ‘vigi-

lance’ against a media which threatened to undermine ‘crit-

ical capacities for evaluation’ by the ‘control, manipulation,

diversion or co-optation of discourse.’ Today, the balance

between the publicly private and the privately public is called

sociality. Personal information is traded like a commodity.

“Byte by byte, personal information is exchanged as currency

to gain digital access to friends. In this manner, personal

information is commercialized into the public realm, with

little input from the individual in the process.”

Cyberspace by its nature facilitates interaction which is

independent of geography, physical space or even physical

place. It changes how we engage in social relations.22 It

changes the development of our identities. Social cooperation

relies on trust in any medium.23 “The very possibility of

achieving stable mutual cooperation depends upon there

being a good chance of a continuing interaction” because it is

through repeat play that trust is developed.24 Signals of

commitment are needed to support cooperative behaviour.

We usually rely on face-to-facemechanisms for creating these

signals and trust.25 Virtual environments are the domain of

liquid identity. This identity question causes insecurities.

Who is the puppeteer hidden behind this little mass of bits

and bytes displayed on my computer screen? Can I trust this

person? Are they who they say they are? Are they really rep-

resenting what they say they represent?

Technology, thus, defines the scope of social relationships

and our online social interaction has different characteristics.

The most important characteristic being that identity is

becoming enriched with more persistent forms of reputation.

Reputation is of course tied to an identity. They are two sides

of the same coin. Reputation, however, is earned over time. As

such, identity without reputation is nearly meaningless.26 It is

a measure of reputation allowing us to assess the risk of doing

business with someone. What is needed is to be known and

determined is reputation at the moment of “transaction”

(however it is defined). So reputation devices like credit scores,

domain name systems or eBay ratings have been created. A

reputation is the “estimation in which a person or thing is

commonly held.”27 Reputation is a fundamental part of your

virtual self. Conversations in social networks can be stored,

andwho you are becomesmore a function of the community’s

view of you, your behaviour and your contributions to

a particular piece of that social network. In this social software

environment of collaborative creativity and interaction,

representation becomes malleable and reputation becomes

community-created. Online reputation needs to recognize the

interests of the collective as well as the individual in the

manner in which identity is constructed online.

If any of these social networks arbitrarily altered or deleted

a user’s reputation despite the fact that the community had

created it, there is little assurance that robust and persistent

identities would be developed. Reputation scores and collab-

orative filtering devices are signalling mechanisms for

successful collective action. Merely because that reputation

depends on software tools for its articulation should not

produce an exclusive property right for the platform owner

without regard for the needs of the group.28 How do we

maintain our right to privacy in the face of this technology?

3. Privacy and the Internet

The Internet exacerbates the potential threat of information

technology to our information privacy, by increasing the range

of users who may access and abuse information about us. It

also increases the potential for us to be subject to data

surveillance, as our ‘movements’ on the net can also be

tracked and monitored. A current trend in information tech-

nology is to link the databases of government agencies into

jurisdiction-wide networks that allow for the common

holding of data.29 These ‘common information repositories’

may be housed at various levels of the network, such that

certain information may be accessible at the state level, other

information can be obtained at a more limited enterprise

level, and other information would be restricted to an agency-

22 Noveck, Beth Simone (2005) Trademark law and the socialconstruction of trust: creating the legal framework for online identity, 83Wash U L Q 1733.23 Axelrod, Robert (1984) The evolution of cooperation, New York:Basic Books.24 Ibid.25 Moringello, Juliet (2005) Signals, assent, and Internet contracting57 Rutgers L Rev 1307.

26 Resnick, Paul; Zeckhauser, Richard; Swanson, John; andLockwood, Kate (2006) The value of reputation on eBay: a controlledexperiment 9 Exp Econ 2.27 The Pocket Oxford Dictionary (1975) Oxford: Oxford UniversityPress.28 Clarke, Roger (1999) Internet Privacy Concerns Confirm the Case forIntervention, 42 Communications of the ACM 42, Special Issue onInternet Privacy.29 Carlson, S & Miller, E (1999) Public Data and Personal Privacy, 16Santa Clara Computer & High Tech. L.J. 83.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7 51

Page 6: How much privacy do clouds provide?

Author's personal copy

specific level.30 Information is organized at state, enterprise,

or agency levels as dictated by concerns about security,

management responsibility, and access requirements.31 The

transformation to an integrated, jurisdiction-wide data

network or ‘infostructure’ requires that value judgments be

made about the privacy concerns of individual bits of infor-

mation. By assigning data to a more general level of access,

the government can reduce the redundancy of data collection

as well as the associated costs of maintaining separate data-

bases. Simultaneously, the government makes personal

informationmore available to officials, as well as to the public.

Governments do collect a vast array of personal data. Some

data, such as tax andmedical records, are extremely sensitive

and have long been recognized as sources of concern for the

privacy interests of individuals. Other sources of personal

information are quite innocuous when considered in discrete

amounts, although they can be compiled and matched to

create broader profiles of individuals that are invasive of

personal privacy. Because governments hold a spectrum of

data about their citizens, public databases are especially

attractive sources of information for people seeking to

generate information profiles of others.32

One may think that ‘surfing the net’ is an anonymous

activity but that is far from the reality. There are a variety of

ways in which information can be collected without knowl-

edge or consent. Most personal information in cyberspace is

collected in one of two ways. An organization may directly

solicit and collect information from individuals who contact

the organization and provide information voluntarily.33

Alternatively, and increasingly more common, the organiza-

tion might surreptitiously track and record individual’s

surfing activity on the Internet.34 For instance, if we were to

visit an Internet site which contained objectionable or

unlawful material, our visit could be recorded and used

against us at a later time. What would not be recorded at that

time is our purpose in visiting that site, which may well have

a legitimate or innocent character.

Direct solicitation of information has been with us for

years in various forms. We have all completed job or credit

applications or filled in surveys. Many consumers have

completed and returned warranty registration cards to the

manufacturer, which volunteer valuable data that can be used

for marketing purposes. In the modern age, more information

is directly solicited online as an increasing number ofwebsites

require registration and the disclosure of personal informa-

tion before a user can access the site’s content. Amazon.com,

for example, uses registration information to help keep track

of its customers’ purchases of books, CDs, electronics, toys,

and other items.35

Surreptitious collection of information from web users is

even more common. Many websites secretly track

a customer’s surfing practices through the use of ‘cookies’ and

similar technologies.36 When a user explores a site, the user

leaves electronic footprints behind. By following the foot-

prints, the site can record information about the user, such as

the Internet service provider used, and the type of hardware

and software the user employed. The site can also record

some behavioural information about the user’s Internet

habits, such as the website previously visited, the amount of

time spent on each web page, and the length of time spent

visiting different parts of the site.37 Another tool is Globally

Unique Identifiers (‘GUIDs’) which are alphanumeric identi-

fiers for the unique installation of software. Such devices

similarly yield information about the user, in terms of the

software or other files created or downloaded on the user’s

hard drive. By utilizing ‘cookies’ commercial websites can

collect personal information about visitors to particular

30 Ibid.31 Ibid.32 For example, actress Rebecca Shaeffer was shot and murderedwhen a lunatic fan acquired her address through the motorvehicle records held by the State of California. See Davis, W. Kent(1997) Drivers’ Licenses: Comply with the Provisions of the FederalDriver’s Privacy Protection Act; Provide Strict Guidelines for the Releaseof Personal Information from Drivers’ Licenses and Other Records of theDepartment of Public Safety, 14 Ga. St.U. L. Rev. 196. Similarly, pro-choice workers are often targeted for violence by anti-abortionactivists who take down their license plate numbers and findtheir home addresses through the registries held by state motorvehicle departments. See Estes, Andrea (Feb. 28, 1997) Feds ProbeAbortion Foes’ Mailing; Planned Parenthood Workers Were Targeted atTheir Homes, Boston Herald, at 1. In addition, IRS workers haverepeatedly been found to sell confidential tax data. See HersheyJr., Robert D. (Apr. 9, 1997) Snooping by I.R.S. Employees Has NotStopped, Report Finds, N.Y. Times at A16.33 Solove, Daniel J. (2002) Digital Dossiers and the Dissipation ofFourth Amendment Privacy, 75 S. Cal. L. Rev. 1083, citing Raymond,Margaret (1998) Rejecting Totalitarianism: Translating the Guaranteesof Constitutional Criminal Procedure, 76 N.C. L. Rev. 1193, noting thatthroughout history, totalitarian governments have instilled fearby creating elaborate systems for collecting data about people’sprivate lives.34 Reidenberg, Joel (1992) Privacy in the Information Economy: AFortress or Frontier for Individual. Rights?, 44 Fed. Comm. L.J. 195.

35 Shen, Andrew (visited 2012) Online Profiling Project, EPIC, athttp://www.epic.org/privacy/internet/Online_Profiling_Workshop.PDF citing an article from The Economist (14 September1997, editorial) in which Jeff Bezos, CEO of Amazon.com,describes Amazon as an ‘information broker’, acting as theconnection between consumers looking for books and publisherslooking for consumers; according to Bezos, Amazon’s vast data-base of customers’ preferences and buying patterns is tied totheir e-mail and postal addresses.; Murray, Alan (19 July 1999) NetEffect: Is Service Getting Too Personal?, Wall St. J. at A1. “The nextwave of Internet innovation is in the area of personalizedmarketing and services. Companies such as Amazon.com areeagerly assembling and sorting massive amounts of informationon customer preferences. Their aim is to know what book, recordor other product you want before you know it, and then marketsit directly to you.”36 A ‘cookie’ is a small file of codes that is dispatched to a user’scomputer when a web page is viewed. The site puts an identifi-cation mark in the file, and the cookie is stored on the user’s harddrive. When the user visits the site again, the site locates thecookie and matches the file code with information previouslycollected about the user’s surfing activity. While privacy advo-cates object to the use of cookies, the problem with banning themis that they have practical uses other than secretly collectinginformation about surfing activity. They can store passwords, forexample, which speeds access to frequently used websites. Seegenerally, Mayer-Schumonberger, V (1997) The Internet and PrivacyLegislation: Cookies for a Treat?, 1 W. Va. J.L. & Tech. 1; Eichelberger,L (1998) The Cookie Controversy, Cookie Central, at http://www.cookiecentral.com/ccstory/cc2.htm.37 Solove (2002) supra.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 752

Page 7: How much privacy do clouds provide?

Author's personal copy

websites. Cookies essentially consist of small data files sent to

the user’s browser when she visits a site, and typically include

the IP address of the user’s online provider, the type of the

user’s browser, and the user’s operating system. Cookies may

also include data that the user furnished, such as the user’s

name and email address. The devices are mainly used by

marketers as follows:

� to target advertising to users who have previously visited

related sites, and thus are presumed to have an interest in

related merchandise.

� cookies may also be used for marketing purposes by

tracking users’ buying habits and preferences.

� cookies technology enables commercial websites and

advertisers to deliver individually tailored banner adver-

tisements to their browsers.

The cookies device, when used without the aid of other

data sources, generally enables the website server to gather

data about the user without ascertaining the user’s identity.

When a user also registers with the site, such as by furnishing

his name and other information, the registration information

may be associated with the cookie to personally identify the

user to the host server.

To make this information more useful, the website might

connect the ‘clickstream’ data to particular Internet users.38

This can be done by either requiring users to register or

branding them with cookies that will report identifying

information back to thewebsite the next time the user visits.39

Using either method, the site can compile a profile of indi-

vidual interests, concerns, and general web surfing habits.

Savvy online marketing firms can even draw inferences about

how we respond to web page presentations. For example, an

online travel service could keep track of every destination to

which a person requested a fare or every city in which hotel

information was sought. A medical information site could

track the number of times a user linked to pages providing

information on osteopathic remedies. Clickstream data can

thus reveal lots of useful marketing information about all who

use the Internet.40

Another form of user-tracking technology is the ‘web bug,’

also known as a web beacon or clear graphic image file (GIF)

tag. Web bugs are image files secretly imbedded in a web page

and are invisible to the person browsing the page.41 The bug

sends information about the user’s browsing habits and

interaction with the page back to the home server. Internet

advertisers also can capture the search terms a person uses to

find websites on a subject of interest. The process, known as

‘banner ad leakage,’ allows an advertiser to record search

terms as the user submits them to the search engine.42 Banner

ad leakage allows the advertiser to collect an enormous

amount of potential marketing data and to tailor ads to the

user’s specific interests more quickly and accurately than

cookie technology would permit.43 The information cookies

provided are in fact shared by thousands of websites through

advertising-network companies. The biggest of these, Dou-

bleClick, has agreements with over 11,000 websites and

maintains cookies on 100million users. These can be linked to

hundreds of pieces of information about each user’s browsing

behaviour. In addition, users can be tracked through other

methods by Internet service providers, website hosts and

email services.

Recently there have also been serious questions raised as

to the retention of data by search engines. It is not widely

known that browsers can have security bugs which allow

hackers and website operators to access a user’s personal

information. Usually these bugs are attended to as soon as

they are discovered, however it is up to the user to keep up to

date with the software. Software agents pose concerns qual-

itatively different to cookies. Agents are programs acting on

behalf of a person or organisation instigating them. They

scour the Internet to performing ‘information gathering or

processing . in the background’.44 Acting for their creator,

they seek out information about others (acting as a data

collector) while, at the same time as they are acting for the

principal, they can use personal information the principal has

invested in them. An example would be an agent bidding for

airline tickets. Such an agent is used by a consumer to make

bids. When collecting information the agent would be acting

as a data collector on behalf of the consumer principal.

Conversely, airlines that optimise prices by using personal

data from users to establish demand level at any given point

38 Berman, Jerry & Mulligan, Deirdre (1999) Privacy in the DigitalAge: Work in Progress, 23 Nova L. Rev. 551, explaining “click-stream” data are series of detailed transactional information thatimprove targeted online advertising; some firms, such as Adfinity,combine clickstream data, or ‘mouse-droppings,’ with personalinformation collected from other sources to create profiles ofa person’s Web browsing behaviour.39 Kennedy John B. & Meade, Mathew H. (2001) Privacy Policies andFair Information Practices: A Look at Current Issues Regarding OnlineConsumer Privacy and Business Practices, 632A PLI/Pat 321; Nehf,James (2003) Recognizing the Societal Value in Information Privacy, 78Wash. L. Rev. 1.40 Solove (2002) supra.41 A web bug is invisible because it is only one pixel square in sizeand blends into the background on a web page or HTML e-mailmessage. The only way to detect a web bug is to locate the sourcecode for theweb page or e-mailmessage and discover that thewebbug image comes from a different server than everything else. Theserver sending thebugmight belong to anadvertisingnetwork thatuses it to obtain information, including the Internet Protocol (IP)address of the computer that accepted the web bug, the URL of thepage on which the web bug appears, the time the web bug wasviewed, the type of browser that accepted the web bug image, andanypreviously set cookie data. (The cookie can link the bug and theinformation it has obtained back to the online profile associatedwith that cookie.) Web bugs are common in HTML e-mail and areused to tell if an e-mail has been read or forwarded to anotherperson. See Smith, R. (visited 2012) FAQ: Web Bugs, at http://www.privacyfoundation.org/resources/webbug.asp; O’Harrow Jr., Rob-ert (13 November 1999) Fearing a Plague of ‘Web Bugs’: Invisible Fact-Gathering Code Raises Privacy Concerns, Wash. Post at E01.

42 Berghel H, (2001) Caustic Cookies, 44(5) Communications of theACM 19; Berghel H, (2002) Hijacking the Web 45(4) Communica-tions of the ACM 23. For an excellent legal description of thetechnical aspects of cookies see In re Doubleclick Inc. Privacy Liti-gation, 2001 U.S. Dist Lexis 3498 at [16].43 Ibid.44 Federal Trade Commission (2000) Online Profiling: A Report toCongress at http://www.ftc.gov/os/2000/06/onlineprofilingreportjune2000.pdf.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7 53

Page 8: How much privacy do clouds provide?

Author's personal copy

in the market time frame would also be acting as data

collectors.45 Collector and use roles aremore complex than for

a cookie where a consumer has a passive role in supplying

data. Agents can make decisions and go to sites unknown to

the principal.

Australia has made an effort to combat this problem with

the Spam Act 2003 (Cth) whose primary purpose is to prevent

unsolicited electronic messages. However, it also protects

privacy. First, it prohibits the use of address-harvesting soft-

ware.46 These are computer programs designed to scour the

Internet for electronic addresses. Their prohibition consti-

tutes a privacy protection because it limits the use of personal

information for purposes other than those for which consent

has been given. Second, the Act requires that express or

inferred consent be given for the sending of electronic

messages to a user.47 Again, this gives the user control over

how their information is used.

The increasing functionality of the Internet is decreasing

the role of the personal computer.48 This shift is being led by

the growth of ‘cloud computing’ or the ability to run applica-

tions and store data on a service provider’s computers over

the Internet, rather than on a person’s desktop computer.49

Scott McNealy, the Chairman and former CEO of Sun Micro-

systems, caused an uproar in 1999 when he dismissed online

privacy concerns and proclaimed, “You have zero privacy

anyway. Get over it.”50 Was he right? Within the realm of

cloud computing, he may have been uncomfortably close to

the truth.

4. Privacy and cloud computing

Cloud computing is the delivery of computing as a service

rather than a product, whereby shared resources, software,

and information are provided to computers and other devices

as a metreed service over a network (typically the Internet).51

Computing clouds provide computation, software, data

access, and storage resources without requiring cloud users to

know the location and other details of the computing infra-

structure. End users access cloud based applications through

awebbrowseror a lightweightdesktopormobileappwhile the

business software and data are stored on servers at a remote

location. At the foundation of cloud computing is the broader

concept of infrastructure convergence (or Converged Infra-

structure) and shared services. Cloud infrastructure services,

also known as ‘infrastructure as a service’ (IaaS), deliver

computer infrastructure e typically a platform virtualization

environmente as a service, alongwith raw (block) storage and

networking. Rather than purchasing servers, software, data-

centre space or network equipment, clients instead buy

those resources as a fully outsourced service.52

Cloud computing is posing significant challenges to legal

adaptation.53 Because cloud computing is so new and still

developing, little research has been done in the area. This

technological innovation forces the need for legal innovation

in two ways. First, it creates an entirely different mode of

communication, which has led to universal surveillance and

infrastructural imperialism. The question here is: How should

legal rules change to accommodate the new communication

technology? If cloud computing does not alter our funda-

mental values, how should legal rules adapt and change in

order to maintain our current values? What should the

substance of our rules be in light of the changing environment

for the actors in the cloud?

Second, cloud computing allows communication at even

greater speed than has been possible.54 The issue here is:

Whichmechanismandmethod for legal change ismore suited

to respond quickly in this new environment, recognizing that

some existing mechanisms and methods of legal adaptation

simply cannot operate at such speed? Police using a horse and

buggy cannotmatchnor catch a speeding car, let alone aplane.

They are chasing rockets now. An examination of the funda-

mental policies underlying the law is needed and an adoption

of a new format of changing and enforcing law.55

Karl Popper said that while all new scientific theories

change at least parts of former theories, showing that the

former theories are either wrong or incomplete, new theories

encompass the (partial) truth of the theories they contradict.56

Thus, Einstein’s theory of relativity differs from Newton’s

theory by showing that it is not true in certain environments,

yet subsumes Newton’s theory for the limited environment as

to which it is true. In law, the relationships between new and

45 For use of intelligent agents see Morris J, Rae P and Maes P(2000) Sardine: Dynamic Seller Strategies in an Auction Marketplace,Proceedings of the Conference on Electronic Commerce (EC ’00) atMinneapolis.46 Spam Act 2003 (Cth) ss 19e22.47 Spam Act 2003 (Cth) s 16.48 Robison, W (2010) Free at What Cost?: Cloud Computing PrivacyUnder the(US) Stored Communications Act, 98 Geo. L.J. 1195.49 Ibid.50 Schwartz, John (5 September 2001) As Big PC Brother Watches,Users Encounter Frustration, N.Y. Times at C6.51 National Institute of Standards and Technology (2012) avail-able at http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf.

52 See, e.g., Rearden LLC v. Rearden Commerce, 597 F. Supp. 2d 1006,1021 (N.D. Cal. 2009) (finding that cloud computing is “a term usedto describe a software-as-a-service (SAAS) platform for the onlinedelivery of products and services”); FTC Complaint of ElectronicPrivacy Information Center at 4, In re Google, Inc. & CloudComputing Servs. (Mar. 17, 2009) (“Cloud Computing Services arean emerging network architecture by which data and applica-tions reside on third party servers, managed by private firms, thatprovide remote access through web-based devices.”), available athttp://epic.org/privacy/cloudcomputing/googleiftc031709.pdf;Gellman, R (2009) World Privacy Forum, Privacy in the Clouds: Risksto Privacy and Confidentiality from Cloud Computing “[C]loudcomputing involves the sharing or storage by users of their owninformation on remote servers owned or operated by others andaccessed through the Internet or other connections.”; Posting ofBob Boorstin to Google Public Policy Blog, (20 March 2009) Whatpolicymakers should know about cloud computing available at http://googlepublicpolicy.blogspot.com/2009/03/what-policymakers-should-know-about.html (defining cloud computing as ‘themovement of computer applications and data storage from thedesktop to remote servers”).53 Lastowka, Greg (2008) Google’s Law 73 Brooklyn L. Rev. 1327.54 Gates, supra.55 Goldstein, J., Kahler, M, Keohane, R. & Slaughter, A. (2001)Legalization and World Politics, Cambridge, MA: MIT Press.56 Popper, Karl (1996) The Myth of The Framework, London:Routledge.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 754

Page 9: How much privacy do clouds provide?

Author's personal copy

traditional policies and rules seem different than in science.

Law can be tested by norms of right and wrong as well as by

truth and falsity. Theoretically, norm setting seems to allow

lawmakers more discretion to change existing laws than

scientific theorizers would have; lawmakers can introduce

new fundamental policies and values that fully trump and

deviate from their predecessors rather than subsume them.

Yet, in reality, the way in which the law changes, is

astoundingly similar to the way in which new theories in

science are fashioned. Most new legal rules and underlying

policies conflict with parts of their predecessors but contain

and reaffirm part of their predecessors. Generally, like most

new scientific theories, new adaptive laws subsume most of

prior laws and only ‘tweak’ them in certain areas.57

There can be a number of reasons for the conservative

attitude of lawmakers to adapting and modifying law. In fact,

these reasonsaresimilar (andsomeare identical) to thereasons

for the doctrine of stare decisis. New laws and regulations can

be risky and costly to both the regulators and the regulated.

They impose learning costs on the regulators, legal profession,

the regulated and the public. Cloud computing marks a new

stage in the cultural divide between the self-regulatingworld of

technology and the command-and-control world of govern-

ment. Can the technology industry limit the size and influence

of government or will privacy laws prevail to curb their exten-

sive influence? What is happening to all of the private infor-

mation that is being utilized to make all of this possible?

5. How does Australia respond?

Although privacy is broadly recognized as a concern for the

development of novel interactive technologies, our ability to

reason analytically about privacy in real settings is limited. A

lack of conceptual interpretive frameworks makes it difficult

to unpack interrelated privacy issues in settings where infor-

mation technology is also present. As noted earlier, the Privacy

Act 1988 (Cth) protects personal information.58 Your personal

hypermedia system’s database should, therefore, be protect-

able. However, there is currently no statutory action for

invasion of privacy in any Australian jurisdiction and there is

scant common law, with no appellate court recognising a tort

of invasion of privacy. In Australia it has long been held that

absent a confidential relationship or a breach of contract, the

common law holds that a person ‘has no obligation towards

the other’ in regards to privacy Australia.59 However, a recent

case that held that corporations have no right to privacy

regarding personal information has perhaps opened the door

slightly for recognition of an individual’s personal privacy.60

The court saw privacy as ‘being a principle protecting the

interests of the individual in leading, to some reasonable

extent, a secluded and private life’.61

However, where an action for breach of confidence lies, the

courts have recognised a right to privacy. In other words,

where one person imparts information to another in confi-

dence and the latter uses that information for her own

purposes or discloses it to third parties without permission,

then an action for breach of confidencemay be available to the

person who provided the information. In the classic case of

Prince Albert v Strange (1849) 64 ER 293 Chancery Division,

a case in which Prince Albert took some etchings done by his

wife, Queen Victoria, to be printed. An employee of the printer

sold copies to the defendant, who advertised a public exhibi-

tion of the etchings and produced a catalogue giving details of

them. The court held that the defendant was enjoined from

displaying the etchings or distributing the catalogue in breach

of confidence. Although the defendant was a third party to the

plaintiff, he was aware that the information came to him as

the result of an abuse of equitable obligation. The law in

Australia relating to confidential information is drawn from

several bases: chiefly equity and contract. Under the terms of

a contract, rights may be derived that rise to an action for

breach of an express or implied term of confidentiality, or for

breach of confidence. In addition, there is also some implicit

recognition of the right to privacy in the law of torts, repre-

sented in actions such as defamation.

The absence of the common law in this area can be traced

back to 1937, where the High Court found in Victoria Park

Racing and Recreation Grounds Co Ltd v Taylor (1937) 58 CLR 479,

that breach of privacy was not recognised in Australian law.

This precedent was maintained by Australian courts until

2001 when the High Court, in Australian Broadcasting Corpora-

tion v Lenah Game Meats Pty Ltd, (2001) 208 CLR 199, departed

from it, clearly indicating that the decision in Victoria Park at

[107] does not stand in the path of the development of a cause

of action for invasion of privacy.

In Lenah Game Meats, the High Court’s decision suggested

two possible bases on which equity may intervene to enjoin

publication of non-confidential material obtained by trespass.

1. The equitable action for breach of confidence could be

extended to apply to the protection of privacy interests, or

a new equitable basis developed for the protection of

private material upon analogy with the action for breach of

confidence, or

2. Bymeans of a constructive trust. On this basis, the occupier

of private premises would acquire a beneficial interest in

material recorded there if there was an unlawful entry. The

constructive trust would arise either in relation to the

copyright in thematerial or in the tangibleproperty inwhich

thematerial is embodied, including copies of thematerial.62

As Lindsay points out, if Australian law was to develop in

this direction, it would confer an extremely high level of

protection against electronic intrusions on activities con-

ducted onprivate premises, equating to the protection given to

private premises via the law of trespass. Such a development

57 Frankel, Tamar (1998) The Internet, Securities Regulation, andTheory of Law, 73 Chi-Kent L. Rev. 1319.58 Privacy Act 1988 (Cth) s 6(1).59 Victoria Park Racing and Recreation Grounds Company Limited vTaylor (1936) 37 SR (NSW) 322 at [330].60 Australian Broadcasting Corporation v Lenah Game Meats Pty Ltd[2001] HCA 63 at per Gummow and Hayne JJ.61 Ibid.

62 Lindsay, D (2002) Playing possum? Privacy, Freedom of Speech andthe Media Following ABC v Lenah Game Meats Pty Ltd: Part II: TheFuture of Australian Privacy and Free Speech Law and the Implicationsfor the Media, 7Media & Arts Law Review 3.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7 55

Page 10: How much privacy do clouds provide?

Author's personal copy

could potentially pose problems in relation to freedom of

information and expression. To surmise, the decision in this

case is generally seen as an instance of Australian law’s failure

to address thequestion concerning the extent towhich the law

should protect privacy in circumstances that are neither

confidential nor defamatory.63

Unfortunately, the High Court did not determine whether

a cause of action exists, nor has it clearly articulated what the

scope of such a cause of action might be. Since the High Court

considered this case, the common law has remained unde-

veloped. Only two lower court cases, Grosse v Purvis, [2003]

QDC 151 and Doe v Australian Broadcasting Corporation, [2007]

VCC 281, have expressly recognised a common law right to an

action for invasion of privacy.

Legislative action has been necessary given the lack of

specific recognition of the right of privacy by the common law.

Indeed, much of this legislation is a direct response to Aus-

tralia’s international obligations to protect privacy, particu-

larly to protect information privacy. This recognition has been

affectedbothbyState,Territoryand federal legislation. In1986,

the federal parliament enacted the Human Rights and Equal

Opportunity Commission Act 1986, which established theHuman

Rights and Equal Opportunity Commission (‘HREOC’).64 This

body has a responsibility to monitor Australia’s observance of

the rights guaranteed by the International Covenant on Civil

and Political Rights (amongst others), which includes the right

to privacy in Art 17. However, the HREOC Act does not include

anyspecific reference toa right of privacy,which is enforceable

within Australian domestic law. Nonetheless, the Act does

establish the right of a citizen to make a complaint to the

HREOC about a breach of their right of privacy, as it would

amount to a breach of Art 17 of the ICCPR. The complaint,

however, would not lead to any enforceable remedy in favour

of the person complaining. The Commonwealth has enacted

a federal Privacy Act 1988which regulates privacy in the public

and private sectors and enacts the Information Privacy Prin-

ciples.65 The Act also creates the position of the Privacy

Commissioner, who has the function of monitoring the

protection of information privacy.

In Review of Australian Privacy Law, the ALRC noted the

major concern about individuals handling private information

but chose not to propose an extension to the Privacy Act 1988

(Cth) which would regulate individuals acting in a non-

commercial capacity.66 Permanence of personal information

on the Internet was also considered a major concern. Google

Australia suggested a privacy take-down notice scheme

modelled on the copyright infringement take-down scheme in

the Copyright Act 1968 (Cth).67 Instead, the ALRC proposed

a statutory cause of action for serious invasion of privacy.68

They felt that a take-down scheme would require a decision

maker to balance the right of freedom of expression against

the right to individual privacy. This determination was more

appropriately made by a court than by a regulator.69

Nonetheless, these suggestions are still merely sugges-

tions. Due to this lack of real world protection, one feels the

need to look to digital social norms. This new digital society

challenges how we see the world and derive value from it.

Return to Altman’s definition of privacy as a dialectic and

dynamic process.70 “As a dialectic process, privacy regulation

is conditioned by our own expectations and experiences, and

those of other with whom we interact. As a dynamic process,

privacy is understood to be under continuous negotiation and

management, with the boundary that distinguishes privacy

and publicity refined according to circumstances.”71 Digital

civil society is developing its own methods of privacy

management which is a balance among individuals and

groups and between technical and social entities.

6. Conclusion

Another obstacle to strengthening online privacy protections

is the changing societal attitude towards online privacy.72

Security is a process, not a product.73 Younger generations

have much less concern about online privacy than older

generations.74 This divergence is partially attributable to the

different ways that each generation uses the Internet. Older

users generally rely on the Internet for transactional

63 Ibid.64 This Commission replaced the Human Rights Commission,which had no specific function in relation to privacy.65 Privacy Amendment (Enhancing Privacy Protection) Bill 2012 (Cth).The 10 National Privacy Principles are located in Schedule 3 andcover the areas of: Principle 1 e Collection; Principle 2 e Use anddisclosure; Principle 3 e Data quality; Principle 4 e Data security;Principle 5 e Openness; Principle 6 e Access and correction;Principle 7 e Identifiers; Principle 8 e Anonymity; Principle 9 e

Transborder data flows; Principle 10 e Sensitive information. Anexcellent summary can be found at Office of the PrivacyCommissioner, National Privacy Principles (extracted from thePrivacy Amendment (Private Sector) Act 2000) http://www.privacy.gov.au/publications/npps01.html.66 Australian Law Review Commission, Australian Privacy Lawand Practice Report 108, Review of Australian Privacy Law: Section 74Recommendation 74.1.

67 Australian Law Review Commission, Australian Privacy Lawand Practice Report 108, Review of Australian Privacy Law: Section 11Recommendations 11.10e11.13.68 Australian Law Review Commission, Australian Privacy Lawand Practice Report 108, Review of Australian Privacy Law: Section 74Recommendation 74.1.69 Australian Law Review Commission, Australian Privacy Lawand Practice Report 108, Review of Australian Privacy Law: Section 11Recommendations 11.23.70 Palen & Dourish, supra.71 Ibid.72 Economist (30 January 2010) A Special Report on SocialNetworking: Privacy 2.0 at 12e13 (summarizing recent commentsby Mark Zuckerberg, the Chief Executive Officer of Facebook,arguing “that social norms ha[ve] shifted and that people ha[ve]become willing to share information about themselves morewidely”).73 Smedinghoff, T (2008) “Defining the Legal Standard for Infor-mation Security” in ed. Chander, A., et al. Securing Privacy in theInternet Age, Berkeley, CA: Stanford University Press.74 Palfrey, J & Gasser, Urs (2008) Born Digital: Understanding theFirst Generation of Digital Natives, New York: Basic Books. “DigitalNatives, who live so much of their lives in networked publics, areunlikely to come to see privacy in the same terms that previousgenerations have, by and large.” They define a member of the‘Digital Natives’ generation as a “person born into the digital age(after 1980) who has access to networked digital technologies andstrong computer skills and knowledge.”

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 756

Page 11: How much privacy do clouds provide?

Author's personal copy

encounters, such as gathering information from websites,

exchanging direct communications via e-mail, managing

personal finances, and purchasing goods.75 In contrast,

younger users are more likely to embrace the Internet’s

interconnectedness and convenience by participating in

social networking, sharing digital content, and using cloud

services.76 The generational differences in Internet usage are

shifting societal calculations about the value of online privacy.

Privacy involves a trade off with other competing values,

such as cost, convenience, efficiency, and networking.77 The

widespread use of cloud computing services by younger

generations is driven extensively by these latter values.

Popular social networking sites, such as Facebook and

MySpace,necessarily involve thepublic (or semi-public) sharing

of personal information and content with a network of other

users.78 For users of these services, the value of networking and

communicating with others outweighs the intangible costs to

their personal privacy.79 Older Internet users have fewer

incentives to bargain away their privacy. Furthermore, older

users may have a better appreciation for privacy’s benefits and

the consequences that might follow from allowing too much

personal information to circulate in the digital realm.80

Security consists of an ongoing process of identifying

threats and vulnerabilities and taking appropriate responses.

A firewall or a password is not a one-size-fits-all solution. The

process of establishing privacy security must be multidi-

mensional. However, the price of privacy security should not

be the loss of innovation or inordinate constraints on busi-

ness. The generational gap in privacy expectations and the

embracing of free services from cloud providers suggest little

opportunity to generate societalmomentum for greater online

privacy protections. Younger generations are less concerned

with personal privacy than older generations and are likely to

carry those views forward as they gradually assume society’s

reins in the future. The expanding business model of

exchanging privacy for free access to cloud providers’ offer-

ings will also continue to reduce the perceivedmarket price of

individual privacy. Thus, the likelihood for building a societal

consensus about the need for heightened online privacy

protections is gradually slipping away.

Dr Angela Adrian ([email protected]) Senior Lecturer,

School of Law & Justice, Southern Cross University, East Lismore,

NSW, Australia.

75 Robison, supra.76 Zittrain, J (2008) The Future of the Internet and How to Stop it, NewHaven, CT: Yale University Press. “People [born after 1985]routinely set up pages on social networking sites e in the UnitedStates, more than 85 per cent of university students are said tohave an entry on Facebook e and they impart reams of photo-graphs, views, and status reports about their lives, updated to theminute.”77 Cate, F (1997) Privacy in the Information Age, Washington, D.C.:Brookings Institution Press. Arguing that “privacy conflicts withimportant values, including society’s interest in free expression,preventing and punishing crime, protection of private property,and the efficient operation of government.”78 Zittrain, supra; Moreno v. Hanford Sentinel, Inc., 91 Cal. Rptr. 3d858, 862e63 (Ct. App. 2009) (noting that an. individual had noreasonable expectation of privacy when she posted material onMySpace, even if she “expected a limited audience,” because thematerial is “opened . . . to the public at large” and the “potentialaudience was vast”).79 Asay, M. (5 November 2009) Google Privacy Controls: Most PeopleWon’t Care, CNET at http://news.cnet.com/8301-13505_3-10390456-16.html “[F]or all our hand-wringing over privacy e

and for good reason e the reality is that most of us, most of thetime, really don’t care. Or, rather, if accessing useful services orgetting work done more efficiently requires some privacyconcessions, we gladly concede.”80 Palfrey and Gasser, supra.

c om p u t e r l aw & s e c u r i t y r e v i ew 2 9 ( 2 0 1 3 ) 4 8e5 7 57