Transcript

Social Trust and Cyber-trust

Denise AnthonySociology

ISTS

Outline• What is trust?

• Why is trust relevant for cyber security?– Problems of Trust

• Periods of social change

• Collective Goods

• Internet: new institutional environment

• Defining trust

• Trust Online – 2 experimental studies– Trust in exchange

– Trust in distributed groups

• Implications for trustable systems and reliable networks

What is trust?

Sociology of Trust

Bob asks to borrow $10 from Alice

Sociology of Trust

Bob asks to borrow $10 from Alice

If Alice trusts Bob…• Alice expects Bob to repay $10

Sociology of Trust

Bob asks to borrow $10 from Alice

If Alice trusts Bob…• Alice expects Bob to repay $10

• Alice lends $10 to Bob• Alice risks losing $10 (or more); Alice getting $10 back

depends on Bob’s behavior• Alice does not know for sure if or when Bob will repay $10

(trust makes Alice feel certain)

Sociology of Trust

• Expectations by one actor about another actor’s (future) behavior

• 3 part relation: A trusts B to do X (Hardin 1991, 2000)

Sociology of Trust

When A trusts B to do X (e.g., repay $10),

A takes action Y (e.g., lending $10) 1. A is vulnerable

• Risk of loss of Y (or more)• A’s outcome depends on B’s behavior

2. A is uncertain about B doing X (info prob)• Reliability – will B do X? for A?• Capability – can B do X?

Sociology of Trust

Bob asks to borrow $10 from Alice

Alice trusts Bob…• Alice expects Bob to repay $10

• Alice lends $10 to Bob• vulnerability: Alice risks losing $10 (or more); Alice getting

$10 back depends on Bob’s behavior

• Alice is uncertain about Bob doing X (though trust makes A feel certain)

• Capability – can B do X? Does Bob have an income? • Reliability – will B do X? (for A?)

Why is trust relevant for cyber-security?

Trust (and related mechanisms) necessary* for cooperation & exchange under

conditions of uncertainty and vulnerability

* Trust is not necessary, but is sufficient for cooperation/exchange

Social Change = problems of trust

• Industrial Revolution– Demographic shifts:

• immigration • movement to cities

– interaction with unknown individuals

– new forms of organization: • Factories - wages• bureaucracy

– New mechanisms to facilitate exchange (Zucker, Shapiro) – new type of trust, beyond interpersonal• Credit scoring by banks• Licensing, regulation, etc

Collective Goods = problem of trust/coop

• Common pool resource systems (Ostrom 1990)– Fisheries, water resources– Tragedy of commons (Hardin 1968)

• Collective action problem (Olson 1965)– Produce collective good all value – clean

air– Free rider problem: common interests ≠

collective action– Collective action requires selective

incentives

Collective Goods = problems of trust

• Collective/Public Goods– Non-rival

• My use does not impede your use • Fuel (rival) vs newspaper (non-rival)

– Non-excludable• Once produced, all can access• Clean air, live music, roadways

– Cannot be produced by an individual• Costly (space travel)• Impossible (group discussion)

• Internet is new environment– Communication, Exchange, Cooperation

• eBay, Amazon, Facebook, MySpace, Wikipedia

• Cannot rely on existing mechanisms for reliable interaction– Who is trustworthy?– How know who interacting with?

• What signals reliability? Capability?– Evidence of problems, fraud, crime

Why is trust relevant for cyber-security?

Why is trust relevant for cyber-security?

Internet: collective good• Costly (impossible?) to produce individually

• Infrastructure owned/maintained by many diverse private and public entities;

• Vast resource investments onto infrastructure by diverse public and private entities (states, corps, orgs, individuals)

• Non-rival: given expanding bandwidth, my use does not inhibit your use

• Non-excludable (more or less): if network is open, then available to all

• Individual actions affect integrity of the system• Viruses; Unauthorized access; Non-reporting of problems

• Cooperation necessary to ensure integrity, deal with problems

T4T Experimental Study

Trust on Internet• What information matters for

“trusting” online vendors re: secure transactions?

• How reduce uncertainty about “B” doing “X”? – A is uncertain about B doing X (info prob)

• Reliability – will B do X? for A?• Capability – can B do X

Three types of Trust

1. Interpersonal trust (Fine, Gambetta, Hardin)

Trust in a specific actor based on reliability:• Past experience• Relationship and/or social ties• Reputation – social networks

(Capability of B to do X is assumed to be 100%)

Three types of Trust

Capability• Certification• Licensing /

Accreditation• Audits• Organizational

position, role, situation

Reliability• Incentives for B to

do X– Laws, contracts,

insurance• BBB seal – past

behavior• Reputation – history

2. Institutional trust (Zucker, Heimer)

3rd Party ‘Assurance’ mechanisms (Yamigishi) -

assure capability or reliability or both

Types of TrustDimensions of Information

for Reducing Uncertainty in Trust Dilemmas

CONTENT of Information

SOURCE of Information

Interpersonal Institutional

Reliability(Motivation)

Direct experienceReputation: history of past reliable behavior

Assurance mechanisms: norms, threat of peer

sanctions

Record of past behavior

Assurance mechanisms:

contracts, laws, criminal and civil

penalties

CapabilityObserved evidence of

ability

Reputation: performance history

Licensure and accreditation bodies

Certification

T4T Experimental Study

Institutional Trust on Internet• What information matters for

“trusting” online vendors re:secure transactions– Content: reliability or capability?

• History: reputation systems; feedback• Capability: certifications; tech systems;

oversight– Source: Institutional third party or other

consumers?• Independent 3rd party, non-profit• Consumer ratings from customers

• Make purchase decisions from a series of vendors on a simulated website: What’sThePrice.com

• For each vendor, decide whether to make a purchase or not, at given price (EXIT game)– Content and Source of information about

vendor– Other factors: price of good, rating of

vendor

• Not real purchases or actual products

T4T Experimental Study

Price information about item for sale:1) Vendor Suggested Price: Vendor claims that this

price is the fair market value

2) Actual Value Range: Verified estimate that value of the item is within this range.

Vendor rating based on varying information:

INFORMATION ON THE VENDOR

1 2 3 4 5

LOW HIGH

CONTENT: Capability versus Reliability

Vendor has capability to conduct secure online

transactions vs. vendor has history of conducting secure

online transactions

SOURCE: Peers versus Institutional Third Party

Information about Vendor (capability vs. reliability) comes from Peers (other consumers)

versus Institutionalized 3rd party

Institutional3rd Party Peers

Reliability

Center for OnlinePurchase

Reporting www.COPR.org

Your independentsource for reliable

information!

BuyReliable.org

Reliable information from consumers like

you!

Capability

Center for Secure Online

Transactions www.CSOT.org Your source for

independent security information!

BuySecure.orgUse the power of

consumer feedback for

online security!

T4T Experimental Setting• Between subjects design (R1 n=73

subjects) [and within subjects (R2 n=61)]– Subjects paid $5-20, mean=$12; – 12 minutes

• Additional factors– Price of item: cheap ($15) vs

expensive ($88)– Quality rating of vendor on 1-5 scale:

• low (3) vs Medium (4) vs High (5)

T4T Experimental Setting• R1: Between subjects design

– 12 rounds: 73*12=876 observations– 68% women (n=50)– 60% white (n=44)

• Influence of information– CONTENT (reliability vs. capability) and – SOURCE (Institutional 3rd party vs. consumer

rating)– Controlling for: price, rating level, and

individual characteristics• On making a purchase (i.e., trusting

vendor)

T4T Exit Game R1

0

0.2

0.4

0.6

0.8

1

Low Medium High

Rating of Vendor

Lik

elih

oo

d o

f P

urc

ha

se

Cheap Expensive

Rating p<.01Price p<.01

Price*Rating p<.05

Role of Information Content(Vendors Rated 4 or 5)

0.5

0.6

0.7

0.8

0.9

1

Reliability Capability

Lik

elih

oo

d o

f P

urc

has

e

Cheap Expensive

Price p<.01Content N.S

Price*Reliability N.S.

Role of Information Source (Vendors Rated 4 or 5)

0.5

0.6

0.7

0.8

0.9

1

Institutional Source Consumers

Lik

elih

oo

d o

f P

urc

ha

se

Cheap Expensive

Price p<.01Source p<.05

Price*Institution N.S.

Implications for Technology and Trust

• Want ‘assurance’ that system trustworthy– Third party assurance not other

consumers– No difference between

capability/reliability• Many Users already ‘trust’

infrastructure– Rely on reputation of company– Familiarity with system increases ‘trust’– Expectation that technology is secure

Limitations of Experiment

• Other aspects of Vendor • More info about actual products

• Still to do:– More subject characteristics

(e.g.,experience)– Within subjects comparisons

Three types of Trust

3. System-level Trust (Giddens)

– Multiple, overlapping mechanisms:• Institutional assurance mechanisms

– Laws, regulations, contracts

• Institutional organizations, roles– Professional groups, accrediting agencies

• Economic incentives & reputation• Social norms and cultural values

– Situational expectations, assumptions

• Experience– Individual knowledge and experience

PLACE: Privacy in Location Aware

Computing Environments Study

New IT enables:• Distributed groups and shared

resources (commons)– wikis– sensor-networks in community spaces

• How ensure/manage privacy and security?– Sensors in room, but actors have

different preferences for privacy– Group wiki with private information

PLACE Experimental Study

1. Does an individual’s own privacy behavior affect behavior toward group privacy?

2. Do people use others’ privacy behavior as a signal of trustworthiness? (i.e., does others’ privacy affect behavior toward group privacy?)

PLACE Experimental Study• Members of geographically distributed

work teams• Secure project wiki with valuable

information – Rewards for finished project-maintaining

password– Incentives to sell password

• Subjects have info re:– Own privacy– Teammates privacy

• 6 rounds (different team configurations) and decide whether to sell password or not (n=110*6 = 660 observations)

PLACE Experimental Study

• Subject Privacy level– Based on questions regarding privacy

practices (lock door; facebook practices; willingness to share private info):• Rated as Private – Moderate – Open

• Teammates Privacy – Paired with 2 different teammates in

each round: teammates privacy level• Rated as Private – Moderate – Open

Personal Privacy and Trustworthiness

PrivacyBehavior

% willingto sell password P-value

Private 48.1F =2.56

p<.10

BonferroniPriv>Open

p<.10

Moderate 40.2

Open 35.6

Impact of others’ privacyon % willing to sell

Teammate 2=

Teammate 3

Open Moderate

PrivateP-value

Open 61% 46% ---T2:

p<.001

T3:p<.001

Moderate

--- 39% ---

Private 47% 33% 21%Logistic regression model, adjusted for own privacy level and size of incentive to sell, robust standard errors.

Interaction of Subject Privacy and Team-mates Privacy

0

10

20

30

40

50

60

Teammates Open Teammates Private

Open Private

% W

illin

g t

o s

ell

pass

word

Subject Privacy Preferences

Implications for Technology and Trust• Users’ own privacy preferences matter for

group privacy behavior– more private, more likely to sell

• Others’ privacy preferences affect trust– More private, more trusted

• Interaction between subjects’ privacy and teammates privacy– Private Users seen by all as more trustworthy– Private users less trustworthy than Open– Private users distrust teammates much more

than Open users

• Users will use privacy preferences as “signals” of trustworthy behavior in group

• BUT, signals not associated with behavior

• Managing privacy in online group/commons may be more difficult than expect

• Social context matters as much (or more) than technology

Implications for Technology and Trust

The Case of Wikipedia• Wikipedia is collective good• What motivates contributions?

– Collective identity; selective incentives (reputation; sanctions)

• What are implications of motivations for contributing to Wikipedia for nature of content?– Number of contributions contributor

makes– Quality of content

• Survivability: extent of contributors content retained in Wikipedia

Wikipedia contributor motivations and content

1. Registered users will make more contributions than non-registered users

2. Registered users with many contributions will have higher reliability:

a) Registered users w/ fewer contributionsb) Non-registered (anonymous) users

4. Anonymous users will contribute less content per edit than registered users

5. Most anonymous users will contribute one time only

7. Reliability will decrease with number of contributions for anonymous users

Table 1. Population and Sample of Wikipedia Contributors by User Type and Language

LanguageUser Type

TotalRegistered Anonymous

FrenchPopulation

Sample5,6901,763

48,2111,729

53,9013,492

DutchPopulation

Sample2,8951,819

30,3221,747

33,2173,566

TotalPopulation

Sample8,5853,582

78,5333,476

87,1187,058

Wikipedia Contribution Characteristics by Type of User (unweighted)

Registered User

Anonymous User

Reliability 70.3 (28.4) 74.0** (29.5) F = 29.7**df = 1, 7,056

Log Edits 1.9** (1.4) 0.60 (.83) F = 2,058.0**df = 1, 7,056

Log Contribution size

6.9** (2.3) 4.5 (2.1) F = 1,955**df = 1, 7,056

Log Article Size 7.8 (1.1) 7.8 (1.3) F = 0.89df = 1, 7,056

French language .49 (.50) .50 (.50) F = 0.19df = 1, 7,056

Reliability by Wikipedia Contributors

1 edit 2+ edits0.65

0.66

0.67

0.68

0.69

0.7

0.71

0.72

0.73

0.74

0.75p<.01 p<.10

Reliability of Anonymous versus Registered Users by Number of Contributions*

ANONYMOUS REGISTERED

Number of contributions

Re

lia

bil

ity

Ra

te

.66

.68

.7.7

2.7

4.7

6%

Ret

aine

d

0 1 2 3 4log edits

Registered User Anonymous User

for Registered versus Anonymous UsersQuality (% retained) by Contributions (log edits)

The Case of Wikipedia• What are implications of motivations for

contributing to Wikipedia for nature of content?– Number of contributions

• Most anonymous users contribute once– Quality of content: Reliability (extent of

contributors content retained in Wikipedia)• Reliability increases with number of

contributions for registered users• Reliability decreases with number of

contributions for anonymous users• Good Samaritans (anonymous one-time

contributors) have highest reliability

• Wikipedia can provide high quality info

• Internet enables Open source production– Critical mass of contributors– Quantity effects quality

• Internet plus collective action mechanisms

• Other goods…

Implications

top related