Top Banner
CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models
38

CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

Jan 11, 2016

Download

Documents

Allyson Cole
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

CS526: Information SecurityChris Clifton

October 7, 2003

Other Policy Models

Page 2: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

2

Tranquility

• Classification changes make things difficult– Declassification violates properties– What about increasing classification of

object?

• Principle of Strong Tranquility– Security levels do not change

• Principle of Weak Tranquility– Security level changes cannot violate policy

Page 3: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

3

Follow-on work:McLean

• Problems with Bell-LaPadula• Basically, Bell-LaPadula trivial

– Definitions capture policy– Only thing interesting is showing induction

• McLean proposed very similar policy– Provably bad– But not easy to see why not okay by Bell-LaPadula

• Key: Axiomatic vs. “models real world” definitions of security

• Read discussion

Page 4: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

5

Integrity Policy

• Principles:– Separation of Duty: Single person can’t mess up the

system• No coding on live system

– Separation of function• No development on production data

– Auditing• Controlled/audited process for updating code on production

system

• This enables validated code to maintain integrity– But how do we ensure we’ve accomplished these?– Is this overkill?

Page 5: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

6

Biba’s Integrity Policy Model

• Based on Bell-LaPadula– Subject, Objects– Ordered set of Integrity Levels

• Higher levels are more reliable/trustworthy

• Information transfer path:Sequence of subjects, objects where– si r oi

– si w oi+1

Page 6: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

7

Policies

• Ring Policy– s r o– s w o i(o) ≤ i(s)– s1 x s2 i(s2) ≤ i(s1)

• Low-Water-Mark Policy– s r o i’(s) = min(i(s), i(o))– s w o i(o) ≤ i(s)– s1 x s2 i(s2) ≤ i(s1)

• Biba’s Model: Strict Integrity Policy– s r o i(s) ≤ i(o)– s w o i(o) ≤ i(s)– s1 x s2 i(s2) ≤ i(s1)

• Theorem for induction similar to Bell-LaPadula

Page 7: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

8

Lipner: Integrity Matrix

• Security Levels– Audit: AM

• Audit/management functions

– System Low: SL• Everything else

• Categories– Development– Production Code– Production Data– System Development– Software Tools

• Not related to sensitive/protected data

• Follow Bell-LaPadula security properties

Page 8: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

9

Lipner: Integrity Matrix

• Users:– Ordinary (SL,{PC, PD})– Developers (SL,{D,T})– System Programmers (SL,{SD, T})– Managers (AM,{D,PC,PD,SD,T})– Controllers (SL,{D,PC,PD,SD,T}

• Objects– Development code/data (SL,{D,T})– Production code (SL,{PC})– Production data (SL,{PC,PD})– Tools (SL,{T})– System Programs (SL,)– System Program update (SL,{SD,T})– Logs (AM, {…})

Page 9: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

10

Clark/Wilson

• Transaction based– State before/after transaction

• Consistency definitions– What states of system are acceptable

• Well-Formed Transaction– State before transaction consistent state after transaction

consistent

• Components– Constrained Data Items– Unconstrained Data Items– Integrity Verification Procedures– Transformation Procedures

Page 10: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

11

Clark/Wilson:Certification Rules

• When any IVP is run, it must ensure all CDIs are in valid state

• A TP transforms a set of CDIs from a valid state to another valid state– Must have no effect on CDIs not in set

• Relations between (user, TP, {CDI}) must support separation of duty

• All TPs must log undo information to append-only CDI

• A TP taking a UDI as input must either reject it or transform it to a CDI

Page 11: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

12

Clark/Wilson:Enforcement Rules

• System must maintain certified relations– TP/CDI sets enforced

• System must control users– user/TP/CDI mappings enforced

• Users must be authenticated to execute TP

• Only certifier of a TP may change associated CDI set

Page 12: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

13

Chinese Wall Model

• Supports confidentiality and integrity• Models conflict of interest

– object sets CD– conflict of interest sets COI

• Principle: Information can’t flow between items in a COI set– S can read O one of the following holds

O’ PreviousRead(S) such that CD(O’) = CD(O) O’, O’ PR(S) COI(O’) COI(O), or• O has been “sanitized”

See reading for more details

Page 13: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

14

Domain-specific Policy Models

• Military Confidentiality– Bell-LaPadula

• Database Integrity– Clark/Wilson

• Corporate Anti-Trust– Chinese Wall

• Clinical Information Systems

• Others?

Page 14: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

16

Problem: Consistent Policies

• Policies defined by different organizations– Different needs– But sometimes subjects/objects overlap

• Can all policies be met?– Different categories

• Build lattice combining them

– Different security levels• Need to be levels – thus must be able to order

Page 15: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

17

What is Consistent?

• Principle of autonomy:– Access allowed by security policy of a

component must be allowed by composition

• Principle of security:– Access denied by security policy of a

component must be denied by composition

• Must prove new “composed” policy meets these principles

Page 16: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

18

Interference

• Expanded notion of “write”– noninterference “single subject” view of the system– Any evidence of another subject acting corresponds

to write

• Noninterference Definition – A,G :| G’ :– G, G’ Subjects, A C (commands) cS C*, s G’:

• proj(s,cS, σi) = proj(s,πG,A(cS),σi)

• Security Policy: Set of noninterference assertions– Example: How do you prevent write-down?

Page 17: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

19

Key Theorem: Unwinding

• Induction property for interference: For policy r, non-interference secure if– output consistent: Given a command, the output seen

under r for subjects not granted rights under r for any command c is the same for any initial states that appear the same to that subject

– transition consistent: If the view of subjects under r of two states is the same, then the view of the system states after the same command applied to both is the same

– Locally respects r: View of state after a command is the same as the view before the command

Page 18: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

CS526: Information SecurityChris Clifton

October 9, 2003

Information Flow

Page 19: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

21

What is the point?Information Flow

• Policy governs flow of information– How do we ensure information flows only through governed

channels?

• State transition attempts to capture this– We may return to this later

• Next: How do we measure/capture flow?– Entropy-based analysis

• Change in entropy flow

– Confinement• “Cells” where information does not leave

– Language/compiler based mechanisms?• E.g., work of Steve Zdancewic

– Guards

Page 20: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

22

Information Flow

• Information Flow: Where information can move in the system

• How does this relate to confidentiality policy?– Confidentiality: What subjects can see what objects– Flow: Controls what subjects actually see

• Variable x holds information classified S– x, information flow class of x, is S

• Confidentiality specifies what is allowed• Information flow describes how this is enforced

Page 21: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

23

Formal Definition

• Problem: capturing all information flow– Files– Memory– Page faults– CPU use– ?

• Definition: Based on entropy– Flow from x to y (times s to t) if H(xs | yt) <

H(xs | ys)

Page 22: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

24

What is Entropy?

• Idea: Entropy captures uncertainty– H(X) = -j P(X=xj) lg P(X=xj)

• Entropy of a coin flip– H(X) = -j=heads,tails P(X=xj) lg P(X=xj)

– = -(P(heads) lg P(heads) + P(tails) lg P(tails))– = - (.5 lg .5 + .5 lg .5) = - (.5 * -1 + .5 * -1) = 1

Complete uncertainty!

• Conditional Entropy:– H(X|Y) = -j P(Y=yj)[i P(X=xi|Y=yj) lg P(X=xi|Y=yj)]

Page 23: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

25

Formal Definition

• Flow from x to y if H(xs | yt) < H(xs | ys)– -j P(ys=yj)[i P(xs=xi| ys=yj) lg P(xs=xi| ys=yj)] <

-j P(yt=yj)[i P(xs=xi| yt=yj) lg P(xs=xi| yt=yj)]

• Has the uncertainty of xs gone down from knowing yt?• Examples showing possible flow from x to y:

– y := x• No uncertainty – H(x|y) = 0

– y := x / z• Greater uncertainty (we only know x for some values of y)

– Why possible?– Does information flow from y to x?

• What if ys not defined?– Flow if H(xs | yt) < H(xs )

Page 24: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

26

Implicit flow

• Implicit flow: flow of information without assignment

• Example:– if (x =1) then y :=0 else y := 1

• This is why the entropy definition is necessary!

Page 25: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

27

How do we Manage Information Flow?

• Information flow policy– Captures security levels– Often based on confinement– Principles: Reflexivity, transitivity

• Compiler-based mechanisms– Track potential flow– Enforce legality of flows

• Execution-based mechanisms– Track flow at runtime– Validate correct

Page 26: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

28

Confinement Flow Model

• (I, O, confine, →)– I = (SCI, ≤I, joinI): Lattice-based policy– O: set of entities– →: O O indicates possible flows– confine(o): SCI SCI is allowed flow levels

• Security requirement a,b O: a → b aL ≤I bU

• Similar definitions possible for more general levels– non-lattice– non-transitive

Page 27: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

29

Compiler Mechanisms

• Declaration approach– x: integer class { A,B }– Specifies what security classes of information are allowed in x

• Function parameter: class = argument• Function result: class = parameter classes

– Unless function verified stricter

• Rules for statements– Assignment: LHS must be able to receive all classes in RHS– Conditional/iterator: then/else must be able to contain if part– Composition

• Verifying a program is secure becomes type checking!

Page 28: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

30

Execution Mechanisms

• Problem with compiler-based mechanisms– May be too strict– Valid executions not allowed

• Solution: run-time checking• Difficulty: implicit flows

– if x=1 then y:=0;– When x:=2, does information flow to y?

• Solution: Data mark machine– Tag variables– Tag Program Counter– Any branching statement affects PC security level

• Affect ends when “non-branched” execution resumes

Page 29: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

31

Data Mark: Example

• Statement involving only variables x– If PC ≤ x then statement

• Conditional involving x:– Push PC, PC = lub(PC,x), execute inside– When done with conditional statement, Pop PC

• Call: Push PC• Return: Pop PC• Halt

– if stack empty then halt execution

Page 30: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

33

Flow Control:Specialized Processor

• Security Pipeline Interface– Independent entity that checks flow– Could this manage confidentiality?– Useful for integrity!

Page 31: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

CS526: Information SecurityChris Clifton

October 16, 2003

Covert Channels

Page 32: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

35

Confinement

• Confinement Problem– Prevent a server from leaking confidential information

• Covert Channel– Path of communication not designed as

communication path

• Transitive Confinement– If a confined process invokes a second process,

invokee must be as confined as invoker

Page 33: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

36

Isolation

• Virtual machine– Simulates hardware of an (abstract?) machine– Process confined to virtual machine

• Simulator ensures confinement to VM

– Real example: IBM VM/SP• Each user gets “their own” IBM 370

• Sandbox– Environment where actions restricted to those

allowed by policy

Page 34: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

37

Covert Channels

• Storage channel– Uses attribute of shared resource

• Timing channel– Uses temporal/ordering relationship of access

to shared resource

• Noise in covert channel– Noiseless: Resource only available to

sender/receiver– Noisy: Other subjects can affect resource

Page 35: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

38

Modeling Covert Channels

• Noninterference– Bell-LaPadula approach– All shared resources modeled as subjects/objects– Let σΣ be states. Noninterference secure if s at

level l(s) ≡: ΣΣ such that• σ1 ≡ σ2 view(σ1) = view(σ2)• σ1 ≡ σ2 execution(i,σ1) ≡ execution(i,σ2)• if i only contains instructions from subjects dominating s,

view(execution(i, σ)) = view(σ)

• Information Flow analysis– Again model all shared resources

Page 36: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

39

Covert Channel Mitigation

• Can covert channels be eliminated?– Eliminate shared resource?

• Severely limit flexibility in using resource– Otherwise we get the halting problem– Example: Assign fixed time for use of

resource• Closes timing channel

• Not always realistic– Do we really need to close every channel?

Page 37: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

40

Covert Channel Analysis

• Solution: Accept covert channel– But analyze the capacity

• How many bits/second can be “leaked”

• Allows cost/benefit tradeoff– Risk exists– Limits known

• Example: Assume data time-critical– Ship location classified until next commercial satellite

flies overhead– Can covert channel transmit location before this?

Page 38: CS526: Information Security Chris Clifton October 7, 2003 Other Policy Models.

41

Example: Covert Channel Analysis