A Type System for Expressive Security Policies David Walker Cornell University
Jan 13, 2016
A Type System for
Expressive Security Policies
David WalkerCornell University
PoPL ’00
David Walker, Cornell
University2
Extensible Systems
• Extensible systems are everywhere: – web browsers, extensible operating
systems, servers and databases
• Critical Problem: Security
CodeSystemInterface
Download, Link & Execute
PoPL ’00
David Walker, Cornell
University3
Certified Code
• Attach annotations (types, proofs, ...) to untrusted code
• Annotations make verification of security properties feasible
UntrustedCode
SystemInterface
Download &VerifyAnnotations
Link & Execute
SecureCode
PoPL ’00
David Walker, Cornell
University4
Certifying Compilation
• An Advantage– Increased Trustworthiness
• verification occurs after compilation• compiler bugs will not result in security
holes
• A Disadvantage– Certificates may be difficult to produce
PoPL ’00
David Walker, Cornell
University5
Producing Certified Code
High-level Program
Compile
Optimize
AnnotatedProgram
Transmit
• Certificate production must be automated
• Necessary components:1) a source-level programming language2) a compiler to compile, annotate, and optimize source programs3) a transmission language that
certifies security properties
user annotations
PoPL ’00
David Walker, Cornell
University6
So Far ...Type SafeHigh-level Program
Compile
Optimize
TypedProgram
Transmit
1) a strongly typed source-level programming language
2) a type-preserving compiler to compile, annotate, and optimize source programs
3) a transmission language that certifies type-safety properties
types
PoPL ’00
David Walker, Cornell
University7
Examples• Proof-Carrying Code [Necula & Lee]
– compilers produce type safety proofs• Typed Assembly Language [Morrisett, Walker, et al]
– guarantees type safety properties• Efficient Code Certification [Kozen]
– uses typing information to guarantee control-flow and memory safety properties
• Proof-Carrying Code [Appel & Felty]
– construct types from low-level primitives
PoPL ’00
David Walker, Cornell
University8
Conventional Type Safety
• Conventional types ensure basic safety:– basic operations performed correctly– abstraction/interfaces hide data
representations and system code
• Conventional types don't describe complex policies– eg: policies that depend upon history
• Melissa virus reads Outlook contacts list and then sends 50 emails
PoPL ’00
David Walker, Cornell
University9
Security in Practice
• Security via code instrumentation– insert security state and check dynamically– use static analysis to minimize run-time
overhead– SFI [Wahbe et al], – SASI [Erlingsson & Schneider], – Naccio [Evans & Twyman], – [Colcombet & Fradet], …
PoPL ’00
David Walker, Cornell
University10
This Paper
• Combines two ideas:– certifying compilation– security via code instrumentation
• The Result:– a system for secure certified code
• high-level security policy specifications• an automatic translation into low-level code• security enforced by static & dynamic checking
PoPL ’00
David Walker, Cornell
University11
Strategy
• Security Automata specify security properties [Erlingsson & Schneider]
• Compilation inserts typing annotations & dynamic checks where necessary
• A dependently-typed target language provides a framework for verification– can express & enforce any security automaton
policy– provably sound
PoPL ’00
David Walker, Cornell
University12
Security Architecture
High-level Program
Compile
Optimize
SecureTypedProgram
Transmit
SecurityAutomaton
SecureTypedInterface
Type Check
System Interface
Annotate
SecureExecutable
PoPL ’00
David Walker, Cornell
University13
Security Automata• A general mechanism for specifying security policies • Enforce any safety property
– access control policies: • “cannot access file foo”
– resource bound policies: • “allocate no more than 1M of memory”
– the Melissa policy: • “no network send after file read”
PoPL ’00
David Walker, Cornell
University14
Example
• Policy: No send operation after a read operation• States: start, has read, bad• Inputs (program operations): send, read• Transitions (state x input -> state):
– start x read(f) -> has read
starthasread
read(f)
send read(f)
bad send
PoPL ’00
David Walker, Cornell
University15
Example Cont’d
starthasread
read(f)
send read(f)
bad send
% untrusted program % s.a.: start statesend(); % ok -> startread(f); % ok -> has read send(); % bad, security violation
• S.A. monitor program execution• Entering the bad state = security
violation
PoPL ’00
David Walker, Cornell
University16
Enforcing S.A. Specs• Every security-relevant operation
has an associated function: checkop
• Trusted, provided by policy writer
• checkop implements the s.a. transition function
checksend (state) = if state = start then start else bad
PoPL ’00
David Walker, Cornell
University17
Enforcing S.A. Specs
• Rewrite programs:
let next_state = checksend(current_state) inif next_state = bad then
haltelse % next state is ok
send()
send()
PoPL ’00
David Walker, Cornell
University18
Questions• How do we verify instrumented
code?– is this safe?
let next_state = checksend(other_state) inif next_state = bad then
haltelse % next state is ok
send()
• Can we optimize certified code?
PoPL ’00
David Walker, Cornell
University19
Verification• Basic types ensure standard type safety
– functions and data used as intended and cannot be confused
– security checks can’t be circumvented
• Introduce a logic into the type system to express complex invariants
• Use the logic to encode the s.a. policy• Use the logic to prove checks
unnecessary
PoPL ’00
David Walker, Cornell
University20
Target Language Types
• Predicates:– describe security states– describe automaton transitions– describe dependencies between
values
• Function types include predicates so they can specify preconditions: – foo: [1,2,P1(1,2),P2(1)] . 1 -> 2
PoPL ’00
David Walker, Cornell
University21
Secure Functions• Each security-relevant function has a type
specifying 3 additional preconditions• eg: the send function:
– P1: in_state(current_state)
– P2: transitionsend(current_state,next_state)
– P3: next_state bad
Pre: P1 & P2 & P3
Post: in_state(next_state)
• The precondition ensures calling send won’t result in a security violation
PoPL ’00
David Walker, Cornell
University22
Run-time Security Checks
• Dynamic checks propagate information into the type system
• eg: checksend(state) Post: next_state. transitionsend(state,next_state)
& result = next_state
• conditional tests: if state = bad then % assume state = bad ... else % assume state bad ...
PoPL ’00
David Walker, Cornell
University23
Example
% P1: in_state(current_state)
let next_state = check_send(current_state) in
% P2: transitionsend(current_state,next_state)
if next_state = bad then haltelse
% P3: next_state bad send() % P1 & P2 & P3 imply send is ok
PoPL ’00
David Walker, Cornell
University24
Optimization
• Analysis of s.a. structure makes redundant check elimination possible– eg:
– supply the type checker with the fact transitionsend(start,start) and verify:
if current = start then send(); send (); send (); …
starthasread
read(f)
send read(f)
bad send
PoPL ’00
David Walker, Cornell
University25
Related Work
• Program verification– abstract interpretation, data flow & control flow
analysis, model checking, soft typing, verification condition generation & theorem proving, ...
• Dependent types in compiler ILs– Xi & Pfenning, Crary & Weirich, ...
• Security properties of typed languages– Leroy & Rouaix, ...
PoPL ’00
David Walker, Cornell
University26
Summary
• A recipe for secure certified code:– types
• ensure basic safety• prevent dynamic checks from being circumvented• provide a framework for reasoning about programs
– security automata• specify expressive policies• dynamic checking when policies can’t be proven
statically