Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition
Perfect and Statistical Secrecy,probabilistic algorithms,
Definitions of Easy and Hard,1-Way FN -- formal definition
Private-key Encryption, revisited
• Key generation alg tosses coins key k
• ENCk(m,r) c
• DECk’(c’) m’
• Correctness: for all k, r, m,DECk(ENCk(m,r))=m
Perfect secrecy, revisited
Pr[ENCk(m1) = c]= P r[ENCk(m2) = c]
Encryption scheme is perfectly secure if for every m1 and m2 and every c,
Where prob is over key k from key-generation, and the coin-tosses of ENC.
Shannon’s secrecy
Let D be a distribution on M. An encryption scheme satisfies Shannon’s secrecy with respect to D if for every m in D and every c,
Where the probability is taken over D, key-gen, and coin-tosses of Enc.
Pr[D =mjENCk(D) = c]= Pr[D =m]
Claim: perfect secrecy holds if and only if Shannon’s secrecy holds
We prove perfect secrecy implies Shannon Secrecy, the converse is similar. (prove it!)
Proof
Proof: recall Bayes’s law: let E and F be events:
Pr[E jF ]= P r[F jE ]¢P r[E ]P r[F ]
Pr[D =mjENCk(D) = c]=
= P r[E N Ck (D )=cjD=m]¢P r[D=m]P r[E N Ck (D )=c]
Proof(cont)
So, we need to show that:
In other words we need to show that:
Pr[ENCk(D) = cjD =m]= P r[ENCk(D) = c]
Pr[ENCk(m) = c]= P r[ENCk(D) = c]
Proof(cont)
(by perfect
Secrecy)
Pr[ENCk(D) = c]=
=X
mi 2D
P r[ENCk(mi ) = c]¢Pr[D =mi ]
= Pr[ENCk(m) = c]¢X
mi 2D
Pr[D =mi ]
= Pr[ENCk(m) = c]¢1:
QED.
CLAIM: If an encryption scheme is perfectly secure, then the number of keys is at least
the size of the message space.
Proof: consider a cyphrertext c for some msg m. By perfect secrecy, c could be an encr. of every message. That is, for every m’ in D, there exists a k’ that encrypts/decrypts c to m’. Thus, c is decryptable to any message, using different keys. But the number of decryptions of c is at most the number of keys.
This is bad news – we need to relax the notion of secrecy to be
able to use smaller keys
• How do we relax the notion?
A first attempt
• We will relax the notion of perfect secrecy.
• What about “statistical” security?
Let X an Y be random variables over a finite set S. X and Y are called -close if for very event T µ S
jP r[X 2 T] { P r[Y 2 T]j · ²
T is called a statistical test.
²
Statistical Secrecy
• DEFINITION: Encryption scheme is statistically -secure if for every two messages m1, and m2 the random variables (over the choice of key) ENCk(m1) and ENCk(m2) are -close.
²
²
• Still can not go much below Shannon’s bound
• HW: show this!
Computational assumptions
• Probabilistic TM definition: (additional RANDOM tape which is read-once)
• P – languages that can be decided in time nc for some c
• NP – languages that can be verified (i.e. given a poly-size witness w we can in time nc check. Thus, there exists a relation R(x,w) is in P.
For us, every algorithm is public
• (no “security from obscurity”)
• What do we have as secrets? (randomness!)
• The need to define PPT (probabilistic polynomial time)
How do we deal with coin-toss?
• RP (“randomized poly”) – languages recognizable in polynomial time with 1-sided error
8x 2 L Pr[M (x) accepts ]> 12
8x =2 L Pr[M (x) accepts ]= 0
Dealing with RP
FACT: P µ RP µ NP
WHY?
First part: Can ignore coins
Second part: Can “guess” the correct coins
Dealing with RP: amplification
8x =2 L Pr[M (x) accepts ]= 0
Let ²(n) = 1poly(n)
Suppose we are given machine M s.t.:
8x 2 L Pr[M (x) accepts ]> ²(n)
How do we amplify?
Design new machine M’ that runs machine M k times with new coin-flips each time. M’ accepts iff any run of M accepts, otherwise M’ rejects.
What is the probability of acceptance of M’ for x in L?
Computing prob of succ of M’
x 2 L with prob. 1¡ ²(n)M errors on
Recall thatex ¸ 1+x thus,e¡ ²(n) ¸ 1¡ ²(n)
Pr(M'(x 2 L) rejects) ·· [Pr(M (x 2 L) rejects]k
· (1¡ ²(n))k
· e¡ ²(n)kSo, make k sufficiently big!
2-sided error
• BPP (“Bounded probabilistic poly-time”) – languages recognizable in polynomial time with 2-sided error
8x 2 L Pr[M (x) accepts ] ¸ 23
8x =2 L Pr[M (x) accepts ] · 13
BPP Amplification
We wish to design new machine M’ s.t.:
8x2 L Pr[M'(x) accepts ]¸ 1¡ 2¡ p(n)
8x =2 L Pr[M'(x) accepts ] · 2¡ p(n)
What do we do?
• Machine M’ runs machine M many times with independent coin-flips each time, and takes a MAJORITY vote.
• If M’ runs M k times, what is the probability that M’ makes a mistake?
Chernoff Bound
Let X1 … Xm be independent 0,1 random variables with the same probability distribution. Then:
Pr
0
@X
i21;;m
X i ¸ (1+¯)E
1
A · e¡ 12 ¯
2E
Back to BPP
Let Xi=1 if M makes a mistake on the ith run.
Pr(X i =1) · 13 (by def of BPP)
Let’s be pessimistic and make it equal to 1/3
E = k3
Plugging in
If we make a majority vote, we fail if
PX i ¸ k
2
Take ¯ = 12, plug into Cherno®bound:
Pr(X
X i >32k3) · e¡ 1
8k3
Non-uniformity, called P/poly
• Sometimes, we need to talk about different input length (circuits, or a TM with “advice” for every input length)
SOME QUESTOIONS:
• Does it belong to NP?
• How easy is it?
• Does it need coin-flips?
P/poly
• does P/poly belong to NP?
• guess advice which makes x accept, if there is such an advice accept.
• Why does it not work?
P/poly (cont)
• How powerful is this class?
• It includes undecidable languages.
• Consider some standard enumeration of TM. Define poly advice for each input length to encode decidability of the i’th machine.
Proof of Adellman’s thm.
We assume BPP machine M s.t.
8x 2 L Pr[M (x) accepts ] ¸ 1¡ 2¡ (n+1)
8x =2 L Pr[M (x) accepts ] · 2¡ (n+1)
Proof that we don’t need coins
Pr[M makes an error ] · 2¡ (n+1)
We want to show that there always exists a random string r(n) that works for all strings of length n.
How do we show it???
Proof (cont)
M uses r(n) random bits
x1
x2n
r2r ( n )
1 makes a mistake
0 correct
r1
If there is a column which is good for all x we are done!
Proof – the punch line
Number of 1’s ·(number of rows) (number of 1’s per row)
Thus, the average number of 1’s per column
=2n ¢(2¡ (n+1) ¢2r (n))
· 2n ¢(2¡ (n +1)¢2r (n ) )2r (n ) = 1
2
Thus, some column must have all 0’s.
Terminology
• For us, easy means polynomial time with coin-flips
• Fast poly-time
• Probably fast expected poly-time
Terminology (cont)
• Las Vegas always correct, probably fast
• Monte-carlo always fast, probably correct [1-sided (i.e. RP or co-RP)]
• Atlantic City always fast, probably correct [2-sided (i.e. BPP)]
Back to Hardness
• For us, BPP is easy.
• But in BPP we can amplify decision of any language that is bounded away from half by 1/poly
• So we must define “hard” languages that can not be decided even with 1/poly advantage.
• So, how do we define “less” then 1/poly?
Try 1
8c8n g(n) < 1nc
Not good. Consider:
g(n) = 1002n ; f (n) = 1
n5
n = 2 ; g(2) = 1=25; f (2) = 1=32
Some facts
• If g(n) is negligible then g(n)*poly(n) is still negligible! (HW: prove it!)
• We sometimes call any > 1/poly function “noticeable” (since we can amplify)
• There are functions that are NEITHER negligible NOR noticeable. Then we need to deal with “families” of input length, and typically resort to non-uniform proofs.