Top Banner
Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition
43

Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Dec 19, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Perfect and Statistical Secrecy,probabilistic algorithms,

Definitions of Easy and Hard,1-Way FN -- formal definition

Page 2: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Private-key Encryption, revisited

• Key generation alg tosses coins key k

• ENCk(m,r) c

• DECk’(c’) m’

• Correctness: for all k, r, m,DECk(ENCk(m,r))=m

Page 3: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Perfect secrecy, revisited

Pr[ENCk(m1) = c]= P r[ENCk(m2) = c]

Encryption scheme is perfectly secure if for every m1 and m2 and every c,

Where prob is over key k from key-generation, and the coin-tosses of ENC.

Page 4: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Shannon’s secrecy

Let D be a distribution on M. An encryption scheme satisfies Shannon’s secrecy with respect to D if for every m in D and every c,

Where the probability is taken over D, key-gen, and coin-tosses of Enc.

Pr[D =mjENCk(D) = c]= Pr[D =m]

Page 5: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Claim: perfect secrecy holds if and only if Shannon’s secrecy holds

We prove perfect secrecy implies Shannon Secrecy, the converse is similar. (prove it!)

Page 6: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof

Proof: recall Bayes’s law: let E and F be events:

Pr[E jF ]= P r[F jE ]¢P r[E ]P r[F ]

Pr[D =mjENCk(D) = c]=

= P r[E N Ck (D )=cjD=m]¢P r[D=m]P r[E N Ck (D )=c]

Page 7: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof(cont)

So, we need to show that:

In other words we need to show that:

Pr[ENCk(D) = cjD =m]= P r[ENCk(D) = c]

Pr[ENCk(m) = c]= P r[ENCk(D) = c]

Page 8: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof(cont)

(by perfect

Secrecy)

Pr[ENCk(D) = c]=

=X

mi 2D

P r[ENCk(mi ) = c]¢Pr[D =mi ]

= Pr[ENCk(m) = c]¢X

mi 2D

Pr[D =mi ]

= Pr[ENCk(m) = c]¢1:

QED.

Page 9: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

CLAIM: If an encryption scheme is perfectly secure, then the number of keys is at least

the size of the message space.

Proof: consider a cyphrertext c for some msg m. By perfect secrecy, c could be an encr. of every message. That is, for every m’ in D, there exists a k’ that encrypts/decrypts c to m’. Thus, c is decryptable to any message, using different keys. But the number of decryptions of c is at most the number of keys.

Page 10: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

This is bad news – we need to relax the notion of secrecy to be

able to use smaller keys

• How do we relax the notion?

Page 11: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

A first attempt

• We will relax the notion of perfect secrecy.

• What about “statistical” security?

Let X an Y be random variables over a finite set S. X and Y are called -close if for very event T µ S

jP r[X 2 T] { P r[Y 2 T]j · ²

T is called a statistical test.

²

Page 12: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Statistical Secrecy

• DEFINITION: Encryption scheme is statistically -secure if for every two messages m1, and m2 the random variables (over the choice of key) ENCk(m1) and ENCk(m2) are -close.

²

²

• Still can not go much below Shannon’s bound

• HW: show this!

Page 13: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Computational assumptions

• Probabilistic TM definition: (additional RANDOM tape which is read-once)

• P – languages that can be decided in time nc for some c

• NP – languages that can be verified (i.e. given a poly-size witness w we can in time nc check. Thus, there exists a relation R(x,w) is in P.

Page 14: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

For us, every algorithm is public

• (no “security from obscurity”)

• What do we have as secrets? (randomness!)

• The need to define PPT (probabilistic polynomial time)

Page 15: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

How do we deal with coin-toss?

• RP (“randomized poly”) – languages recognizable in polynomial time with 1-sided error

8x 2 L Pr[M (x) accepts ]> 12

8x =2 L Pr[M (x) accepts ]= 0

Page 16: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Dealing with RP

FACT: P µ RP µ NP

WHY?

First part: Can ignore coins

Second part: Can “guess” the correct coins

Page 17: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Dealing with RP: amplification

8x =2 L Pr[M (x) accepts ]= 0

Let ²(n) = 1poly(n)

Suppose we are given machine M s.t.:

8x 2 L Pr[M (x) accepts ]> ²(n)

Page 18: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

How do we amplify?

Design new machine M’ that runs machine M k times with new coin-flips each time. M’ accepts iff any run of M accepts, otherwise M’ rejects.

What is the probability of acceptance of M’ for x in L?

Page 19: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Computing prob of succ of M’

x 2 L with prob. 1¡ ²(n)M errors on

Recall thatex ¸ 1+x thus,e¡ ²(n) ¸ 1¡ ²(n)

Pr(M'(x 2 L) rejects) ·· [Pr(M (x 2 L) rejects]k

· (1¡ ²(n))k

· e¡ ²(n)kSo, make k sufficiently big!

Page 20: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

2-sided error

• BPP (“Bounded probabilistic poly-time”) – languages recognizable in polynomial time with 2-sided error

8x 2 L Pr[M (x) accepts ] ¸ 23

8x =2 L Pr[M (x) accepts ] · 13

Page 21: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

BPP Amplification

We wish to design new machine M’ s.t.:

8x2 L Pr[M'(x) accepts ]¸ 1¡ 2¡ p(n)

8x =2 L Pr[M'(x) accepts ] · 2¡ p(n)

Page 22: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

What do we do?

• Machine M’ runs machine M many times with independent coin-flips each time, and takes a MAJORITY vote.

• If M’ runs M k times, what is the probability that M’ makes a mistake?

Page 23: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Chernoff Bound

Let X1 … Xm be independent 0,1 random variables with the same probability distribution. Then:

Pr

0

@X

i21;;m

X i ¸ (1+¯)E

1

A · e¡ 12 ¯

2E

Page 24: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Back to BPP

Let Xi=1 if M makes a mistake on the ith run.

Pr(X i =1) · 13 (by def of BPP)

Let’s be pessimistic and make it equal to 1/3

E = k3

Page 25: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Plugging in

If we make a majority vote, we fail if

PX i ¸ k

2

Take ¯ = 12, plug into Cherno®bound:

Pr(X

X i >32k3) · e¡ 1

8k3

Page 26: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

What do we know?

P µ RP µ BPP

What about BPP vs NP?

Page 27: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Non-uniformity, called P/poly

• Sometimes, we need to talk about different input length (circuits, or a TM with “advice” for every input length)

SOME QUESTOIONS:

• Does it belong to NP?

• How easy is it?

• Does it need coin-flips?

Page 28: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

P/poly

• does P/poly belong to NP?

• guess advice which makes x accept, if there is such an advice accept.

• Why does it not work?

Page 29: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

P/poly (cont)

• How powerful is this class?

• It includes undecidable languages.

• Consider some standard enumeration of TM. Define poly advice for each input length to encode decidability of the i’th machine.

Page 30: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Do we need randomness for P/poly?

Adellman: BPP µ P/poly

Page 31: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof of Adellman’s thm.

We assume BPP machine M s.t.

8x 2 L Pr[M (x) accepts ] ¸ 1¡ 2¡ (n+1)

8x =2 L Pr[M (x) accepts ] · 2¡ (n+1)

Page 32: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof that we don’t need coins

Pr[M makes an error ] · 2¡ (n+1)

We want to show that there always exists a random string r(n) that works for all strings of length n.

How do we show it???

Page 33: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof (cont)

M uses r(n) random bits

x1

x2n

r2r ( n )

1 makes a mistake

0 correct

r1

If there is a column which is good for all x we are done!

Page 34: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Proof – the punch line

Number of 1’s ·(number of rows) (number of 1’s per row)

Thus, the average number of 1’s per column

=2n ¢(2¡ (n+1) ¢2r (n))

· 2n ¢(2¡ (n +1)¢2r (n ) )2r (n ) = 1

2

Thus, some column must have all 0’s.

Page 35: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Terminology

• For us, easy means polynomial time with coin-flips

• Fast poly-time

• Probably fast expected poly-time

Page 36: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Terminology (cont)

• Las Vegas always correct, probably fast

• Monte-carlo always fast, probably correct [1-sided (i.e. RP or co-RP)]

• Atlantic City always fast, probably correct [2-sided (i.e. BPP)]

Page 37: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Back to Hardness

• For us, BPP is easy.

• But in BPP we can amplify decision of any language that is bounded away from half by 1/poly

• So we must define “hard” languages that can not be decided even with 1/poly advantage.

• So, how do we define “less” then 1/poly?

Page 38: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Examples of less then 1/poly

g(n) = 1n log n

g(n) = 1002n

Page 39: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Try 1 – “negligible function”

8c8n g(n) < 1nc

Page 40: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Try 1

8c8n g(n) < 1nc

Not good. Consider:

g(n) = 1002n ; f (n) = 1

n5

n = 2 ; g(2) = 1=25; f (2) = 1=32

Page 41: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Try 2

• After n gets sufficiently big, it works!

8c8n after n gets big, then g(n) < 1nc

Page 42: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Formally, now

g(¢) is negligible if:

8c> 0 9Nc 8n >Nc g(n) < 1nc

Page 43: Perfect and Statistical Secrecy, probabilistic algorithms, Definitions of Easy and Hard, 1-Way FN -- formal definition.

Some facts

• If g(n) is negligible then g(n)*poly(n) is still negligible! (HW: prove it!)

• We sometimes call any > 1/poly function “noticeable” (since we can amplify)

• There are functions that are NEITHER negligible NOR noticeable. Then we need to deal with “families” of input length, and typically resort to non-uniform proofs.