Top Banner
2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction Boaz Barak Anup Rao Ronen Shaltiel Avi Wigderson
34

2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Jan 10, 2016

Download

Documents

larya

2-source Dispersers for n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction. Boaz Barak Anup Rao Ronen Shaltiel Avi Wigderson. Plan for this talk. Introduction: Ramsey Graphs. Randomness extractors. 2-source extractors/dispersers and their relation to Ramsey graphs. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

2-source Dispersers for no(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Boaz BarakAnup Rao

Ronen ShaltielAvi Wigderson

Page 2: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Plan for this talk

Introduction: Ramsey Graphs. Randomness extractors. 2-source extractors/dispersers and

their relation to Ramsey graphs. High level overview of our

construction.

Page 3: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Ramsey Graphs K-Ramsey graphs are graphs which contain no

cliques or anti-cliques of size K. [Erdos 1947]: There exists a graph on N vertices

with no cliques (anti-cliques) of size (2+o(1))log N.

One of the first applications of the probabilistic method!

Erdos also asked whether such graphs can be explicitly constructed.

Best explicit construction [Frankl and Wilson]:)logloglog(2 NN

Page 4: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Ramsey Graphs (Viewed as adjacency matrices)

Ramsey Graph:No large monochromatic

rectangles of form X x X.Bipartite Ramsey Graph:No large monochromatic

rectangles of form X x Y.Every matrix of a bipartite Ramsey

Graph is a matrix of a Ramsey Graph.

Nonexplicit result: O(log N)Known explicit [CG85]: √N[PR04]: o(√N).[BKSSW05]: Nδ for every δ>0.Our result: exp(logδ N) for every

δ>0.

010101110

000110010

001011111

010001111

110101111

001001111

011101001

011001100

100110011

N

X

X Y

N N

Page 5: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

A new construction of Ramsey Graphs beating Frankl-Wilson

Convention N=2n. Identify {1..N}≈{0,1}n.

Theorem: There is a polynomial time computable function

R:{0,1}n x {0,1}n -> {0,1}

Such that for every δ>0 and X,Y in {0,1}n of size K=exp(logδN)=exp(nδ): R(X,Y)={0,1}.

Strongly explicit

construction

Page 6: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Yet another slide on motivation for extractors

Daddy, how do

computers get random

bits?

Do we really have to tell

that old story again?

Page 7: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Randomness Extractors: How can computers get random bits?

Computers have access to sources of randomness:

Electric noise Key strokes of user Timing of past eventsThese distributions are

“somewhat random” but not “truly random”.

Solution: Randomness Extractors

random coins

Probabilistic algorithm

input

output

Somewhat random

RandomnessExtractor

Page 8: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Notion of entropy Somewhat random distributions must “contain

randomness”. Right notion (min-entropy): the min-entropy of a

distribution X is the largest k such that Pr[X=x] ≤2-k. Unjustified assumption for this talk (with loss of

generality): entropy = min-entropy. Notation:

Dfn: rate = entropy / length = k/n. Flat distributions are uniform over some subset. entropy = log(set size).

(n-bit strings)

Xx

Page 9: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

alg

The dream extractor 1-source extractor

I=(011…010)

(n-bit strings)

ext

(m-bit strings)

Xx

Problem:No such thing!

We want of ext:

whenever H(X)>m ext(X)~Um

Page 10: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

alg

Seeded extractor [NZ93]

I=(011…010)

(n-bit strings)

ext

(m-bit strings)

Xx

Page 11: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

alg

1

2

5

3

4

Seeded extractor [NZ93]

I=(011…010)

(m-bit strings)

(n-bit strings)

ext

Xx

Good for:Good for:

Simulating BPPSimulating BPP

using weak sources.using weak sources.

Problems:Problems:

Doesn’t work for Doesn’t work for

cryptographycryptography

poly(n) outputs most outputs are random

Page 12: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

2-source extractor [SV86]

(m-bit strings)

(n-bit strings)

ext

y

xX Y

Whenever:X , Y independent

H(X), H(Y) ≥ k

ext(X,Y)~Um

Such things exist!

Page 13: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

2-source extractors and bipartite Ramsey graphs

Consider 2-source extractors for independent distributions X and Y with entropy k.

Namely a function Ext(x,y) (say into one bit).

Requirement: No size K=2k unbalanced X x Y rectangles.

2-source extractor ⇒ bipartite Ramsey graph ⇒ Ramsey graph .

010101110

000110010

001011101

010000110

110101010

001000101

011101001

011001100

100110011

N

X

Y

x

y0/1

ext

yx

Page 14: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Definitions of 2-source extractors and dispersers

A 2-source extractor for entropy k is a function Ext(x,y) such that for any two independent distributions X,Y with entropy > k the output distribution Ext(X,Y) is close to uniform.

bipartite Ramsey graph = 2-source disperser A 2-source disperser for entropy k is a

function Dis(x,y)∊{0,1} such that for any two independent distributions X,Y with entropy > k the output distribution Dis(X,Y)={0,1}.

Our main result: Disperser for entropy k=nδ for every δ>0.

Page 15: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Summary and plan Main result:

An explicit 2-source disperser for entropy k=nδ for every δ>0. (This gives Ramsey graphs that beat the Frankl-Wilson

construction which achieves δ=½). The construction and its analysis are quite involved. Disclaimer: I will oversimplify in order to try and

highlight the main ideas. Plan:

Somewhere random 2-source extractors. Block-wise sources. Testing entropy. Recursive construction of somewhere random 2-source

extractors. Run out of time… Construction of TestBlock procedure. Something about the final disperser.

Page 16: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

2-source somewhere extractor [BKSSW05]

(m-bit strings)

(n-bit strings)

SE

y

xX Y

Remainder of talk: High level description of our construction of somewhere extractor.

More ideas are needed to get a disperser

1

2

5

3

4

Important step: somewhere extractor for

entropy k=nδ with nε outputs (for

0<ε<δ)

Whenever:X , Y independent

H(X), H(Y) ≥ k

∃i : ext(X,Y)i~Um

Page 17: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Block-wise sources [CG88]

A block wise source with C blocks and entropy k is a distribution X1,..,XC s.t for all i: H(Xi|X1,..,Xi-1)>k

Entropy versus min-entropy. It’s often easier to extract from block wise sources than

from general sources. Extractor for 2 independent block-wise sources with

entropy k and C=O(log n/log k) blocks [Rao06]. Constant number C of blocks for k=nδ. Our result: achieve the same with one block-wise source

and one general source. (we refer to it as basic-ext). Important building block. Relies on

[Rao06,Raz05,Bou05].

X1 X2 X3

Page 18: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Roadmap

Goal: Given parameters 0<ε<δ construct a 2-source somewhere extractor for entropy k=nδ with nε outputs.

Following previous work on seeded extractors [NZ93,SZ94,SSZ95,…] given two sources we try to convert one of them into a block-wise source.

Page 19: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

2-source somewhere extractors for large k>>n1-

ε

Split X into t=nε blocks. Assume that k>2Cn1-ε=2Cn/t>length of C

blocks. Chain rule* ΣH(Xi|X1,..,Xi-1)≥k. ⇒ ∃i1,..,iC s.t. Xi1,..,Xic is a block-wise source with

high (n1-2ε) entropy (roughly the same rate). SE(x,y): Go over all tC=nεC candidate block-wise

sources. For each one run basic-ext and collect all nO(ε) outputs.

Also works when H(Y)=nδ.

X1 X2 X3 .. .. Xt

nX

nY

k>length of C blocks

Page 20: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

2-source somewhere extractors for small k=nδ

Split X into t=nε blocks. We say that a block Xi has

medium entropy if H(Xi|X1,..,Xi-1) ≥ k/2t (same rate). high entropy if H(Xi|X1,..,Xi-1) ≥ k/2C (More condensed ≈ rate ∙ t).

Previous slide: large k ⇒ must exist C medium blocks. Win-win analysis: one of two cases occurs*:

Exist C medium blocks. (∃block-wise source). Exists a high block. (∃block i with rate(Xi|X1,..,Xi-1) ≥ rate(X)∙

Ω(t)). Goal: In 2nd case, identify the high block and continue

recursively. Eventually we will get a block-wise source! Nevertheless, we will try implement this strategy!

X1 X2 X3 .. .. Xt

nX

nY

As k<n/t all the entropy

can be in one block

We only gets samples x,y. How

can we learn something about

the entropy of X,Y?

Page 21: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

Testing blocks for entropy (fantasy object)

We want a procedure that tests if rate(Xi|X1,..,Xi-1)≥r: TestBlockr,i(x,y) s.t.

Given 2 independent sources X,Y with sufficient entropy. If rate(Xi|X1,..,Xi-1)≥r, TestBlockr,i(X,Y) passes w.h.p. If rate(Xi|X1,..,Xi-1)<r, TestBlockr,i(X,Y) fails w.h.p.

Disclaimer: oversimplified and too good to be true. Nevertheless, we can get something with same flavor. We show*: “2-source somewhere-extractors for some

rate r (with few outputs) give TestBlock for rate r”. But we want to use TestBlock inside such a

construction!?

X1 X2 X3 .. .. Xt

nX

nY

Page 22: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Recursive construction of 2-source somewhere extractor

Given entropy rate r, assume by recursion that we have a 2-source somewhere extractor SE’ for larger rate r’≈r∙t.

⇒ We can run TestBlock with rate r’. (We can test if a block Xi in a source X with rate r is a high block).

Construction of SE(x,y) (for rate r) Go over all tC=nεC candidate block-wise sources. For

each one run basic-ext and collect all outputs. Solves case of C medium entropy blocks. For i=1..t, run TestBlockr’,i(x,y) to see if Xi has high ent. Run SE’(xi,y) on the first i on which TestBlock passes. Solves the case of a high entropy block.

To operate on rate r we only require testing

high blocks (rate r’≈r∙t) allows

recursion.

Page 23: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Summary and plan We’ve seen: Construction of 2-source

somewhere random extractor for entropy k=nδ with nε outputs (for any constants 0<ε<δ).

Component: TestBlock a procedure that tests whether rate(Xi|X1,..,Xi-1)≥r.

Next: Precise properties of TestBlock. How to construct TestBlock from a 2-source

somewhere extractor with nε outputs. Subsources.

Page 24: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Subsources

Let X be a flat distribution. A distribution X’ is a subsource of X if X’ is flat and X’⊆X.

X’ has deficiency d if |X’|/|X|≥ 2-d.Fact: If H(X) ≥ k and X’ is a subsource of X

with deficiency d then H(X’) ≥ k – d.

X

X’

Page 25: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Subsource 2-source extractors (succeed on some subsource)

A subsource 2-source extractor for entropy k is a function Ext(x,y) s.t. for any two independent distributions X,Y with entropy > k there exist large independent subsources X’,Y’ s.t. Ext(X’,Y’) is close to uniform.

The extractor is not required to succeed on X,Y but rather on some large subsources X’,Y’.

A subsource 2-source extractor is a 2-source disperser.

Extend definition to subsource somewhere extractor.

010101110

000110010

001011111

010001101

110101011

001001111

011101001

011001100

100110011

N

X

Y

X’

Y’

Page 26: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

Testing blocks for entropy. (precise version with subsources)

TestBlockr,i(x,y) (tests if rate(Xi|X1,..,Xi-1)≥r) Given two independent sources X,Y with

sufficient entropy. If Rate(Xi|X1,..,Xi-1)≥r, ∃subsources X’,Y’ s.t.

TestBlockr,i(X’,Y’) passes w.h.p. H(X’i|X’1,..,X’i-1)≈H(Xi|X1,..,Xi-1), H(Y’)≈H(Y).

If Rate(Xi|X1,..,Xi-1)<r, ∃subsources X’,Y’ s.t. TestBlockr,i(X’,Y’) fails w.h.p. For j>i, H(X’j|X’1,..,X’j-1)≈H(Xj|X1,..,Xj-1), H(Y’)≈H(Y).

X1 X2 X3 .. .. Xt

nX

nY

Page 27: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Recursive construction of subsource 2-source somewhere extractor

Given entropy rate r, assume by recursion that we have a subsource 2-source somewhere extractor SE’ for larger rate r’≈r∙t.

⇒* We can run TestBlock with rate r’. Construction of SE(x,y).

Go over all tC=nεC candidate block-wise sources. For each one run basic-ext and collect all outputs.

Solves case of C medium entropy blocks. For i=1..t, run TestBlockr’,i(x,y). Run SE’(xi,y) on the first i on which TestBlock passes. Solves the case of high entropy block on a subsource.

Page 28: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

Testing blocks for entropy. The challenge response method [BKSSW05]

TestBlockr,i(x,y) tests whether rate(Xi|X1,..,Xi-1)≥r C=SE’(xi,y) Rj=poly-SE(x,y)j

TestBlock passes if ∀j : Rj≠C.

X1 X2 X3 .. .. Xt

nX

nY

If Rate(Xi|X1,..,Xi-1)≥r, ∃subsources X’,Y’ s.t. TestBlockr,i (X’,Y’) passes w.h.p. H(X’i|X’1,..,X’i-1)≈H(Xi|X1,..,Xi-1), H(Y’)≈H(Y).

Rate(Xi|X1,..,Xi-1)≥r ⇒ ∃k : Ck is random ⇒ H(C) large. Special properties of poly-SE ⇒ ∀j : H(C|Rj) large. w.h.p ∀j : Rj≠C.

C1 C2 C3C

R1 R2 R3

Component: specially designed somewhere extractor

with poly(n) outputs and additional properties poly-SE(x,y)j=Vaz(E(x,j),E(y,j)) Different trom [BKSSW05]

We didn’t use

subsources?!

Component: somewhere extractor

for rate r with nε

outputs of length nε |C|=nε∙ nε=n2ε

Page 29: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

Testing blocks for entropy. The challenge response method [BKSSW05]

TestBlockr,i(x,y) tests whether rate(Xi|X1,..,Xi-1)≥r C=SE(xi,y) Rj=poly-SE(x,y)j

TestBlock passes if ∀j : Rj≠C.

X1 X2 X3 .. .. Xt

nX

nY

C1 C2 C3C

R1 R2 R3

Component: somewhere extractor

for rate r with nε

outputs of length nε |C|=nε∙ nε=n2ε

Component: specially designed somewhere extractor

with poly(n) outputs and additional properties poly-SE(x,y)j=Vaz(E(x,j),E(y,j)) Different from [BKSSW05]

If Rate(Xi|X1,..,Xi-1)<r, ∃subsources X’,Y’ s.t. TestBlockr,i(X’,Y’) fails w.h.p. For j>i, H(X’j|X’1,..,X’j-1)≈H(Xj|X1,..,Xj-1), H(Y’)≈H(Y).

Rate(Xi)<r ⇒ can fix Xi and still have entropy left in X’. C is a function of Y ⇒ ∃subsource Y’ s.t. C is constant. X’,Y’ independent ⇒ ∃j : Rj is random ⇒ Pr[Rj=C]≥2-|C|

⇒ Can lose |C| bits and go to subsources on which Rj=C.

Use special properties of poly-SE

Page 30: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Story so far

Goal: 2-source disperser for entropy k=nδ. Component: subsource 2-source somewhere

extractor for entropy k=nδ with nε outputs. Recursive win-win construction.

Component: TestBlockr,i(x,y) tests if rate(Xi|X1,..,Xi-1)≥r Constructed using the subsource 2-source somewhere

extractor (from the recursion hypothesis).

Page 31: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Constructing a (1-output) disperser Having constructed a (subsource)

somewhere extractor for entropy nδ we can run TestBlock to test whether Xi is a medium entropy block.

We observe that on a medium block Xi s.t. Pr[TestBlocki(X,Y) passes] > 1-o(1). (on

subsource) Pr[TestBlocki(X,Y) fails] > exp(-nε) (on

same subsource) TestBlock outputs two different values! This is close to a disperser!

Page 32: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

nX

High level idea for disperser

Disperser(x,y): for i=1 to t Test the entropy of i’th block. If block has low entropy continue. If block has high entropy recurs on it. (output

Disperser(xi,y).) If block has medium entropy run TestBlock on the

block and output pass/fail.This requires designing a more complicated

TestBlock function with 4 possible outputs.Construction and analysis use ideas similar to

previous construction but are more involved.

X1 X2 X3 .. .. Xt

nX

nY

Page 33: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

Conclusions and open problems

We were able to construct a 2-source disperser for entropy no(1).

Equivalently K-Ramsey graphs for K=exp(logo(1)N).

Open problems:

Construct a 2-source extractor for entropy rate < 0.4999. (Improve [Bou05]).

Construct a disperser for entropy polylog n. Main open problem: Simplify construction and

proof.

Page 34: 2-source Dispersers for  n o(1) entropy and Ramsey graphs beating the Frankl-Wilson construction

That’s it