Top Banner
Richard Baraniuk Rice University compressive nonsensing
76

Richard Baraniuk Rice University compressive nonsensing.

Dec 17, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Richard Baraniuk Rice University compressive nonsensing.

Richard Baraniuk

Rice University

compressivenonsensing

Page 2: Richard Baraniuk Rice University compressive nonsensing.

Chapter 1

The Problem

Page 3: Richard Baraniuk Rice University compressive nonsensing.

challenge 1data too expensive

Page 4: Richard Baraniuk Rice University compressive nonsensing.

Case in Point: MR Imaging

• Measurements very expensive

• $1-3 million per machine

• 30 minutes per scan

Page 5: Richard Baraniuk Rice University compressive nonsensing.

Case in Point: IR Imaging

Page 6: Richard Baraniuk Rice University compressive nonsensing.

challenge 2too much data

Page 7: Richard Baraniuk Rice University compressive nonsensing.

Case in Point: DARPA ARGUS-IS

• 1.8 Gpixel image sensor– video rate output:

444 Gbits/s– comm data rate:

274 Mbits/s

factor of 1600xway out of reach ofexisting compressiontechnology

• Reconnaissancewithout conscience– too much data to transmit to a ground station– too much data to make effective real-time decisions

Page 8: Richard Baraniuk Rice University compressive nonsensing.

Chapter 2

The Promise

Page 9: Richard Baraniuk Rice University compressive nonsensing.

COMPRESSIVE

SENSING

Page 10: Richard Baraniuk Rice University compressive nonsensing.

innovation 1sparse signal models

Page 11: Richard Baraniuk Rice University compressive nonsensing.

Sparsity

pixels largewaveletcoefficients

(blue = 0)

widebandsignalsamples

largeGabor (TF)coefficients

time

frequ

en

cy

Page 12: Richard Baraniuk Rice University compressive nonsensing.

Sparsity

pixels largewaveletcoefficients

(blue = 0)

sparsesignal

nonzeroentries

nonlinear signal model

Page 13: Richard Baraniuk Rice University compressive nonsensing.

innovation 2dimensionalityreduction for sparse signals

Page 14: Richard Baraniuk Rice University compressive nonsensing.

Dimensionality Reduction

• When data is sparse/compressible, can directly acquire a compressed representation with no/little information loss through linear dimensionality reduction

measurements sparsesignal

nonzeroentries

Page 15: Richard Baraniuk Rice University compressive nonsensing.

Stable Embedding• An information preserving projection preserves

the geometry of the set of sparse signals

• SE ensures that

K-dim subspaces

Page 16: Richard Baraniuk Rice University compressive nonsensing.

Stable Embedding• An information preserving projection preserves

the geometry of the set of sparse signals

• SE ensures that

Page 17: Richard Baraniuk Rice University compressive nonsensing.

Random Embedding is Stable

• Measurements = random linear combinations of the entries of

• No information loss for sparse vectors whp

measurements sparsesignal

nonzeroentries

Page 18: Richard Baraniuk Rice University compressive nonsensing.

innovation 3sparsity-basedsignal recovery

Page 19: Richard Baraniuk Rice University compressive nonsensing.

Signal Recovery

• Goal: Recover signal from measurements

• Problem: Randomprojection not full rank(ill-posed inverse problem)

• Solution: Exploit the sparse/compressiblegeometry of acquired signal

• Recovery via (convex) sparsitypenalty or greedy algorithms[Donoho; Candes, Romberg, Tao, 2004]

Page 20: Richard Baraniuk Rice University compressive nonsensing.

Signal Recovery

• Goal: Recover signal from measurements

• Problem: Randomprojection not full rank(ill-posed inverse problem)

• Solution: Exploit the sparse/compressiblegeometry of acquired signal

• Recovery via (convex) sparsitypenalty or greedy algorithms[Donoho; Candes, Romberg, Tao, 2004]

Page 21: Richard Baraniuk Rice University compressive nonsensing.

“Single-Pixel” CS Camera

randompattern onDMD array

DMD DMD

single photon detector

imagereconstructionorprocessing

w/ Kevin Kelly

scene

Page 22: Richard Baraniuk Rice University compressive nonsensing.

“Single-Pixel” CS Camera

randompattern onDMD array

DMD DMD

single photon detector

imagereconstructionorprocessing

scene

• Flip mirror array M times to acquire M measurements• Sparsity-based recovery

Page 23: Richard Baraniuk Rice University compressive nonsensing.

Random Demodulator

• CS enables sampling near signal’s (low) “information rate” rather than its (high) Nyquist rate

A2Isampling rate

number oftones /window

Nyquistbandwidth

• Problem: In contrast to Moore’s Law, ADC performance doubles only every 6-8 years

Page 24: Richard Baraniuk Rice University compressive nonsensing.

Example: Frequency Hopper

20x sub-Nyquist sampling

spectrogram sparsogram

Nyquist rate sampling

• Sparse in time-frequency

Page 25: Richard Baraniuk Rice University compressive nonsensing.

challenge 1data too expensive

means fewer expensive measurements needed for the same resolution scan

Page 26: Richard Baraniuk Rice University compressive nonsensing.

challenge 2too much data

means we compress on the fly as we acquire data

Page 27: Richard Baraniuk Rice University compressive nonsensing.

EXCITING!!!

Page 28: Richard Baraniuk Rice University compressive nonsensing.

6640 citations

dsp.rice.edu/cs archive >1500 papers

nuit-blanche.blogspot.com > 1 posting/sec

2004—2014

9797 citations

Page 29: Richard Baraniuk Rice University compressive nonsensing.

Chapter 3

The Hype

Page 30: Richard Baraniuk Rice University compressive nonsensing.

CS is Growing Up

Page 31: Richard Baraniuk Rice University compressive nonsensing.

Gerh

ard

Rich

ter

40

96 C

olo

urs

Page 32: Richard Baraniuk Rice University compressive nonsensing.

muralsoflajolla.com/roy-mcmakin-mural

Page 33: Richard Baraniuk Rice University compressive nonsensing.
Page 34: Richard Baraniuk Rice University compressive nonsensing.
Page 35: Richard Baraniuk Rice University compressive nonsensing.

“L1 is the new L2”

- Stan Osher

Page 36: Richard Baraniuk Rice University compressive nonsensing.

Exponential Growth

Page 37: Richard Baraniuk Rice University compressive nonsensing.
Page 38: Richard Baraniuk Rice University compressive nonsensing.

?

Page 39: Richard Baraniuk Rice University compressive nonsensing.

Chapter 4

The Fallout

Page 40: Richard Baraniuk Rice University compressive nonsensing.

“L1 is the new L2”

- Stan Osher

Page 41: Richard Baraniuk Rice University compressive nonsensing.

CS for “Face Recognition”

Page 42: Richard Baraniuk Rice University compressive nonsensing.

From: M. V. Subject: Interesting application for compressed sensingDate: June 10, 2011 at 11:37:31 PM EDTTo: [email protected], [email protected]

Drs. Candes and Romberg,You may have already been approached about this, but I feel I should say something in case you haven't. I'm writing to you because I recently read an article in Wired Magazine about compressed sensing

I'm excited about the applications CS could have in many fields, but today I was reminded of a specific application where CS could conceivably settle an area of dispute between mainstream historians and Roswell UFO theorists.  As outlined in the linked video below, Dr. Rudiak has analyzed photos from 1947 in which a General Ramey appears holding a typewritten letter from which Rudiak believes he has been able to discern a number of words which he believes substantiate the extraterrestrial hypothesis for the Roswell Incident).  For your perusal, I've located a "hi-res" copy of the cropped image of the letter in Ramey's hand.

I hope to hear back from you.  Is this an application where compressed sensing could be useful?  Any chance you would consider trying it?

Thank you for your time,M. V.

P.S. - Out of personal curiosity, are there currently any commercial entities involved in developing CS-based software for use by the general public?

--

Page 43: Richard Baraniuk Rice University compressive nonsensing.
Page 44: Richard Baraniuk Rice University compressive nonsensing.

x

Page 45: Richard Baraniuk Rice University compressive nonsensing.

Chapter 5

Back to Reality

Page 46: Richard Baraniuk Rice University compressive nonsensing.

Back to Reality

• “There's no such thing as a free lunch”

• “Something for Nothing” theorems

• Dimensionality reductionis no exception

• Result: CompressiveNonsensing

Page 47: Richard Baraniuk Rice University compressive nonsensing.

Nonsense 1

Robustness

Page 48: Richard Baraniuk Rice University compressive nonsensing.

Measurement Noise

• Stable recoverywith additive measurement noise

• Noise is added to

• Stability: noise only mildly amplified in recovered signal

Page 49: Richard Baraniuk Rice University compressive nonsensing.

Signal Noise

• Often seek recoverywith additive signal noise

• Noise is added to

• Noise folding: signal noise amplified in by 3dB for every doubling of

• Same effect seen in classical “bandpass subsampling”

[Davenport, Laska, Treichler, B 2011]

Page 50: Richard Baraniuk Rice University compressive nonsensing.

Noise Folding in CS

slope = -3

CS r

eco

vere

d s

ign

al S

NR

Page 51: Richard Baraniuk Rice University compressive nonsensing.

“Tail Folding”

• Can model compressible(approx sparse) signals as

“signal” + “tail”

• Tail “folds” into signal asincreases

sorted index

“signal”

“tail”

[Davies, Guo, 2011; Davenport, Laska, Treichler, B 2011]

Page 52: Richard Baraniuk Rice University compressive nonsensing.

All Is Not Lost – Dynamic Range

• In wideband ADC apps

• As amount of subsampling grows, can employan ADC with a lower sampling rate and hence higher-resolution quantizer

Page 53: Richard Baraniuk Rice University compressive nonsensing.

Dynamic Range

• CS can significantly boost the ENOB of an ADC system for sparse signals

conventional ADC

CS ADC w/ sparsity

log sampling frequency

state

d n

um

ber

of

bit

s

Page 54: Richard Baraniuk Rice University compressive nonsensing.

Dynamic Range

• As amount of subsampling grows, can employan ADC with a lower sampling rate and hence higher-resolution quantizer

• Thus dynamic range of CS ADC can significantly exceed Nyquist ADC

• With current ADC trends, dynamic range gain is theoretically 7.9dB for each doubling in

Page 55: Richard Baraniuk Rice University compressive nonsensing.

Dynamic Ranged

yn

am

ic r

an

ge

slope = +5 (almost 7.9)

Page 56: Richard Baraniuk Rice University compressive nonsensing.

Tradeoff

SNR: 3dB loss for each doubling of

Dynamic Range: up to 7.9dB gain for each doubling of

Page 57: Richard Baraniuk Rice University compressive nonsensing.

Adaptivity

• Say we know the locations of the non-zero entriesin

• Then we boostthe SNR by

• Motivates adaptivesensing strategiesthat bypass the noise-folding tradeoff[Haupt, Castro, Nowak, B 2009; Candes, Davenport 2011]

columns

Page 58: Richard Baraniuk Rice University compressive nonsensing.

Nonsense 2

Quantization

Page 59: Richard Baraniuk Rice University compressive nonsensing.

CS and Quantization

• Vast majority of work in CS assumes the measurements are real-valued

• In practice, measurements must be quantized (nonlinear)

• Should measure CS performance in terms of number of measurement bits

rather thannumber of (real-valued) measurements

• Limited progress– large number of bits per measurement– 1 bit per measurement

Page 60: Richard Baraniuk Rice University compressive nonsensing.

N=2000, K=20, M = (total bits)/(bits per meas)

CS and Quantization

1 bit6 bits

8 bits

10 bits12 bits/meas

4 bits

2 bits

Page 61: Richard Baraniuk Rice University compressive nonsensing.

Nonsense 3

Weak Models

Page 62: Richard Baraniuk Rice University compressive nonsensing.

Weak Models

• Sparsity models in CS emphasize discrete bases and frames– DFT, wavelets, …

• But in real data acquisition problems, the world is continuous, not discrete

Page 63: Richard Baraniuk Rice University compressive nonsensing.

The Grid Problem• Consider “frequency sparse” signal

– suggests the DFT sparsity basis

• Easy CS problem: K=1frequency

• Hard CS problem: K=1frequency

slow decay due to sincinterpolation of off-grid sinusoids(asymptotically, signal is not even in L1)

Page 64: Richard Baraniuk Rice University compressive nonsensing.

Going Off the Grid• Spectral CS [Duarte, B, 2010]

– discrete formulation

• CS Off the Grid [Tang, Bhaskar, Shah, Recht, 2012]– continuous formulation

best case

worst caseaverage case

Spectral CS

20dB

Page 65: Richard Baraniuk Rice University compressive nonsensing.

Nonsense 4

Focus on Recovery

Page 66: Richard Baraniuk Rice University compressive nonsensing.

Misguided Focus on Recovery

• Recall the data deluge problem in sensing– ex: large-scale imaging, HSI, video,

ultrawideband ADC, – data ambient dimension N too large

• When N ~ billions, signal recoverybecomes problematic, if notimpossible

• Solution: Perform signalexploitation directly on the compressive measurements

Page 67: Richard Baraniuk Rice University compressive nonsensing.

Compressive Signal Processing

• Many applications involve signal inference and not reconstruction

detection < classification < estimation < reconstruction

• Good news: CS supports efficient learning, inference, processing directly on compressive measurements

• Random projections ~ sufficient statisticsfor signals with concise geometrical structure

Page 68: Richard Baraniuk Rice University compressive nonsensing.

Classification• Simple object classification problem

– AWGN: nearest neighbor classifier

• Common issue:– L unknown articulation parameters

• Common solution: matched filter– find nearest neighbor under all articulations

Page 69: Richard Baraniuk Rice University compressive nonsensing.

CS-based Classification• Target images form a low-dimensional

manifold as the target articulates– random projections preserve information

in these manifolds if

• CS-based classifier: smashed filter– find nearest neighbor under all articulations

under random projection [Davenport, B, et al 2006]

Page 70: Richard Baraniuk Rice University compressive nonsensing.

Smashed Filter

• Random shift and rotation (L=3 dim. manifold)• White Gaussian noise added to measurements• Goals: identify most likely shift/rotation parameters

identify most likely class

number of measurements Mnumber of measurements Mavg

. sh

ift

esti

mate

err

or

cla

ssifi

cati

on

rate

(%

)more noise

more noise

Page 71: Richard Baraniuk Rice University compressive nonsensing.

Frequency Tracking

• Compressive Phase Locked Loop (PLL) – key idea: phase detector in PLL computes inner product

between signal and oscillator output– RIP ensures we can compute this inner product between

corresponding low-rate CS measurements

CS-PLL w/ 20xundersampling

Page 72: Richard Baraniuk Rice University compressive nonsensing.

Nonsense 5

Weak Guarantees

Page 73: Richard Baraniuk Rice University compressive nonsensing.

Performance Guarantees

• CS performance guarantees– RIP, incoherence, phase transition

• To date, rigorous results only for random matrices– practically not useful– often pessimistic

• Need rigorous guarantees for non-random, structured sampling matrices with fast algorithms– analogous to the progress in coding theory from Shannon’s

original random codes to modern codes

Page 74: Richard Baraniuk Rice University compressive nonsensing.

Chapter 6

All Is Not Lost !

Page 75: Richard Baraniuk Rice University compressive nonsensing.

Sparsity

Convex optimization

Dimensionality reduction

Page 76: Richard Baraniuk Rice University compressive nonsensing.

12-Step ProgramTo End Compressive Nonsensing

1. Don’t give in to the hype surrounding CS

2. Resist the urge to blindly apply L1 minimization

3. Face up to robustness issues

4. Deal with measurement quantization

5. Develop more realistic signal models

6. Develop practical sensing matrices beyond random

7. Develop more efficient recovery algorithms

8. Develop rigorous performance guarantees for practical CS systems

9. Exploit signals directly in the compressive domain

10. Don’t give in to the hype surrounding CS

11. Resist the urge to blindly apply L1 minimization

12. Don’t give in to the hype surrounding CS