Richard Baraniuk Rice University Compressive Sensing Acknowledgements • For assistance preparing this presentation • Rice DSP group – Petros Boufounos, Volkan Cevher – Mark Davenport, Marco Duarte, Chinmay Hegde, Jason Laska, Shri Sarvotham, … • Mike Wakin, University of Michigan – geometry of CS, embeddings • Justin Romberg, Georgia Tech – optimization algorithms
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Richard Baraniuk
Rice University
CompressiveSensing
Acknowledgements
• For assistance preparing this presentation
• Rice DSP group– Petros Boufounos, Volkan Cevher– Mark Davenport, Marco Duarte, Chinmay Hegde,
Jason Laska, Shri Sarvotham, …
• Mike Wakin, University of Michigan– geometry of CS, embeddings
» image data bases, camera arrays, distributed wireless sensor networks, …
xincreasing numbers of modalities
» acoustic, RF, visual, IR, UV
deluge of datadeluge of data» how to acquire, store,
fuse, process efficiently?
Digital Data Acquisition
• Foundation: Shannon sampling theorem“if you sample densely enough(at the Nyquist rate), you can perfectly reconstruct the original analog data”
time space
Sensing by Sampling
• Long-established paradigm for digital data acquisition– uniformly sample data at Nyquist rate (2x Fourier bandwidth)
sample toomuchdata!
Sensing by Sampling
• Long-established paradigm for digital data acquisition– uniformly sample data at Nyquist rate (2x Fourier bandwidth)
– compress data
compress transmit/store
receive decompress
sample
JPEGJPEG2000
…
Sparsity / Compressibility
pixels largewaveletcoefficients
(blue = 0)
widebandsignalsamples
largeGabor (TF)coefficients
time
freq
uen
cy
Sample / Compress
compress transmit/store
receive decompress
sample
sparse /compressiblewavelettransform
• Long-established paradigm for digital data acquisition– uniformly sample data at Nyquist rate – compress data
What’s Wrong with this Picture?
• Why go to all the work to acquire N samples only to discard all but K pieces of data?
compress transmit/store
receive decompress
sample
sparse /compressiblewavelettransform
What’s Wrong with this Picture?linear processinglinear signal model(bandlimited subspace)
compress transmit/store
receive decompress
sample
sparse /compressiblewavelettransform
nonlinear processingnonlinear signal model(union of subspaces)
Compressive Sensing
• Directly acquire “compressed” data
• Replace samples by more general “measurements”
compressive sensing transmit/store
receive reconstruct
Compressive Sensing
Theory IGeometrical Perspective
• Signal is -sparse in basis/dictionary– WLOG assume sparse in space domain
Sampling
sparsesignal
nonzeroentries
Compressive Sampling
• When data is sparse/compressible, can directly acquire a condensed representation with no/little information loss through linear dimensionality reduction
measurements sparsesignal
nonzeroentries
How Can It Work?
• Projection not full rank…
… and so loses information in general
• Ex: Infinitely many ’s map to the same
How Can It Work?
• Projection not full rank…
… and so loses information in general
• But we are only interested in sparse vectors
• is effectively MxK
columns
How Can It Work?
• Projection not full rank…
… and so loses information in general
• But we are only interested in sparse vectors
• Design so that each of its MxK submatricesare full rank
columns
How Can It Work?
• Goal: Design so that its Mx2K submatrices are full rank
– difference between two K-sparse vectors is 2K sparse in general
– preserve information in K-sparse signals
– Restricted Isometry Property (RIP) of order 2K
columns
Unfortunately…
• Goal: Design so that its Mx2K submatrices are full rank(Restricted Isometry Property – RIP)
• Unfortunately, a combinatorial, NP-complete design problem
columns
Insight from the 80’s [Kashin, Gluskin]
• Draw at random– iid Gaussian– iid Bernoulli
…
• Then has the RIP with high probability
– Mx2K submatrices are full rank– stable embedding for sparse signals– extends to compressible signals in balls
columns
Compressive Data Acquisition
• Measurements = random linear combinationsof the entries of
• WHP does not distort structure of sparse signals – no information loss
measurements sparsesignal
nonzeroentries
CS Signal Recovery
• Goal: Recover signal from measurements
• Problem: Randomprojection not full rank(ill-posed inverse problem)
• Solution: Exploit the sparse/compressiblegeometry of acquired signal
• Sparse signal: only K out of Ncoordinates nonzero
– model: union of K-dimensional subspacesaligned w/ coordinate axes
Concise Signal Structure
sorted index
• Sparse signal: only K out of Ncoordinates nonzero
– model: union of K-dimensional subspaces
• Compressible signal: sorted coordinates decay rapidly to zero
sorted index
power-lawdecay
Concise Signal Structure
• Sparse signal: only K out of Ncoordinates nonzero
– model: union of K-dimensional subspaces
• Compressible signal: sorted coordinates decay rapidly to zero
– model: ball:
sorted index
power-lawdecay
Concise Signal Structure
CS Signal Recovery• Random projection
not full rank
• Recovery problem:givenfind
• Null space
• So search in null space for the “best”according to some criterion– ex: least squares
minimum solution= sparsest solution (with high probability) if
for signals sparse in the space/time domain
Universality
• Random measurements can be used for signals sparse in any basis
Universality
• Random measurements can be used for signals sparse in any basis
Universality
• Random measurements can be used for signals sparse in any basis
sparsecoefficient
vector
nonzeroentries
Compressive Sensing
• Directly acquire “compressed” data
• Replace N samples by M random projections
transmit/store
receive linear pgm
…
random measurements
Compressive Sensing
Theory IIStable Embedding
Johnson-Lindenstrauss Lemma
• JL Lemma: random projection stably embeds a cloud of Q points whp provided
• Proved via concentration inequality
• Same techniques link JLL to RIP [B, Davenport, DeVore, Wakin, Constructive Approximation, 2008]
Q points
Connecting JL to RIPConsider effect of random JL � on each K-plane
– construct covering of points Q on unit sphere– JL: isometry for each point with high probability– union bound � isometry for all points q in Q– extend to isometry for all x in K-plane
K-plane
Connecting JL to RIPConsider effect of random JL � on each K-plane
– construct covering of points Q on unit sphere– JL: isometry for each point with high probability– union bound � isometry for all points q in Q– extend to isometry for all x in K-plane– union bound � isometry for all K-planes
K-planes
• Gaussian
• Bernoulli/Rademacher [Achlioptas]
• “Database-friendly” [Achlioptas]
• Random Orthoprojection to RM [Gupta, Dasgupta]
Favorable JL Distributions
RIP as a “Stable” Embedding
• RIP of order 2K implies: for all K-sparse x1 and x2,
K-planes
• Theorem: Supposing � is drawn from a JL-favorabledistribution,* then with probability at least 1-e-C*M, � meets the RIP with
– models for joint sparsity– algorithms for joint reconstruction– theoretical results for measurement savings
• Zero collaboration, trivially scalable, robust• Low complexity, universal encoding
Distributed CS (DCS)
…
Real Data Example• Light Sensing in Intel Berkeley Lab• 49 sensors, N =1024 samples each, � = wavelets
K=100
M=400
M=400
Multisensor Fusion• Network of J sensors viewing articulating object
• Complexity/bandwidth for fusion and inference
– naïve w/ no compression– individual CS (sensor by sensor)
Multisensor Fusion• Network of J sensors viewing articulating object
• Collection of images lies on K-dim manifold in
• Complexity/bandwidth for fusion and inference
– naïve w/ no compression– individual CS (sensor by sensor)
– joint CS (fuse all sensors)
Multi-Camera Vision Apps
• Background subtraction from multi-camera compressive video – innovations (due to object motion) sparse in space domain– estimate innovations from random projections of video– only few measurements required per camera
thanks to overlapping field of view
Multi-Camera Vision Apps• 3-D shape inference from compressive msmnts
– estimate innovations from random projections of video– fuse into 3D shape by exploiting object sparsity in 3D volume
Compressive Sensing
Related Work
Related Work
• Streaming algorithms
• Multiband sampling
• Finite rate of innovation (FROI) sampling
• General sparse approximation
Compressive Sensing
The End
CS – Summary
• Compressive sensing– integrates sensing, compression, processing– exploits signal sparsity/compressibility information– enables new sensing modalities, architectures, systems
• Why CS works: stable embedding for signals with concise geometric structure
sparse signals | compressible signals | manifolds
• Information scalability for compressive inference– compressive measurements ~ sufficient statistics– many fewer measurements may be required to