Top Banner
SPARSE COMPLEX CHANNEL SPARSE COMPLEX CHANNEL ESTIMATION USING LINEARIZED ESTIMATION USING LINEARIZED BELIEF PROPAGATION BELIEF PROPAGATION - PRESENTATION BY PRESENTATION BY MAHADEVI PILLAI PERUMAL MAHADEVI PILLAI PERUMAL GRADUATE STUDENT GRADUATE STUDENT THE OHIO STATE UNIVERSITY THE OHIO STATE UNIVERSITY 1
24
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Ppt

SPARSE COMPLEX CHANNEL ESTIMATION SPARSE COMPLEX CHANNEL ESTIMATION USING LINEARIZED BELIEF PROPAGATIONUSING LINEARIZED BELIEF PROPAGATION

--PRESENTATION BYPRESENTATION BY

MAHADEVI PILLAI PERUMALMAHADEVI PILLAI PERUMAL

GRADUATE STUDENTGRADUATE STUDENT

THE OHIO STATE UNIVERSITYTHE OHIO STATE UNIVERSITY

1

Page 2: Ppt

2

OUTLINE

• Introduction

• Linearized Belief Propagation

• Complex Case Analysis

• Block Sparsity

• Performance Analysis

• Conclusion

Page 3: Ppt

Channels with a sparse impulse response arise in a number of communication applications.

Sparse channels have impulse response characterized by large number of zero tap coefficients.

Widely used in radar and Ultra WideBand communication

Conventional estimation techniques have poor performance

Algorithms that exploit the sparsity of the channel needed.

INTRODUCTION

3

Page 4: Ppt

Derive necessary conditions under which the Linearized Belief Propagation (LBP) algorithm that exist for real case can be extended to complex-valued case.

Perform simulation and compare the LBP performance in complex case as against its performance in real case.

PROJECT OBJECTIVES

4

Page 5: Ppt

5

System Model

Linear Mixing Estimation Problem -‘n’ components of input vector ‘x’ coupled into the ‘m’ components of output vector y.

Page 6: Ppt

6

OUTLINE

• Introduction

• Linearized Belief Propagation

• Complex Case Analysis

• Block Sparsity

• Performance Analysis

• Conclusion

Page 7: Ppt

MESSAGE PASSING ALGORITHM

7

Page 8: Ppt

STANDARD BP

8

• Most common approach to the linear mixing estimation problem.

• Iteratively updates estimates of the variables based on message passing along a graph.

• When the factor graph contains no loops, BP yields exact posteriors after only two rounds of message passing (i.e., forward and backward).

... .....

.

• With loops, however, convergence to the exact posteriors is not guaranteed.

Page 9: Ppt

DRAWBACK OF STANDARD BP

9

Output Node Message

... .....

.

• Complexity of integration grows exponentially with sparsity of measurement matrix

•With loops, give an approximation to exact posteriors.

Page 10: Ppt

LINEARIZED BELIEF PROPAGATION

10

• Extension of Relaxed BP to the case of general output channel

• Simplify Standard BP using Gaussian approximations

(1) Central Limit theorem at the input node (2) 2nd order Taylor series at the output node • Gaussian messages decouples vector estimation problem to scalar-valued estimation problems with Gaussian noise at the input and output nodes

Page 11: Ppt

CLT APPROXIMATION

11

... .....

.

Gaussian Approximation

• CLT approximation – model uncertainty in all components other than xj as Gaussian noise

• Passes only the mean and variance of the Gaussian noise to the function node.

•Integration dimension reduced to one.

Input Node Message

Page 12: Ppt

TAYLOR SERIES APPROXIMATION

12

• Taylor series approximation – message from each function node Gaussian.

•Output message in one iteration - Gaussian

• Passes only the mean and variance of the Gaussian to the variable node.

Output Node Message

Page 13: Ppt

13

OUTLINE

• Introduction

• Linearized Belief Propagation

• Complex Case Analysis

• Block Sparsity

• Performance Analysis

• Conclusion

Page 14: Ppt

COMPLEX CASE

14

• Extend LBP to estimate complex-valued channel. Conditions needed ?

• Real Case – Output node message approximated with one dimension Taylor series

Page 15: Ppt

TWO DIMENSION TAYLOR SERIES

15

• Approximation involves two dimension Taylor Series

Need to eliminate this cross-term

Page 16: Ppt

CONDITIONAL INDEPENDENCE

16

For = 0 ,Conditional

independence on channel

Similar to real-valued case

Output Node Message

Only need to be sent to variable node at every iteration.

NECESSARY CONDITION TO EXTEND LBP TO COMPLEX CASECONDITION INDEPEDENCE ON CHANNEL

NECESSARY CONDITION TO EXTEND LBP TO COMPLEX CASECONDITION INDEPEDENCE ON CHANNEL

Page 17: Ppt

17

OUTLINE

• Introduction

• Linearized Belief Propagation

• Complex Case Analysis

• Block Sparsity

• Performance Analysis

• Conclusion

Page 18: Ppt

18

BLOCK SPARSITY

Page 19: Ppt

19

OUTLINE

• Introduction

• Linearized Belief Propagation

• Complex Case Analysis

• Block Sparsity

• Performance Analysis

• Conclusion

Page 20: Ppt

MEDIAN SQUARED ERROR PERFORMANCE

20

• In Complex Case, the Median SE is converged to a lower value than real case

• Performance improvement in Complex Case

Real-valued Case Complex-valued Case

Page 21: Ppt

DIFFERENT MEASUREMENT RATIO

21

Complex-valued Case

Real-valued Case

Performance gain observed in Complex Case

Performance gain observed in Complex Case

Page 22: Ppt

CONCLUSION

22

• Successfully derived necessary condition to extend LBP to complex-valued case  

• Implemented the algorithm for real and complex case

• Simulation results revealed better performance in complex case because of block sparsity in the input vector exploited by the algorithm

•Future Work : Performance Analysis for LBP in non Gaussian channel

Page 23: Ppt

REFERENCES

23

[1] S. Rangan, “Estimation with random linear mixing, belief propagation, and compressed sensing,” arXiv:1001.2228v1, May 2010

[2] P. Schniter, “Joint Estimation and Decoding for Sparse Channels via Relaxed Belief Propagation,'' in Proc. Asilomar Conf. on Signals, Systems, and Computers, Pacific Grove, CA, Nov. 2010.

[3] S. Rangan, “Estimation with random linear mixing, belief propagation, and compressed sensing,” arXiv:1001.2228v2, May 2010

[4] Yedidia, J.S.; Freeman, W.T.; Weiss, Y., "Understanding Belief Propagation and Its Generalizations", Exploring Artificial Intelligence in the New Millennium, ISBN 1558608117, Chap. 8, pp. 239-236, January 2003 [  

Page 24: Ppt

24

THANK YOU