EE303: Communication Systems Professor A. Manikas Chair of Communications and Array Processing Imperial College London Principles of Optimum Decision and Detection Theory Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 1 / 50
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
EE303: Communication Systems
Professor A. ManikasChair of Communications and Array Processing
Imperial College London
Principles of Optimum Decision and Detection Theory
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 1 / 50
Table of Contents1 Basic Decision Theory
IntroductionHypothesis TestingMore on Hypethesis TestingTerminologyDecision CriteriaExamplesGaussian Likelihood Functions (LFs)Addition of Signals, pdf’s and Likelihood FunctionsRectangular and Gaussian LFs
2 Optimum Receiver Architectures based on Decision CriteriaBasic Concepts on Optimum ReceiversOptimum Corr. ReceiversOptimum Matched Filters ReceiversSignals and Matched Filters in Frequency DomainMatched Filter: Output SNR
3 Equivalent Optimum Rx Architectures using Signal Constellation4 Summary - Equivalent Optimum Architectures5 Performance of Binary Digital Modulators/Demodulators
Constellation Diagram (Binary)Bit-Error-Rate (BER)ExamplesTable of BERs
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 2 / 50
Basic Decision Theory Introduction
Basic Decision Theory: IntroductionDetection Theory is concerned with determining the existence of asignal in the presence of noise - and can be classified as follows:
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 3 / 50
Basic Decision Theory Introduction
N.B.:1 Adaptive or Learning :
F system parameters change as a function of input;F diffi cult to analyze;F mathematically complex
2 Non-Parametric :F fairly constant level of performance because these are based on generalassumptions on the input pdf;
F easier to implement.
3 Consider a parametric detector which has been designed for Gaussianpdf input.
F if input is actually Gaussian : then parametric’s performance issignificantly better;
F if input 6=Gaussian but still symmetric: then non-parametric detectorperformance may be much better than the performance of theparametric detector
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 4 / 50
Basic Decision Theory Hypothesis Testing
Hypothesis TestingDefinition (Hypothesis)
A Hypothesis , a statement of a possible condition
Definition (Hypethesis Testing)
To choose one from a number (two or more) hypotheses
Example (1: Detection of a signal s(t) in the presence of noise)
we have an observed signal r(t)
we define two hypotheses H1 and H2
Hypethesis Testing:
H1 : r(t) = s(t) + n(t) i.e. the signal is present
(with probability Pr(H1))
H2 : r(t) = n(t) i.e.the signal is not present
(with probability Pr(H2))
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 5 / 50
Basic Decision Theory Hypothesis Testing
Example (2: Hypethesis Testing in an M-ary Comm System)In an M-ary Comm. System we have M hypotheses
The aim is to design a receiver which operates on an observed signalr(t) and chooses one of the following M hypotheses:
Note:I The above can be easily proven using the Bayes rule
Pr(Hi/r) =pdfr/Hi (r).Pr(Hi )
pdfr (r)(8)
I Gj is known as "decision variable"I In this topic the symbol G will be used to represent a "decisionvariable".
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 12 / 50
Basic Decision Theory Decision Criteria
Decision Criteria
Consider the sets of parameters P1 and P2 where:P1 denotes the set of parameters Pr(H1),Pr(H2), ..., Pr(HM )P2 represents the set of costs/weights Cij , ∀i , j
associated with the transition probabilities Pr(Di |Hj )(i.e. one weight for every element of the channel transition matrix F -see EE303, Topic on "Comm Channels")
Estimate/identify the likelihood functions
pdfr |H1(r), pdfr |H2(r), ..., pdfr |HM (r)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 13 / 50
Basic Decision Theory Decision Criteria
Corollary (Main Decision Criteria )Decision: choose the hypothesis Hi (i.e. Di )with the maximum Gi (r)where Gi (r) depends on the chosen criterion as follows:
BAYES CriterionMinimum Probability of Error (min(pe)) CriterionMAP CriterionMINIMAX CriterionNewman-Pearson (N-P ) CriterionMaximum Likelihood (ML ) Criterion
P1 P2 choose Hypothesis with max(Gj (r))
Bayes known known Gi (r) ,weight i × Pr(Hi )×pdfr |Himin(pe ) known unknown Gi (r) , Pr(Hi )× pdfr |Hior MAP
Minimax unknown known Gi (r) ,weight i× pdfr |HiN-P unknown unknown by solving a constraint optim. problem
ML don’t care don’t care Gi (r) , pdfr |Hi
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 14 / 50
Basic Decision Theory Decision Criteria
N.B.:I Note-1:if an approximate/initial solution is required then any informationabout the sets of parameters P1 and/or P2 can be ignored. In thiscase the Maximum Likelihood (ML) Criterion should be used.
I Note-2:
weighti ,M
∑j=1j 6=i
(Cji − Cii
)(9)
or (since the term for i = j is equal to zero) simply,
weighti =M
∑j=1
(Cji − Cii
)(10)
I Note-3:Sometimes, for convenience, Gi will be used (i.e. Gi , Gi (r)) - i.e. theargument will be ignored.
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 15 / 50
Basic Decision Theory Decision Criteria
Decision Criteria: Mathematical Architectures
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 16 / 50
Basic Decision Theory Examples
Example (Gaussian LFs)
A Brth,ji
r0 1Dj Di
pdf ( )r/Hj r pdf ( )r/Hi r
(Volts)
By solving Gj (r) = Gi (r) the decision threshold rth,ji can beestimated
area-A = Pr(Dj |Hi ) and area-B = Pr(Di |Hj )
pe ,cs =M
∑j=1
M
∑i=1i 6=j
Pr(Dj ,Hi ) = symbol error probability/rate (SER)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 17 / 50
Basic Decision Theory Examples
Example (Signal + Noise (without using LFs))
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 18 / 50
Basic Decision Theory Examples
Example (Signal + Noise (by using LFs))
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 19 / 50
Basic Decision Theory Examples
Example (Rectangular and Gaussian LFs)A discrete channel employs two equally probable symbols and thelikelihood functions are given by the following expressions
pdfr |H0 =12rect
{ r2
}; pdfr |H1 = N
(0,σ =
12
)(11)
If the detector employed has been designed in an optimum way, find thedecision rule and model the above discrete channel.
(Volts)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 20 / 50
Optimum Receiver Architectures based on Decision Criteria Basic Concepts on Optimum Receivers
Basic Concepts on Optimum Receivers
Receiver = Detector + Decision Device
Consider a M-ary communication model in which one of M signalssi (t), for i = 1, 2, ..M, is received in the time interval (0,Tcs) in thepresence of white noisei.e.
r(t) =
s1(t), ors2(t), or· · ·sM (t)
+ n(t), 0 ≤ t ≤ Tcs (12)
where r(t) denotes the received (observable) signal.
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 21 / 50
Optimum Receiver Architectures based on Decision Criteria Basic Concepts on Optimum Receivers
e.g.
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 22 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Corr. Receivers
Optimum M-ary ReceiversObjective : to design a receiver which operates on r(t) and choosesone of the following M hypotheses:
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 23 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Corr. Receivers
Optimum Architecture based on Decision TheoryIt can be easily proven that:
Di =
argmaxi{ pdfr/Hi(r(t))︸ ︷︷ ︸ }
=Gi (r )
ML
argmaxi{Pr(Hi )× pdfr/Hi (r(t))︸ ︷︷ ︸
=Gi (r )
} MAP, ormin (pe)
argmaxi{weighti × pdfr/Hi (r(t))︸ ︷︷ ︸
=Gi (r )
} minmax
argmaxi{weighti × Pr(Hi )× pdfr/Hi (r(t))︸ ︷︷ ︸
=Gi (r )
} Bayes
(16)
for i = 1, . . . ,M
Based on Equ 15, the parameter Gi (r) shown in the previous equation, can besimplified as follows:
Gi (r) =∫ Tcs
0r(t)s∗i (t)dt+DCi , ∀i , ∀criterion (17)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 24 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Corr. Receivers
where DCi ∀i , depends on the decision criterion.
For all the criteria, DCi is given as follows:
DCi ,
− 12Ei ML
N02 ln(Pr(Hi ))−
12Ei MAP, or
min (pe)
N02 ln(weighti )−
12Ei minmax
N02 ln(weighti .Pr(Hi ))−
12Ei Bayes
∣∣∣∣∣∣∣∣∣∣∣∣∣∣∣∣(18)
with Ei , energy of si (t) (19)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 25 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Corr. Receivers
Based on 17 which is repeated below:
Gi (r) =∫ Tcs
0r(t)s∗i (t)dt+DCi , ∀i , ∀criterion,
Equ 16, can be implemented by the following optimum architecture:
s (t)1
0
Tcs
+DC1
s (t)2
0
Tcs
+DC2
s (t)M
0
Tcs
+DCM
G1
G2
GM
CompareDecisionVariables
and
choosemaximum
Di
If Gi=maxthen
Gi
Di
r t( )
t
Tcs
Tcs
Tcs
Tcs
OPTIMUM Mary RECEIVER: CORREL. RECEIVER
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 26 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Corr. Receivers
Note-1: If the a priori probabilities are equal, i.e.
Pr(H1) = Pr(H2) = · · · = Pr(Hi ) (20)
and the signals have the same energy, i.e.
E1 = E2 = · · · = EM (21)
then MAP (or min(pe)) in Equs 16 and ?? is simplified as follows:
MAP = Di = argmaxi
{Pr(Hi )× pdfr/Hi (r(t))
}= argmax
i{pdfr/Hi (r(t))} = ML
= argmaxi
{∫ Tcs
0r(t)s∗i (t).dt
}(22)
i.e.
equiprobable & equipower =⇒ ML=MAP, with DCi=0 (23)
Note-2: if Pr(H1) = Pr(H2) = · · · = Pr(Hi ) thenequiprobable =⇒ ML=MAP, with DCi=1
2Ei (24)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 27 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
Optimum M-ary Receivers using Matched FiltersKnown Signals in White Noise
Consider the output of one branch of the correlation receiver:
CORRELATOR
s (t)i
0
Tcs 1r t =s t +n t( ) ( ) ( )i
at point-1 = output =∫ Tcs
0r(t) · si (t) · dt
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 28 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
In this section we will try to replace the correlator of a correlationreceiver with a linear filter (known as Matched Filter ).
MATCHED FILTER
Linearfilterh (t)i
Tcs1 2r t =s t +n t( ) ( ) ( )i
at point-1 =∫ t
0r(u) · hi (t − u) · du
at point-2 = output =∫ Tcs
0r(u) · hi (Tcs − u) · du
N.B.:compare correlator’s o/p (point-1) with matched filter’s o/p (point-2).
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 29 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
If we choose the impulse response of the linear filter as
hi (t) = si (Tcs − t) 0 ≤ t ≤ Tcs (25)
then that linear filter, defined by the above equation, is calledMatched Filter.
at point-2 =∫ Tcs
0r(u) · si (u) · du
=⇒ at point-2 =∫ Tcs
0r(t) · si (t) · dt
N.B.:Output Of Correlator =
↑only at time t=Tcs
Output Of Matched Filter
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 30 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
Example (Matched Filter’s Impulse Response.)
Reversal inTime
Tcs
s (t)i
h (t)i
0
t
t
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 31 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
CORRELATION RECEIVER:
s (t)1
0
Tcs
+DC1
s (t)2
0
Tcs
+DC2
s (t)M
0
Tcs
+DCM
G1
G2
GM
CompareDecisionVariables
and
choosemaximum
Di
If Gi=maxthen
Gi
Di
r t( )
t
Tcs
Tcs
Tcs
Tcs
OPTIMUM Mary RECEIVER: CORREL. RECEIVER
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 32 / 50
Optimum Receiver Architectures based on Decision Criteria Optimum Matched Filters Receivers
or, MATCHED FILTER RECEIVER:
+DC1
+DC2
+DCM
G1
G2
GM
CompareDecisionVariables
and
choosemaximum
Di
If Gi=maxthen
Gi
Di
r t( )
t
Tcs
Tcs
Tcs
Tcs
OPTIMUM Mary RECEIVER: MATCHED FILTER. RECEIVER
Tcs
h t =s T tM M cs( ) ( )
Tcs
h t =s T t2 2( ) ( )cs
Tcs
h t =s T t1 1( ) ( )cs
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 33 / 50
Optimum Receiver Architectures based on Decision Criteria Signals and Matched Filters in Frequency Domain
Signals and Matched Filters in Freq. Domain
Let h(t) ={s(Tcs − t) 0 ≤ t ≤ Tcs0 elsewhere
Time DomainFT7−→ Freq. Domain
s(t)FT7−→ S(f ) =
∫ Tcs0 s(t)e−j2πftdt
h(t)FT7−→ H(f ) =
∫ ∞−∞ h(t)e
−j2πftdt
=∫ Tcso s(Tcs − t)e−j2πftdt
=∫ Tcs0 s(u)e−j2πf (Tcs−u)du
= e−j2πfTcs∫ Tcs0 s(u)e+j2πfudu
= e−j2πfTcs · S∗(f )
thereforeH(f ) = e−j2πfTcs · S∗(f ) (26)
Prof. A. Manikas (Imperial College) EE303: Principles of Decision Theory v.19c1 34 / 50
Optimum Receiver Architectures based on Decision Criteria Matched Filter: Output SNR
Matched Filter: Output SNRConsider next you have got the optimum impulse response h0(t).This impulse response provides the maximum output-SNR which canbe estimated as follows:
MATCHED FILTER
Linearfilterh (t)opt
Tcsr t =s t +n t( ) ( ) ( )
Important : SNRout =maximum = 2 EN0 for white noise,Provided that the filter is matched to the signal, it is obvious fromthe above equation that