Top Banner
Session 2 2 In OS-CFAR, the average noise power in a region is estimated using an order statistic, or ranked sample of the noise power samples in the reference window. For example, we might use the sample median instead of the sample mean to estimate the average noise power. While an order statistic estimate is not the maximum likelihood estimate if the samples are independent and statistically homogeneous (i.i.d.), order statistics (e.g., the sample median) are much more robust to deviations from this ideal. Order-Statistic CFAR (OS-CFAR) Herman Rohling, “Radar CFAR Thresholding in Clutter and Multiple Target Situations,” IEEE Transactions on Aerospace and Electronic Systems, vol. 19, pp. 608–621, 1983. 22.1
12

22.1 Order-Statistic CFAR (OS-CFAR)

Mar 28, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Session_22rawS e s s i o n 2 2
• In OS-CFAR, the average noise power in a region is estimated using an order statistic, or ranked sample of the noise power samples in the reference window.
• For example, we might use the sample median instead of the sample mean to estimate the average noise power.
• While an order statistic estimate is not the maximum likelihood estimate if the samples are independent and statistically homogeneous (i.i.d.), order statistics (e.g., the sample median) are much more robust to deviations from this ideal.
Order-Statistic CFAR (OS-CFAR) Herman Rohling, “Radar CFAR Thresholding in Clutter and Multiple Target Situations,” IEEE Transactions on Aerospace and Electronic Systems, vol. 19, pp. 608–621, 1983.
2 2 . 1
O rd e r s t a t i s t i c s 2 2 . 2
G i v e n i . i . d . R V s X i , . . . , # n , w e c a n
o r d e r t h e s e a s
# ( i )
: X # - Kent
w e ca l l t h i s ordering o f t h e R V s X i , . . . , # n t h e o r d e r s t a t i s t o f X , , . . . , # i s .
Suppose N , n . . . , * n a r e i . i . d . R V s 2 2 . 3
w i t h p d f f * e x ) a n d F * e x ) .
Assume t h e * k a r e absolutely continuous
R V s .
w h a t i s t h e p . d. f . o f t h e k-the o r d e r
s t a t i s t i c Xck) ?
Let's c a l l t h e pdf o f Hea
Fk ( X ) .
We w a n t t o f i n d f k Cx ) .
fk e x ) D X = P ( { x <Hee, E x td x } ) 2 2 . 4
8 5 a smal lpositivenumber
= P ( B k )
where B , = { × < X c , E x t D X } .
T h e e v e n t B k o c c u r s r i f f
( i ) k - 1 R V s a r e l e s s t h a n x
I i i ) o n e R V i s i n ( x , X t d x ]
( I i i ) T h e remaining n - k R V s t a k e o n
values greater t h a n X t d x
•¥¥ t¥¥
F o r any o f t h e n R U S X i , . . . , # n 2 2 . 5
( d e n o t e t h e o n e se lec ted by * ) De f i n e t h e e v e n t s
A , E { * e x }
A - E { x s X s x t d x }
A s I { X > X t d x }
" I { A , , Az , A s } i s a partition o f R .
Fo r any o f t h e n i . i . d . R V s 2 2 . 6
* i s . . . ,
P I A , ) = P ( { H E X } ) = F * e x )
P C A2 ) = P ( E x < N e x t dx3) = f *cx)-dx ( A s ) = P ( { X > x tdx3 ) = I-F*cxtdx)
= I -F*cx) , F , ex) i s continuous
F o r t h e n i . i . d . R V s X.,...,Xn, 2 2 7 w e k n o w t h a t B , o c c u r s i f f
1 . A , o c c u r s K - 1 t i m e s ,
Z . A z o c c u r s o n c e ,
3 . A 3 o c c u r s n - K t i m e s .
W e c a n compute t h e probability o f B k using t h e mul t i nomia l
d i s t r i b u t i o n f o r 3 events.
P (Bk) = q,µ"¥¥,P I A .I"''PCA.I play"-k " ' 8
= ¥¥k . , [EM]"-{f* a n dx ] [i-Eat]"" = fact) D X
° : Sa ex ) =,←j¥g, FEI'f i -⇐as]"-¥ex)
When n i s odd , w e c a n s e t Kents, 2 2 . 9
and w e g e t t h e o rd e r s ta t i s t i c c a l l e d t h e
samplemedian Xµ±, o f X . , . . . , # n o
Equal n u m b e r s o f R V s l i e a b o v e a n d
below t h e sample m e d i a n .
Order statistics a r e u s e d i n
( 1 ) Medicin f i l t e r s
( 2 ) Order s ta t i s t i c F . He r s
( 3 ) O S C F A R P ro c e s s o r s
-3 -2 -1 1 2 3
0.2
0.4
0.6
0.8
1.0
1.2
2 2 . 1 0±÷÷i÷÷÷÷÷÷÷÷÷:*, Sir,(Y) = ¥+×zT f o r a l l n - l , 2 , 3 , . . . .
Computingthe
useless.
converge i n density t o t h e median
← sista
←¥ i #
Order-Statistic CFAR (OS-CFAR) 25
· · · · · ·
Fig. 4.1. Order-Statistic CFAR Processor.
[13] made the observation that threshold determination in OS-CFAR is similar to
modified median filtering [21]. In that respect, OS-CFAR processing shares common
traits with stack filtering [22] as well.
4.1 OS-CFAR System Description
Fig. 4.1 shows a typical OS-CFAR processor. There are only two significant
dierences between the OS-CFAR processor and the CA-CFAR processor. First,
the clutter power estimate Z is made from the k-th order statistic by sorting the
observations in the reference window in ascending order, and selecting the k-th one:
Z = X(k). (4.1)
The rank of the order statistic to be used is determined in advance. It can be
any value 1 k N , as is typically chosen to maximize detection performance.
Rohling [1] suggests a value of k near 3N/4. Analysis presented in chapter 6 suggests
that a value of k near 4N/5, in general, optimizes detection performance.
The second dierence between OS-CFAR and CA-CFAR is the method with
which the threshold scaling factor T is selected, required by the use of a dierent
clutter power estimator. In CA-CFAR, T is a function of the reference window size
In OS-CFAR, the reference noise samples X1, . . . , XN are sorted from smallest to largest and designated
X(1) X(2) · · · X(N).
The k-th order statistic X(k)—or some scaled version of it—can then be used as the mean power estimate.
* See: Michael F. Rimbert, Constant False Alarm Rate Detection Techniques Based on Empirical Distribution Function Statistics, Ph.D Thesis, School of Electrical and Computer Engineering, Purdue University, August 2005.
While the median seems like a logical choice, selecting values of k in the area of 3N/4 to 4N/5 have been shown to work well.
2 2 . 1 1
Behavior of OS-CFAR • OS-CFAR is robust to outliers deviating from a set
of homogeneous i.i.d. samples in the reference window because order statistics—especially central order statistics near the median— are robust to outliers.
• This is in fact why statistician John W. Tukey developed and advocated statistical estimation techniques based on them.
• More general results from the theory of order statistic filters may also yield interesting new CFAR techniques.
• How well do they behave compared to optimal CA- CFAR when the noise reference samples are i.i.d. ?
2 2 . 1 2
Analysis of OS-CFAR
Assume that X1, . . . , XN are i.i.d. samples from a common pdf f(x) having corresponding cdf F (x). If we form the order statistics
X(1) X(2) · · · X(N),
it can be shown that (See Papoulis, Ch. 8) the pdf of X(k) is
fk(x) = n!
(k 1)!(n k)! [F (x)]k1f(x)[1 F (x)]Nk
= k
n
k
2 2 . 1 3
*
* A l s o derived o n slides 2 2 . 2 - 2 2 . 8 o f t h i s l e c tu re .
Now if the Xi are i.i.d. with pdf
f(x) = 1 µ
as we have been assuming under H0, then this becomes
fk(x) = n!
(k 1)!(n k)! [1 ex/µ]k1 · 1
µ ex/µ · 1[0,)(x) · [ey/µ]Nk · 1[0,)(x)
= k
µ
n
k
[ex/µ]Nk+1[1 ex/µ]k1 · 1[0,)(x).
Thus the pdf of the OS-CFAR statistic Z = X(k) is Equivalently, the p.d.f. of Z is given by
fZ(z) = k
2
n
k
[ez/µ]Nk+1[1 ez/µ]k1 · 1[0,)(z).
2 2 . 1 4
Thus the probability of false alarm for OS-CFAR is given by
= EZ [P [Y > TZ|H0]] = EZ [exp{TZ/µ}]
= k
µ
N
k
= k
N
k
PD = EZ [P [Y > TZ|H1]} = EZ [exp(TZ/µ(1 + S))}
= k1
i=0
N i
0 5 10 15 20 25 30 N (samples) 0
5 10
15 20
25 30
SNR (dB)
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
1
Pd
Fig. 4.3. PD as a function of N and SNR for desired PFA = 1 106.
Table 4.1 Threshold Scaling Factor, N = 8, Pfa = 0.5.
k T ADT
1 8.000000 1.000000
2 3.094810 0.828967
3 1.797094 0.780880
4 1.195722 0.758714
5 0.843966 0.746508
6 0.607335 0.739648
7 0.429121 0.737169
8 0.273518 0.743384
0 5 10 15 20 25 30 N (samples) 0
5 10
15 20
25 30
SNR (dB)
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
1
Pd
Fig. 4.3. PD as a function of N and SNR for desired PFA = 1 106.
Table 4.1 Threshold Scaling Factor, N = 8, Pfa = 0.5.
k T ADT
1 8.000000 1.000000
2 3.094810 0.828967
3 1.797094 0.780880
4 1.195722 0.758714
5 0.843966 0.746508
6 0.607335 0.739648
7 0.429121 0.737169
8 0.273518 0.743384
In[1]:=osPfa[n_,k_,t_]:=Module[{i}, Product[(n-i)/(n-i+t),{i,0,k-1}]]
In[2]:=osPd[n_,k_,t_,s_]:=Module[{i}, Product[(n-i)/(n-i+t/(1+s)),{i,0,k-1}]]
In[3]:=findOS[n_,quantile_]:=Module[{}, Round[n*quantile]]
In[4]:=findT[n_,k_,pfa_]:=Module[{sol,r}, sol=FindRoot[osPfa[n,k,r]==0.000001,{r,1}]; r/.sol]
In[5]:=caPd[n_,pfa_,s_]:=Module[{}, ((1+s)/(pfa^(-1/n)+s))^n]
Mathematica Code to Generate Plots 2 2 . 1 7
n=16 k=findOS[n,4/5] Out[7]=13 t=findT[n,k,0.000001] Out[8]=16.9527
LogPlot[{0.000001^(1/(1 + s)), caPd[n, 0.000001, s], osPd[n, k, t, s]}, {s, 0, 20}]
A Typical Run
OS-CFAR
CA-CFAR
OS-CFAR
CA-CFAR
OS-CFAR CA-CFAR
Ideal (N ! ")
S (signal-to-noise ratio (absolute))
Processor Problem
Clutter Edges Poor Good Poor Good
Interfering Targets Poor Poor Good Good
better able to resolve closely spaced targets, but suers from higher false-alarm rates
at clutter edges. Order-Statistic CFAR, on the other hand, has been shown to be
robust in both the presence of statistical outliers (interfering targets) and clutter