Gracheva Inessa - Fast Global Image Denoising Algorithm on the Basis of Nonstationary Gamma-Normal Statistical Model

Post on 27-Jul-2015

42 Views

Preview:

Click to see full reader

Transcript

Fast Global Image Denoising Algorithm on the Basis of Nonstationary Gamma-

Normal Statistical Model

Gracheva Inessa, gia1509@mail.ru,  Kopylov Andrey, and.kopylov@gmail.com,

Krasotkina Olga, O.V.Krasotkina@yandex.ruTula State University, Tula, Russia

AIST Conference, April 2015, Ekaterinburg

Formulation of the problem

What do we want? the method should effectively remove Gaussian noise as well as

Poissonian noise; the method should satisfy strict constraints in terms of computational

cost, so as to be able to process large data sets;the algorithm required as less user input as possible in order to facilitate

its application and to enhance the reproducibility of its results.

To achieve these purpose a nonstationary gamma-normal noise model has been proposed in the framework of Bayesian approach to the problem of image processing. This model allows us to develop a fast global algorithm on the basis of Gauss-Seudel procedure and Kalman filter-interpolator.

Image denoising

( , )tY y t T ( , )tX x t T

1 2 1 1 2 2{ ( , ) : 1,..., , 1,..., }T t t t t N t N

x Xy Y

The analyzed image

The processing result

Probabilistic data model

The joint conditional probability density:

where is the variance of the observation noise.

The a priori joint distribution:

where is the proportionality coefficients to the varianceof the observation noise ; V is the neighborhoodgraphs of image elements having the form of a lattice.

2/2 /2

1 1( | , ) exp( ( ) )

(2 ) 2 t tN Nt T

Y X y x

21/2

,( 1)/2

1 1 1( | , ) exp ( )

2(2 )

t tt t V tn N

tt T

X x x

)( 2teE

tt tr

1 N1t1

1

N2

t2

Gamma - distribution

))/1(exp()/1(),|/1( 1ttt

2 1

2(1/ | , , ) (1/ ) exp (1/ )2t t t

2

1)1(2)/1(,

1)1()/1(

tt VarE2

(1/ ) , (1/ )t tE Var

11

( | , ) ( | , )

1 1exp exp ( 1) ln

tt T t T t

tt T t T t Tt t

G

( | , , )

1 1 1exp ln ,

2 tt T t

G

where 1 1 1

1 1 ,2 2

Gamma-distributed of the inverse factors : t/1

with mathematical expectations and variances:

We come to the a priori distribution density:

Normal gamma-distribution So, have completely defined the joint prior normal gamma-distribution

of both hidden fields and

Coupled with the conditional density of the observable field, it makes basis for Bayesian estimation of the field .

The joint a posteriori distribution of hidden elements, namely, those of field and its instantaneous factors

( , )tX x t T ( , )t t T

( , | , , ) ( | , ) ( | , , )H X X G

( , )tX x t T

( | , ) ( | , ) ( | , )( , | , , , )

( ' | ', ) ( ' | , ) ( | ', ) ' '

X G Y XP X Y

X G Y X dX d

( , )tX x t T ( , )t t T

The Bayesian estimate of the hidden random field

The Bayesian estimate of is the maximum point of the numerator

The conditionally optimal factors are

defined independently of each other:

The zero conditions for the derivatives, excluding the trivial solutions lead to the equalities

( , )X

,

2

2' ''

', ''

( , | , ) arg min ( , | , , ),

( , | , , ) ( )

1( ) / (1 1/ ) ln .

X

t tt T

t t tt t V t

X J X Y

J X Y y x

x x

,

2

1

2' ''

', ''

( , | , ) arg min ( , | , , ),

( , | , , ) ( )

(1/ )( ) 1/(1 1/ ) ln .

1 1/

X

N

t tt

t t

t t V

X J X Y

J X Y y x

x x

( , , ) [ ( , , ), ]tX X t T

2' ''

1( , , ) arg min ( | , , ) : [( ) / ] (1 1/ ) ln 0t t t

t t

X J X x x

,t2

' ''(1/ )( ) 1/( , , )

1 1 /t t

t

x xX

“The saturation effect”

“The saturation effect” of the unsmoothness penalty for sufficiently large values of the parameter µ and fixed value λ =1.

The Gauss - Seudel procedureMinimize of the objective function gives the approximation to the

estimate of the mutually agreed field starting with the initial values , and new value of the hidden component gives the solution of the conditional optimization problem

( , )k ktX x t T

2 2' ''

', '' '

( , 1,..., ) arg min ( , | , , )

1arg min ( ) ( ) .

k k k

X

t t t tX t T t t V t

X x t N J X Y

y x x x

1

2 2' ''

', '' '

arg min ( , | , , )

1arg min ( ) ( ) .

k k

k kt t t t

t T t t V t

J X Y

y x x x

0 0( , )t t T

Approximation trellis neighborhood graph sequence trees

For finding the values of the hidden field at each vertical row of the picture, we use a separate pixel neighborhood tree which is defined, on the whole pixel set and has the same horizontal branches as the others.

The resulting image processing procedure is aimed at finding optimal values only for the hidden variables at the stem nodes in each tree.

The simplicity of such a succession of elementary trees allows for a significant simplification of the optimization procedure.

1

Comparison of Poissonian and Gaussian noise removal algorithms

Image

Peak intensity

Input PSNR

Peppers 256x256

120 60 30

23.92 20.92 17.91

Haar-Fisz 29.27 27.30 24.54

Platelet 29.07 27.44 25.73

PURE-LET 30.28 28.51 26.72

Our algorithm 29.53 28.65 25.77

Image

Peak intensity

Input PSNR

Einstein 512x512

120 60 30

8.34 14.35 19.23

FBF 24.86 26.21 29.63

SURE-LET 24.56 26.30 29.43

BM3D 25.03 26.59 29.99

Our algorithm 24.76 26.44 29.88

Gaussian noisePoissonian noise

Relative computation time of various denoising techniques (sec)

Methods Image size

256x256 512x512

Platelet 856 1112

PURE-LET 4.6 8.2

BM3D 4.1 7.2

Haar-Fisz 1.3 3.2

FBF 0.5 1.8

SURE-LET 0.4 1.6

Our algorithm

0.16 0.53

A plot of the time of the algorithm (s) on the image size.

Experimental results

a) b)

c) d)a)The original Einstein image. b)Noisy version with simulated Gaussian noise.

c)Denoised with our algorithm. d)Denoised with BM3D.

Experimental results

a)

b) c)a)Part of the original MRI slice from the IGBMC Imaging Center data sets. b)Denoised with our

algorithm. c)Denoised with PURE-LET

THANKS FOR YOUR ATTENTION!

AcknowledgementsThis research is funded by RFBR grant,13-07-00529.

top related