Haibo Li, Torbjorn Kronander and Ingemar Ingemarsson- A Pattern Classifier Integrating Multilayer Perceptron and Error-Correcting Code

Post on 06-Apr-2018

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

8/3/2019 Haibo Li, Torbjorn Kronander and Ingemar Ingemarsson- A Pattern Classifier Integrating Multilayer Perceptron and Er…

http://slidepdf.com/reader/full/haibo-li-torbjorn-kronander-and-ingemar-ingemarsson-a-pattern-classifier 1/4

MVA '90 IAPR Workshop on Machine Vision Applications Nov. 28-30,1990, Tokyo

A Patte rn Classifier Integra ting Multilayer Perceptron

and Error-Correcting Code

Haibo Li, Torbjorn Kronander, and Ingemar IngemarssonDepartment of Electrical Engineering,

Linkisping University, S-58183 Linkoping, Sweden

e-mail: Haibo@ isy.liu.se, tobbe@ isy.liu.se

A b s t r a c t

In this paper we present a novel classifier which inte grates a multilayer perceptron

and a error-correcting decoder. There are two stages in th e classifier, in th e first stage,mapping feature vectors from feature space to code space is achieved by a multilayerperceptron; in the second stage, error correcting decoding is done on code space, bywhich the index of the noisy codeword can be obta ined . Hence we can get classificationsof original feature vectors. Th e classifier has better classification performances thanthe conventional multilayer perceptrons.

1 Introduction

Multilayer perceptrons trained with backpropagation[4] have been successfully applied in

m an y areas[4][5][6][7], especia lly in p at te rn recognition[5]. A num ber of theoretical analyses

have shown th at any continues nonlinear m apping can be closely approxim ated using sigmoidnonlinearities and muitilayer perceptron that implies tnat arbitrary uecision regions can be

formed by m ultilayer pe rceptrons. For some real-world problem s, however, it is very difficult

t o train a multilayer perceptron for forming map ping needed within given accurate, especially

when the decision regions required are more complex and irregular, even if neural networks

have more h idden layers and enough t ime b e provided t o t ra in i t .Meanwhile Error-Correcting Code theory has been well established a long t ime . T he

error correcting decoding can be viewed as a kind of classifying problem in which the noisy

codewords are decoded in to th e correct codewords based on m inimum Hamm ing d is tance in

code space. W ith minim um H amm ing distance criterion, the decision regions in code space

are very regular , for example are c ircle . T he mapping f rom feature space to code space

is easier achieved because su ch map ping is from region t o region instea d of from region t opoint. The classif ier has better classif ication performancs than the conventional multilayer

perceptrons.

In th is paper , we focus on how t o form mapping f rom feature space t o code space with

a m ultilayer perceptron and how to seek good error correcting codes whose codeword dis-

tr ibu tion s are uniform a nd codeword distance is large.

We will below first describe the pat ter n Classifier arch itec ture . T he n we discuss howt o train a multilayer perceptron a nd outline the error-correcting code. We conclude with

remarks on our pattern Classif ier and future work.

2 Classifier Architecture

Before we describe our new classifier, let's first examine the pattern classifier problem.

8/3/2019 Haibo Li, Torbjorn Kronander and Ingemar Ingemarsson- A Pattern Classifier Integrating Multilayer Perceptron and Er…

http://slidepdf.com/reader/full/haibo-li-torbjorn-kronander-and-ingemar-ingemarsson-a-pattern-classifier 2/4

For an N-dimensional feature space BN, assuming there are M categories pattern in

th is space, th is is, th is feature space can be par t i t ioned in to M cel ls Ci, i = 1 , 2 , ....,M a n d

associates t o each cell Ci a exem plar vector 5.For a given feature vector f E B N , we say i t belongs t o i th category Ci, if it meets

following conditio n

where 1.1 denote s some kind of distanc e measure in B ~ ,amming distance is used here.

This pattern classif ication problem can be solved by a multilayer perceptron with the

backpropagation algorithm a s its learning algorithm[5]. T he inpu t of the m ultilayer per-

ceptron is 2 € B N , t h e o u t pu t i s ?. Supposing the desired exemplar is d: then the er ror

function can be defined as following

then the weights of the multilayer perceptron are changed in the sequence by an amountpropor t ional t o th e par t ia l der ivat ives of E with respect to t he weights until a error principle

is met . T he general er ror pr incip le is

E < E (3 )

where 6 is a fault tolerant threshold.

For our pattern classif ication problem, Band ? are b inary vector , so the c used here is

less than the m inim um H amm ing distance[l][2] . T he desired exem plar vector B is generally

given with th e miilimum Ha mm ing distance 1. If the minimum Ham ming distance is great

tha n 1, th e classif ication results will be in confusion provided t ha t an y postprocessing is not

used for the results . T h e min imum H amm ing distance is less th an 1 which means e < 1, a

pattern classif ication with this method can be viewed as a kind mapping from a region to

a point. This may be the reason why the multilayer perceptrons are especially diff icult totrain for some reai-world problems.

Based on the above analysis, we present a novel pattern classifier which integrates a mul-

tilayer percep tron and a error-correcting decode. T h e idea behind t he classif ier is achieving

classif ication th roug h, f irst, region space mapping, th at is , map ping from a region of feature

space to a region of code space by a multilayer perceptron, then error-correcting decoding

is done in t he code space, we can obt ain the index of classification. T h e m ain difference be-

tween t he new classifier and multilayer perceptron is map ping from region t o region instead

of ma ppin g from region t o point in ou r new classifier.

For a pattern classification problem with M categories, we can find a set error-correcting

code with M codewords , D , , = 1 , 2 ,....,M , whose word length are L when the minimum

Hamm ing d is tance H,jn is predetermined. T he wordlength L is determined by M and

Hmin .

T h e desired signals used for trai ning classifier are codewords D i , i = 1 , 2 , ....,M , h e error

function is defined as follows

E = ID -P I (4 )

where ? is the o utp ut of th e mult ilayer perceptron .

T h e error principle used here is also described by (3 ) , bu t we can f ind th at th e minimum

Ham ming d is tance is f ree to choose, i t can be great than 1. Clear ly, i t is eas ier to t ra in the

mult i layer perceptron th an the one without mapping in code space. Because i t is eas ier to

meet the er ror pr inciple , i t no t only improve the acc urate but speed th e t ra in ing rate .

Af ter th e p at terns are mapped on to code space, we can make er ror-correct ing decoding inth e code space, we can obtain th e index of patterns. T h e process of a pat ter n classification

by the classifier is shown in Fig.1.

T h e overall classifier stru ctu re consists of two levels: the first level maps pattern vectors

from fea ture space on to code space by a multilayer perceptron; the second level is a error

8/3/2019 Haibo Li, Torbjorn Kronander and Ingemar Ingemarsson- A Pattern Classifier Integrating Multilayer Perceptron and Er…

http://slidepdf.com/reader/full/haibo-li-torbjorn-kronander-and-ingemar-ingemarsson-a-pattern-classifier 3/4

Figure 1: The process of a pattern classification by the new classifier

e r r o r - Ico r rec t i ng

decoder

X iFigure 2: The structure of the new classifier

correcting decoder which indicates the index of the input pattern. Such a system is shown

in Fig.2.

3 Mapping from feature space to code space

This section will mainly cover how to map a feature vector from feature space to code spaceby a multilayer perceptron.

Tzi-Dar Chiueh and R.Goodman[3] gave two schemes, outer product method and ps e u d ~

inverse method, mapping from feature space to code space. These methods are simple to

implement but strictly speaking in theory they can't achieve strict mapping, i.e.For a given feature vector X ECi, that is

Now 2 s mapped onto code space,

Y = w d (6)

where W is a transformation matrix satisfying the following equation,

but it is difEcult to ensure the following equation valid for any d

We achieve such mapping from feature space to code space by means of a multilayer per-

ceptron. The input of a multilayer perceptron are Zi, the desired signals are f i i . Then

8/3/2019 Haibo Li, Torbjorn Kronander and Ingemar Ingemarsson- A Pattern Classifier Integrating Multilayer Perceptron and Er…

http://slidepdf.com/reader/full/haibo-li-torbjorn-kronander-and-ingemar-ingemarsson-a-pattern-classifier 4/4

the m ul ti layer percept ron i s trained by t he backpropagat ion algor i thm. W hen t he maxi-

m um train e rror is less than the minim um Ham ming distance, the t raining of th e mult ilayer

percep tron finished.

If the exemplars, fi of pat tern vectors are known, a bet ter t rain procedure be designed

as follows

1) th e mult i layer perceptron is f i rst t rained w ith inp ut f;2) after the network is stable, the input 2 are chosen in l ine according t o th e sh ortest

distance t o their corresponding exemplars ste p by step for t raining th e multi layer pe rceptron.

Th is kind training procedure similar to th e cooling procedure used in th e simulated

annea ling can avoid sinking in local minimum pitfall . Hence a b et ter map ping performances

can be obtained.

4 Error-correcting code

Error-corre cting c ode has been found in a wide variety of applications, including da ta trans-

mission over a com mun ication chan nel, codes for high-speed a nd m ass memories of com puter

ect .

T h e error-correc ting code we need should mee t th e following properties 1) the distance

between each two codewords mu st be the sam e; 2) the distance between each two codewords

must be as large as possible.

Such codes as IIadam ard matrix codes can be found in [2], due t o the l imitat ions of

space , we don' t discuss this content here.

5 Conclusion remarks

Integrat ing error-correct ing code, the mult ilayer perceptron can give a bet te r performance

than wi thout such codes . Of course, the improvement in th e performance is at th e cost of the

complex system st ructure. T he m ore the er ror we want to lerant , the larger the codewordsare, and t he m ore complex the system is. How to choose the trade-off between the complex

an d fault tolerance is difference from case to case. Bu t we can say error-correcting code has

a wide application foreground in neural networks. We are in the beginning of applying error-

correcting codes for neural networks, we will work more with such object, since integrating

neural networks an d error-correcting codes seems to promise a good pa th in this aspect .

References[I] R.Blahut , Theory and Pract ice of Error Control Codes, 1983

[2] F.Macwill iams and N .Sloane, T he Theo ry of E rror-correcting Code s, 1988

[3] T.C hiue h and R.G oodm an, A Neural Network Classif ier Based on Coding Theory, AIP

1988

[4] D.E.Rumelhart ect , Learning Internal Representat ions by Error Propagation, Paral lel

Distr ibuted Processing, volume 1, M IT P ress, 1986

[5] R.P.L ippm ann, P at t ern Classification Using Neural Networks, IEE E Cornm. magazine

November 1989.

[6] Haibo Li, Nonlinear Predictive Image Coding Using Neural Networks, ICASSP 1990

[7] E.B aum , On th e Capabil i t ies of Mult ilayer Perceptrons, J.Com plexityIl988

top related