Top Banner
Theory and Applications of GF(2 p ) Cellular Automata P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India (LOGIC ON MEMORY)
34

Theory and Applications of GF(2 p ) Cellular Automata

Jan 04, 2016

Download

Documents

Theory and Applications of GF(2 p ) Cellular Automata. (LOGIC ON MEMORY). P. Pal Chaudhuri Department of CST Bengal Engineering College (DU) Shibpur, Howrah India. An Application Of LOGIC ON MEMORY. Logic on Memory. Basic Concept Classical Example Content Addressable Memory - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Theory and Applications of  GF(2 p ) Cellular Automata

Theory and Applications of GF(2p) Cellular Automata

P. Pal Chaudhuri

Department of CST

Bengal Engineering College (DU)

Shibpur, Howrah India

(LOGIC ON MEMORY)

Page 2: Theory and Applications of  GF(2 p ) Cellular Automata

An Application Of

LOGIC ON MEMORY

Page 3: Theory and Applications of  GF(2 p ) Cellular Automata

Logic on Memory

• Basic Concept

• Classical Example – Content Addressable

Memory

– Content Addressable Processor

Cell

Comp

Bit LineWord Line

=

Page 4: Theory and Applications of  GF(2 p ) Cellular Automata

Logic on Memory

• Sub Micron era

• Search

• Storage of (Large) size table and efficient search

• Memory + CA

• Efficient storage and search of data with CA based Classifier

Page 5: Theory and Applications of  GF(2 p ) Cellular Automata

Logic-on-memory

• Problem Definition

• CA Based Solution

Memory CA

Memory Element

XOR XNORLogic

Logic on Memory to Implement a specific function

Page 6: Theory and Applications of  GF(2 p ) Cellular Automata

GF(2p) CA as a Classifier

Input Output / AttributeI1 (C11) A1I2 (C21) A2I3 (C31) A1…… …..

Ik (C22) A2

( I1 , I3 ) ; (C11 , C31) A1 ( I2 , Ik ) ; (C21 , C22) A2

• Classification ---- a universal problem

• Given the Input, fast search for the attribute of an input element

• Uses a Special Class of CA

• Non Group Multiple Attractor CA (MACA)

Page 7: Theory and Applications of  GF(2 p ) Cellular Automata

Classifier

• Design of a CA Based Classifier

• Input is an element Cij

----- the classifier outputs Ai --- that is the

Cij belongs to class Ai

• Implicit Memory • Fast Search

Input Output / AttributeI1 (C11) A1I2 (C21) A2I3 (C31) A1…… …..

Ik (C22) A2

( I1 , I3 ) ; (C11 , C31) A1 ( I2 , Ik ) ; (C21 , C22) A2

LOGIC ON MEMORY Memory(Conventional & CA) + (CA)XOR Logic

Page 8: Theory and Applications of  GF(2 p ) Cellular Automata

Special Class Of CA Non Group Multiple Attractor CA (MACA)

9 711

5

8 410

6

D1 MACA

MACA

8 10

4

6

9 11

7

5 01100101

1110

12 14

2

0

13 15

1

30000 0011

00 01

1110

13 115

3

12 214

00100

Page 9: Theory and Applications of  GF(2 p ) Cellular Automata

Problem Definition

• Given Sets {P1} {P2} ………. {Pn} where each set {Pi} = {Xi1 , Xi2 , Xi3 …… Xim}

• Given a randomly selected value Xkj

• To Answer The Question

Which Class does Xkj belong To?

12 14

2

0

8 10

4

6

9 11

7

5

13 15

1

3

Page 10: Theory and Applications of  GF(2 p ) Cellular Automata

Classifier

• n bit CA with M Attractors is a natural Classifier

• {0,3,5,6} Are the attractors

• Inverted trees are the Attractor Basins

12 14

2

0

8 10

4

6

9 11

7

5

13 15

1

3

Page 11: Theory and Applications of  GF(2 p ) Cellular Automata

Classifier

• Suppose we want to identify which class X = 7 lies in

• The CA is loaded with X

• CA is run in autonomous mode for k (=2) cycles where k is the depth of CA

• The Pseudo Exhaustive bits (10 ) of the Attractor give the class of the pattern

12 14

2

0

8 10

4

6

9 11

7

5

13 15

1

301100101

1110

0000 0011

0001

Page 12: Theory and Applications of  GF(2 p ) Cellular Automata

Two Class D1 Classifier

• We use Depth 1 CA (D1 CA)

• We construct a CA satisfying the following 1. R1

x P1 and y P2

T (x y) 0

2. R2

T 2 =T

T (T I ) = 0

12 14

2

0

13 15

1

30000 0011

00 01

13 115

3

12 214

0

0100

Depth 1 CA (D1 MACA)

Page 13: Theory and Applications of  GF(2 p ) Cellular Automata

Algorithm

• Any CA Satisfying R1 & R2 is a classifier for P = { { P1} {P2} }

• P1 = { 0,2,12,14} and P2 = { 3,1,13,15}

• Each basin of CA will contain patterns from either P1 or P2

• 2 attractors

12 14

2

0

13 15

1

30000 0011

00 01

Page 14: Theory and Applications of  GF(2 p ) Cellular Automata

Algorithm

• In general, there will be 2 n-r attractors ( n=Size of CA , r=Rank of T matrix )

• 2 n-r PE positions at certain (n-r) positions

• The two Classes can be identified by a single bit memory stored in a 2 n-r x 1 bit memory or a simple logic circuit

12 14

2

0

13 15

1

30000 0011

00 01

Page 15: Theory and Applications of  GF(2 p ) Cellular Automata

Multiclass Classifier

• But what about multi class classifier ?

• A general CA based solution does not exist

• However we can use

hierarchical Two Classifier to build a solution

8 10

4

6

9 11

7

501100101

1110

12 14

2

0

13 15

1

30000 0011

0001

Page 16: Theory and Applications of  GF(2 p ) Cellular Automata

Multiclass Classifier

• Hierarchical Two Class classifier

• Built by partitioning the pattern set P

• P = {P1, P2, P3 ,…Pn}

as {{P1,P2,P3…Pk},{Pk+1,….Pn}}

and finding a two class classifier for this

• This is repeated for each subset

• Number of CAs required is log2n where n is the

number of classes

Page 17: Theory and Applications of  GF(2 p ) Cellular Automata

Multiclass Classifier

Classes are• P1 = {0,2,12,14}• P2 = {3,1,13,15}• P3 = {5,7, 9,11}• P4 = {6,4,8,10}

8 10

4

6

9 11

7

501100101

1110

12 14

2

0

13 15

1

30000 0011

0001

Page 18: Theory and Applications of  GF(2 p ) Cellular Automata

Multiclass Classifier

• Initially we built a Two Classifier to identify these two classes

• Temp0 = {P1,P2}

• Temp1 = {P3,P2}

• Then two more Classifiers to identify {P1 and P2} and {P3 and P4}

8 10

4

6

9 11

7

501100101

1110

12 14

2

0

13 15

1

30000 0011

0001

Temp 0Temp 1

Page 19: Theory and Applications of  GF(2 p ) Cellular Automata

General Multiclass Classifier

P2 PnPk

Temp1

Temp0

Temp 00Temp 11

Templm

log2 n CA s

Page 20: Theory and Applications of  GF(2 p ) Cellular Automata

Multiclass Classifier in GF (2p)

• Handles class elements of Symbol string rather than a bit string

• A T matrix satisfying R1 and R2 is efficiently obtained using BDD in GF(2)

• In GF (2p) we have introduced certain hueristics to get a solution T matrix in reasonably fast time

Page 21: Theory and Applications of  GF(2 p ) Cellular Automata

Application Areas

• Fast encoding in vector quantization of images

• Fault diagnosis

Page 22: Theory and Applications of  GF(2 p ) Cellular Automata

Image Compression

• Target Pictures Portraits and similar images

• Image size 352 x 240 ( CCIR size )

• Target compression ratio 97.5 % - 99 %

• Target PSNR value 25 - 30 dB

• Target application low bit rate coding for video telephony

Page 23: Theory and Applications of  GF(2 p ) Cellular Automata

Algorithm

• Used a training set of 12 pictures of a similar nature

• The images were partitioned in sizes of 8 x 8

• These 8 x 8 blocks are clustered around 8192 pivot points using standard LBG algorithm

B1

B2

Bi

Bm

Bn

Blocks

Training Images

Page 24: Theory and Applications of  GF(2 p ) Cellular Automata

Algorithm

• Elements are 64 length GF (2p) Symbol string --- 8 x 8 pixel block

• Therefore we have 8192 clusters

• And these can be addressed using 13 bits

• A multi class classifier is designed for these 8192 classes

• The depth of this classifier is 13

C1 C2…..

…. ….Cn

Clusters

Pivot Points

C1

C2

C8192

Codebook

Page 25: Theory and Applications of  GF(2 p ) Cellular Automata

Algorithm• The target image to be coded is

divided into 8 x 8 blocks• Each of these blocks is input to the

Multi Class Classifier• The Multi Class Classifier outputs

the class id of the block• This is done in effectively 13 clock

cycles plus some memory access times

• Encoding time is thus drastically reduced

Image

Block

Classifier

Class id

Page 26: Theory and Applications of  GF(2 p ) Cellular Automata

Training Images

B1

B2

Bi

BmBn

Blocks

C1 C2…..

…. ….Cn

Clusters

Pivot Points

C1

C2

C8192

Image

Block

Classifier

Codebook

Algorithm

Page 27: Theory and Applications of  GF(2 p ) Cellular Automata

Sample Results

Image PSNRJulie 33.53Girl1256 34.58Michelle 32.69Girl256 27.84Claire256 29.91Ash 27.84

Page 28: Theory and Applications of  GF(2 p ) Cellular Automata

Sample Images

• PSNR 27.8 db• Compression ratio 97.5 %

Page 29: Theory and Applications of  GF(2 p ) Cellular Automata

Sample Images

PSNR 25.1 dbCompression ratio 97.5 %

PSNR 28.5 dbCompression ratio 97.5 %

Page 30: Theory and Applications of  GF(2 p ) Cellular Automata

Schematic of a CA Based Vector Quantizer

CA Memory

CA Conf.

Controller

PE bits

Shift Register

Output

Page 31: Theory and Applications of  GF(2 p ) Cellular Automata

Hardware Design for

CA Based Vector Quantizer

Page 32: Theory and Applications of  GF(2 p ) Cellular Automata

Improvements Over the Basic scheme

• A hierarchical encoder has been implemented

• The image is first encoded using 16 x 16 blocks ….

• If a match cannot be obtained with any of the classes in the training set, then a match with 8 x 8 blocks is tried

• This pushes up the Compression ratio to 99 %

Page 33: Theory and Applications of  GF(2 p ) Cellular Automata

Dynamic Classification

• Static Database• The solution assumes the target pattern is present in

the cluster set• If a new pattern outside this range is input , the

classifier indicates No entry in The Database• So a linked queue of these new blocks is maintained• At periodic intervals, a new Multiclass Classifier is

obtained using these updated data members after incorporating them in the appropiate classes

Page 34: Theory and Applications of  GF(2 p ) Cellular Automata

Thank You