Top Banner
B219 Intelligent Systems Semester 1, 2003 Week 3 Lecture Notes page 1 of 1 Artificial Neural Networks (Ref: Negnevitsky, M. “Artificial Intelligence, Chapter 6) BPNN in Practice
16

B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

Feb 14, 2019

Download

Documents

dinhhanh
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 1 of 1

Artificial Neural Networks (Ref: Negnevitsky, M. “Artificial Intelligence, Chapter 6)

BPNN in Practice

Page 2: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 2 of 2

The Hopfield Network

§ In this network, it was designed on analogy of brain’s

memory, which is work by association.

§ For example, we can recognise a familiar face even in

an unfamiliar environment within 100-200ms.

§ We can also recall a complete sensory experience,

including sounds and scenes, when we hear only a

few bars of music.

§ The brain routinely associates one thing with another.

§ Multilayer Neural Networks trained with

backpropagation algorithm are used for pattern

recognition problems.

§ To emulate the human memory’s associative

characteristics, we use a recurrent neural network.

Page 3: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 3 of 3

§ A recurrent neural network has feedback loops from

its outputs to its inputs. The presence of such loops

has a profound impact on the learning capability of

the network.

§ Single layer n-neuron Hopfield network

§ The Hopfield network uses McCulloch and Pitts

neurons with the sign activation function as its

computing element:

§ The current state of the Hopfield is determined by the

current outputs of all neurons, y1, y2,…,yn.

Page 4: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 4 of 4

Hopfield Learning Algorithm:

§ Step 1: Assign weights

o Assign random connections weights with values

1+=ijw or 1−=ijw for all ji ≠ and

0 for ji =

§ Step 2: Initialisation

o Initialise the network with an unknown pattern:

1-or 1

pattern,input an of input at element an is and

0 at time node ofoutput theis )( where

,10 ),(

+

==−≤≤=

ix

ktikO

NikOx

i

i

ii

§ Step 3: Convergence

o Iterate until convergence is reached, using the

relation:

10 ,)()1(1

0−≤≤

∑=+−

=NjkOwfkO

N

iiiji

where the function f(.) is a hard limiting

nonlinearity.

o Repeat the process until the node outputs remain

unchanged.

Page 5: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 5 of 5

o The node outputs then best represent the

exemplar pattern that best matches the unknown

input.

§ Step 4: Repeat for Next Pattern

o Go back to step 2 and repeat for next xi, and so

on.

§ Hopfield network can act as an error correction

network.

Type of Learning

§ Supervised Learning o the input vectors and the corresponding output

vectors are given o the ANN learns to approximate the function

from the inputs to the outputs

Page 6: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 6 of 6

§ Reinforcement Learning o the input vectors and a reinforcement signal are

given o the reinforcement signal tells how good the

true output was

§ Unsupervised Learning o only input are given o the ANN learns to form internal representations

or codes for the input data that can then be used e.g. for clustering

§ From now we will look at unsupervised learning

neural networks.

Page 7: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 7 of 7

Hebbian Learning

§ In 1949, Donald Hebb proposed one of the key ideas

in biological learning, commonly known as Hebb’s

Law.

§ Hebb’s Law states that if neuron i is near enough to

excite neuron j and repeatedly participates in its

activation, the synaptic connection between these two

neurons is strengthened and neuron j becomes more

sensitive to stimuli from neuron i.

§ Hebb’s Law can be represented in the form of two

rules:

• If two neurons on either side of a connection are

activated synchronously, then the weight of that

connection is increased.

• If two neurons on either side of a connection are

activated asynchronously, then the weight of that

connection is decreased.

Page 8: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 8 of 8

§ Hebbian learning implies that weights can only

increase. To resolve this problem, we might impose a

limit on the growth of synaptic weights.

§ It can be implemented by introducing a non-linear

forgetting factor into Hebb’s Law:

where ö is the following factor

§ Forgetting factor usually falls in the interval between

0 and 1, typically between 0.01 and 0.1, to allow

only a little “forgetting” while limiting the weight

growth.

Page 9: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 9 of 9

Hebbian Learning Algorithm:

§ Step 1: Initialisation

o Set initial synaptic weights and threshold to small

random values, say in an interval [0,1].

§ Step 2: Activation

o Compute the neuron output at iteraction p

where n is the number of neuron inputs, and èj is

the threshold value of neuron j.

§ Step 3: Learning

o Update the weights in the network:

where Äwij(p) is the weight correction at iteration

p.

Page 10: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 10 of 10

o The weight correction is determine by the

generalised activity product rule:

§ Step 4: Iteration

Increase iteration p by one, go back to Step 2.

Competitive Learning

§ Neurons compete among themselves to be activated

§ While in Hebbian Learning, several output neurons

can be activated simultaneously, in competitive

learning, only a single output neuron is active at any

time.

§ The output neuron that wins the “competition” is

called the winner-takes-all neuron.

§ In the late 1980s, Kohonen introduced a special call

of ANN called self-organising maps. These maps are

based on competitive learning.

Page 11: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 11 of 11

Self-organising Map

§ Our brain is dominated by the cerebral cortex, a very

complex structure of billions of neurons and hundreds

of billions of synapses.

§ The cortex includes areas that are responsible for

different human activities (motor, visual, auditory,

etc), and associated with different sensory input.

§ Each sensory input is mapped into a corresponding

area of the cerebral cortex. The cortex is a self-

organising computational map in the human brain.

Page 12: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 12 of 12

§ Feature-mapping Kohonen model

The Kohonen Network

§ The Kohonen model provides a topological mapping.

It places a fixed number of input patterns from the

input layer into a higher dimensional output or

Kohonen layer.

Page 13: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 13 of 13

§ Training in the Kohonen network begins with the

winner’s neighbourhood of a fairly large size. Then,

as training proceeds, the neighbourhood size

gradually decreases.

§ The lateral connections are used to create a

competition between neurons. The neuron with the

largest activation level among all neurons in the

output layer becomes the winner.

§ The winning neuron is the only neuron that produces

an output signal. The activity of all other neurons is

suppressed in the competition.

Page 14: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 14 of 14

§ The lateral feedback connections produce excitatory

or inhibitory effects, depending on the distance from

the winning neuron.

§ This is achieved by the use of a Mexican hat

function which describes synaptic weights between

neurons in the Kohonen layer.

§ In the Kohonen network, a neuron learns by shifting

its weights from inactive connections to actives ones.

Only the winning neuron and its neighbourhood are

allowed to learn.

Page 15: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 15 of 15

Competitive Learning Algorithm

§ Step 1: Initialisation

o Set initial synaptic weights to small random

values, say in an interval [0, 1], and assign a small

positive value to learning rate parameter á

§ Step 2: Activation and Similarity Matching

o Activate the Kohonen network by applying the

input vector X, and find the winner-takes-all (best

matching) neuron jX at iteration p, using the

minimum-distance Euclidean criterion

where n is the number of neurons in the input

layer, and m is the number of neurons in the

Kohonen layer.

Page 16: B219 Intelligent Systems Semester 1, 2003 Artificial ...ftp.it.murdoch.edu.au/units/ICT219/Lectures/03B219Lect_Week03.pdf · few bars of music. ... The presence of such loops ...

B219 Intelligent Systems Semester 1, 2003

Week 3 Lecture Notes page 16 of 16

§ Step 3: Learning

o Update the synaptic weights

where Äwij(p) is the weight correction at iteration

p.

o The weight correction is determined by the

competitive learning rule:

where á is the learning rate parameter, and Ëj(p)

is the neighbourhood function central around the

winner-takes-all neuron jx at iteration p.

§ Step 4: Iteration

o Increase iteration p by one, go back to step 2 and

continue until minimum-distance Euclidean

criterion is satisfied, or no noticeable changes

occur in the feature map