This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1GATE-561
Reactive Behavior ModelingNeural Networks
(GATE-561)
Dr.Çağatay ÜNDEĞER
InstructorMiddle East Technical University, GameTechnologies
• Signals can be thought as simple electrical impulses from axons to dendrites.
• Dendrites collect the signals.• Soma performs a kind of summation.• Axon fire and transmit the signal.
5GATE-561
Artificial Neural Networks (NNets)
• Neural networks use a similar approach.• Consist of:
– Input arcs collect the signals,– Neuron sums the signals and– Output arcs transmit the summed signal based on
a transmit function.
• Neural networks can be learned.
6GATE-561
Neural Network Architecture
i
i
i
h
h
h
h
h
h
o
o
h h
input layer hidden layers output layer
input 1
input 2
input 3
output 1
output 2
Arcs (storing weights) Neurons
7GATE-561
Usage of NNets
• Environmental Scanning and Classification:– Interpret vision and auditory information
• Memory:– Can learn through experience
• Behavior Control:– Outputs can be used as actions to be
performed.• Response Mapping:
– Any input output mapping and association.
8GATE-561
Usage of NNets
• The usage of NNets is a little bit fuzzy.• No exact cases and rules.• Assume it as a tool and use your creativity to
get whatever you like.
• What NNets do is– To seperate a solution space into partitions
9GATE-561
A Neuron Architecture
Y
X1
X2
Xi
Xn
w1
w2
wi
wn
Xi = input i of neuron
wi = weight of input i
Y = summing junction
y = output of neuron
yto input of other
neurons
y is based on a transfer or activation function
B=1
b
B = a constant bias node, which may represent past history
10GATE-561
Neuron Architecture
Y
X1
X2
Xi
Xn
w1
w2
wi
wn
y
B=1
b
Ya = B*b + Σ Xi*wii=1
n
Summing is referred as the input activation Ya
Input activation is fed into the activation function fa(x)
and output y is acquired
y = fa(x) Inputs and weights can be in
range (-∞, +∞)
11GATE-561
Some Activation Functions
1
0 x
y
θ
Step function
fs(x) = 1, if x≥θ0, if x<θ
1
0 x
y
Linear function
fs(x) = x
12GATE-561
Common Activation Function
1
0 x
y
Exponential / Sigmoid function
fs(x) = 1 / ( 1 + e-θx )
θ = 1
You can also scale and shift the output if required
13GATE-561
A Group Of Neurons
• A single neuron will not do a lot for us• We need a group of them
A single layer NNet A two layer NNet
14GATE-561
Simple Logic Functions
• Created by McCulloch and Pitts in 1943.• AND, OR, XOR functions could be done with
simple neural network structures.
15GATE-561
Simple Logic Function
AND
OR
XOR
16GATE-561
Learning NNets
• Weights of NNets can be learned by various algorithms.
• The most common algorşthm is the Backpropagation.
17GATE-561
Backpropagation
• Can learn weights of a multi-layer network – With fixed set of units and interconnections
• Employs a gradient descent search– To minimize the squared errors between
target output and network output• It is sub-optimal and may fail to reach true
optimal point.
18GATE-561
Backpropagation
• Can learn weights of a multi-layer network – With fixed set of units and interconnections
• Employs a gradient descent search– To minimize the squared errors between
target output and network output
19GATE-561
Backpropagation
• Error surface of the network may have multiple local minima.
• Therefore backprogation guarantees only to converge a local-optimum and may fail to reach the true-optimum.
20GATE-561
Backpropagation
01
02
03
11
12
13
21
22
14
input 1
input 2
input 3
output 1
output 2
Initialize the network weights with small random numbers (e.g. -0.5 ... 0.5) Train the network with m training examples repeatedly for multiple iterations.Stop training when a convergence criteria is met.
21GATE-561
Backpropagation
01
02
03
11
12
13
21
22
14
o1
o2
For one training example, assume that:Inputs are x01, x02 and x03 andExpected outputs are t1 and t2
We propogate the inputs through the network and Compute the actual outputs o1, o2 and o3
t1
t2
expectedactualx01
x02
x03
22GATE-561
Backpropagation11
12
13
21
22
14
o1
o2
Errors will occur between actual and expected outputs.We propagate errors backward through the network.