Komachi Lab M2 2016/06/14 Chainer back propagation
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
6
SGD
Loss func
correct
W1 W2
predictinput
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
7
SGD
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
Neural Net
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
8
SGD
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
1. forward
2. Loss Loss
3.
4. (AdaGrad )
5. 1
10
gW1 gW2
Loss func
correct
W1 W2
predictinput
Komachi Lab
✤ W1 W2
✤ x, h, y, t
✤ Loss
13
(i train set index)
correct
W1 W2
predictinput :x
hidden:h:y :t
Komachi Lab
Chainer
✤ PFN DNN
http://chainer.org/
✤ Define-by-Run
Define-and-Run
✤
TensorFlow, Theano, Torch, Keras, Caffe, etc...
21
Komachi Lab
on Chainer✤
✤
✤
✤ W
28
tanh forward backwardchainer.functions.activation.tanh
forward Chain backward ChainChainer