Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks Author : Bogdan M. Wilamowski, Fellow, IEEE, Nicholas J. Cotton, Okyay Kaynak, Fellow, IEEE, and Günhan Dündar Source : IEEE INDUSTRIAL ELECTRONICS MAGAZINE Date : 2012/3/28 Presenter : 林林林 1
15
Embed
Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks Author : Bogdan M. Wilamowski, Fellow, IEEE, Nicholas J. Cotton,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Computing Gradient Vector and Jacobian Matrix inArbitrarily Connected Neural Networks
Author : Bogdan M. Wilamowski, Fellow, IEEE, Nicholas J. Cotton, Okyay Kaynak, Fellow, IEEE, and Günhan DündarSource : IEEE INDUSTRIAL ELECTRONICS MAGAZINEDate : 2012/3/28Presenter : 林哲緯
• Levenberg–Marquardt algorithm– Combine the advantages of Gauss–Newton
algorithm and Steepest descent method– far off the minimum like Steepest descent method– Close to the minimum like Newton algorithm– It’s find local minimum not global minimum