Conjugate Gradient Algorithm Based on Aitken's Process for Training Neural Networks | ||
AL-Rafidain Journal of Computer Sciences and Mathematics | ||
Article 4, Volume 11, Issue 1, June 2014, Pages 39-51 PDF (428.77 K) | ||
Document Type: Research Paper | ||
DOI: 10.33899/csmj.2014.163730 | ||
Authors | ||
Khalil K. Abbo; Hind H. Mohammed | ||
College of Computer Science and Mathematics University of Mosul, IRAQ | ||
Abstract | ||
Conjugate gradient methods constitute excellent neural network training methods, because of their simplicity, numerical efficiency and their very low memory requirements. It is well-known that the procedure of training a neural network is highly consistent with unconstrained optimization theory and many attempts have been made to speed up this process. In particular, various algorithms motivated from numerical optimization theory have been applied for accelerating neural network training. In this paper, we propose a conjugate gradient neural network training algorithm by using Aitken's process which guarantees sufficient descent with Wolfe line search. Moreover, we establish that our proposed method is globally convergent for general functions under the strong Wolfe conditions. In the experimental results, we compared the behavior of our proposed method(NACG) with well- known methods in this field. | ||
Keywords | ||
Artificial neural front networks; Education algorithms; Aitken method; Conjugate gradient algorithms | ||
Statistics Article View: 215 PDF Download: 244 |