Journal of Northeastern University Natural Science ›› 2020, Vol. 41 ›› Issue (9): 1274-1279.DOI: 10.12068/j.issn.1005-3026.2020.09.010

• Mechanical Engineering • Previous Articles     Next Articles

A Compound Gradient Acceleration Optimization Algorithm with Adaptive Step Size

YIN Ming-ang1, WANG Yu-shuo2, SUN Zhi-li1, YU Yun-fei3   

  1. 1.School of Mechanical Engineering & Automation, Northeastern University, Shenyang 110819, China; 2.CRRC Changchun Railway Vehicles Co.,Ltd., Changchun 130062, China; 3.AVIC Shenyang Engine Design Institute, Shenyang 110015, China.
  • Received:2020-01-09 Revised:2020-01-09 Online:2020-09-15 Published:2020-09-15
  • Contact: YIN Ming-ang
  • About author:-
  • Supported by:
    -

Abstract: In related researches, a class of adaptive iteration step size accelerated(Adam) algorithms becomes a research hotspot because of its high computational efficiency and compatibility. To solve the problem of Adam′s low convergence rate, based on the combination of current gradient, prediction gradient and historical momentum gradient, this paper proposed a new kind of Adam algorithm named as compound gradient descent method(C-Adam), and proved its convergence. The difference between C-Adam and other acceleration algorithms is that C-Adam distinguishes the prediction gradient from the historical momentum, and finds a more accurate search direction for the next iteration through a real gradient update. Using two testing data sets and the data of 45 steel static tensile experiment to test the C-Adam, the results show that the algorithm has faster convergence speed and smaller training loss compared with other popular algorithms.

Key words: first-order optimization algorithm, compound gradient descent method, Logistic regression, pattern recognition

CLC Number: