东北大学学报(自然科学版) ›› 2007, Vol. 28 ›› Issue (9): 1247-1249.DOI: -

• 论著 • 上一篇    下一篇

一种基于填充函数的神经网络全局优化算法

李鸿儒;李海龙;   

  1. 东北大学信息科学与工程学院;东北大学信息科学与工程学院 辽宁沈阳110004;辽宁沈阳110004
  • 收稿日期:2013-06-24 修回日期:2013-06-24 出版日期:2007-09-15 发布日期:2013-06-24
  • 通讯作者: Li, H.-R.
  • 作者简介:-
  • 基金资助:
    国家自然科学基金资助项目(60674063);;

A global optimization algorithm based on filled-function for neural networks

Li, Hong-Ru (1); Li, Hai-Long (1)   

  1. (1) School of Information Science and Engineering, Northeastern University, Shenyang 110004, China
  • Received:2013-06-24 Revised:2013-06-24 Online:2007-09-15 Published:2013-06-24
  • Contact: Li, H.-R.
  • About author:-
  • Supported by:
    -

摘要: 针对前向神经网络BP算法由于初始权值选择不当而陷入局部极小点这一缺陷,提出新的全局优化训练算法.首先,提出了一种新的填充函数,并证明该函数的填充性质,进而结合该新填充函数与BP算法,构造出基于填充函数的全局最优化神经网络算法.应用全局优化算法训练神经网络时,如果误差函数陷入局部极小值,该算法可以利用填充函数帮助误差函数不断地跳出局部最优,直到找到全局最优点.该新算法的最大优点是对于初始权值无依赖性,避免了BP算法易陷入局部极小值的缺点.理论分析和仿真试验结果证明了该全局优化神经网络算法的有效性和优越性.

关键词: 前向神经网络, BP算法, 填充函数, 全局优化, 局部极小点

Abstract: A novel global optimization training algorithm was constructed for the feed-forward neural networks to which the BP algorithm is easy to fall into the local minimum point for improper choice of initial weights. A new filled-function is therefore proposed with its nature of being filled-in proved. Then, the function is combined with BP algorithm to construct a global hybrid optimization algorithm. When it is used to train the neural networks in which the error function has fell into a local minimum point, it can apply the filled-function to helping the error function to uninterruptedly get rid of the point till a globally optimum point is found. So, its best advantage is independent of the chosen initial weights which makes the BP algorithm easy to fall into local minimum point. The simulation results and theoretic analysis show that this new algorithm is of efficiency and superiority.

中图分类号: