Journal of Northeastern University Natural Science ›› 2017, Vol. 38 ›› Issue (2): 153-157.DOI: 10.12068/j.issn.1005-3026.2017.02.001

• Information & Control •     Next Articles

A Class of Neural Networks for Solving Optimization Problems with Global Attractivity

WANG Zhan-shan1, KANG Yun-yun2, NIU Hai-sha1   

  1. 1. School of Information Science & Engineering, Northeastern University, Shenyang 110819, China; 2. Chizhou Power Supply Company of State Grid, Chizhou 247100, China.
  • Received:2015-10-18 Revised:2015-10-18 Online:2017-02-15 Published:2017-03-03
  • Contact: WANG Zhan-shan
  • About author:-
  • Supported by:
    -

Abstract: A recurrent neural network in the form of differential inclusion was proposed for solving a class of nonlinear optimization problems, where the constraints were defined by a class of inequality and equality constraints. A higher-order compensation term was involved in the considered neural model, therefore, the convergence rate of the neural computation was significantly increased and the unstable problem of the optimal solution from infeasible domain to feasible domain was solved. In theory, it is proven that not only the solution of the proposed network exists globally and uniquely, but also the solution of the proposed network is bounded and is convergent to the optimal solution set of the optimization problem. Meanwhile, global attractivity of the neural network was analyzed. Three numerical examples were used to show the effectiveness and good performance of the proposed neural network.

Key words: neural networks, differential inclusion, neural computation, optimization, global attractivity

CLC Number: