An Integer-Fractional Gradient Algorithm for Back Propagation Neural Networks
Citation
Source Title
Faculty
School
Funding and Sponsorship
Collection
Abstract
This paper proposes a new optimization algorithm for backpropagation (BP) neural networks by fusing integer-order differentiation and fractional-order differentiation, while fractional-order differentiation has significant advantages in describing complex phenomena with long-term memory effects and nonlocality, its application in neural networks is often limited by a lack of physical interpretability and inconsistencies with traditional models. To address these challenges, we propose a mixed integer-fractional (MIF) gradient descent algorithm for the training of neural networks. Furthermore, a detailed convergence analysis of the proposed algorithm is provided. Finally, numerical experiments illustrate that the new gradient descent algorithm not only speeds up the convergence of the BP neural networks but also increases their classification accuracy.
Related items
Showing items related by title, author, creator and subject.
-
Saptoro, Agus (2012)Recently, artificial neural networks, especially feedforward neural networks, have been widely used for the identification and control of nonlinear dynamical systems. However, the determination of a suitable set of ...
-
Chan, Kit Yan; Ling, S.; Dillon, Tharam; Nguyen, H. (2011)Hypoglycemia or low blood glucose is dangerous and can result in unconsciousness, seizures and even death for Type 1 diabetes mellitus (T1DM) patients. Based on the T1DM patients’ physiological parameters, corrected QT ...
-
Goh, Kwang Leng (2013)Web spamming has tremendously subverted the ranking mechanism of information retrieval in Web search engines. It manipulates data source maliciously either by contents or links with the intention of contributing negative ...