Skip to main content
Log in

Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method

  • Published:
Frontiers of Information Technology & Electronic Engineering Aims and scope Submit manuscript

Abstract

We introduce the fractional-order global optimal backpropagation machine, which is trained by an improved fractional-order steepest descent method (FSDM). This is a fractional-order backpropagation neural network (FBPNN), a state-of-the-art fractional-order branch of the family of backpropagation neural networks (BPNNs), different from the majority of the previous classic first-order BPNNs which are trained by the traditional first-order steepest descent method. The reverse incremental search of the proposed FBPNN is in the negative directions of the approximate fractional-order partial derivatives of the square error. First, the theoretical concept of an FBPNN trained by an improved FSDM is described mathematically. Then, the mathematical proof of fractional-order global optimal convergence, an assumption of the structure, and fractional-order multi-scale global optimization of the FBPNN are analyzed in detail. Finally, we perform three (types of) experiments to compare the performances of an FBPNN and a classic first-order BPNN, i.e., example function approximation, fractional-order multi-scale global optimization, and comparison of global search and error fitting abilities with real data. The higher optimal search ability of an FBPNN to determine the global optimal solution is the major advantage that makes the FBPNN superior to a classic first-order BPNN.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

Download references

Author information

Authors and Affiliations

Authors

Contributions

Yi-fei PU designed the research and drafted the manuscript. Jian WANG helped organize the manuscript. Yi-fei PU and Jian WANG processed the data, and revised and finalized the paper.

Corresponding authors

Correspondence to Yi-fei Pu or Jian Wang.

Additional information

Compliance with ethics guidelines

Yi-fei PU and Jian WANG declare that they have no conflict of interest.

Project supported by the National Key Research and Development Program of China (No. 2018YFC0830300) and the National Natural Science Foundation of China (No. 61571312)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pu, Yf., Wang, J. Fractional-order global optimal backpropagation machine trained by an improved fractional-order steepest descent method. Front Inform Technol Electron Eng 21, 809–833 (2020). https://doi.org/10.1631/FITEE.1900593

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1631/FITEE.1900593

Key words

CLC number

Navigation