Paper The following article is Open access

Non-negative Radial Basis Function Neural Network in Polynomial Feature Space

, , , and

Published under licence by IOP Publishing Ltd
, , Citation Huiyang Wang et al 2019 J. Phys.: Conf. Ser. 1168 062005 DOI 10.1088/1742-6596/1168/6/062005

1742-6596/1168/6/062005

Abstract

Radial basis function neural network (RBFNN) is an effective nonlinear learning model, which has a strong nonlinear fitting capability. The hidden neurons and the weight play important roles in the neural network. In the existing researches, the hidden neurons are computed in the form of a rigorous linear combination. And the weight is hard to be solved. To address these issues, in this paper a novel neural network named non-negative radial basis function neural network (NRBFNN) is proposed. The main thought of non-negative matrix factorization (NMF) is to be used to train the parameters of RBFNN. According to the structure of the neural network, the label information of the samples is decomposed into the weight matrix and the mapped feature by activation functions in polynomial feature space. And the proposed method is able to obtain the weight matrix and the hidden neurons implied in the activation functions iteratively. Furthermore, the proposed NRBFNN is able to improve the representability of the hidden neurons and the iterative formulas of the weight can ensure the solvability and interpretability of it. ORL, Yale and Caltech 101 face databases are selected for evaluations. Experimental results show that the proposed algorithm outperforms several related algorithms.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1168/6/062005