ABSTRACT
On ten learning tasks (5 function approximation and 5 real life regression problems), we compare the effciency and efficacy of using asymmetric or anti-symmetric activation functions in sigmoidal feedforward artificial neural network training and usage. The result obtained in the experiment allows us to conclude that for networks trained using the improved variant of the resilient backpropagation algorithm, the usage of asymmetric activation functions like the logistic or the log-sigmoid function should be preferred as compared to anti-symmetric function such that the two functions have the same derivative.
- Asuncion, A. 2007.UCI machine learning repository.Google Scholar
- Breiman, L. 1991. The PI method for estimating multivariate functions from noisy data. Technometrics 3(2), 125--160. Google ScholarDigital Library
- Chandra, P. 2003. Sigmoidal function classes for feedforward artificial neural networks. Neural Processing Letters 18(3), 205--215. Google ScholarDigital Library
- Chandra, P., Singh, Y. 2004. Feedforward sigmoidal networks - equicontinuity and fault- tolerance. IEEE Transactions on Neural Networks 15(6), 1350--1366. Google ScholarDigital Library
- Cherkassky, V., Gehring, D., Mulier, F. 1996. Comparison of adaptive methods for func-tion estimation from samples. IEEE Transactions on Neural Networks 7(4), 969--984. Google ScholarDigital Library
- Cherkassky, V., Mulier, F. 1998. Learning from Data - Concepts, Theory and Methods. John Wiley, New York. Google ScholarDigital Library
- Cybenko, G. 1989.Approximation by superposition of a sigmoidal function. Mathematics of Control, Signal and Systems 5, 233--243.Google Scholar
- Duch, W., Jankowski, N. 1999. Survey of neural network transfer functions. Neural Computing Surveys 2, 163--212.Google Scholar
- Duch, W., Jankowski, N. 2001.Transfer functions: hidden possibilities for better neural networks. In: ESANN. pp. 81--94.Google Scholar
- Friedman, J.H. 1991. Multivariate adaptive regression splines. Ann. Statist 19, 1--141.Google ScholarCross Ref
- Funahashi, K. 1989. On the approximate realization of continuous mappings by neural networks. Neural Networks 2, 183--192. Google ScholarDigital Library
- Gibbons, J.D., Chakraborti, S.2003. Nonparametric Statistical Inference. Marcel Dekker, Inc., New York.Google Scholar
- Haykin, S. 1999. Neural Networks: A Comprehensive Foundation. Prentice Hall, Inc., New Jersey (1999). Google ScholarDigital Library
- Hornik, K., Stinchcombe, M., White, H.1989. Multilayer feedforward networks are universal approximators. Neural Networks 2, 359--366. Google ScholarDigital Library
- Igel, C., Husken, M.2003. Empirical evaluation of the improved rprop learning algorithms Neurocomputing 50(0), 105--123.Google Scholar
- Johnson, R.W. 1996. Fitting percentage of body fat to simple body measurements. Proceedings of the IEEE 4(1).Google ScholarCross Ref
- Jones, L.1990. Constructive approximations for neural networks by sigmoidal functions. Proceedings of the IEEE 78(10), 1586--1589.Google ScholarCross Ref
- LeCun, Y., Bottou, L., Orr, G.B., Muller, K.R. 1998. Efficient backprop. In: Orr, G.B.,Muller, K.R. (eds.) Neural Networks: Tricks of the trade. pp. 9--50. LNCS: 1524, Springer, Berlin. Google ScholarDigital Library
- Riedmiller, M., Braun, H.2010. A direct adaptive method for faster backpropagation learning: The RPROP algorithm. In: Proc. of IEEE conference on Neural Networks. vol. 1, pp. 586--591. San Francisco.Google Scholar
- Riedmiller, M. 1994. Advanced supervised learning in multi-layer perceptrons from backpropagation to adaptive learning algorithms. Computer Standards & Interfaces 16(3), 265--278.Google ScholarCross Ref
- Singh, Y., Chandra, P.2003. A class +1 sigmoidal activation functions for FFANNs. Journal of Economic Dynamics and Control 28, 183--187.Google ScholarCross Ref
- Sodhi, S.S., Chandra, P. 2014. Bi-modal derivative activation function for sigmoidal feedforward networks. Neurocomputing 143(0), 182--196. Google ScholarDigital Library
- Effect of Activation Function Symmetry on Training of SFFANNs with RPROP Algorithm
Recommendations
An activation function adapting training algorithm for sigmoidal feedforward networks
The universal approximation results for sigmoidal feedforward artificial neural networks do not recommend a preferred activation function. In this paper a new activation function adapting algorithm is proposed for sigmoidal feedforward neural network ...
Sigmoidal Function Classes for Feedforward Artificial Neural Networks
The role of activation functions in feedforward artificial neural networks has not been investigated to the desired extent. The commonly used sigmoidal functions appear as discrete points in the sigmoidal functional space. This makes comparison ...
Bi-modal derivative activation function for sigmoidal feedforward networks
A new class of activation functions is proposed as the sum of shifted log-sigmoid activation functions. This has the effect of making the derivative of the activation function with respect to the net inputs, be bi-modal. That is, the derivative of the ...
Comments