Skip to main content

Advertisement

Log in

A new model for time-series forecasting using radial basis functions and exogenous data

  • Original Article
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

In this paper, we present a new model for time-series forecasting using radial basis functions (RBFs) as a unit of artificial neural networks (ANNs), which allows the inclusion of exogenous information (EI) without additional pre-processing. We begin by summarizing the most well-known EI techniques used ad hoc, i.e., principal component analysis (PCA) and independent component analysis (ICA). We analyze the advantages and disadvantages of these techniques in time-series forecasting using Spanish bank and company stocks. Then, we describe a new hybrid model for time-series forecasting which combines ANNs with genetic algorithms (GAs). We also describe the possibilities when implementing the model on parallel processing systems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3a-h
Fig. 4

Similar content being viewed by others

Notes

  1. Before discussing this problem, we prove the existence of an exact representation for the continuous function in terms of simpler functions using Kolmogorov’s theorem.

  2. The subindex k is related to the structure or subset of loss functions used in the approximation.

  3. Roughly speaking, the VC dimension, h, measures how many training points can be separated for all possible labeling using functions of the class.

  4. In the strict sense presented in [33], that is, they are bounded functions or satisfy a certain inequality.

  5. For example, Vapnik’s ε insensitive loss function [33]: \(L{\left( {f{\left( x \right)} - y} \right)} = \left\{ {\begin{array}{*{20}l} {{{\left| {f{\left( x \right)} - y} \right|} - \varepsilon } \hfill} & {{{\text{for }}{\left| {f{\left( x \right)} - y} \right|} \geqslant \varepsilon } \hfill} \\ {0 \hfill} & {{{\text{otherwise}}} \hfill} \\ \end{array} } \right. \)

  6. This calculation must be performed several times during the process.

  7. Generalized differentiation of a function: \({\text{d}}R{\left| f \right|} = {\left[ {{\left( {{\text{d}} \mathord{\left/ {\vphantom {{\text{d}} {{\text{d}}\rho }}} \right. \kern-\nulldelimiterspace} {{\text{d}}\rho }} \right)}R{\left[ {f + \rho h} \right]}} \right]}, \) where hH.

  8. The principal feature of these algorithms is the sequential adaptation of neural resources.

References

  1. Pollock DSG (1999) A handbook of time series analysis, signal processing and dynamics. Academic Press, San Diego, California

  2. Box GEP, Jenkins GM, Reinsel GC (1994) Time series analysis: forecasting and control, 3rd edn. Prentice Hall, Englewood Cliffs, New Jersey

    Google Scholar 

  3. Platt J (1991) A resource-allocating network for function interpolation. Neural Comput 3(2):213-225

    Google Scholar 

  4. Salmerón-Campos M (2001) Predicción de series temporales can redes neuronales de funciones radiales y técnicas de descomposición matricial. PhD thesis, Departamento de Arquitectura y Technología de Computadores, University of Granada

  5. Moisés Salmerón, Julio Ortega, Carlos G. Puntonet, Alberto Prieto (2001) Improved RAN sequential prediction using orthogonal techniques. Neurocomputing 41:153-172

    Article  Google Scholar 

  6. Masters T (1995) Neural, novel and hybrid algorithms for time series prediction. Wiley, New York

  7. Back AD, Weigend AS (1997) Discovering structure in finance using independent component analysis. In: Proceedings of the 5th international conference on neural networks in the capital markets (Computational finance 1997), London, December 1997

  8. Back AD, Trappenberg TP (2001) Selecting inputs for modelling using normalized higher order statistics and independent component analysis. IEEE Trans Neural Networ 12:(3):612-617

    Google Scholar 

  9. Hyvarinen A, Oja E (2000) Independent component analysis: algorithms and applications. Neural Networks 13:411-430

    Article  Google Scholar 

  10. Bell AJ, Sejnowski TJ (1995) An information-maximization approach to blind separation and blind deconvolution. Neural Comput 7:1129-1159

    CAS  PubMed  Google Scholar 

  11. Comon P (1994) Independent component analysis: a new concept. Signal Process 36:287-314

    Article  MATH  Google Scholar 

  12. Amari S, Cichocki A, Yang HH (1996) A new learning algorithm for blind source separation. In: Advances in neural information processing systems 8. MIT Press, Cambridge, Massachusetts, pp 757-763

  13. Puntonet CG (1994) Nuevos algoritmos de separación de fuentes en medios lineales. PhD thesis, Departamento de Arquitectura y Tecnología de Computadores, University of Granada

  14. Theis FJ, Jung A, Puntonet C, Lang EW (2003) Linear geometric ICA: fundamentals and algorithms. Neural Comput 15(2):419-439

    Google Scholar 

  15. Puntonet CG, Mansour A, Ohnishi N (2002) Blind multiuser separation of instantaneous mixture algorithm based on geometrical concepts. Signal Process 82(8):1155-1175

    Article  MATH  Google Scholar 

  16. Puntonet CG, Ali Mansour (2001) Blind separation of sources using density estimation and simulated annealing. IEICE Trans Fund Electr E84-A:2539-2547

  17. Rodríguez-Álvarez M, Puntonet CG, Rojas I (2001) Separation of sources based on the partitioning of the space of observations. Lect Notes Comput Sci 2085:762-769

    Google Scholar 

  18. Górriz Sáez JM (2003) Algoritmos híbridos la modelización de series temporales con técnicas ar-ica. PhD thesis, Departamento de Ing de Sistemas y Aut Tec Eleectrónica y Electrónica , University of Cádiz

  19. Back AD, Weigend AS (1997) Discovering structure in finance using independent component analysis. In: Proceedings of the 5th international conference on neural networks in the capital markets (Computational finance 1997), London, December 1997

  20. Moody J, Darken CJ (1989) Fast learning in networks of locally-tuned processing units. Neural Comput 1:284-294

    Google Scholar 

  21. Hastie T, Tibshirani R, Friedman V (2000) The elements of statistical learning. Springer, Berlin Heidelberg New York

  22. Michalewicz Z (1992) Genetic algorithms + data structures = evolution programs. Springer, Berlin Heidelberg New York

  23. Szapiro T, Matwin S, Haigh K (1991) Genetic algorithms approach to a negotiation support system. IEEE Trans Syst Man Cybern 21:102-114

    Article  Google Scholar 

  24. Chen S, Wu Y (1997) Genetic algorithm optimization for blind channel identification with higher order cumulant fitting. IEEE Trans Evolut Comput 1:259-264

    Google Scholar 

  25. Chao L, Sethares W (1994) Nonlinear parameter estimation via the genetic algorithm. IEEE Trans Signal Proces 42:927-935

    Article  Google Scholar 

  26. Olle Haggstrom (1998) Finite Markov chains and algorithmic applications. Cambridge University Press, Cambridge, UK

  27. Schmitt LM, Nehaniv CL, Fujii RH (1998) Linear analysis of genetic algorithms. Theor Comput Sci 200:101-134

    Article  MathSciNet  MATH  Google Scholar 

  28. Suzuki J (1995) A markov chain analysis on simple genetic algorithms. IEEE Trans Syst Man Cybern 25:655-659

    Article  Google Scholar 

  29. Eiben AE, Aarts EHL, Van Hee KM (1991) Global convergence of genetic algorithms: a markov chain analysis, parallel problem solving from nature. Lect Notes Comput Sci 496:4-12

    Google Scholar 

  30. Schmitt LM (2001) Theory of genetic algorithms. Theoret Comput Sci 259:1-61

    Google Scholar 

  31. Lozano JA, Larrañaga P, Graña M, Albizuri FX (1999) Genetic algorithms: bridging the convergence gap. Theoret Comput Sci 229:11-22

    Google Scholar 

  32. Rudolph G (1994) Convergence analysis of canonical genetic algorithms. IEEE Trans Neural Networ 5:96-101

    Article  Google Scholar 

  33. Vapnik V (1998) Statistical learning theory. Wiley, New York

  34. Tikhonov AN, Arsenin VY (1997) Solutions of ill-posed problems. Winston, Washington, pp 415-438

  35. Vapnik V, Chervonenkis A (1974) Theory of pattern recognition (in Russian). Nauka, Moscow

  36. Muller KR, Smola AJ, Ratsch G, Scholkopf B, Kohlmorgen J (1999) Using support vector machines for time series prediction. In: Scholkopf B, Burges CJC, Smola AJ (eds) Advances in kernel methods-support vector learning. MIT Press, Cambridge, Massachusetts, pp 243-254

  37. Muller KR, Smola AJ, Ratsch G, Scholkopf B, Kohlmorgen J, Vapnik V (1997) Predicting time series with support vector machines. In: Proceedings of the 7th international conference on artificial neural networks (CANN’97), Lausanne, Switzerland, May 1997, pp 999-1004

  38. Smola AJ, Scholkopf B, Muller KR (1998) The connection between regularization operators and support vector kernels. Neural Networks 11:637-649

    Article  Google Scholar 

  39. Muller KR, Mika S, Ratsch G, Tsuda K, Scholkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Networ 12(2):181-201

    Article  Google Scholar 

  40. Vapnik V, Lerner A (1963) Pattern recognition using generalized portrait method. Automat Rem Contr 24:774-780

    Google Scholar 

  41. Kuhn HW, Tucker AW (1951) Nonlinear programming. In: Proceedings of the 2nd Berkeley symposium on mathematical statistics and probabilistics. University of California Press, pp 481-492

  42. Kohonen T (1990) The self-organizing map. P IEEE 78(9):1464-1480

    Article  Google Scholar 

  43. Cao L (2003) Support vector machines experts for time series forecasting. Neurocomputing 51:321-339

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. M. Górriz.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Górriz, J.M., Puntonet, C.G., Salmerón, M. et al. A new model for time-series forecasting using radial basis functions and exogenous data. Neural Comput & Applic 13, 101–111 (2004). https://doi.org/10.1007/s00521-004-0412-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-004-0412-5

Keywords

Navigation