Abstract
Metamodels can speed up the optimization process. Previously evaluated designs can be used as a training set for building surrogate models. Subsequently an inexpensive virtual optimization can be performed. Candidate solutions found in this way need to be validated (evaluated by means of the real solver).
This process can be iterated in an automatic way: this is the reason of the fast optimization algorithms. At each iteration the newly evaluated designs enrich the training database, permitting more and more accurate metamodels to be build in an adaptive way.
In this paper a novel scheme for fast optimizers is introduced: the virtual optimization - representing an exploitation process - is accompanied by a virtual run of a suited space-filler algorithm - for exploration purposes - increasing the robustness of the fast optimizer.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Bongartz, I., Conn, A., Gould, N., Toint, P.: CUTE: constrained and unconstrained testing environment. ACM Trans. Math, Soft. 21, 123–160 (1995)
Buhmann, M.D.: Radial Basis Functions: Theory and Implementations. Cambridge University Press, Cambridge (2003)
Deb, K., Mathur, A., Meyarivan, T.: Constrained Test Problems for Multi-Objective Evolutionary Optimization. In: Zitzler, E., Deb, K., Thiele, L., Coello Coello, C.A., Corne, D.W. (eds.) EMO 2001. LNCS, vol. 1993, p. 284. Springer, Heidelberg (2001)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multi-objective genetic algorithm - NSGA-II. Report Number 2000001, KanGAL (2000)
Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc., Boston (1989)
Hagan, M.T., Menhaj, M.B.: Training feedforward networks with the marquardt algorithm. IEEE Trans. on Neural Networks 5(6) (1994)
Irie, B., Miyake, S.: Capabilities of three-layered perceptrons. In: Proceedings of the IEEE International Conference on Neural Networks, pp. I–641 (1998)
Jin, R., Chen, W., Simpson, T.W.: Comparative studies of metamodeling techniques under multiple modeling criteria. Structural and Multidisciplinary Optimization 23(1), 1–13 (2001)
Knowles, J., Nakayama, H.: Meta-modeling in multiobjective optimization. In: Branke, J., Deb, K., Miettinen, K., Słowiński, R. (eds.) Multiobjective Optimization. LNCS, vol. 5252, pp. 245–284. Springer, Heidelberg (2008)
Lovison, A.: Kriging. Technical Report 2007-003, Esteco (2007)
Matheron, G.: Les variables régionalisées et leur estimation: une application de la théorie des fonctions aléatoires aux sciences de la nature. Masson, Paris (1965)
Myers, R.H., Montgomery, D.C., Anderson-Cook, C.M.: Response surface methodology: process and product optimization using designed experiments. John Wiley & Sons, Inc., Chichester (2009)
Nelder, J.A., Mead, R.: A simplex method for function minimization. Computer Journal 7(4), 308–313 (1965)
Nguyen, D., Widrow, B.: Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights. In: Proceedings of IJCNN, vol. 3, pp. 21–96 (1990)
Poles, S.: The SIMPLEX method. Technical Report 2003-005, Esteco (2003)
Poloni, C., Pediroda, V.: GA coupled with computationally expensive simulations: tools to improve efficiency. In: Genetic Algorithms and Evolution Strategies in Engineering and Computer Science, pp. 267–288. John Wiley and Sons, Chichester (1997)
Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)
Rigoni, E.: Radial basis functions response surfaces. Technical Report 2007-001, Esteco (2007)
Rigoni, E., Lovison, A.: Automatic sizing of neural networks for function approximation. In: Proceedings of SMC 2007, pp. 2005–2010 (2007)
Rippa, S.: An algorithm for selecting a good value for the parameter c in radial basis function interpolation. Adv. Adv. Comp. Math. 11, 193–210 (1999)
Tanaka, M.: GA-based decision support system for multi-criteria optimization. In: Proc. of the Int. Conference on Systems, Man and Cybernetics - 2
Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C., da Fonseca, V.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Transactions on Evolutionary Computation 7(2), 117–132 (2003)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2010 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Rigoni, E., Turco, A. (2010). Metamodels for Fast Multi-objective Optimization: Trading Off Global Exploration and Local Exploitation. In: Deb, K., et al. Simulated Evolution and Learning. SEAL 2010. Lecture Notes in Computer Science, vol 6457. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-17298-4_56
Download citation
DOI: https://doi.org/10.1007/978-3-642-17298-4_56
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-17297-7
Online ISBN: 978-3-642-17298-4
eBook Packages: Computer ScienceComputer Science (R0)