Abstract
Algorithms based on Nested Generalized Exemplar (NGE) theory [10] classify new data points by computing their distance to the nearest “generalized exemplar” (i.e. an axis-parallel multidimensional rectangle). An improved version of NGE, called BNGE, was previously shown to perform comparably to the Nearest Neighbor algorithm. Advantages of the NGE approach include compact representation of the training data and fast training and classification. A hybrid method that combines BNGE and the k-Nearest Neighbor algorithm, called KBNGE, is introduced for improved classification accuracy. Results from eleven domains show that KBNGE achieves generalization accuracies similar to the k-Nearest Neighbor algorithm at improved classification speed. KBNGE is a fast and easy to use inductive learning algorithm that gives very accurate predictions in a variety of domains and represents the learned knowledge in a manner that can be easily interpreted by the user.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Aha, D.W.: A Study of Instance-Based Algorithms for Supervised Learning Tasks. Technical Report, University of California, Irvine (1990)
Carpenter, G.A., Grossberg, S., Markuzon, N., Reynolds, J.h., Rosen, D.B.: Fuzzy ARTMAP: A Neural Network Architecture for Incremental Supervised Learning of Analog Multidimensional Maps. IEEE Transactions on Neural Networks 3 (1992) 698–713
Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press (1991)
Detrano, R., Janosi, A., Steinbrunn, W., Pfisterer, M., Schmid, K., Sandhu, S., Guppy, K., Lee, S., Froelicher, V.: Rapid searches for complex patterns in biological molecules. American Journal of Cardiology 64 (1989) 304–310
Friedman, J.H., Bentley, J.L., Finkel, R.A.: An Algorithm for Finding Best Matches in Logarithmic Expected Time. ACM Transactions on Mathematical Software. 3 (1977) 209–226
Murphy, P.M., Aha, D.W.: UCI Repository of machine learning databases [Machine-readable data repository]. Technical Report, University of California, Irvine (1991)
Omohundro, S.M.: Five Balltree Construction Algorithms. Technical Report, International Computer Science Institute, Berkeley, CA (1989)
Omohundro, S.M.: Best-First Model Merging for Dynamic Learning and Recognition. Neural Information Processing Systems 4 San Mateo California: Morgan Kaufmann Publishers, INC. (1992) 958–965
Parzen, E.: An estimation of a probability density function and mode. Ann. Math. Stat. 33 (1962) 1065–1076
Salzberg, S.: A Nearest Hyperrectangle Learning Method. Machine Learning 6 (1991) 277–309
Schaffer, C.: Overfitting Avoidance as Bias. Machine Learning 10 (1993) 153–178
Simpson, P.K.: Fuzzy min-max neural networks: 1. Classification. IEEE Transactions on Neural Networks 3 (1992) 776–786
Weiss, S.M., Kulikowski, C.A.: Computer Systems that learn. San Mateo California: Morgan Kaufmann Publishers, INC. (1991)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1994 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Wettschereck, D. (1994). A hybrid nearest-neighbor and nearest-hyperrectangle algorithm. In: Bergadano, F., De Raedt, L. (eds) Machine Learning: ECML-94. ECML 1994. Lecture Notes in Computer Science, vol 784. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-57868-4_67
Download citation
DOI: https://doi.org/10.1007/3-540-57868-4_67
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-57868-0
Online ISBN: 978-3-540-48365-6
eBook Packages: Springer Book Archive