[1]
T. Jolliffe, Principal Component Analysis, Springer, NY (1989).
Google Scholar
[2]
T. Cox, M. Cox, Multidimensional Scalling, Chapman & Hall, London (1994).
Google Scholar
[3]
X. He and P. Niyogi, Locality preserving projections, in Proceedings of the Annual Conference on Neural Information Processing Systems 16, Vancouver, Canada (2003).
Google Scholar
[4]
J. B. Tenenbaum, V. de Silva, J. C. Langford, A global geometric framework for nonlinear dimensionality reduction, Science 290: 2319-2323 (2000).
DOI: 10.1126/science.290.5500.2319
Google Scholar
[5]
S. T. Roweis, L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science 290: 2323-2326 (2000).
DOI: 10.1126/science.290.5500.2323
Google Scholar
[6]
M. Belkin, P. Niyogi, Laplacian eigenmaps and spectral techniques for embedding and clustering, NIPS 15, Vancouver, British Columbia, Canada (2001).
Google Scholar
[7]
L. Yang, Building k edge-disjoint spanning trees of minimum total length for isometric data embedding. IEEE Trans. Pattern Analysis and Machine Intelligence, 27(10): 1680-1683 (2005).
DOI: 10.1109/tpami.2005.192
Google Scholar
[8]
Y. Pan, et al., Weighted locally linear embedding for dimension reduction, Pattern Recognition, 5(12): 798-811 (2009).
DOI: 10.1016/j.patcog.2008.08.024
Google Scholar
[9]
Kouropteva O, Okun O, Hadid A, et al, Beyond locally linear embedding algorithm. Technical Report MVG, University of Oulu (2002).
Google Scholar
[10]
L. K. Saul, S. T. Roweis, Think globally, fit locally: unsupervised learning of nonlinear manifolds. Journal of Machine Learning Research 4: 119–155 (2003).
Google Scholar
[11]
O. Kouropteva, O. Okun, M. Pietikäinen. Incremental locally linear embedding. Pattern Recognition, 38(10): 1764-1767 (2005).
DOI: 10.1016/j.patcog.2005.04.006
Google Scholar
[12]
http: /www. cs. toronto. edu/~roweis/data. html.
Google Scholar
[13]
http: /archive. ics. uci. edu/ml/machine-learning-databases/iris.
Google Scholar