Abstract
Principal Component Analysis (PCA) (Jolliffe, Principal component analysis. Springer, 2011) is a very well-known and fundamental linear method for subspace learning and dimensionality reduction (Friedman et al., The elements of statistical learning. vol. 2. Springer series in statistics, 2009). This method, which is also used for feature extraction, was first proposed by Pearson in 1901 (Pearson, LIII. Lond Edinb Dublin Philos Mag J Sci 2, 559–572, 1901).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
If there are c classes, one-hot encoding of the j-th class is a binary vector with all zeros except the j-th element.
- 2.
Haar wavelet is a family of square-shaped filters which form wavelet bases.
References
Hervé Abdi and Lynne J Williams. “Principal component analysis”. In: Wiley interdisciplinary reviews: computational statistics 2.4 (2010), pp. 433–459.
Jonathan L Alperin. Local representation theory: Modular representations as an introduction to the local representation theory of finite groups. Vol. 11. Cambridge University Press, 1993.
Eric Bair et al. “Prediction by supervised principal components”. In: Journal of the American Statistical Association 101.473 (2006), pp. 119–137.
Elnaz Barshan et al. “Supervised principal component analysis: Visualization, classification and regression on subspaces and submanifolds”. In: Pattern Recognition 44.7 (2011), pp. 1357–1371.
Mikhail Belkin and Partha Niyogi. “Laplacian eigenmaps for dimensionality reduction and data representation”. In: Neural computation 15.6 (2003), pp. 1373–1396.
Raymond B Cattell. “The scree test for the number of factors”. In: Multivariate behavioral research 1.2 (1966), pp. 245–276.
Michael AA Cox and Trevor F Cox. “Multidimensional scaling”. In: Handbook of data visualization. Springer, 2008, pp. 315–347.
David L Donoho. “High-dimensional data analysis: The curses and blessings of dimensionality”. In: AMS math challenges lecture (2000).
Susan T Dumais. “Latent semantic analysis”. In: Annual review of information science and technology 38.1 (2004), pp. 188–230.
Brendan Frey. Frey face dataset. https://cs.nyu.edu/~roweis/data.html. Accessed: June 2017.
Jerome Friedman, Trevor Hastie, and Robert Tibshirani. The elements of statistical learning. Vol. 2. Springer series in statistics New York, NY, USA: 2009.
Ali Ghodsi. “Dimensionality reduction: a short tutorial”. In: Department of Statistics and Actuarial Science, Univ. of Waterloo, Ontario, Canada 37 (2006).
Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep learning. MIT Press, 2016.
Arthur Gretton et al. “Measuring statistical dependence with Hilbert-Schmidt norms”. In: International conference on algorithmic learning theory. Springer. 2005, pp. 63–77.
Ji Hun Ham et al. “A kernel view of the dimensionality reduction of manifolds”. In: International Conference on Machine Learning. 2004.
Thomas Hofmann, Bernhard Schölkopf, and Alexander J Smola. “Kernel methods in machine learning”. In: The annals of statistics (2008), pp. 1171–1220.
Ian Jolliffe. Principal component analysis. Springer, 2011.
Shuangge Ma and Ying Dai. “Principal component analysis based methods in bioinformatics studies”. In: Briefings in bioinformatics 12.6 (2011), pp. 714–722.
Thomas P Minka. “Automatic choice of dimensionality for PCA”. In: Advances in neural information processing systems. 2001, pp. 598–604.
Hoda Mohammadzade et al. “Critical object recognition in millimeter-wave images with robustness to rotation and scale”. In: JOSA A 34.6 (2017), pp. 846–855.
Karl Pearson. “LIII. On lines and planes of closest fit to systems of points in space”. In: The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science 2.11 (1901), pp. 559–572.
Sam T Roweis and Lawrence K Saul. “Nonlinear dimensionality reduction by locally linear embedding”. In: science 290.5500 (2000), pp. 2323–2326.
David E Rumelhart, Geoffrey E Hinton, and Ronald J Williams. “Learning representations by back-propagating errors”. In: Nature 323.6088 (1986), pp. 533–536.
Bernhard Schölkopf, Alexander Smola, and Klaus-Robert Müller. “Kernel principal component analysis”. In: International conference on artificial neural networks. Springer. 1997, pp. 583–588.
Bernhard Schölkopf, Alexander Smola, and Klaus-Robert Müller. “Nonlinear component analysis as a kernel eigenvalue problem”. In: Neural computation 10.5 (1998), pp. 1299–1319.
Radomir S Stanković and Bogdan J Falkowski. “The Haar wavelet transform: its status and achievements”. In: Computers & Electrical Engineering 29.1 (2003), pp. 25–44.
Harry Strange and Reyer Zwiggelaar. Open Problems in Spectral Dimensionality Reduction. Springer, 2014.
Joshua B Tenenbaum, Vin De Silva, and John C Langford. “A global geometric framework for nonlinear dimensionality reduction”. In: science 290.5500 (2000), pp. 2319–2323.
The Digital Technology Group. AT&T Laboratories Cambridge. https://www.cl.cam.ac.uk/research/dtg/attarchive/facedatabase.html. Accessed: June 2017. 1994.
Matthew Turk and Alex Pentland. “Eigenfaces for recognition”. In: Journal of cognitive neuroscience 3.1 (1991), pp. 71–86.
Matthew A Turk and Alex P Pentland. “Face recognition using eigenfaces”. In: Computer Vision and Pattern Recognition, 1991. Proceedings CVPR’91., IEEE Computer Society Conference on IEEE. 1991, pp. 586–591.
Yi-Qing Wang. “An analysis of the Viola-Jones face detection algorithm”. In: Image Processing On Line 4 (2014), pp. 128–148.
M-H Yang, Narendra Ahuja, and David Kriegman. “Face recognition using kernel eigenfaces”. In: Image processing, 2000. proceedings. 2000 international conference on. Vol. 1. IEEE. 2000, pp. 37–40.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Ghojogh, B., Crowley, M., Karray, F., Ghodsi, A. (2023). Principal Component Analysis. In: Elements of Dimensionality Reduction and Manifold Learning. Springer, Cham. https://doi.org/10.1007/978-3-031-10602-6_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-10602-6_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-10601-9
Online ISBN: 978-3-031-10602-6
eBook Packages: Computer ScienceComputer Science (R0)