Principal Components Analysis (PCA)
0 订阅
Principle Components Analysis (PCA) is an unsupervised method primary used for dimensionality reduction within machine learning. PCA is calculated via a singular value decomposition (SVD) of the design matrix, or alternatively, by calculating the covariance matrix of the data and performing eigenvalue decomposition on the covariance matrix. The results of PCA provide a low-dimensional picture of the structure of the data and the leading (uncorrelated) latent factors determining variation in the data.
相关学科: Face RecognitionDimensionality ReductionLDAICAKPCASVMFault DetectionML2DPCACV
学科讨论

暂无讨论内容,你可以
推荐文献
按被引用数
学科管理组
暂无学科课代表,你可以申请成为课代表
重要学者
Yoshua Bengio
429868 被引用,1063
篇论文
Robert Tibshirani
278725 被引用,644
篇论文
Jian Sun
179895 被引用,332
篇论文
Yann LeCun
175383 被引用,366
篇论文
Trevor Hastie
173966 被引用,454
篇论文
Michael I. Jordan
150356 被引用,1056
篇论文
Anil K. Jain
148144 被引用,1055
篇论文
John C. Morris
146512 被引用,1902
篇论文
Terrence J. Sejnowski
134448 被引用,931
篇论文
Stephen M. Smith
128866 被引用,574
篇论文