performance - PCA calculation using SVD vs EIG -
pca can calculated using svd , eig, svd considered more numerical stable(and seems used more in mature machine learning projects).
so need comparision of 2 methods in sence of memory usage , performance , prove why 1 of them more numerical stable.
i here qr method , jacobi rotations method svd calculation, don't know properties.
i don't have of answers you, if want this, asking comparing of schur decomposition , svd (for illustration http://www.ijcaonline.org/icvci/number14/icvci1529.pdf), since believe schur decomposition way people compute eigenvalues.
in work, prefer utilize singular value decomposition if no other reason fact eigenvalues can complex numbers, whereas singular values real. of course, there higher computational complexity svd.
i have heard svd more accurate, admittedly not know why. think when people svd more accurate, mean svd(a) yield more accurate singular values eig(a^t * a), true. note a^t denotes transpose of matrix a.
performance memory machine-learning linear-algebra pca
No comments:
Post a Comment