Zscore and solving by definition differ

15 Ansichten (letzte 30 Tage)
Martin Vasilkovski
Martin Vasilkovski am 18 Okt. 2018
I am trying to recreate some steps of finding eigenvalues and eigenvectors from a data set. The one I used is Matlab's cities. The code is:
clear all
close all
load cities
A=zeros(329,9);
L=zeros(9,9);
V=zeros(9,9);
%normalization
for i=1:1:9
means(i)=sum(ratings(:,i))/size(ratings,1);
stdev(i)=std(ratings(:,i));
end
for i=1:1:9
for j=1:1:329
A(j,i)=(ratings(j,i)-means(i))/stdev(i);
end
end
%covariance matrix
C=cov(A);
%eigenvalues
[vectors,lambda]=eig(C);
%sorting
sort(lambda,'descend'); %variances
for k=1:1:9
V(:,k)=vectors(:,10-k);
end
L=flip(flip(lambda,2),1);
Z=zscore(ratings);
[coefs,scores,variances,t2]=pca(Z);
Outputs: V =
0.2064 0.2178 -0.6900 0.1373 -0.3691 0.3746 -0.0847 0.3623 -0.0014
0.3565 0.2506 -0.2082 0.5118 0.2335 -0.1416 -0.2306 -0.6139 -0.0136
0.4602 -0.2995 -0.0073 0.0147 -0.1032 -0.3738 0.0139 0.1857 0.7164
0.2813 0.3553 0.1851 -0.5391 -0.5239 0.0809 0.0186 -0.4300 0.0586
0.3512 -0.1796 0.1464 -0.3029 0.4043 0.4676 -0.5834 0.0936 -0.0036
0.2753 -0.4834 0.2297 0.3354 -0.2088 0.5022 0.4262 -0.1887 -0.1108
0.4631 -0.1948 -0.0265 -0.1011 -0.1051 -0.4619 -0.0215 0.2040 -0.6858
0.3279 0.3845 -0.0509 -0.1898 0.5295 0.0899 0.6279 0.1506 0.0255
0.1354 0.4713 0.6073 0.4218 -0.1596 0.0326 -0.1497 0.4048 -0.0004
coefs =
0.2064 -0.2178 0.6900 -0.1373 -0.3691 0.3746 -0.0847 -0.3623 -0.0014
0.3565 -0.2506 0.2082 -0.5118 0.2335 -0.1416 -0.2306 0.6139 -0.0136
0.4602 0.2995 0.0073 -0.0147 -0.1032 -0.3738 0.0139 -0.1857 0.7164
0.2813 -0.3553 -0.1851 0.5391 -0.5239 0.0809 0.0186 0.4300 0.0586
0.3512 0.1796 -0.1464 0.3029 0.4043 0.4676 -0.5834 -0.0936 -0.0036
0.2753 0.4834 -0.2297 -0.3354 -0.2088 0.5022 0.4262 0.1887 -0.1108
0.4631 0.1948 0.0265 0.1011 -0.1051 -0.4619 -0.0215 -0.2040 -0.6858
0.3279 -0.3845 0.0509 0.1898 0.5295 0.0899 0.6279 -0.1506 0.0255
0.1354 -0.4713 -0.6073 -0.4218 -0.1596 0.0326 -0.1497 -0.4048 -0.0004
Mathematically, they should be the same. But, some of the eigenvectors are totally equal, where some of them (columns 2,3,4,8) are negative (multiplied by -1). What do you think is happening? I need to find the reason and source for this.

Akzeptierte Antwort

John D'Errico
John D'Errico am 18 Okt. 2018
Bearbeitet: John D'Errico am 18 Okt. 2018
Source? Basic mathematics. An eigenvector is NOT unique.
If A is a square matrix, with eigenvalue lambda, and eigenvector X, then we learn that
A*X = lambda*X
But then we can multiply by ANY scalar.
k*A*X = k*lambda*X
Since scalar multiplication will commute here, we see that
A*(k*X) = lambda*(k*X)
Therefore k*X is ALSO an eigenvector. What did you find? k=-1. SURPRISE!
Some of your eigenvectors are the same, others are essentially the same, but with s sign flip. Still the same. Since the eigenvectors will be normalized to have unit norm, they can still have an arbitrary factor of -1 applied to any eigenvector.
Finally, in some cases, where an eigenvalue has multiplicity higher than 1, then the corresponding eigenvectors are even less unique. But that is not your problem here.
  1 Kommentar
Martin Vasilkovski
Martin Vasilkovski am 18 Okt. 2018
Of course it is not unique, but they were too similar (and with exactly same eigenvalues) not be a little suspicious about it. And as you said, I completely missed the opportunity of scalar multiplication, and a factor of -1 explains it. Thank you.

Melden Sie sich an, um zu kommentieren.

Weitere Antworten (0)

Kategorien

Mehr zu Linear Algebra finden Sie in Help Center und File Exchange

Produkte

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by