I need to analyze data from csv file in Matlab. Since csv file is large Matlab struggles with compilation. What can I do to make it more efficient?

1 Ansicht (letzte 30 Tage)
Code reads particular rows from excel inside for loop and puts them into matrix. I need to operate with this matrix but Matlab takes forever to compile it. Here is what I'm currently using:
n = 0;
A = zeros(N1, 40);
B = zeros(N1, 40);
C = zeros(N1, 40);
for m = start: N: fin
n = n + 1;
section1 = ['B', num2str(m), ':AO', num2str(m)];
A(n,:) = xlsread(fileName, section1);
section2 = ['B', num2str(m+1), ':AO', num2str(m+1)];
B(n,:) = xlsread(fileName, section2);
C(n,1) = A(n,1) - B(n,1);
for k = 2:40
C(n,k) = C(n,k-1) + (A(n,k) - B(n,k));
end
end
N1 is large number so that matrices are big. Is there any other efficient way to do this?

Akzeptierte Antwort

José-Luis
José-Luis am 16 Dez. 2016
Ouch!
Don't call xlsread inside a loop. You're always opening the same file, as far as I can see. Just load once and read from that.

Weitere Antworten (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by