This question is closed. Reopen it to edit or answer.

My nested For Loops takes 40 seconds for 20,000 records. Anyway to vectorize or improve?

1 view (last 30 days)
% Consolidate Results by Security and Date
uniqueDate = unique(d.RiskData(:,2));
uniqueSec = unique(d.RiskData(:,1));
numSecs = length(unique(d.RiskData(:,1)));
numDates = length(unique(d.RiskData(:,2)));
d.RiskDataNew{(numDates * numSecs) + 1,size(d.RiskData,2)} =[];
d.RiskDataNew(1,1) = {'SECURITY'};
d.RiskDataNew(1,2) = {'DATE'};
for i = 1:numFields
d.RiskDataNew(1,i + 2) = fldList(i);
%d.RiskDataNew = {};
n = 1;
for x = 1:numSecs
for y = 1:numDates
idx = find(contains(d.RiskData(1:end,1),uniqueSec(x)) & ...
if ~isempty(idx)
n = n + 1;
if length(idx) > 1
numFields = size(d.RiskData,2);
for j = 2:length(idx)
for k = 3:numFields % first two fields are defaults: Security and Date
if (isempty(d.RiskDataNew{n,k}))
d.RiskDataNew{n,k} = d.RiskData{idx(j),k};
Walter Roberson
Walter Roberson on 16 Jun 2020
I have not tested, but I have a suspicion that ismember is faster than contains().
That sub-expression can be calculated at the for x level. You can possibly even do
for x = 1:numSecs
xidx = find(contains(d.RiskData(1:end,1),uniqueSec(x)));
for y = xidx
idx = find(contains(d.RiskData(1:end,2),uniqueDate(y)));

Answers (2)

Sayyed Ahmad
Sayyed Ahmad on 16 Jun 2020
You have to avoiding the nested loops.
may be you can use the bsxfun to avoiding some loops.
An example:
A = rand(50); % 50-by-50 matrix of random values between 0 and 1
% method 1: slow and lots of lines of code
meanA = mean(A); % mean of every matrix column: a row vector
% pre-allocate result for speed, remove this for even worse performance
result = zeros(size(A));
for j = 1:size(A,1)
result(j,:) = A(j,:) - meanA;
clear result % make sure method 2 creates its own result
% method 2: fast and only one line of code
result = bsxfun(@minus,A,mean(A));
the Answer wold be
Elapsed time is 0.015153 seconds.
Elapsed time is 0.007884 seconds.
see the following links for more details.

Mark McGrath
Mark McGrath on 16 Jun 2020
Edited: Mark McGrath on 16 Jun 2020
Hello Thanks for soem of the responses.
I am trying to consolidate data and can use a table with summary grouping but i need it to be dynamic as i do not know all the fields ahead of time other than security and date (so can't use). So i am iterating through to find all fields, distinct securities and distinct dates. For security and date i want all fields to be on one row.
The example below happens to show security and date with 3 extra fields. I want the three extra fields to be rolled up into one line. BUT, those 3 extra fields still need to be dynamic as i may have 20 extra fields. Depends on scenario.
testData = {'IBM','01/04/2020',[],'23',1;
dpb on 17 Jun 2020
Are all the "fields", numeric? Then just a cell array would hold them -- the problem will still be processing a different number of elements per cell will kill about any vector operations as will a variable number of struct fields or the like.
The most straightforward way altho a little more memory-costly would be to either define a maximum N and preallocate or determine the max number in the dataset and allocate that size of array using NaN or other way to indicate the missing values.
Or, you could then use a table or timetable with the necessary number of variable columns -- code could then be written for that scenario as well that would be generic based on the table size/number variables/columns.




Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by