how to optimize block toeplitz matrix ?
8 Ansichten (letzte 30 Tage)
Ältere Kommentare anzeigen
Yamina chbak
am 27 Jan. 2025
Kommentiert: John D'Errico
am 28 Jan. 2025
how to optimize block toeplitz matrix in matlab
I want to create a block lower triangular with block teoplitz matrice starting from second column

in code
block_size1= nts * npt;
% Construct Block Matrices
[tp] = cell(nts, nts);
tp{1} = zeros(npt); tp{2} = zeros(npt);for n=3:nts, tp{n}=rand(npt);end
A_T = sparse(block_size1, block_size1);
for col = 2:nts
for row = col:nts
rowStart = (row - 1) * npt + 1;
rowEnd = rowStart + npt - 1;
colStart = (col - 1) * npt + 1;
colEnd = colStart + npt - 1;
A_T(rowStart:rowEnd, colStart:colEnd) = tp{row - col + 2};
end
end
we know that MATLAB does have a built-in function like toeplitz for cell arrays of matrices, but I want to optimize this code to:
- Minimize computation time.
- Reduce memory usage (if possible).
0 Kommentare
Akzeptierte Antwort
John D'Errico
am 27 Jan. 2025
Bearbeitet: John D'Errico
am 27 Jan. 2025
The one thing you NEVER, EVER, EVER want to do is stuff the elements into a sparse array iteratively. NEVER DO THAT!!!!!!!!!!!!!!! Regardless, you are doing this the wrong way anyway.
If it is only roughly 50% non-zero, then you will not be saving memory. Sparse storage has overhead, so the overhead balances out what you think you are gaining. This is perhaps the most common error people make about sparse matrices. They only think their matrices are sparse. But a new user of sparse is almost always wrong in this respect. For example...
A = tril(rand(2e3),-1);
As = sparse(A);
whos A As
Note that both matrices are lower triangular, and lie fully below the main diagonal. But the sparse one takes exactly the same amount of memory as the full one does!
The problem is, a sparse storage requires you to also store the location of the non-zero elements. As such, you are wasting your time with sparse. It will also take more time to do many computations, because again, there is overhead with sparse.
B = rand(2e3);
timeit(@() A*B)
timeit(@() As*B)
So a simple multiply takes roughly 7x as long! Sparse storage is designed for matrices that are FAR more sparse than you have. Don't waste your time with sparse for this problem.
In terms of writing code to create the matrix itself as a FULL matrix, you will want to create a matrix of locations of the non-zeros, in terms of row indices, column indices, and values. You should be able to do this efficiently in a vectorized way. NO LOOPS NEEDED.
Then build it using one call to accumarray. SKIP SPARSE STORAGE.
For example, how would I build a sparse block lower triangular matrix? I'll make it a small one, so you can see the result. And, of course, there are surely many ways we could do this. But here is my first crack at it.
N = 6; % A 6x6 block array, with 2x2 elements
ind = tril(toeplitz((0:N-1)',0:N-1),-1);
ind(:,1) = 0; % The pattern is now built
[R,C,V] = find(ind); % identify the elements we want to fill
% Create the 2x2 blocks. There are only N-2 blocks needed
tp = zeros(2,2) + reshape(1:N-2,[1,1,N-2])
Those blocks could be anything you wanted, but I decided to make it like this, so you could understand the result.
[R2,C2] = find(ones(2,2));
columnoffset = 2;
rowoffset = 2;
rowindices = R2 - 2 + rowoffset*R';
columnindices = C2 - 2 + columnoffset*C';
values = tp(:,:,V);
A = accumarray([rowindices(:),columnindices(:)],values(:),[2*N,2*N])
As you can see, the result has the desired pattern, with no loops required. It is efficiently done. And forget about sparse for this. really.
2 Kommentare
John D'Errico
am 28 Jan. 2025
You are welcome.
Think about sparse like this. It gains, tremendously so, when your matrices are really sparse. And by that, unless your matrix is 99% zero, and more likely 99.9% zeros or more, it is not that significant of a gain in terms of computation.
If you re only using sparse to decrease the storage required, then you will begin to gain when the matrix is considerably more than 50% zeros. As I showed in my answer, the amount of memory used by a lower triangular sparse matrix was the same as that used by a lower triangular full matrix.
A = rand(1000); A = A.*(A<0.25);
As = sparse(A);
whos A As
The matrix there is only 25% non-zero, so 75% are zeros. And yes, it reduced the amount of memory required by 50%. But, if I could tolerate a conversion to single precision, I would have seen even a slightly better reduction in memory required.
Asingle = single(A);
whos Asingle
So again, sparse does not help until the matrix becomes seriously sparse. In my work, my sparse matrices typically have only 3 to 5 non-zero elements per row, and they are often banded in some form. It would not be at all uncommon for those matrices to be 40000x40000. In that case, 5 elements per row comes out to a non-zero fraction of 0.000125, so 99.9875% zero. In that context, sparsity is an immense gain.
Weitere Antworten (0)
Siehe auch
Kategorien
Mehr zu Loops and Conditional Statements finden Sie in Help Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!