Backslash (mldivide) slower than inverse and multiplication
Ältere Kommentare anzeigen
The common wisdom is that A\y is more accurate than inv(A)*y and I believe that to be true, but when is it also faster? Say the matrix A is well-conditioned so I don't really care about the ability of \ to find a least-squares solution.
Am I doing something misleading here? The \ takes much longer.
A = randn(20);
A = A*A';
s = randn(20,1);
timeit(@() inv(A)*s)
timeit(@() A \ s)
From the documentation for mldivide, it sounds like it should be using the Cholesky solver since A is Hermitian, why is that not faster than an inv and matrix multiplication?
ishermitian(A)
Akzeptierte Antwort
Weitere Antworten (1)
John D'Errico
am 30 Jul. 2025
Bearbeitet: John D'Errico
am 30 Jul. 2025
First of all, you should NEVER be computing A*A' or A'*A. that is where most of the loss in accuracy arises. Whatever is the condition number of A, that operation squares it. Effectively, you might as well be doing your work in single precision, because the lower 8 bits of your result are now meaningless garbage.
Next, a 20x20 matrix is far too small for anything intelligent to be made of a timing estimate like that. It is far more likely to get bogged down in simple things like deciding if the matrix is indeed symmetric, etc.
Next, learn to use tools like timeit.
A = randn(2000); A = A*A';
s = randn(2000,1);
timeit(@() A\s)
timeit(@() inv(A)*s)
Do you see that the 2kx2k example is now considerably faster for backslash?
As well, it is almost certain that you could have done what you needed to do without ever squaring that condition number in the first place.
2 Kommentare
AB
am 30 Jul. 2025
John D'Errico
am 30 Jul. 2025
For a 20x20 matrix, that matrix is so small that it is very difficult to know where the time penalty may come from. No, my answer is NOT that overhead is expected to be the cause of the problem, because I have no expectation at all. But a matrix that small can easily see the actual computation time be swamped by other things. If I had to make a guess, it would be that.
And again, you should probably not be computing things that way in the first place.
Yes, you CAN solve a linear least squares probem using the normal equations, but that squares the condition number. And yes, this is what so many people seem to learn. They learn it from someone else, who learned it long ago from another, etc. it is likely you are doing something like what I've described, because you talked about least squares, and because you are forming something like A*A' in the first place, and then inverting it.
So I would suggest that you consider other ways to solve the problem you want to solve. It might involve a QR factorization. For example, you can get a Cholesky factorization directly from a QR, without needing to go through the nasty issues of forming A'*A or A*A'. What you really want to do might involve backslash. But I don't know, because all I see is the test case you ran.
Kategorien
Mehr zu Linear Algebra finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
