How to remove dependent rows in a matrix?
Ältere Kommentare anzeigen
Let A be an m by n matrix whose rows are linearly dependent. I want to remove rows from A such that the rank does not decrease. How can I find such rows of A?
Akzeptierte Antwort
Weitere Antworten (2)
Jos (10584)
am 24 Jan. 2014
Another, very straightforward, approach is to include them one by one and observe the changes in rank … (I agree that this is not so elegant!).
N = size(A,1) ; % number of rows
IncludeTF = false(N,1) ; % by default, exclude all rows, except ...
IncludeTF(1) = true ; % first row which can always be included
R0 = rank(A) ; % the original rank
for k = 2:N, % loop over all rows
B = A(IncludeTF,:) ; % select the currently included rows of A
IncludeTF(k) = rank(B) < R0 ; % include in B when the rank is less
end
isequal(rank(B), R0) % check!
1 Kommentar
Jeel Bhavsar
am 24 Nov. 2018
I have the same question with gf matrix.Does this code work for gf(galois field) matrix?
Arash Rabbani
am 24 Aug. 2019
This is a shorter version of Jos solution if you needed:
R1=1;
for I=1:size(A,1)
R2=rank(A(1:I,:));
if R2~=R1; disp(I); end
R1=R2+1;
end
1 Kommentar
Arash Rabbani
am 24 Aug. 2019
It displays the rows with linear dependany to other rows.
Kategorien
Mehr zu Creating and Concatenating Matrices finden Sie in Hilfe-Center und File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!

