Suppose we have a m by n matrix A with rank m and a set K⊆{1..n} such that the columns of A indexed by K are linearly independent. Now we want to extend K and find a set L so that k⊆L and columns indexed by L are linearly independent too.
One way of doing it would be to start adding column indexes to K and test if the new set are linearly independent or not by using Gaussian Elimination for example. But is there a better way so that I would not need to test for every index added.
Thank You