views:

210

answers:

1

Suppose we have a m by n matrix A with rank m and a set K⊆{1..n} such that the columns of A indexed by K are linearly independent. Now we want to extend K and find a set L so that k⊆L and columns indexed by L are linearly independent too.

One way of doing it would be to start adding column indexes to K and test if the new set are linearly independent or not by using Gaussian Elimination for example. But is there a better way so that I would not need to test for every index added.

Thank You

A: 

Only because no one officially answered yet... can't you just use QR?

Q, R = qr(A)

QR factorization finds orthonormal matrix Q and upper triangular matrix R such that A = QR. It implements the Gram-Schmidt algorithm for finding an orthonormal basis for A.

The columns of Q are orthonormal, hence linearly independent. And the first n columns of Q span the column space of A. So Q should give you what you want.

Steve