views:

238

answers:

2

Hi! I'm using the GNU GSL to do some matrix calculations. I'm trying to multiply a matrix B with the inverse of a matrix A.

Now I've noticed that the BLAS-part of GSL has a function to do this, but only if A is triangular. Is there a specific reason to this? Also, what would be the fastest way to do this computation? Should I invert A using LU decomposition, or is there a better way?

FWIW, A has the form P'*G*P, where P is a normal matrix, P' is its inverse, and G is a diagonal matrix.

Thanks a bunch :)

A: 

in a nutshell:

The fact that only triangular matrices are supported is simply because this operation is really easy to perform for a triangular matrix ( its called back-substitution) and BLAS only provides routines for low level functions. Any higher level function is usually found in LAPACK which uses BLAS internally for block-wise operations.

In the specific case you're dealing with: A:=P'*G*P, if P is a normal matrix then the inverse of A can be written inv(A) := P'*inv(G)*P. You then have two options:

  1. Build inv(A) and multiply it with B.
  2. sequentially multiply B with the different factors of inv(A): multiply B by P (on the left), then rescale each line of the result with the inverse of the diagonal elements of G and then multiply again with P' (left again).

Depending on the specific case you can pick your method. Anyway, you'll need dgemm (matrix-matrix multiplications, Lvl3 BLAS) and dscal (scaling of the lines, Lvl 1 BLAS), assuming double precision.

Hope this helps.

A.

Adrien
Thanks for your reply. I'm not sure where the permutation comes from. Did you maybe confuse something because one of my matrices is called "P"?You are right in saying that esentially I could reformulate my problem as(P')*G*P*X = B, but I fail to see how you can derive that further to what you propose? Would you mind elaborating a bit?Also, please not that while my matrix A is composed of P'*G*P, this is not some kind of eigenvalue decomposition, In case you were thinking along those lines.
Tom
My mistake,... I was working with permutation matrices earlier,... I'll edit the post to correct.
Adrien
A: 

I believe Adrien is right with the reason for BLAS not having an inverse function for square matrices. It depends on the matrix you are using to optimize the calculus of its inverse.

In general you can use the LU decomposition, which is valid for any square matrix. I. e., something like:

gsl_linalg_LU_decomp(A, p, signum);
gsl_linalg_LU_invert(A, p, invA);

where A is the square matrix you want its inverse, p is a gsl_permutation object (the permutation object where the permutation matrix is encoded), signum is the sign of the permutation and invA is the inverse of A.

Since you state that A = P' G P being P normal and G diagonal, probably A is a normal matrix. I haven't used them in a while, but there must be a factorization theorem for them, which you can find in the Chapter 14 of the GSL reference manual or even better, in a linear algebra book.

Just an example, if you had symmetric positive definite matrices and wanted to find its inverse, you could use the Cholesky factorization, which is optimized for that kind of matrices. Then you could use gsl_linalg_cholesky_decomp() and gsl_linalg_cholesky_invert() functions in the GSL to make it efficient.

I hope it helps!

Alejandro Cámara