views:

366

answers:

1

Hi,

I have two equations I'm solving on each recursive round:

X = A - inv(B) * Y * inv(B), X = X + A' * inv(B) * A,

I solve the problem this way:

C = inv(B)Y <=> BC = Y, solve C. D = Cinv(B) <=> DB = C <=> B'D' = C', solve D'

E = inv(B)*A <=> BE = A, solve E.

All matrices change over time, so I have to do this (or inverting) again every recursion. N is usually about 1-10, possibly more, but usually something like that. B is positive definite so I can use cholesky on factorisation, and then solve equations of multiple right hand sides.

Is this much slower or faster than just inverting B and then doing matrix multiplications with that? One invertion vs solving three systems of linear equations (there's that another equation too) plus some transposing. I would think it's at least numerically more stable than inverting?

Thanks.

A: 

Ok, I feel better now. And why sleep when you can analyze linear equations in the middle of the night? :)

First of all, lets assume that all your matrixes are of order n x n. The cholesky factorization can then be done in O(n^3/6) operations (for large values of n).

Solving B*c(i) = y(i) or L*L'c(i) = y(i) (cholesky) is 2O(n^2/2) or O(n^2), but solving BC=Y is solving n of these equations (because Y is n x n), so at total we have O(n^3)

Solving D' is obviously analogous to this, so another O(n^3)

Transposing D' to D is rougly O(n^2), no calculations though, just swapping of data (apart from the diagonal elements which of course are the same).

Solving E in BE = A in the second formula is backwards substitution of cholesky factorization once more, so O(n^3)

A' * E is n^2 * (n mult and n-1 add) which is O(2*n^3 - n^2)

This sums up to: O(n^3/6) + 3*O(n^3) + O(n^2) + O(2*n^3 - n^2) ~ O(31*n^3/6) ~ O(5*n^3) (for large values of n)

Note that I haven't calculated the matrix additions/subtractions, but this isn't relevant because they will be the same if we decide to invert B. I have also skipped A to A' for the same reasons.

Ok, so how expensive is inverting a matrix? Well we wan to solve the matrix equation:

B * inv(B) = I, which is the same as solving B * x(i) = e(i) for i=1..n, where e(i) are the base unit vectors in I. This is usually done by using gauss elimination to transform the system to a triangular form, which takes about O(n^3/3) operations. When the triangulation is made it takes O(n^2/2) operations to solve it (backwards substitution). But this has to be done n times, which gives us a total of O(n^4/3) + O(n^3/2) operations, so as you can see we are already over the edge.

However, calculating inv(B) when knowing the cholesky factorization of B is O(n^3) (because solving L*L'*inv(B) = I is the same as solving BE=A)

So we then have: O(n^3/6) (cholesky of B) + O(n^3) (calculating inv(B) with cholesky) + 4*O(2n^3-n^2) (four matrix multiplications) ~ O(9*n^3) which is a little bit better, but still worse.

So I suggest that you stay with your current approach. But you have to keep in mind that this is for large values of n. Unless n is 100+ I dont think it matters that much anyway.

Hope this helped some.

Cheers

Magnus Skog
Thank you very much!