tags:

views:

670

answers:

9

I want to multiply two matrices but the triple loop has O(n3) complexity. Is there any algorithm in dynamic programming to multiply two matrices with O(n) complexity?

+8  A: 

Short answer: No

Long answer: There are ways if you have special kinds of matricies (for instance a diagonal matrix). The better matrix multiplication algorithms out there can pare you down to something like O(n2.4) (http://en.wikipedia.org/wiki/Coppersmith-Winograd_algorithm). The major one I am somewhat familiar with uses a divide and conquer algorithm to split up the workload (not the one I linked to).

I hope this helps!

SapphireSun
+1 but would you correct the link .. :)
TheMachineCharmer
Pah, I can't get the link to behave :(
SapphireSun
A: 

No! I don't think so.

There is no way unless and until you are using a parallel processing machine. That too, it has its own dependencies and limitations.

Till now, its not yet achieved.

Ritz
Parallel processing does not, in general, reduce the big-O complexity of algorithm. There's always some fixed upper bound, say M, of your concurrency either from your parallel algorithm or hardware so you get at best a constant speedup. (Not that constant speedups aren't good, mind you.)
Dale Hagglund
You can only ever have up to a certain max number of processors, so parallel processing can only speed you up by some (possibly large) constant factor.
mbeckish
I know thats why I'd written 'it has its own dependencies'. I know a bit of Amdahl's law.Thanks for the comment :-)
Ritz
+34  A: 

The best Matrix Multiplication Algorithm known so far is the "Coppersmith-Winograd algorithm" with O(n2.38 ) complexity but it is not used for practical purposes.

However you can always use "Strassen's algorithm" which has O(n2.81 ) complexity but there is no such known algorithm for matrix multiplication with O(n) complexity.

Prasoon Saurav
please note, that coppersmith has very high constant costs and therefore is only recommended for big matrices (forget about multiplying 2 4x4-matrices in a video game for example)
Dave
For each epsilon>0, there is an algorithm in O(2^{n+epsilon}). However, this is a theoretical result, the constant explodes when epsilon goes to zero?
Alexandre C.
Alexandre C. - +1 for optimism :)
amartynov
+14  A: 

There is a theoretical lower bound for matrix multiplication at O(n^2) as you have to touch that many memory locations to do the multiplication. As others have said, there are algorithms that drop us below O(n^3), but are usually impractical in real use.

If you need to speed it up, you might want to look at Cache Oblivious Algorithms, such as this one (http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.44.5650) that accelerate performance by performing operations in a cache cohesive way, ensuring that data is in the cache when needed.

Peter Alexander
+2  A: 

If the matrices are known to be diagonal, you can multiply them in O(N) operations. But in general, you cannot.

Stephen C
or if they are Töplitz matrices.
Alexandre C.
+2  A: 

Matrices have O(n2) elements, and every element must be considered at least once for the result, so there is no possible way for a matrix multiplication algorithm to run in less than O(n2) operations.

Roberto Bonvallet
A: 

If you have processors and shared-read memory architecture, you could multiply two matrices in O(n) time... but this is only theory for now.

liori
A: 

elaborate your question.

QAZI
A: 

If

  • your matrices are large
  • they have a lot of zeroes
  • you are willing to store them in strange formats

you can devise algorithms whose complexities depend only on the number of non-zero elements. This can be mandatory for some problems (eg. finite element methods).

Alexandre C.