views:

432

answers:

7

R question: Looking for the fastest way to NUMERICALLY solve a bunch of arbitrary cubics known to have real coeffs and three real roots. The polyroot function in R is reported to use Jenkins-Traub's algorithm 419 for complex polynomials, but for real polynomials the authors refer to their earlier work. What are the faster options for a real cubic, or more generally for a real polynomial?

+1  A: 

Do you need all 3 roots or just one? If just one, I would think Newton's Method would work ok. If all 3 then it might be problematic in circumstances where two are close together.

Jason S
Just one. The one known to lie in (0,1).
knot
I vote for Newton's method, then. Start with 0.5 as a guess, limit the domain between 0 and 1, and it should work fine. It should have quadratic convergence; the only reason you would have a problem is if the function has a minimum or maximum within that range (derivative = 0)
Jason S
A: 

The common methods are available: Newton's Method, Bisection Method, Secant, Fixed point iteration, etc. Google any one of them.

If you have a non-linear system on the other hand (e.g. a system on N polynomial eqn's in N unknowns), a method such as high-order Newton may be used.

Alex
+4  A: 

The fastest known way (that I'm aware of) to find the real solutions a system of arbitrary polynomials in n variables is polyhedral homotopy. A detailed explanation is probably beyond a StackOverflow answer, but essentially it's a path algorithm that exploits the structure of each equation using toric geometries. Google will give you a number of papers.

Perhaps this question is better suited for mathoverflow?

John Feminella
+5  A: 

The numerical solution for doing this many times in a reliable, stable manner, involve: (1) Form the companion matrix, (2) find the eigenvalues of the companion matrix.

You may think this is a harder problem to solve than the original one, but this is how the solution is implemented in most production code (say, Matlab).

For the polynomial:

p(t) = c0 + c1 * t + c2 * t^2 + t^3

the companion matrix is:

[[0 0 -c0],[1 0 -c1],[0 1 -c2]]

Find the eigenvalues of such matrix; they correspond to the roots of the original polynomial.

For doing this very fast, download the singular value subroutines from LAPACK, compile them, and link them to your code. Do this in parallel if you have too many (say, about a million) sets of coefficients.

Notice that the coefficient of t^3 is one, if this is not the case in your polynomials, you will have to divide the whole thing by the coefficient and then proceed.

Good luck.

Edit: Numpy and octave also depend on this methodology for computing the roots of polynomials. See, for instance, this link.

Arrieta
I had fogotten to indicate that it was an R question. Sorry.
knot
I think this is still a relevant answer, it's easy enough to find the eigenvalues of the companion matrix in R using the ''eigen()'' function.
Ken Williams
A: 

There are many ideas on wikipedia. Which have you tried?

hadley
The answer is in my question.
knot
Is it? I must be missing something.
hadley
A: 

Have you tried looking into the GSL package http://cran.r-project.org/web/packages/gsl/index.html?

knguyen
+1  A: 

Fleshing out Arietta's answer above:

> a <- c(1,3,-4)
> m <- matrix(c(0,0,-a[1],1,0,-a[2],0,1,-a[3]), byrow=T, nrow=3)
> roots <- eigen(m, symm=F, only.values=T)$values

Whether this is faster or slower than using the cubic solver in the GSL package (as suggested by knguyen above) is a matter of benchmarking it on your system.

Ken Williams