I am trying to find the linear trend line for a set of data. The set contains pairs of dates (x values) and scores (y values). I am using a version of this code as the basis of my algorithm.
The results I am getting are off by a few orders of magnitude. I assume that there is some problem with round off error or overflow because I am using Date's getTime method which gives you a huge number of milliseconds. Does anyone have a suggestion on how to minimize the errors and compute the correct results?