views:

50

answers:

1

I am performing a least squares regression as below (univariate). I would like to express the significance of the result in terms of R^2. Numpy returns a value of unscaled residual, what would be a sensible way of normalizing this.

field_clean,back_clean = rid_zeros(backscatter,field_data)
num_vals = len(field_clean)
x = field_clean[:,row:row+1]
y = 10*log10(back_clean)

A = hstack([x, ones((num_vals,1))])
soln = lstsq(A, y )
m, c =  soln [0]
residues = soln [1]

print residues
+1  A: 

See http://en.wikipedia.org/wiki/Coefficient_of_determination

Your R2 value =

1 - residual / sum((y - y.mean())**2) 

which is equivalent to

1 - residual / (n * y.var())

As an example:

import numpy as np

# Make some data...
n = 10
x = np.arange(n)
y = 3 * x + 5 + np.random.random(n)

# Note that polyfit is an easier way to do this...
# It would just be "model, resid = np.polyfit(x,y,1,full=True)[:2]" 
A = np.vstack((x, np.ones(n))).T
model, resid = np.linalg.lstsq(A, y)[:2]

r2 = 1 - resid / (y.size * y.var())
print r2
Joe Kington