I've spent a decent amount of time trying to hunt down a simple way of doing this - ideally, a magical library exists out there somewhere that will take my set of 3D data points and return 2 points on the best fit line using either orthogonal regression or least squares and also return the error of the fitted line. Does such a thing exist, and if so, where?
It's easy enough to do it if you know the trick: http://www.scribd.com/doc/21983425/Least-Squares-Fit
More dimensions means more coefficients, but they're easy enough to add in. The ideas are all the same.
This is easy enough to do, but to write it yourself you will need an eigenvalue solver or a singular value decomposition. Create the nx3 matrix A, of your (x-xbar, y-ybar, z-zbar) data as columns. Save those column means for later, I'll call it V0 = [xbar,ybar,zbar].
Now, compute the eigenvalues and eigenvectors of A'*A, i.e., the 3x3 matrix formed from A transpose multiplied by A.
If this data lies on a line in R^3, then one of those eigenvalues will be significantly larger than the other two eigenvalues. If this is not true, then the orthogonal regression line will not be well estimated.
Take the eigenvector that is associated with the largest eigenvalue of A'*A. Then if V is the corresponding eigenvector, the orthogonal regression line is defined as
V(t) = V0 + t*V
Any point on that line can be given by some value of the parameter t.
Alternatively, compute the singular value decomposition of A, and take the right singular vector which corresponds to the largest singular value of A.
In either event, if you wish to compute the errors for the data points, this would be defined as simply the orthogonal distance to the line in question.