Tuesday, February 9, 2016

What Does Linear Least Squares Do?


  The equations for the linear least squares coefficients are derived from the assumption that the sum of the square of the errors is a minimum. This may not be the actual case as the following example shows. The result is slightly biased with the fit line (dotted) being slightly oblique to the true line (solid).


A calculation shows that for the fit the 1st moment of the errors, δ'', is zero while this is untrue for the assumed errors, δ. This result is quite general and can be proven assuming the equations for the fit.


One can also show the unexpected result that the average of the errors is also zero starting from the first of the fit equations.


Both conclusions are due to the original assumption that the sum of the square of the errors is a minimum and may be in conflict with the true values. For random errors the conclusions are approximately true and are not in conflict with the use of linear least squares for an estimate of the coefficients.

No comments: