An OLS fit seeks to minimize the squares of the vertical deviations from the line. A plot of these deviations for the fit shows that they are larger near the center of the distribution of errors and smaller at larger distances away from the center along the x-axis. Apparently OLS favors minimizing the deviations of points with extreme x distances from the center of the error distribution. If one computes the moment which corresponds to the sum of the "weights" δy at distance Δx from the center of the distribution one finds that it is zero. Similarly, if one exchanges the directions x and y one finds that this moment is quite large since the δxs were ignored. One also finds that the moment about the origin is zero and that the moment of normal deviations of points along the fitted line is quite large.
If one does a least squares fit using simultaneously both the x and y deviations from the line one gets a more balanced fit. This is probably closer to what one would get by drawing a straight line through the data by hand using a straight edge.
This "Balanced Least Squares" method has a moment which is zero for normal deviations along the line using distances from the center of the data. The moments for deviations normal to the x and y axis tend to cancel themselves out resulting in small values which appear to be the same.
It can shown that the direction of the fitted line is an eigenvector of a matrix which is the product of a matrix deviations Δr of the data points from the center of the distribution and its transpose and that the fitted line passes through the center of the data points. An eigenvector, e, of matrix, M, has the property that M e = m e, that is, the vector produced by multiplying a matrix by one of its eigenvectors has the same direction as the original eigenvector but the magnitude may be subject to change.
No comments:
Post a Comment