Saturday, April 20, 2013

The Least Squares Polynomial Regression Formula

   My first encounter with linear fits was the simple formula for a line fit found in the Probability and Statistics section of the CRC Standard Mathmatical Tables, 16th Edition, 1968, p. 532 where two equations for the coefficients involving sums and products of the data are given without proof. There was a more general formula for fitting a polynomial function which is the same formula that is used in polynomial regression. One often wonders where these formulas come from and it can be shown that they are least squares fits for the coefficients of the polynomial. One can simplify the equations using vectors containing the data and reduce everything to a simple matrix equation. Knowledge of a derivation often helps in understanding what one is doing when following some procedure. x and y are the data points, δ is the difference between the observed value of y and its calculated value and V is the variance which is minimized. The procedure is known as the Method of Least Squares.

  The matrix X contains information on how the set of vectors containing the powers of xk correlate with one another and Y contains information on how they correlate with the vector of the yk values.

No comments: