## Thursday, April 25, 2013

### Orthogonal Polynomials for Polynomial Regression

If one can find a set of orthogonal polynomials for which the function correlation matrix is diagonalized,  it is easier to solve for the coefficients of a fit. We can represent the coefficients of the unknow set of polynomials by a set of column vectors A<k> and we generate the polynomials one at a time. When each new polynomial is correlated with the previous set we the result is defined to be zero. But the number of equations is one less than the number of unknowns. We can avoid this by arbitrarily setting the constant term equal to one and solve for the remainder of the coefficients. The solution is similar to the Gram-Schmidt Process for solving a set of equations. The function X2A below does this. The V<p> are the set of orthogonal polynomials.

If one examines the set of coefficients that results one can see that they approach something that is closely related to the shifted Legendre polynomials. Note how the new polynomials correlate with one another. The diagonal terms are approximately 1/(2p+1).

The coefficients needed to fit a function Y by the orthogonal set is just c=VTY/VTV for each polynomial.  One can then use the set of coefficients to transform the set of coefficients for a fit to get the power series coefficients as can be seen using coefficients found for the fit of the exponential function.