One can use the method of least squares to derive vector formulas for best fits. For a linear fit one can define the deviation, δ

_{k}, as the vertical distance of data point k from a line and the variance, V, as the sum of the squares of the deviations. The formula for estimating the best slope through a set of data going through a particular point is derived as as shown below. The best slope is assumed to be that for which the sum of the squares of the deviations or the variance is a minimum. From the theory of maxima and minima in calculus we know that for the minimum the derivative of the variance is zero.

In the above the chosen point that the line goes through is (t

_{p},ΔT

_{p}). The equation for the line is ΔT(Δt) = ΔT

_{p}+ s Δt where Δt = t - t

_{p}. A vector Σ whose components are all 1 is needed to include the scalars t

_{p}and ΔT

_{p}in the vectors defining δ and Δt.

One can do something similar to find vector formulas for the best straight line through a set of data.

Merry Christmas.