The Math Forum

Ask Dr. Math - Questions and Answers from our Archives
Associated Topics || Dr. Math Home || Search Dr. Math

Interpolation Polynomials and Linear Regression

Date: 10/09/98 at 08:37:26
From: Stephane
Subject: Interpolation polynomial

Dear Dr. Math,

I am working on a temperature compensated Crystal oscillator, and I 
need to find the polynomial of a temperature curve.

Samples from the curve shall be taken within a certain temperature 
range and shall be of fewer than 40 samples.

I read some articles about different techniques that can be used and 
Lagrange interpolation technique seemed to be the simplest. However,
it is too heavy to compute with that many samples (40 samples would
give a 40th order polynomial). On the other hand, cubic splines would 
result in a serie of 39 polynomials, one for each span (between two 

Can you tell me if there is another technique, easy to compute, that
could give me a good approximation with only a 6th or 7th order 

Thank you in advance,

Date: 10/09/98 at 13:05:59
From: Doctor Rob
Subject: Re: Interpolation polynomial

Hi Stephane,

Yes, there is such a technique. It is called Least Squares Fitting, or
Linear Regression.

Pick the degree of the polynomial you want. Say it is degree d. Then 
the function you are using to approximate has the form:

   f(x) = Sum a(k)*x^k

For the N sample values (x[i],y[i]), compute f(x[i]). Then you want to
minimize the sum of the squares of the differences f(x[i]) - y[i] with
respect to the d+1 variables a(k), 0 <= k <= d. Form the sum:

   F[a(0),...,a(d)] = Sum [f(x[i])-y[i]]^2

Now to find the minimum value of F, take the partial derivatives of F 
with respect to a(0), ..., a(d), and set them equal to zero:

   F_a(j) = Sum 2*[f(x[i])-y[i]]*x[i]^j = 0,  0 <= j <= d

    d        N                N
   Sum a(k)*Sum x[i]^(k+j) = Sum x[i]^j*y[i],  0 <= j <= d
   k=0      i=1              i=1

This is a system of d+1 linear equations in d+1 unknowns which can 
easily be solved for the seven variables a(k), 0 <= k <= d. Those will 
be the coefficients of your function f(x). That f(x) approximates your 
points best among all polynomials of degree d, in the sense that the 
sum of the squares of the errors over all the points is least. You can 
judge how good the fit is by evaluating F at that minimum. If the fit 
isn't good enough, raise the degree and do it again. The root-mean-
square deviation is the square root of F_min/N, and is a fair measure 
of how much each y[i] deviates from its approximated value f(x[i]) on 

Example: Suppose you had the N = 5 points:

   (x[1],y[1]) = (0,0)
   (x[2],y[2]) = (1,1)
   (x[3],y[3]) = (3,2)
   (x[4],y[4]) = (5,0)
   (x[5],y[5]) = (6,-2)

and I wanted to approximate y as a quadratic function of x (d = 2).  

   f(x) = a(2)*x^2 + a(1)*x + a(0)

The system of equations I have to solve is:

 a(0)*Sum 1      + a(1)*Sum x[i]   + a(2)*Sum x[i]^2 = Sum y[i]
 a(0)*Sum x[i]   + a(1)*Sum x[i]^2 + a(2)*Sum x[i]^3 = Sum y[i]*x[i]
 a(0)*Sum x[i]^2 + a(1)*Sum x[i]^3 + a(2)*Sum x[i]^4 = Sum y[i]*x[i]^2

where the sums run over 1 <= i <= 5.

    5*a(0) +  15*a(1) +   71*a(2) =  1
   15*a(0) +  71*a(1) +  369*a(2) = -5
   71*a(0) + 369*a(1) + 2003*a(2) = -53

   a(0) = -75/637,  a(1) = 1052/637,  a(2) = -16/49

Then the function (-208*x^2 + 1052*x - 75)/637 approximates these five
points best, and the sum of the squared differences is:

   44/637 = 0.06907

so the fit is quite good, with root-mean-square deviation 0.1175.

- Doctor Rob, The Math Forum   
Associated Topics:
High School Physics/Chemistry
High School Statistics

Search the Dr. Math Library:

Find items containing (put spaces between keywords):
Click only once for faster results:

[ Choose "whole words" when searching for a word like age.]

all keywords, in any order at least one, that exact phrase
parts of words whole words

Submit your own question to Dr. Math

[Privacy Policy] [Terms of Use]

Math Forum Home || Math Library || Quick Reference || Math Forum Search

Ask Dr. MathTM
© 1994- The Math Forum at NCTM. All rights reserved.