Interpolation Polynomials and Linear RegressionDate: 10/09/98 at 08:37:26 From: Stephane Subject: Interpolation polynomial Dear Dr. Math, I am working on a temperature compensated Crystal oscillator, and I need to find the polynomial of a temperature curve. Samples from the curve shall be taken within a certain temperature range and shall be of fewer than 40 samples. I read some articles about different techniques that can be used and Lagrange interpolation technique seemed to be the simplest. However, it is too heavy to compute with that many samples (40 samples would give a 40th order polynomial). On the other hand, cubic splines would result in a serie of 39 polynomials, one for each span (between two samples). Can you tell me if there is another technique, easy to compute, that could give me a good approximation with only a 6th or 7th order polynomial? Thank you in advance, Stephane Date: 10/09/98 at 13:05:59 From: Doctor Rob Subject: Re: Interpolation polynomial Hi Stephane, Yes, there is such a technique. It is called Least Squares Fitting, or Linear Regression. Pick the degree of the polynomial you want. Say it is degree d. Then the function you are using to approximate has the form: d f(x) = Sum a(k)*x^k k=0 For the N sample values (x[i],y[i]), compute f(x[i]). Then you want to minimize the sum of the squares of the differences f(x[i]) - y[i] with respect to the d+1 variables a(k), 0 <= k <= d. Form the sum: N F[a(0),...,a(d)] = Sum [f(x[i])-y[i]]^2 i=1 Now to find the minimum value of F, take the partial derivatives of F with respect to a(0), ..., a(d), and set them equal to zero: N F_a(j) = Sum 2*[f(x[i])-y[i]]*x[i]^j = 0, 0 <= j <= d i=1 d N N Sum a(k)*Sum x[i]^(k+j) = Sum x[i]^j*y[i], 0 <= j <= d k=0 i=1 i=1 This is a system of d+1 linear equations in d+1 unknowns which can easily be solved for the seven variables a(k), 0 <= k <= d. Those will be the coefficients of your function f(x). That f(x) approximates your points best among all polynomials of degree d, in the sense that the sum of the squares of the errors over all the points is least. You can judge how good the fit is by evaluating F at that minimum. If the fit isn't good enough, raise the degree and do it again. The root-mean- square deviation is the square root of F_min/N, and is a fair measure of how much each y[i] deviates from its approximated value f(x[i]) on average. Example: Suppose you had the N = 5 points: (x[1],y[1]) = (0,0) (x[2],y[2]) = (1,1) (x[3],y[3]) = (3,2) (x[4],y[4]) = (5,0) (x[5],y[5]) = (6,-2) and I wanted to approximate y as a quadratic function of x (d = 2). Then: f(x) = a(2)*x^2 + a(1)*x + a(0) The system of equations I have to solve is: a(0)*Sum 1 + a(1)*Sum x[i] + a(2)*Sum x[i]^2 = Sum y[i] a(0)*Sum x[i] + a(1)*Sum x[i]^2 + a(2)*Sum x[i]^3 = Sum y[i]*x[i] a(0)*Sum x[i]^2 + a(1)*Sum x[i]^3 + a(2)*Sum x[i]^4 = Sum y[i]*x[i]^2 where the sums run over 1 <= i <= 5. 5*a(0) + 15*a(1) + 71*a(2) = 1 15*a(0) + 71*a(1) + 369*a(2) = -5 71*a(0) + 369*a(1) + 2003*a(2) = -53 a(0) = -75/637, a(1) = 1052/637, a(2) = -16/49 Then the function (-208*x^2 + 1052*x - 75)/637 approximates these five points best, and the sum of the squared differences is: 44/637 = 0.06907 so the fit is quite good, with root-mean-square deviation 0.1175. - Doctor Rob, The Math Forum http://mathforum.org/dr.math/ |
Search the Dr. Math Library: |
[Privacy Policy] [Terms of Use]
Ask Dr. Math^{TM}
© 1994-2015 The Math Forum
http://mathforum.org/dr.math/