Polynomial curve fitting
p = polyfit(x,y,n) [p,S] = polyfit(x,y,n) [p,S,mu] = polyfit(x,y,n)
a nx-by-mx matrix of doubles
a nx-by-mx matrix of doubles
a 1-by-1 matrix of doubles, integer value, n>=0
a 1-by-(n+1) matrix of doubles, the coefficients of the polynomial
a structure to estimate the errors of evaluations (see below )
a 1-by-2 matrix of doubles, the standardizing parameter : mu(1) is mean(x) and mu(2)=stdev(x). Use mu to standardize the data for a more accurate evaluation of p, using the mean and the standard deviation.
polyfit(x,y,n)
finds the coefficients of a polynomial p(x) of
degree n that fits the data, p(x(i)) to y(i), in a least-squares sense.
These coefficients are ordered with powers in decreasing order:
The implementation is based on the QR decomposition.
[p,S]=polyfit(x,y,n)
returns the polynomial coefficients
p and a data structure S for use with polyval
to produce
error estimates on predictions.
The following are the fields of S.
S.R : Cholesky factors from the QR decomposition
S.df : the number of degrees of freedom : nx-(n+1)
S.norm : the 2-norm of the residuals
The computation of the coefficients p can be difficult in the case where the inputs x are of very different orders of magnitude. In order to get more accurate results, we can use the mu output argument. This has the effect of computing the standard variable z from :
where is the mean and
is the standard deviation.
This reduces the condition number of the associated Vandermonde
matrix.
In this case, the mu argument must be passed to the polyval function
in order to evaluate the polynomial.
This allows polyval to evaluate the polynomial depending on z.
x = linspace(0,%pi,10)'; y = sin(x) p = polyfit(x,y,3) // Evaluate : f = polyval(p,x) // Compare with a table disp([x y f abs(f-y)]) // Compare with a polynomial ff = p(1)*x.^3+p(2)*x.^2+p(3)*x+p(4) // Compare with a graphics // Notice that the fit is good up to x=%pi, // then the fit is not good anymore. x = linspace(0,4,100)'; y = sin(x); f = polyval(p,x); scf(); plot(x,y,"r-") plot(x,f,"b-") plot([%pi %pi],[-1 1],"g-") legend(["Sin","Polynomial","Limit"],"in_lower_left"); xtitle("Degree 3 least squares polynomial","X","F(x)") // A difficult example : scaling is necessary. x=[-1e15 1.e6 1e15]; y=[-12. 0 12.]; p=polyfit(x,y,2) // A warning is generated [p,S,mu]=polyfit(x,y,2) // No problem // Evaluate : y2=polyval(p,x,S,mu) // Source [2] // Caution : the urban definition changes in 1950 cdate=(1790:10:1990)'; pop=[ 3.929214 5.308483 7.239881 9.638453 12.860702 17.063353 23.191876 31.443321 38.558371 50.189209 62.979766 76.212168 92.228496 106.02154 123.20262 132.16457 151.3258 179.32317 203.30203 226.5422 248.70987 ]; scf(); // Plot the data plot(cdate,pop,"+") // Calculate fit parameters [p,S] = polyfit(cdate,pop,2); // Evaluate the fit pop_fit = polyval(p,cdate,S); // Plot the fit plot(cdate,pop_fit,"g-") // Annotate the plot legend("Polynomial Model","Data","Location","in_upper_left"); xtitle("","Census Year","Population (millions)"); | ![]() | ![]() |
[1] http://en.wikipedia.org/wiki/Polynomial_interpolation
[2] USA population : Population: 1790 to 1990, http://www.census.gov/population/censusdata/table-4.pdf
[3] http://www.mathworks.fr/fr/help/matlab/data_analysis/programmatic-fitting.html
[4] Numerical Computing with MATLAB, Cleve Moler, 2004