Polynomial regression: Difference between revisions
Content added Content deleted
No edit summary |
No edit summary |
||
Line 456: | Line 456: | ||
=={{header|R}}== |
=={{header|R}}== |
||
The easiest (and most robust) approach to solve this in R is to use the base package's ''lm'' function which will find the least squares solution via a QR decomposition: |
|||
R has several tools for fitting. Here we use a generalized nonlinear least squares method with the polynomial as model: |
|||
<lang R> |
<lang R> |
||
x <- c(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10) |
x <- c(0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10)i |
||
y <- c(1, 6, 17, 34, 57, 86, 121, 162, 209, 262, 321) |
y <- c(1, 6, 17, 34, 57, 86, 121, 162, 209, 262, 321) |
||
coef(lm(y ~ x + I(x^2)))</lang> |
|||
fitted <- gnls(y ~ c0*x^2 + c1*x + c2, start=list(c0=1, c1=1, c2=0)) |
|||
print(paste(fitted$coeff[[1]], "*x^2 + ", |
|||
fitted$coeff[[2]], "*x + ", |
|||
fitted$coeff[[3]])) |
|||
'''Output''' |
|||
# get several info about the fitting process |
|||
<lang R> |
|||
print(summary(fitted))</lang> |
|||
(Intercept) x I(x^2) |
|||
1 2 3 |
|||
</lang> |
|||
The "base" <tt>nls</tt> could be used with algorithm "port", since the default Gauss-Newton has problems recognizing the convergence: |
|||
<lang R>nls(y ~ c0*x^2 + c1*x + c2, start=list(c0=1, c1=1, c2=0), trace=TRUE)</lang> |
|||
gives |
|||
5.364254e-29 : 3 2 1 |
|||
Error in nls(y ~ c0 * x^2 + c1 * x + c2, start = list(c0 = 1, c1 = 1, : |
|||
number of iterations exceeded maximum of 50 |
|||
And even increasing the maximum possible iterations, it does not finish properly (even if the result, as we can see with the trace option, is reached!). Instead, the |
|||
<lang R>nls(y ~ c0*x^2 + c1*x + c2, start=list(c0=1, c1=1, c2=0), algorithm="port")</lang> |
|||
works fine. |
|||
=={{header|Ruby}}== |
=={{header|Ruby}}== |