[R] need 95% confidence interval bands on cubic extrapolation

2005-12-20 Thread James Salsman
Dear R experts:

I need to get this plot, but also with 95% confidence interval bands:

   hour <- c(1, 2, 3, 4, 5, 6)
   millivolts <- c(3.5, 5, 7.5, 13, 40, 58)

   plot(hour, millivolts, xlim=c(1,10), ylim=c(0,1000))

   pm <- lm(millivolts ~ poly(hour, 3))

   curve(predict(pm, data.frame(hour=x)), add=TRUE)

How can the 95% confidence interval band curves be plotted too?

Sincerely,
James Salsman

P.S.  I know I should be using data frames instead of parallel lists.
This is just a simple example.

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] adjusted R^2 vs. ordinary R^2

2005-06-17 Thread James Salsman
I thought the point of adjusting the R^2 for degrees of
freedom is to allow comparisons about goodness of fit between
similar models with different numbers of data points.  Someone
has suggested to me off-list that this might not be the case.

Is an ADJUSTED R^2 for a four-parameter, five-point model
reliably comparable to the adjusted R^2 of a four-parameter,
100-point model?  If such values can't be reliably compared
with one another, then what is the reasoning behind adjusting
R^2 for degrees of freedom?

What are the good published authorities on this topic?

Sincerely,
James Salsman

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] plotting confidence intervals for polynomial models? (was Re: ordinary polynomial coefficients from orthogonal polynomials?)

2005-06-14 Thread James Salsman
What I really want now are the 95% confidence intervals that I
mistakenly thought that linesHyperb.lm() would provide:

 > pm <- lm(billions ~ I(decade) + I(decade^2) + I(decade^3))
 > library(sfsmisc)
 > linesHyperb.lm(pm)
Error in "names<-.default"(`*tmp*`, value = c("I(decade)", 
"I(decade^2)",  :
 names attribute [3] must be the same length as the vector [1]
 > pm <- lm(billions ~ poly(decade, 3))
 > linesHyperb.lm(pm)
Warning message:
'newdata' had 100 rows but variable(s) found have 5 rows

Shouldn't curve(predict(...), add=TRUE) be able to plot confidence
interval bands?

Prof Brian Ripley wrote:

> Why do `people' need `to deal with' these, anyway.  We have machines to 
> do that.

Getting a 0.98 adjusted R^2 on the first try, compels me to
try to publish the fitted formula.

> decade <- c(1950, 1960, 1970, 1980, 1990)
> billions <- c(3.5, 5, 7.5, 13, 40)
> # source: http://www.ipcc.ch/present/graphics/2001syr/large/08.17.jpg
>
> pm <- lm(billions ~ poly(decade, 3))
>
> plot(decade, billions, xlim=c(1950,2050), ylim=c(0,1000),
 main="average yearly inflation-adjusted dollar cost of extreme weather
 events worldwide")
> curve(predict(pm, data.frame(decade=x)), add=TRUE)
> # output: http://www.bovik.org/storms.gif
>
> summary(pm)

 Call:
 lm(formula = billions ~ poly(decade, 3))

 Residuals:
  1   2   3   4   5
 0.2357 -0.9429  1.4143 -0.9429  0.2357

 Coefficients:
 Estimate Std. Error t value Pr(>|t|)
 (Intercept)13.800  0.882  15.647   0.0406 *
 poly(decade, 3)1   25.614  1.972  12.988   0.0489 *
 poly(decade, 3)2   14.432  1.972   7.318   0.0865 .
 poly(decade, 3)36.483  1.972   3.287   0.1880
 ---
 Signif. codes:  0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1

 Residual standard error: 1.972 on 1 degrees of freedom
 Multiple R-Squared: 0.9957, Adjusted R-squared: 0.9829
 F-statistic: 77.68 on 3 and 1 DF,  p-value: 0.08317

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] ordinary polynomial coefficients from orthogonal polynomials?

2005-06-14 Thread James Salsman
How can ordinary polynomial coefficients be calculated
from an orthogonal polynomial fit?

I'm trying to do something like find a,b,c,d from
  lm(billions ~ a+b*decade+c*decade^2+d*decade^3)
but that gives:  "Error in eval(expr, envir, enclos) :
Object "a" not found"

 > decade <- c(1950, 1960, 1970, 1980, 1990)
 > billions <- c(3.5, 5, 7.5, 13, 40)
 > # source: http://www.ipcc.ch/present/graphics/2001syr/large/08.17.jpg
 >
 > pm <- lm(billions ~ poly(decade, 3))
 >
 > plot(decade, billions, xlim=c(1950,2050), ylim=c(0,1000), 
main="average yearly inflation-adjusted dollar cost of extreme weather 
events worldwide")
 > curve(predict(pm, data.frame(decade=x)), add=TRUE)
 > # output: http://www.bovik.org/storms.gif
 >
 > summary(pm)

Call:
lm(formula = billions ~ poly(decade, 3))

Residuals:
   1   2   3   4   5
  0.2357 -0.9429  1.4143 -0.9429  0.2357

Coefficients:
  Estimate Std. Error t value Pr(>|t|)
(Intercept)13.800  0.882  15.647   0.0406 *
poly(decade, 3)1   25.614  1.972  12.988   0.0489 *
poly(decade, 3)2   14.432  1.972   7.318   0.0865 .
poly(decade, 3)36.483  1.972   3.287   0.1880
---
Signif. codes:  0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1

Residual standard error: 1.972 on 1 degrees of freedom
Multiple R-Squared: 0.9957, Adjusted R-squared: 0.9829
F-statistic: 77.68 on 3 and 1 DF,  p-value: 0.08317

 > pm

Call:
lm(formula = billions ~ poly(decade, 3))

Coefficients:
  (Intercept)  poly(decade, 3)1  poly(decade, 3)2  poly(decade, 3)3
   13.80025.61414.432 6.483

__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] need an R-squared from a nls logistic sigmoid fit

2005-06-05 Thread James Salsman

Why doesn't nls() produce any kind of R-squared value?  In the absence
of such information, how are we supposed to compare one fit to another
when the y-axis scale changes?


sm <- nls(y ~ SSfpl(x, miny, maxy, midx, grad))
summary(sm)


Formula: y ~ SSfpl(x, miny, maxy, midx, grad)

Parameters:
 Estimate Std. Error t value Pr(>|t|)
miny  -0.5845 4.6104  -0.127  0.90524
maxy   7.2680 1.5512   4.686  0.00941 **
midx  16.9187 2.2340   7.573  0.00163 **
grad   1.7283 1.9150   0.903  0.41782
---
Signif. codes:  0 `***' 0.001 `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1

Residual standard error: 1.13 on 4 degrees of freedom

Correlation of Parameter Estimates:
minymaxymidx
maxy -0.6654
midx  0.8936 -0.3221
grad -0.9068  0.8477 -0.6865





__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] start values for nls() that don't yield singular gradients?

2005-04-23 Thread James Salsman
I'm trying to fit a Gompertz sigmoid as follows:
x <- c(15, 16, 17, 18, 19)  # arbitrary example data here;
y <- c(0.1, 1.8, 2.2, 2.6, 2.9) # actual data is similar
gm <- nls(y ~ a+b*exp(-exp(-c*(x-d))), start=c(a=?, b=?, c=?, d=?))
I have been unable to properly set the starting value '?'s.  All of
my guesses yield either a "singular gradient" error if they are
decent guesses, or a "singular gradient matrix at initial parameter
estimates" error if they are bad guesses like all zeros.
How can I pick starting values that don't result in singular gradients?
I have had no luck with the "selfStart" models, e.g., "SSgompertz"
-- the formula in "SSgompertz" is not the same as the one I need
above, since it has three parameters instead of four.  I've tried
it and SSfpl thusly:
> getInitial(y ~ SSfpl(x,a,b,c,d),data=data.frame(x=x,y=y))
Error in nls(y ~ cbind(1, 1/(1 + exp((xmid - x)/exp(lscal, data = 
xy,  :
step factor 0.000488281 reduced below `minFactor' of 0.000976563

And this:
> getInitial(y ~ SSgompertz(x,a,b,c),data=data.frame(x=x,y=y))
Error in nls(y ~ cbind(1, 1 - exp(-exp(lrc) * x)), data = xy, start = 
list(lrc = as.vector(log(-coef(lm(log(abs(y -  :
singular gradient

Thanks for any help.
Sincerely,
James Salsman
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html


[R] glm family=binomial logistic sigmoid curve problem

2005-04-10 Thread James Salsman
I'm trying to plot an extrapolated logistic sigmoid curve using
glm(..., family=binomial) as follows, but neither the fitted()
points or the predict()ed curve are plotting correctly:
> year <- c(2003+(6/12), 2004+(2/12), 2004+(10/12), 2005+(4/12))
> percent <- c(0.31, 0.43, 0.47, 0.50)
> plot(year, percent, xlim=c(2003, 2007), ylim=c(0, 1))
> lm <- lm(percent ~ year)
> abline(lm)
> bm <- glm(percent ~ year, family=binomial)
Warning message:
non-integer #successes in a binomial glm! in: eval(expr, envir, enclos)
> points(year, fitted(bm), pch="+")
NULL
> curve(predict(bm, data.frame(year=x)), add=TRUE)
All four of the binomial-fitted points fall exactly on the simple
linear regression line, and the predict() curve is nowhere near any
of the data points.  What am I doing wrong?
What does the warning mean?  Do I need more points?
I am using R on Windows, Version 2.0.1  (2004-11-15)
Thank you for your kind help.
Sincerely,
James Salsman
__
R-help@stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html