[R] Regression tree: labels in the terminal nodes

2007-08-16 Thread Juergen Kuehn
Dear everybody, I'm a new user of R 2.4.1 and I'm searching for information on improving the output of regression tree graphs. In the terminal nodes I am up to now able to indicate the number of values (n) and the mean of all values in this terminal node by the command text(tree, use.n=T,

Re: [R] Regression tree: labels in the terminal nodes

2007-08-16 Thread Achim Zeileis
On Thu, 16 Aug 2007, Juergen Kuehn wrote: Dear everybody, I'm a new user of R 2.4.1 and I'm searching for information on improving the output of regression tree graphs. In the terminal nodes I am up to now able to indicate the number of values (n) and the mean of all values in this terminal

[R] Regression with Missing values. na.action?

2007-07-26 Thread Vaibhav Gathibandhe
Hi all, Can you please tell me what is the problem here. My regression eq is y = B0 + B1X1 +B2X2 +e And i am interested in coefficient B1 I am doing regression with two cases: 1) reg-lm(y ~ X1 + X2, sam) where sam is the data 2) reg-lm(y ~ X1 + X2, sam, na.action= na.exclude) . I have

Re: [R] Regression with Missing values. na.action?

2007-07-26 Thread David Barron
na.exclude should give the same results as na.omit, which is the default na.action. Is the number of complete cases the same in these two regressions? On 26/07/07, Vaibhav Gathibandhe [EMAIL PROTECTED] wrote: Hi all, Can you please tell me what is the problem here. My regression eq is y =

[R] Regression trees using Goodness-of-Split

2007-07-24 Thread Fiona Callaghan
Hi I have two questions: 1) I would like to know if there is a package in R that constructs a regression tree using the 'goodness-of-split' algorithm for survival analysis proposed by Le Blanc and Crowley (1993) (rather than the usual CART algorithm that uses within-node difference and impurity

Re: [R] Regression on two time series

2007-04-12 Thread rdporto1
Ron, you're right. It's not legitimate at all. I suggest you to take a look at the HUGE bibliography on cointegration, as a start up. Rogerio Dear all R user, Please forgive me if my question is too simple. My question is related to Statistics rather directly to R. Suppose I have

[R] Regression on two time series

2007-04-11 Thread Ron Michael
Dear all R user, Please forgive me if my question is too simple. My question is related to Statistics rather directly to R. Suppose I have two time series of spot prices of two commodities X and Y for two years. Now I want to see what percentage of spot price of X is explained by Y. Yes

Re: [R] Regression trees with an ordinal response variable

2007-02-03 Thread Torsten Hothorn
On Fri, 2 Feb 2007, Henric Nilsson (Public) wrote: Torsten, consider the following: ### ordinal regression mammoct - ctree(ME ~ ., data = mammoexp) Warning message: no admissible split found ### estimated class probabilities treeresponse(mammoct, newdata = mammoexp[1:5, ]) [[1]] [1]

Re: [R] Regression trees with an ordinal response variable

2007-02-02 Thread Henric Nilsson (Public)
Den Fr, 2007-02-02, 06:03 skrev Stacey Buckelew: Hi, I am working on a regression tree in Rpart that uses a continuous response variable that is ordered. I read a previous response by Pfr. Ripley to a inquiry regarding the ability of rpart to handle ordinal responses in 2003. At that time

Re: [R] Regression trees with an ordinal response variable

2007-02-02 Thread Torsten Hothorn
On Fri, 2 Feb 2007, Henric Nilsson (Public) wrote: Den Fr, 2007-02-02, 06:03 skrev Stacey Buckelew: Hi, I am working on a regression tree in Rpart that uses a continuous response variable that is ordered. I read a previous response by Pfr. Ripley to a inquiry regarding the ability of

[R] Regression trees with an ordinal response variable

2007-02-01 Thread Stacey Buckelew
Hi, I am working on a regression tree in Rpart that uses a continuous response variable that is ordered. I read a previous response by Pfr. Ripley to a inquiry regarding the ability of rpart to handle ordinal responses in 2003. At that time rpart was unable to implement an algorithm to handle

[R] Regression Line Plot

2007-01-31 Thread amna khan
Sir I am not finding the function to plot least square regression line on type=o plot of two variables. guid me in this regard. -- AMINA SHAHZADI Department of Statistics GC University Lahore, Pakistan. Email: [EMAIL PROTECTED] [EMAIL PROTECTED] [EMAIL PROTECTED] [[alternative HTML

Re: [R] Regression Line Plot

2007-01-31 Thread Marc Schwartz
On Wed, 2007-01-31 at 09:25 -0800, amna khan wrote: Sir I am not finding the function to plot least square regression line on type=o plot of two variables. guid me in this regard. Did you want something like this: x - 1:50 y - rnorm(50) plot(x, y, type = o) abline(lm(y ~ x)) See ?abline

[R] Regression lines

2007-01-12 Thread Tom Backer Johnsen
My simpleminded understanding of simple regression is that when plotting regression lines for x on y and y on x in the same plot, the lines should cross each other at the respective means. But, given the R function below, abline (lm(y~x)) works fine, but abline (lm(x~y)) does not. Why?

[R] Regression lines

2007-01-12 Thread ken knoblauch
Try this version of your function and then think about it tst - function () { attach (attitude) x - rating y - learning detach (attitude) plot (x, y) abline(v=mean(x)) abline(h=mean(y)) abline (lm(y~x)) cc - coef(lm(x ~ y)) abline (-cc[1]/cc[2], 1/cc[2]) } My simpleminded understanding of

Re: [R] Regression lines

2007-01-12 Thread Roger Bivand
On Fri, 12 Jan 2007, Tom Backer Johnsen wrote: My simpleminded understanding of simple regression is that when plotting regression lines for x on y and y on x in the same plot, the lines should cross each other at the respective means. But, given the R function below, abline (lm(y~x))

Re: [R] Regression lines

2007-01-12 Thread Prof Brian Ripley
On Fri, 12 Jan 2007, Tom Backer Johnsen wrote: My simpleminded understanding of simple regression is that when plotting regression lines for x on y and y on x in the same plot, the lines should cross each other at the respective means. But, given the R function below, abline (lm(y~x)) works

Re: [R] Regression lines

2007-01-12 Thread Uwe Ligges
Tom Backer Johnsen wrote: My simpleminded understanding of simple regression is that when plotting regression lines for x on y and y on x in the same plot, the lines should cross each other at the respective means. But, given the R function below, abline (lm(y~x)) works fine, but abline

Re: [R] Regression lines

2007-01-12 Thread Peter Dalgaard
Prof Brian Ripley wrote: Where did you tell it 'x' was the abscissa and 'y' the ordinate? (Nowhere: R is lacking a mind_read() function!) Please stop complaining about missing features. Patches will be considered. Oh, it's you, Brian. Never mind then. You'll get to it, I'm sure. ;-) --

[R] Regression lines

2007-01-12 Thread ken knoblauch
This should do the trick: mind_reader - function() { ll - letters[round(runif(6, 1, 26))] ff - ll[1] for (ix in 2:length(ll)) { ff - paste(ff, ll[ix], sep = ) } if (exists(ff)) { cat(The function that you were

Re: [R] Regression lines

2007-01-12 Thread Barry Rowlingson
ken knoblauch wrote: This should do the trick: mind_reader - function() { ll - letters[round(runif(6, 1, 26))] I see my paraNormal distribution package hasn't found its way to CRAN yet: http://tolstoy.newcastle.edu.au/R/help/05/04/1701.html Barry

Re: [R] Regression lines

2007-01-12 Thread Tom Backer Johnsen
Barry Rowlingson wrote: ken knoblauch wrote: This should do the trick: mind_reader - function() { ll - letters[round(runif(6, 1, 26))] I see my paraNormal distribution package hasn't found its way to CRAN yet: http://tolstoy.newcastle.edu.au/R/help/05/04/1701.html LOL! Nice!

Re: [R] Regression lines

2007-01-12 Thread Duncan Murdoch
On 1/12/2007 5:56 AM, Barry Rowlingson wrote: ken knoblauch wrote: This should do the trick: mind_reader - function() { ll - letters[round(runif(6, 1, 26))] I see my paraNormal distribution package hasn't found its way to CRAN yet:

Re: [R] Regression lines

2007-01-12 Thread Sarah Goslee
Fortune? On 1/12/07, Peter Dalgaard [EMAIL PROTECTED] wrote: Prof Brian Ripley wrote: Where did you tell it 'x' was the abscissa and 'y' the ordinate? (Nowhere: R is lacking a mind_read() function!) Please stop complaining about missing features. Patches will be considered. Oh, it's

[R] Regression question ...

2006-12-04 Thread justin bem
Dear Helpers, I have a simple question. In statistic studies. I have lear to make inference on sampling. I want to know what should be the strategy when I have the whole population ? If a suppose that data are collecte without error, does inference made are useful ? sincerly ! Justin BEM

[R] Regression

2006-11-15 Thread Alvaro
I need to run a regression analysis with a large number of samples. Each sample (identified in the first file column) has its own x and y values. I will use the same model in all samples. How can I run the model for each sample? In SAS code I would use the BY SAMPLE statement. Alvaro

[R] regression analyses using a vector of means and a variance-covariance matrix

2006-10-14 Thread John Sorkin
R 2.2.0 windows XP How can I perform a regression analyses using a vector of means, a variance-covariance matrix? I looked at the help screen for lm and did not see any option for using the afore mentioned structures as input to lm. Thanks, John John Sorkin M.D., Ph.D. Chief, Biostatistics and

Re: [R] regression analyses using a vector of means anda variance-covariance matrix

2006-10-14 Thread John Fox
:[EMAIL PROTECTED] On Behalf Of John Sorkin Sent: Saturday, October 14, 2006 3:27 PM To: r-help@stat.math.ethz.ch Subject: [R] regression analyses using a vector of means anda variance-covariance matrix R 2.2.0 windows XP How can I perform a regression analyses using a vector of means

Re: [R] regression analyses using a vector of means and a variance-covariance matrix

2006-10-14 Thread Gabor Grothendieck
Here is another approach using the same data as in John Fox's reply. His is probably superior but this does have the advantage that its very simple. Note that it gives the same coefficients and R squared to several decimal places. We just simulate a data set with the given means and variance

Re: [R] regression analyses using a vector of means and a variance-covariance matrix

2006-10-14 Thread Gabor Grothendieck
There was a missing line: On 10/14/06, Gabor Grothendieck [EMAIL PROTECTED] wrote: Here is another approach using the same data as in John Fox's reply. His is probably superior but this does have the advantage that its very simple. Note that it gives the same coefficients and R squared to

[R] New contribute about R Regression

2006-09-19 Thread Vito Ricci
Dear UseRs, you can find on CRAN web site my last contribute about R regression techniques: http://cran.r-project.org/doc/contrib/Ricci-regression-it.pdf It's in Italian language. Regards. Vito Ricci Se non ora, quando? Se non qui, dove? Se non tu, chi

Re: [R] Regression line limited by the rage of values

2006-05-25 Thread Karl Ove Hufthammer
Andreas Svensson wrote: So, how can I constrain the abline to the relevant region, i.e stop abline from extrapolating beyond the actual range of data. Or should I use a function line 'lines' to do this? One elegant way of doing this is using 'xyplot' from 'lattice' and adding a loess line

[R] Regression line limited by the rage of values

2006-05-24 Thread Andreas Svensson
Hi In R, using plot(x,y) followed by abline(lm(y~x)) produces a graph with a regression line spanning the whole plot . This means that the line extends beyond the swarm of data points to the defined of default plot region. With par(xpd=T) it will span the entire figure region. But how can

Re: [R] Regression line limited by the rage of values

2006-05-24 Thread Marc Schwartz (via MN)
On Wed, 2006-05-24 at 18:51 +0200, Andreas Svensson wrote: Hi In R, using plot(x,y) followed by abline(lm(y~x)) produces a graph with a regression line spanning the whole plot . This means that the line extends beyond the swarm of data points to the defined of default plot region.

Re: [R] Regression line limited by the range of values

2006-05-24 Thread Andreas Svensson
Thankyou very much Marc for that nifty little script. When I use it on my real dataset though, the lines are fat in the middle and thinner towards the ends. I guess it's because lines draw one fitted line for each x, and if you have hundreds of x, this turns into a line that is thicker that it

Re: [R] Regression line limited by the range of values

2006-05-24 Thread Marc Schwartz (via MN)
On Wed, 2006-05-24 at 21:53 +0200, Andreas Svensson wrote: Thankyou very much Marc for that nifty little script. When I use it on my real dataset though, the lines are fat in the middle and thinner towards the ends. I guess it's because lines draw one fitted line for each x, and if you

Re: [R] Regression line limited by the rage of values

2006-05-24 Thread Greg Snow
Svensson Sent: Wednesday, May 24, 2006 10:52 AM To: r-help@stat.math.ethz.ch Subject: [R] Regression line limited by the rage of values Hi In R, using plot(x,y) followed by abline(lm(y~x)) produces a graph with a regression line spanning the whole plot . This means that the line extends beyond

[R] Regression through the origin

2006-05-23 Thread Leonardo Trujillo
__ R-help@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-help PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

[R] Regression through the origin

2006-05-23 Thread Trujillo L.
Dear R-users: Sorry for the naiveness of my question but I have been trying in the R-help and the CRAN website without any success. I am trying to perform a regression through the origin (without intercept) and my main concern is about its evaluative statistics. It is clear for me that R squared

Re: [R] Regression through the origin

2006-05-23 Thread Rau, Roland
this helps a bit. Best, Roland -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Trujillo L. Sent: Tuesday, May 23, 2006 12:54 PM To: r-help@stat.math.ethz.ch Subject: [R] Regression through the origin Dear R-users: Sorry for the naiveness of my

Re: [R] Regression through the origin

2006-05-23 Thread Karl Ove Hufthammer
Trujillo L. skreiv: Sorry for the naiveness of my question but I have been trying in the R-help and the CRAN website without any success. I am trying to perform a regression through the origin (without intercept) and my main concern is about its evaluative statistics. It is clear for me that

Re: [R] regression modeling

2006-04-26 Thread John Maindonald
@stat.math.ethz.ch Subject: Re: [R] regression modeling May I offer a perhaps contrary perspective on this. Statistical **theory** tells us that the precision of estimates improves as sample size increases. However, in practice, this is not always the case. The reason is that it can take time

Re: [R] regression modeling

2006-04-25 Thread bogdan romocea
] On Behalf Of Weiwei Shi Sent: Monday, April 24, 2006 12:45 PM To: r-help Subject: [R] regression modeling Hi, there: I am looking for a regression modeling (like regression trees) approach for a large-scale industry dataset. Any suggestion on a package from R or from other sources which

Re: [R] regression modeling

2006-04-25 Thread Weiwei Shi
vary as the dataset gets larger and larger? -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Weiwei Shi Sent: Monday, April 24, 2006 12:45 PM To: r-help Subject: [R] regression modeling Hi, there: I am looking for a regression modeling

Re: [R] regression modeling

2006-04-25 Thread Berton Gunter
] [mailto:[EMAIL PROTECTED] On Behalf Of Weiwei Shi Sent: Tuesday, April 25, 2006 12:10 PM To: bogdan romocea Cc: r-help Subject: Re: [R] regression modeling i believe it is not a question only related to regression modeling. The correlation between the sample size and confidence

Re: [R] regression modeling

2006-04-25 Thread Frank E Harrell Jr
Berton Gunter wrote: May I offer a perhaps contrary perspective on this. Statistical **theory** tells us that the precision of estimates improves as sample size increases. However, in practice, this is not always the case. The reason is that it can take time to collect that extra data, and

[R] regression modeling

2006-04-24 Thread Weiwei Shi
Hi, there: I am looking for a regression modeling (like regression trees) approach for a large-scale industry dataset. Any suggestion on a package from R or from other sources which has a decent accuracy and scalability? Any recommendation from experience is highly appreciated. Thanks, Weiwei

[R] regression/step coefficients extraction

2006-04-10 Thread Hufkens Koen
Hi list, I'm looking for a way to easily extract regression p-values and export them to one file for further evaluation. Here is the problem. I use lm() and step() to get my regression parameters/coefficients. after that I can extract them with summary(lm-results)$coefficients[,4] so far so

Re: [R] regression with nestedness

2006-01-25 Thread Spencer Graves
Have you considered lme in library(nlme)? The companion book Pinheiro and Bates (2000) Mixed-Effects Models in S and S-Plus (Springer) is my favorite reference for this kind of thing. From what I understand of your question, you should be able to find excellent answers in this

[R] regression with nestedness

2006-01-22 Thread Jeffrey Stratford
Dear R-users, I set up an experiment where I put up bluebird boxes across an urbanization gradient. I monitored these boxes and at some point I pulled a feather from a chick and a friend used spectral properties (rtot, a continuous var) to index chick health. There is an effect of sex that I

[R] Regression with no-intercept

2006-01-17 Thread Alexandra R. M. de Almeida
Dear R users, There is a method called style analysis where you make a regression being Y=fund yield and X=benchmarks yield, where we have the restrictions to calculatethe linear regression: 1. The regression must don have the intercept term. 2. The coefficient sum must be one. 3. All

[R] Regression with partial info about the dependent variable

2005-12-27 Thread maneesh deshpande
Hi, I have the following problem which I would appreciate some help on. A variable y is to be modelled as a function of a set of variables Vector(x). The twist is that there is another variable z in the problem with the property that y(i) = z(i). So the data set is divided into three

[R] regression using a lagged dependent variable as explanatory variable

2005-10-15 Thread giacomo moro
Hi, I would like to regress y (dependent variable) on x (independent variable) and y(-1). I have create the y(-1) variable in this way: ly-lag(y, -1) Now if I do the following regression lm (y ~ x + ly) the results I obtain are not correct. Can someone tell me the code to use in R in order to

Re: [R] regression using a lagged dependent variable as explanatory variable

2005-10-15 Thread Gabor Grothendieck
Create time series from your data and then use lm with the dyn or dynlm package (as lm does not support time series directly). With the dyn package you just preface lm with dyn$ and then use lm as usual: library(dyn) yt - ts(y) xt - ts(x) dyn$lm(yt ~ xt + lag(yt, -1)) After loading dyn try this

Re: [R] regression using a lagged dependent variable as explanatory variable

2005-10-15 Thread Achim Zeileis
On Sat, 15 Oct 2005, giacomo moro wrote: Hi, I would like to regress y (dependent variable) on x (independent variable) and y(-1). I have create the y(-1) variable in this way: ly-lag(y, -1) Now if I do the following regression lm (y ~ x + ly) the results I obtain are not correct. The

Re: [R] Regression slope confidence interval

2005-10-01 Thread Prof Brian Ripley
On Thu, 29 Sep 2005, Christian Hennig wrote: ?confint Thank you to all of you. As far as I see this is not mentioned on the lm help page (though I presumably don't have the recent version), which I would suggest... and I would suggest that you study a good book on the subject. (confint

[R] Regression slope confidence interval

2005-09-29 Thread Christian Hennig
Hi list, is there any direct way to obtain confidence intervals for the regression slope from lm, predict.lm or the like? (If not, is there any reason? This is also missing in some other statistics softwares, and I thought this would be quite a standard application.) I know that it's easy to

Re: [R] Regression slope confidence interval

2005-09-29 Thread Chuck Cleland
?confint For example: ctl - c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14) trt - c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69) group - gl(2,10,20, labels=c(Ctl,Trt)) weight - c(ctl, trt) lm(weight ~ group) Call: lm(formula = weight ~ group) Coefficients:

Re: [R] Regression slope confidence interval

2005-09-29 Thread Søren Højsgaard
?confint -Oprindelig meddelelse- Fra: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] På vegne af Christian Hennig Sendt: 29. september 2005 13:19 Til: r-help-request Mailing List Emne: [R] Regression slope confidence interval Hi list, is there any direct way to obtain

Re: [R] Regression slope confidence interval

2005-09-29 Thread Achim Zeileis
On Thu, 29 Sep 2005, Christian Hennig wrote: Hi list, is there any direct way to obtain confidence intervals for the regression slope from lm, predict.lm or the like? There is a confint method: e.g., R fm - lm(dist ~ speed, data = cars) R confint(fm, parm = speed) 2.5 % 97.5 %

Re: [R] Regression slope confidence interval

2005-09-29 Thread Christian Hennig
?confint Thank you to all of you. As far as I see this is not mentioned on the lm help page (though I presumably don't have the recent version), which I would suggest... Best, Christian On Thu, 29 Sep 2005, Chuck Cleland wrote: ?confint For example: ctl -

Re: [R] Regression slope confidence interval

2005-09-29 Thread Renaud Lancelot
Why not use vcov() and the normal approximation ? ctl - c(4.17,5.58,5.18,6.11,4.50,4.61,5.17,4.53,5.33,5.14) trt - c(4.81,4.17,4.41,3.59,5.87,3.83,6.03,4.89,4.32,4.69) group - gl(2,10,20, labels=c(Ctl,Trt)) weight - c(ctl, trt) lm.D9 - lm(weight ~ group) cbind(estimate =

Re: [R] Regression slope confidence interval

2005-09-29 Thread Renaud Lancelot
Sorry, I forgot confint and I made a mistake in my suggestion which should be: cbind(estimate = coef(lm.D9), lower = coef(lm.D9) - 1.96 * sqrt(diag(vcov(lm.D9))), upper = coef(lm.D9) + 1.96 * sqrt(diag(vcov(lm.D9 Best, Renaud Christian Hennig a écrit : Hi list, is there

Re: [R] regression methods for circular(?) data.

2005-09-27 Thread Ted Harding
I retract the siggestion I proposed last night -- it was based on a bad hunch! Sorry for wasting time. Best wishes, Ted. E-Mail: (Ted Harding) [EMAIL PROTECTED] Fax-to-email: +44 (0)870 094 0861 Date: 27-Sep-05

[R] regression methods for circular(?) data.

2005-09-26 Thread nwew
Dear R-users, I have the following data x - runif(300,min=1,max=230) y - x*0.005 + 0.2 y - y+rnorm(100,mean=0,sd=0.1) y - y%%1 # --- modulo operation plot(x,y) and would like to recapture the slope (0.005) and intercept(0.2). I wonder if there are any clever algorithms to do this. I was

Re: [R] regression methods for circular(?) data.

2005-09-26 Thread Ted Harding
On 26-Sep-05 nwew wrote: Dear R-users, I have the following data x - runif(300,min=1,max=230) y - x*0.005 + 0.2 y - y+rnorm(100,mean=0,sd=0.1) y - y%%1 # --- modulo operation plot(x,y) and would like to recapture the slope (0.005) and intercept(0.2). I wonder if there are any

Re: [R] regression methods for circular(?) data.

2005-09-26 Thread Witold Eryk Wolski
Hi, I do not know the intercept and slope. And you have to know them in order to do something like: ix-(y 0.9*(x-50)/200 I am right? cheers (Ted Harding) wrote: On 26-Sep-05 nwew wrote: Dear R-users, I have the following data x - runif(300,min=1,max=230) y - x*0.005 + 0.2 y -

Re: [R] regression methods for circular(?) data.

2005-09-26 Thread Ted Harding
On 26-Sep-05 Witold Eryk Wolski wrote: Hi, I do not know the intercept and slope. And you have to know them in order to do something like: ix-(y 0.9*(x-50)/200 I am right? cheers Although I really knew them from the way you generated the data, I pretended I did not know them. Read

Re: [R] regression methods for circular(?) data.

2005-09-26 Thread Witold Eryk Wolski
Ted, I agree with you that if you unwrap the data you can use lm. And you can separate the data in the way you describe. However, if you have thousands of such datasets I do not want to do it by looking at the graph. Yes the scatter may be larger as in the example and range(y) may be larger

Re: [R] regression methods for circular(?) data.

2005-09-26 Thread Ted Harding
On 26-Sep-05 Witold Eryk Wolski wrote: Ted, I agree with you that if you unwrap the data you can use lm. And you can separate the data in the way you describe. However, if you have thousands of such datasets I do not want to do it by looking at the graph. Yes the scatter may be larger

Re: [R] regression with restrictions - optimization problem

2005-09-16 Thread Spencer Graves
I have not seen any replies, so I will offer a comment: 1. You speak of x1, x2, ..., x10, but your example includes only x1+x2+x3+x4. I'm confused. If you could still use help with this, could you please simplify your example further so there was only x1+x2, say? Can

[R] regression with restrictions - optimization problem

2005-09-09 Thread Mark Hempelmann
Dear WizaRds! I am sorry to ask for some help, but I have come to a complete stop in my efforts. I hope, though, that some of you might find the problem quite interesting to look at. I have been trying to estimate parameters for lotteries, the so called utility of chance, i.e. the felt

[R] regression with more than one observation per x value

2005-08-16 Thread Christoph Scherber
Dear R users, How can I do a regression analysis in R where there is more than one observation per x value? I tried the example in SokalRohlf (3rd edn., 1995), page 476 ff., but I somehow couldn´t find a way to partition the sums of squares into linear regression, deviations from regression,

Re: [R] regression with more than one observation per x value

2005-08-16 Thread Liaw, Andy
Sounds like you're looking for something like pure.error.anova in the `alr3' package on CRAN... Andy From: Christoph Scherber Dear R users, How can I do a regression analysis in R where there is more than one observation per x value? I tried the example in SokalRohlf (3rd edn.,

Re: [R] regression data set

2005-08-03 Thread Vito Ricci
Hi, I suggest to give a look to: “Practical Regression and Anova using R” by Julian Faraway http://cran.r-project.org/doc/contrib/Faraway-PRA.pdf http://www.stat.lsa.umich.edu/~faraway/book/ see also package faraway for datasets:

[R] R: regression data set

2005-08-02 Thread Clark Allan
hi all i am busy teaching a regression analysis course to second year science students. the course is fairly theoretical with all of the standard theorems and proofs... i would like to give the class a practical assignment as well. could you suggest a good problem and the location of the data

Re: [R] R: regression data set

2005-08-02 Thread Kevin Wang
Clark Allan wrote: i would like to give the class a practical assignment as well. could you suggest a good problem and the location of the data set/s? it would be good if the data set has been analysed by a number of other people so that students can see the different ways of tackling a

Re: [R] Regression lines for differently-sized groups on the same plot

2005-07-20 Thread Laura M Marx
Sundar Dorai-Raj writes: Hi, Laura, Would ?predict.glm be better? plot(logarea, hempresence, xlab = Surface area of log (m2), ylab=Probability of hemlock seedling presence, type=n, font.lab=2, cex.lab=1.5, axes=TRUE) lines(logarea, predict(hemhem, logreg, response),

[R] Regression lines for differently-sized groups on the same plot

2005-07-19 Thread Laura M Marx
Hi there, I've looked through the very helpful advice about adding fitted lines to plots in the r-help archive, and can't find a post where someone has offered a solution for my specific problem. I need to plot logistic regression fits from three differently-sized data subsets on a plot of

Re: [R] Regression lines for differently-sized groups on the same plot

2005-07-19 Thread Sundar Dorai-Raj
Laura M Marx wrote: Hi there, I've looked through the very helpful advice about adding fitted lines to plots in the r-help archive, and can't find a post where someone has offered a solution for my specific problem. I need to plot logistic regression fits from three differently-sized

[R] Regression and time series

2005-04-11 Thread Fernando Saldanha
Can someone shed some light on this obscure portion of the help for lm? Considerable care is needed when using 'lm' with time series. Unless 'na.action = NULL', the time series attributes are stripped from the variables before the regression is done. (This is necessary as

[R] Regression Modeling Strategies Workshop by Frank Harrell in Southern California

2005-04-05 Thread Madeline Bauer
Dr. Frank E. Harrell, Jr., Professor and Chair of the Department of Biostatistics at Vanderbilt University is giving a one-day workshop on Regression Modeling Strategies on Friday, April 29, 2005. Analyses of the example datasets use R/S-Plus and make extensive use of the Hmisc library

[R] regression tree xerror

2005-03-29 Thread Sherri Miller
I am running some models (for the first time) using rpart and am getting results I don't know how to interpret. I'm using cross-validation to prune the tree and the results look like: Root node error: 172.71/292 = 0.59148 n= 292 CP nsplit rel error xerror xstd 1 0.124662 0

Re: [R] regression tree xerror

2005-03-29 Thread Luis Torgo
Sherri Miller wrote: I am running some models (for the first time) using rpart and am getting results I don't know how to interpret. I'm using cross-validation to prune the tree and the results look like: Root node error: 172.71/292 = 0.59148 n= 292 CP nsplit rel error xerror xstd 1

RE: [R] regression on a matrix

2005-03-04 Thread Martin Maechler
ReidH == Huntsinger, Reid [EMAIL PROTECTED] on Thu, 3 Mar 2005 17:24:22 -0500 writes: ReidH You might use lsfit instead and just do the whole Y ReidH matrix at once. That saves all the recalculation of ReidH things involving only X. yes, but in these cases, we have been

RE: [R] regression on a matrix

2005-03-04 Thread Liaw, Andy
From: Martin Maechler ReidH == Huntsinger, Reid [EMAIL PROTECTED] on Thu, 3 Mar 2005 17:24:22 -0500 writes: ReidH You might use lsfit instead and just do the whole Y ReidH matrix at once. That saves all the recalculation of ReidH things involving only X. yes, but in

[R] regression on a matrix

2005-03-03 Thread Eduardo Leoni
Hi - I am doing a monte carlo experiment that requires to do a linear regression of a matrix of vectors of dependent variables on a fixed set of covariates (one regression per vector). I am wondering if anyone has any idea of how to speed up the computations in R. The code follows: #regression

RE: [R] regression on a matrix

2005-03-03 Thread Huntsinger, Reid
-help@stat.math.ethz.ch Subject: [R] regression on a matrix Hi - I am doing a monte carlo experiment that requires to do a linear regression of a matrix of vectors of dependent variables on a fixed set of covariates (one regression per vector). I am wondering if anyone has any idea of how

RE: [R] regression slope

2004-07-21 Thread Pfaff, Bernhard
see also the contributed document by John Verzani, Simple R, page 87f. Adaikalavan Ramasamy wrote: I would try to construct the confidence intervals and compare them to the value that you want x - rnorm(20) y - 2*x + rnorm(20) summary( m1 - lm(y~x) ) snip Coefficients:

Re: [R] regression slope

2004-07-21 Thread Douglas Bates
On Tue, 2004-07-20 at 17:02, Avril Coghlan wrote: Hello, I'm a newcomer to R so please forgive me if this is a silly question. It's that I have a linear regression: fm - lm (x ~ y) and I want to test whether the slope of the regression is significantly less than 1. How can I do this in R? Another

[R] Regression Modeling Strategies Short Course

2004-07-21 Thread Harrell, Frank E
I will be giving a one-day short course related to my book Regression Modeling Strategies in Toronto as part of the Joint Statistical Meetings on August 8. For more information visit the American Statistical Association web site amstat.org and biostat.mc.vanderbilt.edu/rms. The course applies

[R] regression slope

2004-07-20 Thread Avril Coghlan
Hello, I'm a newcomer to R so please forgive me if this is a silly question. It's that I have a linear regression: fm - lm (x ~ y) and I want to test whether the slope of the regression is significantly less than 1. How can I do this in R? I'm also interested in comparing the slopes of two

Re: [R] regression slope

2004-07-20 Thread Adaikalavan Ramasamy
I would try to construct the confidence intervals and compare them to the value that you want x - rnorm(20) y - 2*x + rnorm(20) summary( m1 - lm(y~x) ) snip Coefficients: Estimate Std. Error t value Pr(|t|) (Intercept) 0.1418 0.1294 1.0950.288 x 2.2058

Re: [R] regression slope

2004-07-20 Thread Corey Moffet
At 06:44 PM 7/20/2004 +0100, Adaikalavan Ramasamy wrote: I would try to construct the confidence intervals and compare them to the value that you want x - rnorm(20) y - 2*x + rnorm(20) summary( m1 - lm(y~x) ) snip Coefficients: Estimate Std. Error t value Pr(|t|) (Intercept)

Re: [R] regression slope

2004-07-20 Thread Sundar Dorai-Raj
Adaikalavan Ramasamy wrote: I would try to construct the confidence intervals and compare them to the value that you want x - rnorm(20) y - 2*x + rnorm(20) summary( m1 - lm(y~x) ) snip Coefficients: Estimate Std. Error t value Pr(|t|) (Intercept) 0.1418 0.1294 1.0950.288

[R] Regression Modeling query

2004-06-22 Thread devshruti pahuja
Hi All I received a raw data set with one record per tennis player (both male and female) and then i cured it by aggregation i.e by 4 age groups, 2 gender levels and 6 income levels. Gender and Income are categorical variables. Please advise me how to use 'R' to model this data set (Actually, i

Re: [R] Regression Modeling query

2004-06-22 Thread devshruti pahuja
where as in men's tennis it's fluctuating. Also, i would like to which age group reflects the prime of a tennis player and hence i've changed continuous variables to categorical. Please advise Thanks -Dev From: Peter Flom [EMAIL PROTECTED] To: [EMAIL PROTECTED] Subject: Re: [R] Regression

Re: [R] Regression Modeling query

2004-06-22 Thread devshruti pahuja
From: Berton Gunter [EMAIL PROTECTED] To: devshruti pahuja [EMAIL PROTECTED] Subject: Re: [R] Regression Modeling query Date: Tue, 22 Jun 2004 13:44:13 -0700 MIME-Version: 1.0 X-Sender: Berton Gunter [EMAIL PROTECTED] Received: from compton.gene.com ([192.12.78.250]) by mc8-f14.hotmail.com

RE: [R] Regression Modeling query

2004-06-22 Thread Liaw, Andy
into books to find little useful. Any references that you provide will be pretty helpful Please advise -Dev From: Berton Gunter [EMAIL PROTECTED] To: devshruti pahuja [EMAIL PROTECTED] Subject: Re: [R] Regression Modeling query Date: Tue, 22 Jun 2004 13:44:13 -0700 MIME-Version

Re: [R] Regression query

2004-06-13 Thread Peter Flom
If variables are colinear, then looking at interactions among them doesn't make much sense. High collinearity means that one variable is nearly a linear combination of others. IOW, that variable is not adding much information. So, if you look at the interaction, you are ALMOST looking at a

  1   2   >