1. You should regress Elevation on Volume, no?
2. You are calling lm incorrectly for prediction. Please read ?lm and
related links carefully and/or consult a tutorial. R-Help is really not the
first place you should look for this sort of detailed info.
3. I think this is what you want:
lm1 <-
Dear all;
I have a dataframe with several columns. The columns are the elevation,
volume and the area of the cells (which were placed inside a polygon). I
have extracted them from DEM raster to calculate the volume under polygon
and the elevation for a specific volume of the reservoir.
>
I think you might be looking for
?contrasts
to form the contrast matrix.
Rich
On Mon, May 13, 2019 at 7:31 AM Witold E Wolski wrote:
>
> I am looking for a function to compute contrasts with a interface
> similar to that of
>
> lmerTest::contest
> multcomp::glht
>
> i.e. taking the model and a
I am looking for a function to compute contrasts with a interface
similar to that of
lmerTest::contest
multcomp::glht
i.e. taking the model and a contrast vector or matrix as an argument,
but for linear models, and without the multiple testing adjusted made
by multcomp::glht.
Thank you
--
Thanks Brian for all your kind help.
"didn't mean to imply that the different parameterization of the contrasts
would make the lm estimates agree more with the lmer estimates, only that
it might be easier to compare the regression summary output to see how
similar/dissimilar they were ".
Got it
Utkarsh: I think the differences between the lm and lmer estimates of the
intercept are consistent with the regularization effect expected with
mixed-effects models where the estimates shrink towards the mean slightly.
I don't think there is any reason to expect exact agreement between the lm
and
Hi Brian,
This makes some sense to me theoretically, but doesn't pan out with my
experiment.
The contrasts default was the following as you said:
> options("contrasts")
$contrasts
unordered ordered
"contr.treatment" "contr.poly"
I changed it as follows:
>
Your lm() estimates are using the default contrasts of contr.treatment,
providing an intercept corresponding to your subject 308 and the other
subject* estimates are differences from subject 308 intercept. You could
have specified this with contrasts as contr.sum and the estimates would be
more
Hello Thierry,
Thank you for your quick response. Sorry, but I am not sure if I follow
what you said. I get the following outputs from the two models:
> coef(lmer(Reaction ~ Days + (1| Subject), sleepstudy))
Subject(Intercept) Days
308292.1888 10.46729
309173.5556 10.46729
310
The parametrisation is different.
The intercept in model 1 is the effect of the "average" subject at days ==
0.
The intercept in model 2 is the effect of the first subject at days == 0.
ir. Thierry Onkelinx
Instituut voor natuur- en bosonderzoek / Research Institute for Nature and
Forest
team
Hi experts,
While the slope is coming out to be identical in the two methods below, the
intercepts are not. As far as I understand, both are formulations are
identical in the sense that these are asking for a slope corresponding to
'Days' and a separate intercept term for each Subject.
# Model-1
One technique for dealing with this is called 'multiple imputation'.
Google for 'multiple imputation in R' to find R packages that implement
it (e.g., the 'mi' package).
Bill Dunlap
TIBCO Software
wdunlap tibco.com
On Tue, Mar 15, 2016 at 8:14 AM, Lorenzo Isella
wrote:
IMHO this is not a question about R... it is a question about statistics
whether R is involved or not. As such, a forum like stats.stackexchange.com
would be better suited to address this.
FWIW I happen to think that expecting R to solve this for you is unreasonable.
--
Sent from my phone.
Dear All,
A situation that for sure happens very often: suppose you are in the
following situation
set.seed(1235)
x1 <- seq(30)
x2 <- c(rep(NA, 9), rnorm(19)+9, c(NA, NA))
x3 <- c(rnorm(17)-2, rep(NA, 13))
y <- exp(seq(1,5, length=30))
mm<-lm(y~x1+x2+x3)
i.e. you try a simple linear
On 16/11/15 20:49, Ragia Ibrahim wrote:
Dear group IF I had an objective function and some constrains formed
in linear model form. is there a way,..library in R that helps me to
solve such amodel and find the unknown variable in it?
This is a very ill-posed question and is unlikely to provoke
Dear group
IF I had an objective function and some constrains formed in linear model form.
is there a way,..library in R that helps me to solve such amodel and find the
unknown variable in it?
thanks in advance
Ragia
Dear All,
I am struggling with a linear model and an allegedly trivial data set.
The data set does not consist of categorical variables, but rather of
numerical discrete variables (essentially, they count the number of times
that something happened).
Can I still use a standard linear
Lorenzo:
1. This is a statistics question, not an R question.
2. Your statistical background appears inadequate -- it looks like
Poisson regression, which would fall under generalized linear
models. But it depends on how discrete discrete is (on some level,
all measurements are discrete,
On Jun 13, 2013, at 2:21 PM, Bert Gunter wrote:
Lorenzo:
1. This is a statistics question, not an R question.
2. Your statistical background appears inadequate -- it looks like
Poisson regression, which would fall under generalized linear
models. But it depends on how discrete discrete
De: Peter Ehlers [ehl...@ucalgary.ca]
Enviado: quarta-feira, 3 de Abril de 2013 19:01
Para: Adams, Jean
Cc: Cecilia Carmo; r-help@r-project.org
Assunto: Re: [R] linear model coefficients by year and industry, fitted values,
residuals, panel data
A few minor improvements to Jean's post suggested
De: Peter Ehlers [ehl...@ucalgary.ca]
Enviado: quarta-feira, 3 de Abril de 2013 19:01
Para: Adams, Jean
Cc: Cecilia Carmo; r-help@r-project.org
Assunto: Re: [R] linear model coefficients by year and industry, fitted values,
residuals, panel data
A few minor
Carmo
Cc: r-help@r-project.org; Adams, Jean
Assunto: Re: [R] linear model coefficients by year and industry, fitted values,
residuals, panel data
On 2013-04-04 02:11, Cecilia Carmo wrote:
Thank you all. I'm very happy with this solution. Just two questions:
I use mutate() with package plyr
Hi R-helpers,
My real data is a panel (unbalanced and with gaps in years) of thousands of
firms, by year and industry, and with financial information (variables X, Y, Z,
for example), the number of firms by year and industry is not always equal, the
number of years by industry is not always
Cecilia,
Thanks for providing a reproducible example. Excellent.
You could use the ddply() function in the plyr package to fit the model for
each industry and year, keep the coefficients, and then estimate the fitted
and residual values.
Jean
library(plyr)
coef - ddply(final3, .(industry,
A few minor improvements to Jean's post suggested inline below.
On 2013-04-03 05:41, Adams, Jean wrote:
Cecilia,
Thanks for providing a reproducible example. Excellent.
You could use the ddply() function in the plyr package to fit the model for
each industry and year, keep the coefficients,
Peter.
For suggestion 1, what advantages are there to using coef() rather than
$coef?
For suggestion 2, thanks! I'm new to the plyr package and wasn't aware of
the mutate() function.
Jean
On Wed, Apr 3, 2013 at 1:01 PM, Peter Ehlers ehl...@ucalgary.ca wrote:
A few minor improvements to
On 04/04/2013 07:54 AM, Adams, Jean wrote:
Peter.
For suggestion 1, what advantages are there to using coef() rather than
$coef?
Just thought I'd chip in: It is considered, uh, politically correct to use
extractor functions rather than digging out components of objects
in a direct manner.
Dear R-users,
in the last days I have been trying to estimate a normal linear model with
equality and inequality constraints.
Please find below a simple example of my problem.
Of course, one could easily see that, though the constraints are consistent,
there is some redundancy in the
I have data X and Y, and I want to predict what the very next point would be
based off the model. This is what I have:
model=lm(x~y)
I think I want to use the predict function, but I'm not exactly sure what to
do.
Thank you!
--
View this message in context:
On 24.07.2012 20:20, cm wrote:
I have data X and Y, and I want to predict what the very next point would be
based off the model. This is what I have:
model=lm(x~y)
Hmmm, are you sure about the above code?
I think I want to use the predict function, but I'm not exactly sure what to
do.
How do I set it up? Because when I do predict(model) I get a ton of points, not
just one.
- Original Message -
From: Uwe Ligges-3 [via R]
Date: Tuesday, July 24, 2012 2:28 pm
Subject: Re: Linear Model Prediction
To: cm
On 24.07.2012 20:20, cm wrote:
I have data X and Y,
On Jul 24, 2012, at 1:38 PM, cm bunnylove...@optonline.net wrote:
How do I set it up? Because when I do predict(model) I get a ton of points,
not just one.
You need to supply newdata= . predict() without new data gives predicted
values for the predictors you for the model to.
Yes, why wouldn't I? It's a linear model between two sets of data: x and y.
Also, what would the new data be if i want to predict into the future? So,
for example, the data goes from a month ago to today. I want to predict what
tomorrow's data would be. So what is newdata?
--
View this
On Tue, Jul 24, 2012 at 2:06 PM, bunnylove...@optonline.net wrote:
Yes, why wouldn't I? It's a linear model between two sets of data: x and y.
Conventionally, one predicts y based on x -- which is specified y ~ x,
not x ~ y. (Predictors on the RHS, predicted on the LHS)
Also, what would the
Subject: [R] Linear Model Prediction
I have data X and Y, and I want to predict what the very next point would be
based off the model. This is what I have:
model=lm(x~y)
I think I want to use the predict function, but I'm not exactly sure what to
do.
Thank you!
--
View this message in context
I cleaned up my old benchmarking code and added checks for missing
data to compare various ways of finding OLS regression coefficients.
I thought I would share this for others. the long and short of it is
that I would recommend
ols.crossprod = function (y, x) {
x -
Dear R Users,
Using lm() function with categorical variable R use contrasts. Let
assume that I have one X independent variable with 3-levels. Because R
estimate only 2 parameters ( e.g. a1, a2) the coef function returns
only 2 estimators. Is there any function or trick to get another a3
values. I
?dummy.coef
(NB: 'R' does as you tell it, and if you ask for the default contrasts
you get coefficients a2 and a3, not a1 and a2. So perhaps you did
something else and failed to tell us? And see the comment in
?dummy.coef about treatment contrasts.)
On Sun, 12 Jun 2011, Robert Ruser
Prof. Ripley, thank you very much for the answer but wanted to get
something else. There is an example and an explanation:
options(contrasts=c(contr.sum,contr.poly)) # contr.sum uses ‘sum
to zero contrasts’
Y - c(6,3,5,2,3,1,1,6,6,6,7,4,1,6,6,6,6,1)
X - structure(list(x1 = c(2L, 3L, 1L, 3L, 3L,
Hi Robert,
Try this:
reg2 - lm( Y ~ factor(x1) + factor(x2) + factor(x3) + factor(x4) +
factor(x5) - 1, data = X )
cof(ref2)
HTH,
Jorge
On Sun, Jun 12, 2011 at 4:40 PM, Robert Ruser wrote:
Prof. Ripley, thank you very much for the answer but wanted to get
something else. There is an
Hi,
but I want to get the coefficients for every variables from x1 to x5.
(x1 was an example)
Robert
2011/6/12 Jorge Ivan Velez jorgeivanve...@gmail.com:
Hi Robert,
Try this:
reg2 - lm( Y ~ factor(x1) + factor(x2) + factor(x3) + factor(x4) +
factor(x5) - 1, data = X )
cof(ref2)
HTH,
this may work.
X-data.frame(sapply(X,function(x) as.factor(x)))
reg3=lm(Y~.,data=X)
dummy.coef(reg3)
Weidong Gu
On Sun, Jun 12, 2011 at 4:55 PM, Robert Ruser robert.ru...@gmail.com wrote:
Hi,
but I want to get the coefficients for every variables from x1 to x5.
(x1 was an example)
Robert
Hi Weidong,
thank you very much. It really works fine.
Robert
2011/6/12 Weidong Gu anopheles...@gmail.com:
this may work.
X-data.frame(sapply(X,function(x) as.factor(x)))
reg3=lm(Y~.,data=X)
dummy.coef(reg3)
Weidong Gu
On Sun, Jun 12, 2011 at 4:55 PM, Robert Ruser robert.ru...@gmail.com
-Original Message-
From: stephen sefick [mailto:ssef...@gmail.com]
Sent: April-03-11 5:35 PM
To: Steven McKinney
Cc: R help
Subject: Re: [R] Linear Model with curve fitting parameter?
Steven:
You are exactly right sorry I was confused
-Original Message-
From: stephen sefick [mailto:ssef...@gmail.com]
Sent: April-04-11 2:49 PM
To: Steven McKinney
Subject: Re: [R] Linear Model with curve fitting parameter?
Steven:
I am really sorry for my confusion. I hope this now makes sense.
b0 == y intercept == y
Thank you very much for all of your help.
On Mon, Apr 4, 2011 at 6:10 PM, Steven McKinney smckin...@bccrc.ca wrote:
-Original Message-
From: stephen sefick [mailto:ssef...@gmail.com]
Sent: April-04-11 2:49 PM
To: Steven McKinney
Subject: Re: [R] Linear Model with curve fitting
McKinney smckin...@bccrc.ca wrote:
-Original Message-
From: stephen sefick [mailto:ssef...@gmail.com]
Sent: April-01-11 5:44 AM
To: Steven McKinney
Cc: R help
Subject: Re: [R] Linear Model with curve fitting parameter?
Setting Z=Q-A would be the incorrect dimensions. I could Z=Q/A.
I
sefick
Sent: March-31-11 3:38 PM
To: R help
Subject: [R] Linear Model with curve fitting parameter?
I have a model Q=K*A*(R^r)*(S^s)
A, R, and S are data I have and K is a curve fitting parameter. I
have linearized as
log(Q)=log(K)+log(A)+r*log(R)+s*log(S)
I have taken the log
...@bccrc.ca wrote:
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of stephen sefick
Sent: March-31-11 3:38 PM
To: R help
Subject: [R] Linear Model with curve fitting parameter?
I have a model Q=K*A*(R^r)*(S^s)
A, R, and S are data I have
-Original Message-
From: stephen sefick [mailto:ssef...@gmail.com]
Sent: April-01-11 5:44 AM
To: Steven McKinney
Cc: R help
Subject: Re: [R] Linear Model with curve fitting parameter?
Setting Z=Q-A would be the incorrect dimensions. I could Z=Q/A.
I suspect this is confusion
I have a model Q=K*A*(R^r)*(S^s)
A, R, and S are data I have and K is a curve fitting parameter. I
have linearized as
log(Q)=log(K)+log(A)+r*log(R)+s*log(S)
I have taken the log of the data that I have and this is the model
formula without the K part
lm(Q~offset(A)+R+S, data=x)
What is the
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of stephen sefick
Sent: March-31-11 3:38 PM
To: R help
Subject: [R] Linear Model with curve fitting parameter?
I have a model Q=K*A*(R^r)*(S^s)
A, R, and S are data I have
Hi,
Sorry for the naive question, but what exactly does the 'Adjusted R-squared'
coefficient in the summary of linear model adjust for?
Sample code:
x - rnorm(15)
y - rnorm(15)
lmr - lm(y~x)
summary(lmr)
Call:
lm(formula = y ~ x)
Residuals:
Min 1Q Median 3Q Max
-1.7828
See:
http://en.wikipedia.org/wiki/Coefficient_of_determination#Adjusted_R2
and the implementation in summary.lm :
ans$adj.r.squared - 1 - (1 - ans$r.squared) * ((n -
df.int)/rdf)
Brian Smith wrote:
Hi,
Sorry for the naive question, but what exactly does the 'Adjusted
Hi,
I wanted to check the difference in results (using lme4) , if I treated a
particular variable (beadchip) as a random effect vs if I treated it as a
fixed effect.
For the first case, my formula is:
lmer.result - lmer(expression ~ cancerClass + (1|beadchip))
For the second case, I want
] On
Behalf Of Brian Smith
Sent: Friday, February 25, 2011 10:06 AM
To: r-help@r-project.org
Subject: [R] linear model lme4
Hi,
I wanted to check the difference in results (using lme4) , if I treated a
particular variable (beadchip) as a random effect vs if I treated it as a
fixed effect
Hi,
can somebody tell me why R is not able to calculate a linear model
written in this way?
lm (seq(1:100)~seq(1:100))
Call:
lm(formula = seq(1:100) ~ seq(1:100))
Coefficients:
(Intercept)
50.5
Warning messages:
1: In model.matrix.default(mt, mf, contrasts) :
the response appeared on
On 08/05/2010 05:50 AM, Giuseppe Amatulli wrote:
Hi,
can somebody tell me why R is not able to calculate a linear model
written in this way?
lm (seq(1:100)~seq(1:100))
Call:
lm(formula = seq(1:100) ~ seq(1:100))
Coefficients:
(Intercept)
50.5
Warning messages:
1: In
On Aug 5, 2010, at 6:50 AM, Giuseppe Amatulli wrote:
Hi,
can somebody tell me why R is not able to calculate a linear model
written in this way?
lm (seq(1:100)~seq(1:100))
Call:
lm(formula = seq(1:100) ~ seq(1:100))
Coefficients:
(Intercept)
50.5
Warning messages:
1: In
On Nov 13, 2009, at 11:49 AM, Sam Albers wrote:
Hello R list,
snipped answered question
Sorry to not use your data but it's not in a form that lends itself
very well to quick testing. If you had included the input commands I
might have tried it.
No problem not use my data. For
Hello R list,
This is a question for anyone who has used the by() command. I would like
to
perform a regression on a data frame by several factors. Using by() I
think
that I have able to perform this using the following:
lm.r - by(master, list(Sectionf=Sectionf, startd=startd),
On Fri, Nov 13, 2009 at 11:49 AM, Sam Albers tonightstheni...@gmail.com wrote:
snip
No problem not use my data. For future reference, would it have been easier
to attach a .csv file and then include the appropriate read.csv command? I
realized that the easier one makes it to help, the easier it
Hello R list,
This is a question for anyone who has used the by() command. I would like to
perform a regression on a data frame by several factors. Using by() I think
that I have able to perform this using the following:
lm.r - by(master, list(Sectionf=Sectionf, startd=startd), function(x) lm
Hi,
You have not given us all the data needed to reproduce your analysis
(what is SectionF?), but the issue is probably that lm.r is a list and
you're not treating it that way. Try
srt(lm.r)
and
summary(lm.r[[1]])
You may also want to look at the the lmList() function in the lme4 package.
Rnewb wrote:
I would like to perform a regression like the one below:
lm(x ~ 0 + a1 + a2 + a3 + b1 + b2 + b3 + c1 + c2 + c3, data=data)
However, the data has the property that a1+a2+a3 = A, b1+b2+b3 = B, and
c1+c2+c3 = C, where A, B,
Ravi Varadhan has an example how this could be
I would like to perform a regression like the one below:
lm(x ~ 0 + a1 + a2 + a3 + b1 + b2 + b3 + c1 + c2 + c3, data=data)
However, the data has the property that a1+a2+a3 = A, b1+b2+b3 = B, and
c1+c2+c3 = C, where A, B, and C are positive constants. So there are two
extra degrees of freedom,
Hello,
I am deriving near real-time liner relationships based on 5-min
precipitation data, sometimes the non-qced data result in a slope of NA. I
am trying to read the coefficient (in this example x) to see if it is equal
to NA, if it is equal to NA assign it a value of 1. I am having trouble
Hello,
I am deriving near real-time liner relationships based on 5-min
precipitation data, sometimes the non-qced data result in a slope of NA. I
am trying to read the coefficient (in this example x) to see if it is equal
to NA, if it is equal to NA assign it a value of 1. I am having trouble
if(fit$coef[[2]] == NA) {.cw = 1}
See ?is.na
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal,
On Sep 21, 2009, at 4:38 PM, Douglas M. Hultstrand wrote:
Hello,
I am deriving near real-time liner relationships based on 5-min
precipitation data, sometimes the non-qced data result in a slope of
NA. I
am trying to read the coefficient (in this example x) to see if it
is equal
to NA,
On Sep 21, 2009, at 4:50 PM, David Winsemius wrote:
On Sep 21, 2009, at 4:38 PM, Douglas M. Hultstrand wrote:
Hello,
I am deriving near real-time liner relationships based on 5-min
precipitation data, sometimes the non-qced data result in a slope
of NA. I
am trying to read the
On 21-Sep-09 20:38:25, Douglas M. Hultstrand wrote:
Hello,
I am deriving near real-time liner relationships based on 5-min
precipitation data, sometimes the non-qced data result in a
slope of NA. I am trying to read the coefficient (in this example x)
to see if it is equal to NA, if it is
Dear R-help,
Suppose I have the following data:
df=data.frame(x=1:10, y=c(1,2,3,4,5,12,14,16,18,20))
plot(y~x, df, t=b)
How can I fit a model which estimates the slopes between x = 1-5, 5-6,
and 6-10?
Adding the factor f:
df$f - gl(2,5)
Allows me to fit a linear model with interaction
On Aug 29, 2009, at 7:56 AM, Markus Gesmann wrote:
Dear R-help,
Suppose I have the following data:
df=data.frame(x=1:10, y=c(1,2,3,4,5,12,14,16,18,20))
plot(y~x, df, t=b)
How can I fit a model which estimates the slopes between x = 1-5,
5-6, and 6-10?
Adding the factor f:
df$f -
On Sat, 2009-08-29 at 12:56 +0100, Markus Gesmann wrote:
Dear R-help,
Suppose I have the following data:
df=data.frame(x=1:10, y=c(1,2,3,4,5,12,14,16,18,20))
plot(y~x, df, t=b)
How can I fit a model which estimates the slopes between x = 1-5, 5-6,
and 6-10?
Does the segmented
Thanks to somebody I got the hint to use offset for the purpose of
validating if there's
a difference between the intercept and slope of a model and some
provided values for
the coefficients intercept and slope.
I read ?model.offset and I'm still struggling to use it for my
purpose. If I
On Sat, 8 Aug 2009, Katharina May wrote:
Thanks to somebody I got the hint to use offset for the purpose of
validating if there's
a difference between the intercept and slope of a model and some
provided values for
the coefficients intercept and slope.
You could also use a Wald test for a
Hi there,
I've got a question which is really trivial for sure but still I have
to ask as I'm not
making any progress solving it by myself (please be patient with an
undergraduate
student):
I've got a linear model (lm and lmer fitted with method=ML).
Now I want to compare the coefficients
Hi every one
I perform a simple linear regression
lm(a b + c + d , data = data1)
How to say to R to perform and print the regression with restricting the
coefficient
of the variable c to be equal to 0.1. In the model print, I want to
show the p-values of all my coefficients.
Thank you in
On Jun 5, 2009, at 11:10 AM, Axel Leroix wrote:
Hi every one
I perform a simple linear regression
lm(a b + c + d , data = data1)
How to say to R to perform and print the regression with restricting
the coefficient
of the variable c to be equal to 0.1.
?lm
Examine material on
Hi all,
I have a question about linear model with interaction:
I created a data frame df like this:
df
V1 V2 V3 V4 V5
1 6.414094 c t a g
2 6.117286 t a g t
3 5.756922 a g t g
4 6.090402 g t g t
...
which holds the response in the first column and letters (a,c,g,t) in
Hi
for some data I working on I am merely plotting time against temperature for
a variable named filmclip. So for example, I have volunteers who watched
various film clips and have used infared camera to monitor the temperature
on their face at every second of the clip.
The variable names I
Melissa2k9 wrote:
Hi
for some data I working on I am merely plotting time against temperature for
a variable named filmclip. So for example, I have volunteers who watched
various film clips and have used infared camera to monitor the temperature
on their face at every second of the clip.
://www.StatisticalEngineering.com
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org] On
Behalf Of Melissa2k9
Sent: Friday, April 03, 2009 5:51 AM
To: r-help@r-project.org
Subject: [R] Linear model, finding the slope
Hi
for some data I working on I am merely
I want to know how accurate are the p-values when you do linear regression in
R?
I was looking at the variable x3 and the t=10.843 and the corresponding
p-value=2e-16 which is the same p-value for the intercept where the t-value
for the intercept is 48.402.
I tried to calculate the p-value in R
February 2009 6:36 AM
To: r-help@r-project.org
Subject: [R] Linear model
I want to know how accurate are the p-values when you do linear regression in
R?
I was looking at the variable x3 and the t=10.843 and the corresponding
p-value=2e-16 which is the same p-value for the intercept where the t-value
On Wed, Feb 11, 2009 at 1:36 PM, kayj kjaj...@yahoo.com wrote:
I want to know how accurate are the p-values when you do linear regression in
R?
I was looking at the variable x3 and the t=10.843 and the corresponding
p-value=2e-16 which is the same p-value for the intercept where the t-value
Hey,
I am modelling a linear regression Y=X*B+E. To compute the effect of “group”
the B-values of the regressors/columns that code the interaction effects (col.
5-8 and col. 11-14, see below) have to be weighted with non-zero elements
within the contrast Group 1 minus Group 2 (see below). My
The problem comes from mixing up general linear model (LM) theory to compute
B with the classical anova estimators. The two methods use different
approaches to solving the normal equations. LM theory uses any generalized
inverse of X'X to solve the normal equations. Yours comes from ginv()
Dear All,
I have a question which seems trivial, but I reached a dead end.
I have a set of points (measurements) and I used lm() to obtain their linear
regression model. From the biological background this line must pass through
a point (100,0). Our dataset is not optimal and it shows a slight
I have a set of points (measurements) and I used lm() to obtain their linear
regression model. From the biological background this line must pass through
a point (100,0). Our dataset is not optimal and it shows a slight deviation
from that coordinate. How can I add the restraint to the model,
: Wednesday, June 04, 2008 9:12 PM
To: Austin, Matt
Cc: r-help@r-project.org
Subject: Re: [R] linear model in the repeated data type~
hi:lot thanks,how to use list to extract,I type allFit$coefficents,it came to
nothing,
such as I need to extract the estimates,how to do it by using list
2008/6/3
:03 PM
To: Manli Yan
Cc: r-help@r-project.org
Subject: Re: [R] linear model in the repeated data type~
allFits - lmList(y ~ t|id, data=table1, pool=FALSE)
allCoefs - sapply(allFits, coef) ## preferred by me
or
allCoefs - list(length(allFits))
for(i in 1:length(allFits)) allCoef[[i]] - coef
here is the data:
y-c(5,2,3,7,9,0,1,4,5)
id-c(1,1,6,6,7,8,15,15,19)
t-c(50,56,50,56,50,50,50,60,50)
table1-data.frame(y,id,t)//longitudinal data
the above is only part of data.
what I want to do is to use the linear model for each id ,then get the
estimate value,like:
Try this:
f - function(x)any(is.na(coefficients(x)))
models - by(table1[c(y, t)], table1$id, FUN=lm)
models[!unlist(lapply(models, f))]
On Wed, Jun 4, 2008 at 6:20 PM, Manli Yan [EMAIL PROTECTED] wrote:
here is the data:
y-c(5,2,3,7,9,0,1,4,5)
id-c(1,1,6,6,7,8,15,15,19)
:07 PM
To: r-help@r-project.org
Subject: [R] linear model in the repeated data type~
here is the data:
y-c(5,2,3,7,9,0,1,4,5)
id-c(1,1,6,6,7,8,15,15,19)
t-c(50,56,50,56,50,50,50,60,50)
table1-data.frame(y,id,t)//longitudinal data
what I want to do is to use the linear model for each id
here is the data:
y-c(5,2,3,7,9,0,1,4,5)
id-c(1,1,6,6,7,8,15,15,19)
t-c(50,56,50,56,50,50,50,60,50)
table1-data.frame(y,id,t)//longitudinal data
what I want to do is to use the linear model for each id ,then get the
estimate value,like:
fit1-lm(y~t,data=table1,subset=(id==1))
but ,you can
PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of Manli Yan
Sent: Tuesday, June 03, 2008 9:07 PM
To: r-help@r-project.org
Subject: [R] linear model in the repeated data type~
here is the data:
y-c(5,2,3,7,9,0,1,4,5)
id-c(1,1,6,6,7,8,15,15,19)
t-c(50,56,50,56,50,50,50,60,50)
table1-data.frame(y,id
Try something like this:
fits - list(500)
for (i in 1:500)
{
if (sum(table1$id == i) == 0) fits[[i]] - NA
else fits[[i]] - lm(y~t,data=table1,subset=(id==i))
}
--- On Wed, 4/6/08, Manli Yan [EMAIL PROTECTED] wrote:
From: Manli Yan [EMAIL PROTECTED]
Subject: [R] linear model in the repeated
Hi,
I am a bit unclear if svyglm with family=gaussian is
actually a normal linear model including weighting.
The goal is to estimate a normal linear model using
sample inflation weights.
Can anybody illuminate me a bit on this?
Thanks a lot!
Werner
100 matches
Mail list logo