On 2024/1/30 20:00, Martin Becker wrote:
Apart from the fact that the statement "such that t1+t2+t3+t4=2970 (as
it must)" is not correct, the LP can be implemented as follows:
I was confused by "such that t1+t2+t3+t4=2970 (as it must)", otherwise,
I also get the same solution.
Apart from the fact that the statement "such that t1+t2+t3+t4=2970 (as
it must)" is not correct, the LP can be implemented as follows:
library(lpSolve)
LHS <- rbind(
c(0,0,0,0, 1, 0, 0,0),
c(1,0,0,0,-1, 1, 0,0),
c(0,1,0,0, 0,-1, 1,0),
c(0,0,1,0, 0, 0,-1,1),
cbind(-diag(4),diag(4)),
Question for 'experts' in LP using R (using the lpSolve package, say) --
which does not apply to me for the sort of problem I describe below.
I've run any number of LP's using lpSolve in R, but all of them to date
have objective and constraint functions that both contain the same
variables.
1. You should regress Elevation on Volume, no?
2. You are calling lm incorrectly for prediction. Please read ?lm and
related links carefully and/or consult a tutorial. R-Help is really not the
first place you should look for this sort of detailed info.
3. I think this is what you want:
lm1 <-
Dear all;
I have a dataframe with several columns. The columns are the elevation,
volume and the area of the cells (which were placed inside a polygon). I
have extracted them from DEM raster to calculate the volume under polygon
and the elevation for a specific volume of the reservoir.
>
On 12.10.2023 16:25, Fernando Archuby wrote:
Hi.
I have successfully performed the discriminant analysis with the lda
function, I can classify new individuals with the predict function, but I
cannot figure out how the lda results translate into the classification
decision. That is, I don't
It's possible that neither of these will help, but
(1) you can look at the source code of the predict method
(MASS:::predict.lda)
(2) you can look at the source reference ("Modern Applied Statistics in
S", Venables and Ripley) to see if it gives more information (although
it might not);
Hi.
I have successfully performed the discriminant analysis with the lda
function, I can classify new individuals with the predict function, but I
cannot figure out how the lda results translate into the classification
decision. That is, I don't realize how the classification equation for new
There is no difference when running anova or t-test. So you shouldn't
expect positive variance between batches.
On Fri, Mar 4, 2022 at 7:06 PM array chip via R-help
wrote:
> Thanks Jeff for reminding me that the attachment is removed. I put it in
> my google drive if anyone wants to test the
Do you really think a variance from a sample size of 2 makes any sense?
Bert Gunter
"The trouble with having an open mind is that people keep coming along
and sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
On Fri, Mar 4, 2022 at 5:06 PM array chip
Thanks Jeff for reminding me that the attachment is removed. I put it in my
google drive if anyone wants to test the data
(https://drive.google.com/file/d/1lgVZVLHeecp9a_sFxEPeg6353O-qXZhM/view?usp=sharing)
I'll try the mixed model mailing list as well.
John
On Friday, March 4, 2022,
str(dat2)
data.frame': 37654 obs. ...:
$ Yld: int
$ A : int
$ B : chr
$ C : chr
On Wed, Jan 26, 2022 at 10:49 AM Bert Gunter wrote:
> What does str(dat2) give?
>
> Bert Gunter
>
> "The trouble with having an open mind is that people keep coming along
> and sticking
What does str(dat2) give?
Bert Gunter
"The trouble with having an open mind is that people keep coming along
and sticking things into it."
-- Opus (aka Berkeley Breathed in his "Bloom County" comic strip )
On Wed, Jan 26, 2022 at 7:37 AM Val wrote:
>
> Hi all,
>
> I am trying to get the
Hi all,
I am trying to get the lsmeans for one of the factors fitted in the
following model
Model1 = lm(Yld ~ A + B + C, data = dat2)
M_lsm = as.data.frame(lsmeans(Model1, "C")),
My problem is, I am getting this error message.
"Error: The rows of your requested reference grid would be 81412,
Dear Helmut,
The mixed models list is more suitable for this kind of question. I'm
forwarding it to that list. Please send any follow-up to that list instead
of the general R help list.
If I understand correctly, you'll need a different variance term for both
treatments (the within subject for T
Dear all,
I’m struggling to set up a model required for the FDA (haha, and the
Chinese agency). The closest I could get given at the end (which matches
the one preferred by other regulatory agencies worldwide). The FDA is
happy with R but "close" is not close /enough/.
Don't hit me. I'm
I think you might be looking for
?contrasts
to form the contrast matrix.
Rich
On Mon, May 13, 2019 at 7:31 AM Witold E Wolski wrote:
>
> I am looking for a function to compute contrasts with a interface
> similar to that of
>
> lmerTest::contest
> multcomp::glht
>
> i.e. taking the model and a
I am looking for a function to compute contrasts with a interface
similar to that of
lmerTest::contest
multcomp::glht
i.e. taking the model and a contrast vector or matrix as an argument,
but for linear models, and without the multiple testing adjusted made
by multcomp::glht.
Thank you
--
Hi Kenneth,
My guess is that you have tried to send screenshots of your output and
these were blocked. Try to cut and paste the output into your message.
Jim
On Tue, Aug 7, 2018 at 6:38 PM, John wrote:
> On Mon, 6 Aug 2018 20:18:38 +0200
> kenneth Barnhoorn wrote:
>
> Your examples did not
On Mon, 6 Aug 2018 20:18:38 +0200
kenneth Barnhoorn wrote:
Your examples did not appear. Remember to use plain text rather
than html.
JWDougherty
> I have a problem with a linear regression output.
>
> In January I made an analysis of some data and received an certain
> output, if I run the
I have a problem with a linear regression output.
In January I made an analysis of some data and received an certain output, if I
run the same code now I don’t receive the same output and I don’t see why. It
is important to know the country, so I would like to see the country names
behind the
Generally, statistics questions are off topic here, although they do
sometimes intersect R programming issues, as perhaps here.
Nevertheless, I believe your post would fit better on the
r-sig-mixed-models list, where repeated measures and other mixed
effects (/variance components) models are
Dear list,
this seemed to me like a very trivial question, but finally I haven't found
any similar postings with suitable solutions on the net ...
Basically, instead of regressing two simple series of measures 'a' and 'b'
(like b ~ a), I would like to use independent replicate measurements for
Step back a minute: normality is NOT required for predictors in a
multiple regression model, though the sqrt(x) transformation may
also make the relationship more nearly linear, and linearity IS
assumed when you fit a simple model such as y ~ x + w + z.
(Normality is only required for the
Before going to stackexchange you should consider if a square root
transformation is appropriate for the model that you are trying to
estimate. If you do so, you may be able to interpret the coefficients
yourself. If no explanation is obvious you probably should not be using a
square root
Hello,
R-Help answers questions on R code, your question is about statistics.
You should try posting the question to
https://stats.stackexchange.com/
Hope this helps,
Rui Barradas
Em 23-10-2017 18:54, kende jan via R-help escreveu:
Dear all, I am trying to fit a multiple linear regression
Dear all, I am trying to fit a multiple linear regression model with a
transformed dependant variable (the normality assumption was not verified...).
I have realised a sqrt(variable) transformation... The results are great, but I
don't know how to interprete the beta coefficients... Is it
Hello,
I have a netcdf file for summer monsoon rainfall gridded data over Indian
region. How can I find the linear trend in R?
regards
Sourabh Bal
Dr. Sourabh Bal
Assistant Professor
Department of Physics
Swami Vivekananda Institute of Science and Technology
Kolkata 700145
Thank You
-- ԭʼ�ʼ� --
��: "Bert Gunter"<bgunter.4...@gmail.com>;
ʱ��: 2017��3��11��(��) 11:44
�ռ���: "��˳"<liushu...@qq.com>;
: "r-help"<r-help@r-project.org>;
: Re: [R] Linear Mixed-Effec
Have you read the docs? Is this some kind of homework? -- this list
does not do homework. We expect minimal efforts at least on the part
of posters. We do not do tutorials here. I think you need to do some
reading on your own before posting further. Try posting on the
r-sig-mixed-models list to
Dear R Help:
What does "lmer(responce ~ factor1*factor2 + (factor1*factor2 | group1) +
(factor1*factor2| group2), data)" mean?
And
"lmer(responce ~ factor1*factor2 + (factor1*factor2 | group1) +
(factor1*factor2| group2), data)" vs. "lmer(responce ~ factor1*factor2 +
(factor1+factor2 | group1)
Small example code to set up the problem?
JN
On 2017-01-07 06:26 AM, Preetam Pal wrote:
> Hi Guys,
> Any help with this,please?
> Regards,
> Preetam
>
> On Thu, Jan 5, 2017 at 4:09 AM, Preetam Pal wrote:
>
>> Hello guys,
>>
>> The context is ordinary multivariate
, you
should cite it and describe why not specifically so we don't find it and
think "you should have found this yourself". The second hit that came up
when I typed 'R linear optimization "quadratic constraint"' into Google
was https://cran.r-project.org/web/packages/ROI/
Hi Guys,
Any help with this,please?
Regards,
Preetam
On Thu, Jan 5, 2017 at 4:09 AM, Preetam Pal wrote:
> Hello guys,
>
> The context is ordinary multivariate regression with k (>1) regressors,
> i.e. *Y = XB + Error*, where
> Y = n X 1 vector of predicted variable,
> X =
Hello guys,
The context is ordinary multivariate regression with k (>1) regressors,
i.e. *Y = XB + Error*, where
Y = n X 1 vector of predicted variable,
X = n X (k + 1) matrix of regressor variables(including ones in the first
column)
B = (k+1) vector of coefficients, including intercept.
Say, I
Grothendieck
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
Thx a lot Gabor!
Aljosa Aleksandrovic, FRM, CAIA
Quantitative Analyst - Convertibles
aljosa.aleksandro...@man.com
Tel +41 55 417 76 03
Man Investments (CH) AG
Huobstrasse 3 | 8808 Pfäffikon SZ
Below is the covariates for a model ~x1+x2+x3+x4+x5+x6. I noticed that when
fitting this model that the coefficient x6 is unestimable.*Is this merely a
case that adding more columns to my model matrix will eventually lead to
linear dependance so the more terms I have in the model formulae the more
Thanks Brian for all your kind help.
"didn't mean to imply that the different parameterization of the contrasts
would make the lm estimates agree more with the lmer estimates, only that
it might be easier to compare the regression summary output to see how
similar/dissimilar they were ".
Got it
Utkarsh: I think the differences between the lm and lmer estimates of the
intercept are consistent with the regularization effect expected with
mixed-effects models where the estimates shrink towards the mean slightly.
I don't think there is any reason to expect exact agreement between the lm
and
Hi Brian,
This makes some sense to me theoretically, but doesn't pan out with my
experiment.
The contrasts default was the following as you said:
> options("contrasts")
$contrasts
unordered ordered
"contr.treatment" "contr.poly"
I changed it as follows:
>
Your lm() estimates are using the default contrasts of contr.treatment,
providing an intercept corresponding to your subject 308 and the other
subject* estimates are differences from subject 308 intercept. You could
have specified this with contrasts as contr.sum and the estimates would be
more
Hello Thierry,
Thank you for your quick response. Sorry, but I am not sure if I follow
what you said. I get the following outputs from the two models:
> coef(lmer(Reaction ~ Days + (1| Subject), sleepstudy))
Subject(Intercept) Days
308292.1888 10.46729
309173.5556 10.46729
310
The parametrisation is different.
The intercept in model 1 is the effect of the "average" subject at days ==
0.
The intercept in model 2 is the effect of the first subject at days == 0.
ir. Thierry Onkelinx
Instituut voor natuur- en bosonderzoek / Research Institute for Nature and
Forest
team
Hi experts,
While the slope is coming out to be identical in the two methods below, the
intercepts are not. As far as I understand, both are formulations are
identical in the sense that these are asking for a slope corresponding to
'Days' and a separate intercept term for each Subject.
# Model-1
Hi,
Take a look at the package "ic.infer" by Ulrike Gromping.
https://www.jstatsoft.org/article/view/v033i10
Best,
Ravi
Ravi Varadhan, Ph.D. (Biostatistics), Ph.D. (Environmental Engg)
Associate Professor, Department of Oncology
Division of Biostatistics & Bionformatics
Sidney Kimmel
Hi all,
I hope you are doing well?
I'm currently using lm() to estimate a linear multi-factor (5 factors without
intercept) model as follows ...
factor.lm <- lm(y~x1+x2+x3+x4+x5-1, data = data.frame.rbind)
Using nnls(A,b) I estimated the same model, extended by a non-negativity
constraint on
:06
To: Gabor Grothendieck
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
Thx a lot Gabor!
Aljosa Aleksandrovic, FRM, CAIA
Quantitative Analyst - Convertibles
aljosa.aleksandro...@man.com
Tel +41 55 417 76 03
Man Investments (CH) AG
Huobstrasse 3 | 8
]
Sent: Donnerstag, 28. April 2016 14:48
To: Aleksandrovic, Aljosa (Pfaeffikon)
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
The nls2 package can be used to get starting values.
On Thu, Apr 28, 2016 at 8:42 AM, Aleksandrovic, Aljosa (Pfaeffikon
...@gmail.com]
Sent: Dienstag, 26. April 2016 17:59
To: Aleksandrovic, Aljosa (Pfaeffikon)
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
This is a quadratic programming problem that you can solve using either a
quadratic programming solver with constraints
ail.com]
> Sent: Dienstag, 26. April 2016 17:59
> To: Aleksandrovic, Aljosa (Pfaeffikon)
> Cc: r-help@r-project.org
> Subject: Re: [R] Linear Regressions with constraint coefficients
>
> This is a quadratic programming problem that you can solve using either a
> quadratic prog
Any help with exporting anova output in R to csv or xlsx?
From: "Aleksandrovic, Aljosa (Pfaeffikon)" <aljosa.aleksandro...@man.com>
To: Bert Gunter <bgunter.4...@gmail.com>
Cc: "r-help@r-project.org" <r-help@r-project.org>
Sent: Tuesday, April 26,
Grothendieck [mailto:ggrothendi...@gmail.com]
Sent: Dienstag, 26. April 2016 17:59
To: Aleksandrovic, Aljosa (Pfaeffikon)
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
This is a quadratic programming problem that you can solve using either a
quadratic
This is a quadratic programming problem that you can solve using
either a quadratic programming solver with constraints or a general
nonlinear solver with constraints. See
https://cran.r-project.org/web/views/Optimization.html
for more info on what is available.
Here is an example using a
[mailto:bgunter.4...@gmail.com]
Sent: Dienstag, 26. April 2016 17:49
To: Aleksandrovic, Aljosa (Pfaeffikon)
Cc: r-help@r-project.org
Subject: Re: [R] Linear Regressions with constraint coefficients
Have you tried web searching on " R constrained linear regression" or similar.
There seemed to be
unter [mailto:bgunter.4...@gmail.com]
> Sent: Dienstag, 26. April 2016 16:51
> To: Aleksandrovic, Aljosa (Pfaeffikon)
> Cc: r-help@r-project.org
> Subject: Re: [R] Linear Regressions with constraint coefficients
>
> If the slope coefficients sum to a constant, the regressors are d
Subject: Re: [R] Linear Regressions with constraint coefficients
If the slope coefficients sum to a constant, the regressors are dependent and
so a unique solution is impossible (an infinity of solutions would result). So
I think you have something going on that you don't understand and should
If the slope coefficients sum to a constant, the regressors are
dependent and so a unique solution is impossible (an infinity of
solutions would result). So I think you have something going on that
you don't understand and should consult a local statistician to help
you formulate your problem
Hi all,
I hope you are doing well?
I’m currently using the lm() function from the package stats to fit linear
multifactor regressions.
Unfortunately, I didn’t yet find a way to fit linear multifactor regressions
with constraint coefficients? I would like the slope coefficients to be all
One technique for dealing with this is called 'multiple imputation'.
Google for 'multiple imputation in R' to find R packages that implement
it (e.g., the 'mi' package).
Bill Dunlap
TIBCO Software
wdunlap tibco.com
On Tue, Mar 15, 2016 at 8:14 AM, Lorenzo Isella
wrote:
IMHO this is not a question about R... it is a question about statistics
whether R is involved or not. As such, a forum like stats.stackexchange.com
would be better suited to address this.
FWIW I happen to think that expecting R to solve this for you is unreasonable.
--
Sent from my phone.
Dear All,
A situation that for sure happens very often: suppose you are in the
following situation
set.seed(1235)
x1 <- seq(30)
x2 <- c(rep(NA, 9), rnorm(19)+9, c(NA, NA))
x3 <- c(rnorm(17)-2, rep(NA, 13))
y <- exp(seq(1,5, length=30))
mm<-lm(y~x1+x2+x3)
i.e. you try a simple linear
On 16/11/15 20:49, Ragia Ibrahim wrote:
Dear group IF I had an objective function and some constrains formed
in linear model form. is there a way,..library in R that helps me to
solve such amodel and find the unknown variable in it?
This is a very ill-posed question and is unlikely to provoke
Dear group
IF I had an objective function and some constrains formed in linear model form.
is there a way,..library in R that helps me to solve such amodel and find the
unknown variable in it?
thanks in advance
Ragia
Hi Ravi,
And remember that the vanilla rounding procedure is biased upward. That is,
an observation of 5 actually may have ranged from 4.5 to 5.4.
Jim
On Thu, Oct 22, 2015 at 7:15 AM, peter salzman
wrote:
> here is one thought:
>
> if you plug in your numbers into
:11 PM
To: Charles C. Berry
Cc: Ravi Varadhan; r-help@r-project.org
Subject: Re: [R] Linear regression with a rounded response variable
> On 21 Oct 2015, at 19:57 , Charles C. Berry <ccbe...@ucsd.edu> wrote:
>
> On Wed, 21 Oct 2015, Ravi Varadhan wrote:
>
>> [snippage]
>
&g
> Yes, and I think that the suggestion in another post to look at censored
> regression is more in the right direction.
I think this is right and perhaps the best (or at least better) pathway to
pursue than considering this within the framework of measurement error (ME). Of
course there *is*
Hi,
I am dealing with a regression problem where the response variable, time
(second) to walk 15 ft, is rounded to the nearest integer. I do not care for
the regression coefficients per se, but my main interest is in getting the
prediction equation for walking speed, given the predictors (age,
On Wed, 21 Oct 2015, Ravi Varadhan wrote:
Hi, I am dealing with a regression problem where the response variable,
time (second) to walk 15 ft, is rounded to the nearest integer. I do
not care for the regression coefficients per se, but my main interest is
in getting the prediction equation
Hi Ravi,
Thanks for this interesting question. My thoughts are given below.
If you believe the rounding is indeed uniformly distributed, then the
problem is equivalent with adding a uniform random error between (-0.5,
0.5) for every observation in addition to the standard normal error, which
here is one thought:
if you plug in your numbers into any kind of regression you will get
prediction that are real numbers and not necessarily integers, it may be
that you predictions are good enough with this approximate value of Y. you
could test this by randomly shuffling your data by +- 0.5
This could be modeled directly using Bayesian techniques. Consider the
Bayesian version of the following model where we only observe y and X. y0
is not observed.
y0 <- X b + error
y <- round(y0)
The following code is based on modifying the code in the README of the CRAN
rcppbugs R
> On 21 Oct 2015, at 19:57 , Charles C. Berry wrote:
>
> On Wed, 21 Oct 2015, Ravi Varadhan wrote:
>
>> [snippage]
>
> If half the subjects have a value of 5 seconds and the rest are split between
> 4 and 6, your assertion that rounding induces an error of
>
-project.org
Subject: [R] Linear regression of 0/1 response ElemStatLearn (Fig. 2.1 the
elements of statistical learning)
Hello
In chapter 2 ESL book authors write: Let's look at example of linear
model in a classification context
They fit a simple linear model g = 0.3290614 -0.0226360x1 + 0.2495983x2 + e
On 07/30/2014 05:00 AM, r-help-requ...@r-project.org wrote:
A while ago, I inquired about fitting excess relative risk models in R. This is
a follow-up about what I ended up doing in case the question pops up again.
While I was not successful in using standard tools, switching to Bayesian
(stanFit)
stanFit
-Original Message-
From: Wollschlaeger, Daniel
Sent: Thursday, January 9, 2014 10:44 AM
To: David Winsemius
Cc: r-help@r-project.org
Subject: RE: AW: [R] Linear relative rate / excess relative risk models
Thanks for your suggestions! Here are links to simulated data
Hello,
I'm a R beginner and I want to make a Multiple Regression about birds. My data
is stord in a .csv file.
I tried to do this with the following code:
reg.data - read.table(file.choose(),header=T, sep=;,dec=,)
attach(reg.data)
names(reg.data)
model - lm(Flights ~ Age + Gender + weight +
One way to see where the first warning comes from is to turn warnings
into errors with options(warn=2) and when the error happens call
traceback().
Bill Dunlap
TIBCO Software
wdunlap tibco.com
On Sun, Jun 29, 2014 at 4:12 AM, wat tele watt...@hotmail.de wrote:
Hello,
I'm a R beginner and I
Dear all
I try to fit a linear mixed model to my data. In short, my dependent variable
reflects changes of the bone level (Knmn, in mm), thus this variable is
continous and provides negative values. I have two different groups (factor
Group) that were measured 3 times each (thus repeated
-project.org
Onderwerp: [R] linear mixed model for non-normal negative and continous data
Dear all
I try to fit a linear mixed model to my data. In short, my dependent variable
reflects changes of the bone level (Knmn, in mm), thus this variable is
continous and provides negative values. I have two
Hi,
Im trying to plot a linear line on the scatter plot using the pairs()
function. At the moment the line is non linear. However, I want a linear
line and the associated R value.
Here is my current code:
panel.cor.scale - function(x, y, digits=2, prefix=, cex.cor)
{
usr - par(usr);
-help@r-project.org
Subject: [R] Linear line on pairs plot
Hi,
Im trying to plot a linear line on the scatter plot using the pairs()
function. At the moment the line is non linear. However, I want a linear
line and the associated R value.
Here is my current code:
panel.cor.scale
Sent: 25. april 2014 12:26
To: r-help@r-project.org
Subject: [R] Linear line on pairs plot
Hi,
Im trying to plot a linear line on the scatter plot using the pairs()
function. At the moment the line is non linear. However, I want a linear
line and the associated R value.
Here
received this e-mail in error please contact the sender.
From: Shane Carey [mailto:careys...@gmail.com]
Sent: 25. april 2014 13:14
To: Frede Aakmann Tøgersen
Cc: r-help@r-project.org
Subject: Re: [R] Linear line on pairs plot
Great, thanks Frede,
This works perfectly. Ive tested these correlations
:* 25. april 2014 13:14
*To:* Frede Aakmann Tøgersen
*Cc:* r-help@r-project.org
*Subject:* Re: [R] Linear line on pairs plot
Great, thanks Frede,
This works perfectly. Ive tested these correlations with ones in sigma
plot and excel and for some reason the r squared value is different
to www.vestas.com/legal/notice
If you have received this e-mail in error please contact the sender.
*From:* Shane Carey [mailto:careys...@gmail.com]
*Sent:* 25. april 2014 13:14
*To:* Frede Aakmann Tøgersen
*Cc:* r-help@r-project.org
*Subject:* Re: [R] Linear line on pairs plot
Great, thanks Frede
[mailto:careys...@gmail.com]
*Sent:* 25. april 2014 13:14
*To:* Frede Aakmann Tøgersen
*Cc:* r-help@r-project.org
*Subject:* Re: [R] Linear line on pairs plot
Great, thanks Frede,
This works perfectly. Ive tested these correlations with ones in sigma
plot and excel and for some reason the r
I have this problem with this form:
min (A*X) under some constraints.
the unknown is X that is a Matrix. I can't use the function linp because
in it X is a vector..
How can I do??? Can you help me
[[alternative HTML version deleted]]
__
On 03/17/2014 07:57 AM, Barbara Rogo wrote:
I have this problem with this form:
min (A*X) under some constraints.
the unknown is X that is a Matrix. I can't use the function linp because
in it X is a vector..
How can I do??? Can you help me
If X is a matrix, then A*X could be a matrix or
event and offset pyears.
Many thanks, D
-Original Message-
From: David Winsemius [mailto:dwinsem...@comcast.net]
Sent: Thursday, January 09, 2014 4:33 AM
To: Wollschlaeger, Daniel
Cc: r-help@r-project.org
Subject: Re: AW: [R] Linear relative rate / excess relative risk models
My question is how I can fit linear relative rate models (= excess relative
risk models, ERR) using R. In radiation epidemiology, ERR models are used to
analyze dose-response relationships for event rate data and have the following
form [1]:
lambda = lambda0(z, alpha) * (1 + ERR(x, beta))
*
I would fit a Poisson model to the dose-response data with offsets for the
baseline expecteds.
Sent from my iPhone
On Jan 8, 2014, at 10:49 AM, Wollschlaeger, Daniel
wollschlae...@uni-mainz.de wrote:
My question is how I can fit linear relative rate models (= excess relative
risk
Von: David Winsemius [dwinsem...@comcast.net]
Gesendet: Mittwoch, 8. Januar 2014 19:06
An: Wollschlaeger, Daniel
Cc: r-help@r-project.org
Betreff: Re: [R] Linear relative rate / excess relative risk models
I would fit a Poisson model to the dose-response data with offsets
Von: David Winsemius [dwinsem...@comcast.net]
Gesendet: Mittwoch, 8. Januar 2014 19:06
An: Wollschlaeger, Daniel
Cc: r-help@r-project.org
Betreff: Re: [R] Linear relative rate / excess relative risk models
I would fit a Poisson model to the dose-response
Hello, Colleges!
I'm having a problem with illustration of linear transformation of
multivariate gaussian RV using R.
According to the theory, if X ~ N(0, I), then CX ~ N(0,CC').
But the code below doesn't illustrate this. Please could you help me to
find a mistake
require(tmvtnorm)
C =
Hi
I have a question about drawing a linear line in x, y plot. I usually
use the following code, but for this time the x values are to small
(-0.08 to -0.02)
I wrote the following code, but r does not draw the line. However, it
does not give an error when it takes the code.
reg1- lm(CWSI~NWI,
A few problems ...
This statement doesn't make sense.
seq(-0.08, -0.02, len = -0.02)
Perhaps you meant
seq(-0.08, -0.02, by = 0.02)
The xlim= and ylim= are arguments to higher level plot functions, like
plot(), and won't work for functions lines() or abline().
Are you trying to limit the range
On Oct 28, 2013, at 8:30 AM, Ahmed Attia wrote:
Hi
I have a question about drawing a linear line in x, y plot. I usually
use the following code, but for this time the x values are to small
(-0.08 to -0.02)
That is not the problem.
I wrote the following code, but r does not draw the line.
On 10/29/2013 02:30 AM, Ahmed Attia wrote:
Hi
I have a question about drawing a linear line in x, y plot. I usually
use the following code, but for this time the x values are to small
(-0.08 to -0.02)
I wrote the following code, but r does not draw the line. However, it
does not give an error
First of I am new to using R.
I have a dataset that I plotted using R, I created a scatter plot and used
abline to create the line, what I need is to find the equation of the line.
Below is the script I have used up until this point.
young400_1-read.csv(Z:\\SOFTEL\\North Key Largo
summary(lm(Canopy_Height~Ground_Elevation, data=young400_1)) #use
data= instead of attach!
Or even
mylm - lm(Canopy_Height~Ground_Elevation, data=young400_1)
mylm
summary(mylm)
coefficients(mylm)
Most intro to R guides cover the basics of modeling; you might benefit
from reading one of them.
1 - 100 of 497 matches
Mail list logo