If it converts into a factor although it appears to be numeric, then there
is probably a string entry somewhere in this variable, which causes R to
convert the whole variable into a factored variable (which results in
dummies in the regression). You will want to check whether there are any
excess
It seems the crux of the problem is:
#Heres my problem...my attempt below is to build a vector of
length-candslength of all of the
#TAZDetermine_FEET values that are less than or equal to Dev_size.
If you do:
TAZDetermine_FEET[ TAZDetermine_FEET = Dev_size ]
then you will get the vector of
Hi Mark, I cannot explain why it does it. But if you use data.frame instead,
it works.
Cheers,
Daniel
f.lmmultenhanced -
function(response, pred1, pred2)
{
regmod - lm(response ~ pred1 + pred2)
lmsum - summary(regmod)
imbcoef-lmsum$coefficients[2,1]
-Original Message-
From: r-help-boun...@r-project.org
[mailto:r-help-boun...@r-project.org] On Behalf Of Larry Ma
Sent: Friday, December 12, 2008 8:34 PM
To: r-help@r-project.org
Subject: [R] sas.get function in Hmsic 3.4-4 vs. 3.0-12
Dear R Users,
I have just installed R 2.8
K F Pearce wrote:
Hello everyone.
This is a question regarding generation of the concordance index (c
index) in R using the function rcorr.cens. In particular about
interpretation of its direction and form of the 'predictor'.
Since Frank Harrell hasn't replied I'll contribute my 2 cents.
Evan DeCorte wrote:
I am trying to make estimates of the predictive power of ARIMA models estimated by the auto.arima() function.
I am looping through a large number of time seiries and fitting ARIMA models with the following code.
data1 - read.csv(file = case.csv, header = T)
data - data1
Daren Tan wrote:
Besides the impute package, are there others that have alternative impute approaches ? I hope to compare their performances.
Have a look at transcan() and impute() in package Hmisc.
--
Gad Abraham
Dept. CSSE and NICTA
The University of Melbourne
Parkville 3010, Victoria,
Gad Abraham wrote:
Daren Tan wrote:
Besides the impute package, are there others that have alternative
impute approaches ? I hope to compare their performances.
Have a look at transcan() and impute() in package Hmisc.
aregImpute in Hmisc is better as it does real multiple imputation
Gad Abraham wrote:
K F Pearce wrote:
Hello everyone.
This is a question regarding generation of the concordance index (c
index) in R using the function rcorr.cens. In particular about
interpretation of its direction and form of the 'predictor'.
Since Frank Harrell hasn't replied I'll
for(i in 1:length(data))
{
point_data = unlist(data[i], use.names = FALSE)
x = auto.arima(point_data , max.p = 10, max.q = 10, max.P = 0, max.Q =
0, approximation = TRUE)
}
However, I would like to find a way to test the out of sample predictive
power of these models. I
On Dec 12, 2008, at 10:59 PM, js.augus...@gmail.com wrote:
Hi all,
I'm quite new to R and have a very basic question regarding how one
gets
the standard error of the mean for factor levels under aov. I was
able to
get the factor level means using:
Matching row names was the request.
# continuing with your example:
rownames(d1)- letters[1:5]
rownames(d2)- letters[3:7]
# and then for rownames of d1 that are also in rownames of d2:
# for the full rows ...
d1[row.names(d1) %in% row.names(d2),]
# or for just the names:
I'm not sure this is the appropriate forum--please let me know if I
should post somewhere else.
I have Rpad version 1.3.0 set up on my webserver. It works, except
that graphics are not displayed. They are created (i.e., when I run
the example I see that graphics files are created in
Hi,
I have a Data Set x and I want to check with a Kolmogorow-Smirnow-Test, if x
comes from a Subbotin Distribution, whose density function is:
function(x,location,scale,tail) # Exponential power (=Subbotin)
{
const- 2*scale*tail^(1/tail) *gamma(1+1/tail)
z- (x-location)/scale
Perhaps because you have not loaded the package that contains it?
If you have installed MASS (via the super package VR) then try:
require(MASS)
?negative.binomial
--
David Winsemius
On Dec 11, 2008, at 4:29 PM, Marty Kardos wrote:
Hi;
I am running generalized linear mixed models (GLMMs)
Dan,
Thanks for the link. Here is what I learned:
#1 new Hmsic sas.get which didn't work
status - sys(paste(shQuote(sasprog), shQuote(sasin), -log,
shQuote(log.file)), output = FALSE)
#2 old Hmsic which worked, and you can replace the following line in the new
sas.get
status -
Hi David, thanks for the quick response. I did look at the help files for
model.tables and se.contrast and neither seemed appropriate. I probably
wasn't clear enough in my original email, so here's more information:
I'm analyzing data from a psychology experiment on how people allocate
visual
On Dec 12, 2008, at 8:57 AM, Peter Dalgaard wrote:
Chuck Cleland wrote:
On 12/12/2008 3:29 AM, robert-mcfad...@o2.pl wrote:
Hello,
Which package allows to use Cochrana-Armitage trend test? I tried
to search for but I found only package coin in which there is no
explicit function.
But
Wacek,
I am curious as to why Brian and I (and possibly other responders) are held to
a higher standard than the original poster.
My first question was a real question. There are 2 main ways to do regular
expression matching (possibly others as well), you describe one method with the
Marty Kardos wrote:
Hi;
I am running generalized linear mixed models (GLMMs) with the lmer
function
from the lme4 package in R 2.6.2. My response variable is overdispersed,
and
I would like (if possible) to run a negative binomial GLMM with lmer if
possible. I saw a posting from
Dear all,
I'm using the lrm function from the package Design, and I want to extract
the p-values from the results of that function. Given an lrm object
constructed as follows :
fit - lrm(Y~(X1+X2+X3+X4+X5+X6+X7)^2, data=dataset)
I need the p-values for the coefficients printed by calling fit.
Sent this mail in rich text format before. Excuse me for this.
Dear all,
I'm using the lrm function from the package Design, and I want to
extract the p-values from the results of that function. Given an lrm
object constructed as follows :
fit -
On Sat, 13 Dec 2008, markle...@verizon.net wrote:
could someone explain why the name of FPVAL gets .value concatenated onto
it when the code below is run and temp is returned.
I've been trying to figure this out for too long. It doesn't matter when I
put the FPVAL in the return statement. It
You have to problems:
a) You need the cumulative distribution function, hence I suggest to
look at package normalp which offers function pnormp() (as well as
dnormp and other related functions that might be of interest)
b) The KS test won't have much power given you are estimating the
On Dec 13, 2008, at 11:37 AM, Jason Augustyn wrote:
Hi David, thanks for the quick response. I did look at the help
files for model.tables and se.contrast and neither seemed
appropriate. I probably wasn't clear enough in my original email, so
here's more information:
I'm analyzing data
joris meys wrote:
Sent this mail in rich text format before. Excuse me for this.
Dear all,
I'm using the lrm function from the package Design, and I want to
extract the p-values from the results of that function. Given an lrm
object constructed as follows :
fit -
Does not this give you what you need?
model.tables(rawfixtimedata.aov,means, se=TRUE)
I tried that, but get an error:
SEs for type 'means' are not yet implemented
Maybe I'm not using the correct terminology to describe what I need to do.
Using the main effect of Marking as an example, I have
On Dec 13, 2008, at 1:12 PM, joris meys wrote:
Sent this mail in rich text format before. Excuse me for this.
Dear all,
I'm using the lrm function from the package Design, and I want to
extract the p-values from the results of that function. Given an lrm
object
Thanks for the great feedback. Conceptually I understand how you would go about
testing out of sample performance. It seems like accuracy() would be the best
way to test out of forecast performance and will help to automate the
construction of statistics I would have calculated on my own.
On Dec 13, 2008, at 2:15 PM, js.augus...@gmail.com wrote:
Does not this give you what you need?
model.tables(rawfixtimedata.aov,means, se=TRUE)
I tried that, but get an error:
SEs for type 'means' are not yet implemented
I don't get that error. Using the example and this call
On Dec 12, 2008, at 8:57 AM, Peter Dalgaard wrote:
Chuck Cleland wrote:
On 12/12/2008 3:29 AM, robert-mcfad...@o2.pl wrote:
Hello,
Which package allows to use Cochrana-Armitage trend test? I tried
to search for but I found only package coin in which there is no
explicit function.
But
Taoufik NADIFI taoufik.nadifi at sgcib.com writes:
Please, can you tell me if you know how can i use the library Igraph in C#
?
Not easy, but possible. Try to google for rcom c# and watch the list
http://www.mail-archive.com/rco...@mailman.csd.univie.ac.at/
where some problems with more
To clarify : I am aware of the interpretation problems. Thank you for
the tip! (it's getting late here...)
On Sat, Dec 13, 2008 at 9:56 PM, joris meys jorism...@gmail.com wrote:
Thanks for the answers.
@David : I am aware of that, but this is far from the last model actually.
@ Frank : I
joris meys wrote:
Thanks for the answers.
@David : I am aware of that, but this is far from the last model actually.
@ Frank : I know the Anova procedure gives more relevant p-values, but
the attempt is to order the terms by interaction type from low
significance to high significance, based on
On Dec 12, 2008, at 8:41 AM, Gabor Grothendieck wrote:
Its a FAQ:
http://cran.r-project.org/doc/FAQ/R-FAQ.html#How-can-I-turn-a-string-into-a-variable_003f
On Fri, Dec 12, 2008 at 8:30 AM, Philip Whittall
philip.whitt...@detica.com wrote:
I am still struggling to map a character string to
Dear R user´s,
Is there anyone that may send me articles, e-books or scripts (R/Matlab)
about Hidden Markov Models?
I would like studying this methodology.
Thanks a lot.
Best regards.
--
Prof. Marcus Vinicius P. de Souza
Juiz de Fora / MG
Brasil
[[alternative HTML version deleted]]
On Dec 12, 2008, at 11:14 AM, Marc Marí Dell'Olmo wrote:
Hello all,
Does anyone know if there exists any function in R that resembles the
lincom and nlcom of STATA?. These functions computes point
estimates, standard errors, significance levels, confidence intervals,
etc. for linear and non
Greg Snow wrote:
Wacek,
I am curious as to why Brian and I (and possibly other responders) are held
to a higher standard than the original poster.
(we have just had an offline communication, it should be fine to keep it
that way)
My first question was a real question. There are 2 main
Thank you.
I got tryCatch working ...
Maura
-Messaggio originale-
Da: Sarah Goslee [mailto:sarah.gos...@gmail.com]
Inviato: ven 12/12/2008 16.24
A: mau...@alice.it
Cc: r-h...@stat.math.ethz.ch
Oggetto: Re: [R] Eror handling with R
Is try() what you're looking for?
Sarah
On Fri, Dec
Evan DeCorte wrote:
Thanks for the great feedback. Conceptually I understand how you would go about testing out of sample performance. It seems like accuracy() would be the best way to test out of forecast performance and will help to automate the construction of statistics I would have
Those commands provide point estimates, standard errors and confidence
intervals based on linear combination of parameters or
linearization/delta-method, respectively. R's contrasts appear to be
limited to a single factor and combinations that sum up to zero.
I am too so used to this Stata's
estimable in the gmodels package provides point estimates, standard
errors and confidence intervals for arbitrary linear combinations of
model parameters. I don't know for non-linear combinations, though.
Cheers
Andrew
On Sat, Dec 13, 2008 at 11:33:12PM -0600, Stas Kolenikov wrote:
Those
Controlling the pointer is going to be very different from perl since the R
functions are vectorized rather than focusing on a single string.
Here is one approach that will give all the matches and lengths (for the
original problem at least):
mystr - paste(rep(1122, 10), collapse=)
n -
There is function for linear combination in car package as well.
Best
Ronggui
On Sun, Dec 14, 2008 at 1:42 PM, Andrew Robinson
a.robin...@ms.unimelb.edu.au wrote:
estimable in the gmodels package provides point estimates, standard
errors and confidence intervals for arbitrary linear
David Kaplan wrote:
Hi,
We're using stats4 for a logistic regression. The code is
chdreg.logit2 - glm(chd ~ age + sex, family = binomial)
summary(chdreg.logit2)
oddsratios - coef(chdreg.logit2)
exp(oddsratios)
# Calculate model predicted values
pred - predict(chdreg.logit2,type=response)
45 matches
Mail list logo