, collapse = cluster, weighted =
TRUE) :
Wrong length for 'collapse'
I tried both 64 bit (R.3.1.0) and 32 bit (R.3.1.2) in Windows 7 64bit and get
the same errors
Inclusion of tt and cluster terms worked fine in R2.9.2-2.15.1 under Windows
Vista 32 bit and Ubuntu 64 bit
Any ideas?
Christos
!
Christos Argyropoulos
From: vicvoncas...@gmail.com
Date: Sun, 1 Jan 2012 14:10:36 -0500
To: dwinsem...@comcast.net
CC: r-help@r-project.org
Subject: Re: [R] R on Android
If the phone is rooted one could hypothetically install from Debian
repositories where R is represented very
I believe there was a fairly recent exchange (within the last 6 months) about
linear measurement error models/error-in-variable models/Deming
regression/total least squares/orthogonal regression. Terry Therneau provided
code for an R function that can estimate such models:
I'm not really sure I understand the question. Do you want to create a
function, which is defined as the integral of another function?
Something like:
f1-function(x) sin(x)
f2-function(x) cos(x)
integral-function(u,integrand) integrate(integrand,0,u)
integral(pi,f1)
2 with absolute
If the system is sparse and you have a really large cluster to play
with, then maybe (emphasis) PETSc/TAO is the right combination of tools
for your problem.
http://www.mcs.anl.gov/petsc/petsc-as/
Christos
One possible way is the following:
x -c(0.49534,0.80796,0.93970,0.8)
count -c(0,33,0,4)
x[count==0]
[1] 0.49534 0.93970
x[count0]
[1] 0.80796 0.8
Christos
Date: Tue, 6 Jul 2010 15:39:08 +0900
From: gunda...@gmail.com
To: r-h...@stat.math.ethz.ch
Subject: [R] Conditional
e.g. compare X=0.1 v.s X=5.1).
From my understanding of how HPDinterval works, the intervals returned by the
2 different invokations should be very similar. So what causes this
discrepancy? Which one of the 2 intervals should be used?
Regards,
Christos Argyropoulos
R CODE FOLLOWS
. compare X=0.1 v.s X=5.1).
From my understanding of how HPDinterval works, the intervals returned by the
2 different invokations should be very similar. So what causes this
discrepancy? Which one of the 2 intervals should be used?
Regards,
Christos Argyropoulos
R CODE FOLLOWS
library(MASS
.
Ravi.
-Original Message-
From: r-help-boun...@r-project.org [mailto:r-help-boun...@r-project.org
] On
Behalf Of Christos Argyropoulos
Sent: Friday, July 02, 2010 8:41 AM
To: sarah_sanche...@yahoo.com; r-help@r-project.org
Subject: Re: [R] Double Integration
Function
Function adapt in package integrate maybe?
Date: Thu, 1 Jul 2010 05:30:25 -0700
From: sarah_sanche...@yahoo.com
To: r-help@r-project.org
Subject: [R] Double Integration
Dear R helpers
I am working on the Bi-variate Normal distribution probabilities. I need to
double integrate the
Hi Raoul,
I presume you need these summaries for a table of descriptive statistics for a
thesis/report/paper
(Table 1 as known informally by medical researchers). If this is the case,
then specify
method=reverse to summary.formula. In the following small example, I create 4
groups of
Look at the summary.formula function inside package Hmisc
Christos
Date: Sat, 26 Jun 2010 05:17:34 -0700
From: raoul.t.dso...@gmail.com
To: r-help@r-project.org
Subject: [R] Calculating Summaries for each level of a Categorical variable
Hi,
I have a dataset which has a categorical
,
but it seems to be that Vectorize is easier.
Thanks for help. I appreciated
Carrie
2010/6/23 Christos Argyropoulos argch...@hotmail.com
No something else is going on here
f=function(x) {dmvnorm(c(0.6, 0.8), mean=c(0.75, 0.75/x))*dnorm(x,
mean=0.6,
sd=0.15)}
f(1)
[1] 0.01194131
x
No something else is going on here
f=function(x) {dmvnorm(c(0.6, 0.8), mean=c(0.75, 0.75/x))*dnorm(x,
mean=0.6,
sd=0.15)}
f(1)
[1] 0.01194131
x-seq(-2,2,.15)
f(x)
Error in dmvnorm(c(0.6, 0.8), mean = c(0.75, 0.75/x)) :
mean and sigma have non-conforming size
But ...
Hi,
You should use the sapply/lapply for such operations.
r-round(runif(10,1,10))
head(r)
[1] 3 7 6 3 2 8
filt-function(x,thres) ifelse(xthres,x,thres)
system.time(r2-sapply(r,filt,thres=5))
user system elapsed
3.360.003.66
head(r2)
[1] 3 5 5 3 2 5
To return a list,
should be aware concerns the numerical performance of gamm
(versus its cousin gamm4); the lmer package is much much faster and numerically
more stable for large problems so that you should prefer the second interface
for large problems.
Christos Argyropoulos
Date: Mon, 21 Jun 2010 23:05:55 +0100
last thing you should be aware concerns the numerical performance of
gamm (versus its cousin gamm4); the lmer package is much much faster and
numerically more stable for large problems so that you should prefer the
second interface for large problems.
Christos Argyropoulos
Date: Mon
Hi,
The error message you are getting (probably) means that the algorithm did not
converge. Did you check for convergence? (Look at the fail element of the
returned lrm object)
Christos
How about getting statistics of downloads of the R-base from the different CRAN
mirrors ?
This should (in principle) allow one to estimate the total # of people who
intended to use R at some point in their life.
It may even be possible to analyze those numbers for temporal trends since the
Hi,
Are you sure these are date objects and not strings? For example
d1-runif(10,0,100)
d2-runif(10,0,200)
df-data.frame(d1=d1+as.Date(2010-6-20)+d1,d2=as.Date(2009-6-20)+d2)
df
d1 d2
1 2010-09-23 2009-06-30
2 2010-06-25 2009-08-21
3 2010-10-10 2009-08-04
4 2010-07-06
.
Christos
Subject: RE: [R] Popularity of R, SAS, SPSS, Stata...
Date: Sun, 20 Jun 2010 21:11:14 -0400
From: muenc...@utk.edu
To: argch...@hotmail.com
-Original Message-
From: r-help-boun...@r-project.org
[mailto:r-help-boun...@r-project.org]
On Behalf Of Christos Argyropoulos
Hi,
mod.poly3$coef/sqrt(diag(mod.poly3$var))
will give you the Wald stat values, so
pnorm(abs(mod.poly3$coef/sqrt(diag(mod.poly3$var))),lower.tail=F)*2
will yield the corresponding p-vals
Christos Argyropoulos
%
54.00 67.50 77.50 90.25 112.00
Christos Argyropoulos
Date: Fri, 18 Jun 2010 21:02:41 -0700
From: jwiley.ps...@gmail.com
To: r-help@r-project.org
Subject: [R] quantile() depends on order of probs?
Hello All,
I am trying to figure out
(look at the REML chapter in the manual).
Christos Argyropoulos
Date: Mon, 3 May 2010 23:18:28 +0200
From: duta...@gmail.com
To: r-help@r-project.org
Subject: [R] extended Kalman filter for survival data
Dear all,
I'm looking for an implementation of the generalized extended Kalman
() then
chooses df-degree-1 knots at suitable quantiles of x (which will ignore missing
values) if the intercept argument is TRUE and df-degree if intercept=FALSE.
Christos Argyropoulos
_
Hotmail
Can you give an example of what the python code is supposed to do?
Some of us are not familiar with python, and the R code is not particularly
informative. You seem to encode information on both the values and the names of
the elements of the vector d. If this is the case, why don't you create
variable
specified. It can also create TeX versions of these tables which
can be imported (e.g. through htlatex) to MSWord and OpenOffice.
Cheers,
Christos Argyropoulos
From: rui...@gmail.com
Date: Sat, 1 May 2010 01:04:19 +0800
To: r-help@r-project.org
Subject: [R] deriving mean
I presume you want to use such tables to summarize baseline information (a.k.a
Table 1 in medical papers)
Try the Hmisc package ... will do the tables and statistics for you and save
them as tex (which you can import directly into
in your favorite Office like program after running htlatex)
...
Christos Argyropoulos
_
Hotmail: Free, trusted and rich email service.
[[alternative HTML version deleted]]
__
R-help@r-project.org mailing
given previously ... but it also gives
you a 2D of where the matches are.
These are stored in the res variable; use with care with very big datasets.
I wonder whether it is possible to reduce the memory footprint with bit level
operations ...
Christos Argyropoulos
ggplot2 should work (resize to get the plot to the dimensions you need for the
paper)
library(Hmisc)
library(pscl)
library(ggplot2)
## data
data(bioChemists, package = pscl)
fm_pois - glm(art ~ ., data = bioChemists, family = poisson)
summary(fm_pois)
### pull out rate-ratios and 95% CI
So ..
are you trying to figure out whether your data hasa substantial number of
outliers that call into question the adequacy of the normal distro fro your
data?
If this is the case, note that you cannot individually check the values (as you
are doing) without taking into account of the
) v.s. (m=3) models with ANOVA as the
2 models are properly nested within each other.
Any other ideas?
Sincerely,
Christos Argyropoulos
University of Pittsburgh
_
Hotmail
Hi,
The package evd most functions that one would need for analysis of Extreme
Values so you should consider giving it a try.
By the way your vector of numerical values is not valid; there are a couple of
values with repeated decimal point separators.
Regards,
Christos Argyropoulos
,...)
dum-format(cor(x,y,use=complete,method=kendal),dig=2)
panel.text(30,40,bquote(tau == .(dum)),font=2)
},pscales=0,col=gray
)
Any suggestions?
Christos Argyropoulos
_
s. It's easy!
aspxmkt=en-us
Diagnostic page (it gives
diagnostics about any page that
it may contain Malware) does not seem to be working at the moment, so that the
problems that it found
with CRAN are not available for review.
Anyone else has the same problems?
Christos Argyropoulos
University of Pittsburgh
, 31 Jan 2009 09:25:26 -0600 From: iver...@biostat.wisc.edu To:
argch...@hotmail.com CC: r-help@r-project.org Subject: Re: [R] This site
may harm your computer - Google warning about cran website What OS/browser
versions are you guys using? Christos Argyropoulos wrote: This is
extremely
Each of the two integrals (g1, g2) seem to be divergent (or at least is
considered to be so by R :) )
Try this:
z - c(80, 20, 40, 30)
f1 - function(x, y, z) {dgamma(cumsum(z)[-length(z)], shape=x, rate=y)}
g1 - function(y, z) {integrate(function(x) {f1(x=x, y=y, z=z)}, 0.1,
0.5,
with
this (and possibly other) numerical integration methods.
Christos Argyropoulos
University of Pittsburgh Medical Center
_
s. It's easy!
aspxmkt=en-us
__
R-help@r-project.org mailing list
,...) { c(t.test(x,y,...))[1:3] }
q2-mapply(my.t,as.data.frame(x),as.data.frame(y))
q2
Good luck!
Christos Argyropoulos
University of Pittsburgh Medical Center
__
R-help@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read
Hello,
I was hoping that someone could answer a few questions for me (the background
is given below):
1) Can the coxph accept an interaction between a covariate and a frailty term
2) If so, is it possible to
a) test the model in which the covariate and the frailty appear as main terms
using
in Windows XP Pro 32-bit (running R v 2.7) and Ubuntu
Hardy (running the same version of R).
Thanks
Christos Argyropoulos
University of Pittsburgh Medical Center
_
Discover the new Windows Vista
E
42 matches
Mail list logo