TalentMatch.com, featured in the news today has launched its much talked
about entertainment site featuring independent bands, musicians, models,
directors, actors and comedians from all over the world.
You can listen to a Emmy-nominated singer/songwriter, bands of all genres
from all over the wo
> -Original Message-
> From: [EMAIL PROTECTED]
> [mailto:[EMAIL PROTECTED] On Behalf Of Luke Tierney
> Sent: den 1 mars 2003 01:31
> To: Laurent Gautier
> Cc: [EMAIL PROTECTED]
> Subject: Re: [R] R (external ?) reference
>
>
> On Fri, 28 Feb 2003, Laurent Gautier wrote:
>
> > Dear List,
I'd like to construct a function is.empty(x) which would work on arbitrary
objects, and would tell me whether the object contains any data. I think
that I can write a function which will recurse down through all the slots
of x (and slots of slots) until it reaches objects of elementary type
(NU
Hi
I think Andy Liaw's suggestion can be modified to handle my
interpretation of Patrik's question; try:
R> data
[1] 10 10 11 10 12 11 10 12 11 11 10 11
R> table(c(data,NA),c(NA,data)) -> x.tab
R> c(diag(x.tab), x.tab[upper.tri(x.tab)] + x.tab[lower.tri(x.tab)])
10 11 12 NA NA NA
1 1 0 5
On Fri, 28 Feb 2003, Daniel A. Powers wrote:
> To R-list --
>
> Does anyone know of an R function that will create split-episode data from
> single spell event/duration data according to user-defined time intervals?
>
> Example: original data
> t d x
>
> 6 0 x1
> 5 1
On Fri, 28 Feb 2003, Thomas Lumley wrote:
> On Fri, 28 Feb 2003, Jim Rogers wrote:
>
> > Hello,
> >
> > Could someone please tell me what I am thinking about incorrectly:
> >
> > f <- function(y) {
> > g <- function(x) x + y
> > g
> > }
> >
> > In the following, I get what I expect based on m
You can use R objects, such as the return from gam, and the
predict.gam function, from C. See the R extensions manual.
Reid Huntsinger
-Original Message-
From: RenE J.V. Bertin [mailto:[EMAIL PROTECTED]
Sent: Thursday, February 27, 2003 3:42 PM
To: Wiener, Matthew
Cc: [EMAIL PROTECTED]
S
Hi, Everyboday,
Does anybody know how to get the first derivative form the cubic natural
spline matrix?
Thanks !!!
Jassy
~
Nuoo-Ting (Jassy) Molitor, MS, Junior Statistician
Division of Biostatistics
Department of Preventive Medicine
University
You need to use llines instead of lines (lines doesn't produce any output with
grid graphics).
Quoting "Bliese, Paul D MAJ WRAIR-Wash DC" <[EMAIL PROTECTED]>:
> Platform: WIN2000
> Version of R: 1.6.2
>
> I'm interested in plotting fitted values in a trellis xyplot. I believe
> the
> follow
Thanks for the help. I had in fact used gcc 3.2.1 to
build. Rebuilding using the SunPro compiler seems to
have fixed the problem.
B Jones
--- [EMAIL PROTECTED] wrote:
> This is a known bug in gcc 3.2.1 and 3.2.2: are you
> using either of those?
> It is described in the R-admin manual: please
Paul,
I don't think you can use lines with trellis graphics. Check out ?panel.lmline
(and other associated panel. functions).
HTH
steve
"Bliese, Paul D MAJ WRAIR-Wash DC" wrote:
> Platform: WIN2000
> Version of R: 1.6.2
>
> I'm interested in plotting fitted values in a trellis xyplot. I beli
To R-list --
Does anyone know of an R function that will create split-episode data from
single spell event/duration data according to user-defined time intervals?
Example: original data
t d x
6 0 x1
5 1 x2
Split using intervals [0,3) [3,infty) (or cutpoint at 3)
Platform: WIN2000
Version of R: 1.6.2
I'm interested in plotting fitted values in a trellis xyplot. I believe the
following should work; however, I only get the points (not the fitted
lines).
library(lattice)
trellis.device(bg="white")
xyplot(MULTDV~TIME|SUBNUM,data=TEMP,
panel=function(x,y
Did you build R with profiling (not R profiling, which must be disabled)
compiler/linker options, as mentioned in the R-admin manual?
I've certainly profiled dyn.load-ed code on Solaris in R, but not
recently.
On Fri, 28 Feb 2003 [EMAIL PROTECTED] wrote:
> I have inherited a legacy S-plus sys
Hello,
I hope this is an acceptable spam.I have just set up a Yahoo group
called ecomon to discuss ecological monitoring. Currently the people
who have shown an interest in the group are people interested in
ecological monitoring associated with ecological impact assessment,
but there is no re
I have inherited a legacy S-plus system with about 10,000 lines of S and
10,000 lines of
Fortran. It's now running under R. However, I would like to profile the
fortran code with gprof or prof for performance tuning. I've successfully
linked the .so file into a simple C driver program and profil
Put your own solution together with arrows(...,angle=90), or use plotCI
within gregmisc, or get plotCI from my website
(http://www.zoo.ufl.edu/bolker/R/windows -- you want the bbmisc bundle)
Ben
On Fri, 28 Feb 2003, Katrina Grech wrote:
>
> How do I add error bars to an interaction plot of
How do I add error bars to an interaction plot of means?
Thanks
__
[EMAIL PROTECTED] mailing list
http://www.stat.math.ethz.ch/mailman/listinfo/r-help
"Pedro J. Aphalo" <[EMAIL PROTECTED]> writes:
> Douglas Bates wrote:
> >
> > [EMAIL PROTECTED] (Bjørn-Helge Mevik) writes:
> >
> > > Mona Riihimaki <[EMAIL PROTECTED]> writes:
> > >
> > > > I've done lme-analysis with R; [...] I'd need also the mean squares.
> > >
> > > AFAIK, lme doesn't calcul
This is a known bug in gcc 3.2.1 and 3.2.2: are you using either of those?
It is described in the R-admin manual: please consult it for details.
The Sun Forte compilers work, and the actual 3.2 does work for me.
On Fri, 28 Feb 2003, a a wrote:
> I recently built R 1.6.2 on solaris 2.8 with gcc 3
Hey, R-listers
I am going to approximate arbitrary 1-D data density by
mixture of Gaussian models.
The problem is that given a set of data generated from an
unknown density function, and want to use a Gaussian mixture density model
to approximate it.
Now how to determine the number of components,
I recently built R 1.6.2 on solaris 2.8 with gcc 3.2.
Things seem to run OK, but using graphics causes R to
core dump. (For instance, by using the plot() or
hist() functions.) Sometimes I can see the graphics
drawn before it actually core dumps. The core file
shows a crash in Rf_gpptr.
I'm q
Dear Martin,
Thanks for explaining this.
One thing that might be considered IMHO could be to replace the named column
heads (or both column and row head if so desired) with a number
corresponding to the position of the term in the printed table.
1 2 3 4
pH * 1
I(pH^2) * B 1
On Fri, 28 Feb 2003, Jim Rogers wrote:
> Hello,
>
> Could someone please tell me what I am thinking about incorrectly:
>
> f <- function(y) {
> g <- function(x) x + y
> g
> }
>
> In the following, I get what I expect based on my understanding of
> lexical scoping:
>
> (f(1))(3) # 4
> (f(2))(3)
Hi,
it is sort of a bug (and sort of not a bug). You are getting bitten
by lazy evaluation. The value of y is not getting evaluated until
the second function is created and returned.
f <- function(y) {
y
g <- function(x) x + y
g
}
will give the behavior you want and I think there is
Rpart does not fit a linear model at any node. Please read up on
tree-based models.
On Fri, 28 Feb 2003 [EMAIL PROTECTED] wrote:
> To whom it may concern,
> I am using the rpart() function to perform a recursive tree analysis on a
> set
> of data that includes both numerical and categorical att
rpart (and tree) do not fit a linear model at each node, merely a
threshold split. See Alexander, W. P., and Grimshaw, S. D. (1996),
"Treed Regression," Journal of Computational and Graphical Statistics,
5, 156-175, for an alternative that may meet your needs.
> I am using the rpart() function t
To whom it may concern,
I am using the rpart() function to perform a recursive tree analysis on a
set
of data that includes both numerical and categorical attributes.
At the end of the analysis, I would like to see the linear model at each
node in the tree. In particular, I would like to see each
Hello,
Could someone please tell me what I am thinking about incorrectly:
f <- function(y) {
g <- function(x) x + y
g
}
In the following, I get what I expect based on my understanding of
lexical scoping:
(f(1))(3) # 4
(f(2))(3) # 5
But now,
fs <- lapply(c(1, 2), f)
fs[[1]](3) # 5 (Why
The term "anova" has evolved to include roughly any table with something
like a partition of sums of squares even in non-normal situations, e.g.,
when using "glm" for logistic regression, where the "deviance" =
(-2)*log(likelihood) is partitioned.
Hope this helps.
Spencer Graves
Pedro J. Aphalo
Douglas Bates wrote:
>
> [EMAIL PROTECTED] (Bjørn-Helge Mevik) writes:
>
> > Mona Riihimaki <[EMAIL PROTECTED]> writes:
> >
> > > I've done lme-analysis with R; [...] I'd need also the mean squares.
> >
> > AFAIK, lme doesn't calculate sum of squares (or mean squares). It
> > maximises the lik
Hi All,
I woul like to ask you a couple of questions on chisq.test.
First, I have 40 flies, 14 males and 26 females and I want to test for an a
priori hypothesis that the sex ratio is 1:1
sex<-c(14,26)
pr<-c(1,1)/2
chisq.test(se, p=pr, correct=TRUE)
Chi-squared test for given probabilities
dat
> "GS" == Gavin Simpson <[EMAIL PROTECTED]>
> on Fri, 28 Feb 2003 13:42:21 - writes:
GS> format.pval() as in:
GS> format.pval(1-pf(((RSSred-RSSful)/2)/(RSSful/(34-3)),2,34-3),digits = 5)
GS> see ?format for more information
and if you want to make sure not to suffer f
> "GS" == Gavin Simpson <[EMAIL PROTECTED]>
> on Fri, 28 Feb 2003 13:07:55 - writes:
GS> Hi,
GS> I've had a look the bug list and searched though the R documentation, email
GS> lists etc. but didn't see anything on this:
GS> when I do:
GS> summary(species.glm1
On Fri, 28 Feb 2003, Laurent Gautier wrote:
> Dear List,
>
> I found a documentation on the web that mentions things like 'R references'
> (http://www.stat.uiowa.edu/~luke/R/simpleref.html).
>
> However, I could not find the R_MakeReference and friends in R...
> Does anyone knows more about that
If I understand you correctly, here's a quick and dirty way:
## Simulate some data:
x1 <- sample(3, 20, replace=TRUE)
x2 <- sample(3, 20, replace=TRUE)
x.tab <- table(x1, x2)
x.count <- c(diag(x.tab), x.tab[upper.tri(x.tab)] + x.tab[lower.tri(x.tab)])
The first 3 elements of x.count will be (1,1
ndata <- na.omit(led1t7sts) and work with ndata.
Why is that difficult?
On Fri, 28 Feb 2003, CG Pettersson wrote:
> In 27/2, I got the following answer from Prof. Ripley: (The question is at the
> bottom)
>
> >This ia already fixed in R-devel. The answer is the same: don't use
> >na.omit impli
format.pval() as in:
format.pval(1-pf(((RSSred-RSSful)/2)/(RSSful/(34-3)),2,34-3),digits = 5)
see ?format for more information
Gav
%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%~%
Dr. Gavin Simpson [T] +44 (0)20 7679 5522
ENSIS Research Fellow
how can I show an number not rounded, but in the format, eg. x.
e.g.
> signif(1-pf(((RSSred-RSSful)/2)/(RSSful/(34-3)),2,34-3),digits = 5)
shows
[1] 0
but I need something like
[1] 2.2e-16
thanks
christoph
--
Christoph LehmannPhone: ++41 31 930 93 83
Depar
[EMAIL PROTECTED] (Bjørn-Helge Mevik) writes:
> Mona Riihimaki <[EMAIL PROTECTED]> writes:
>
> > I've done lme-analysis with R; [...] I'd need also the mean squares.
>
> AFAIK, lme doesn't calculate sum of squares (or mean squares). It
> maximises the likelihood (or restricted likelihood) and u
Hi,
I've had a look the bug list and searched though the R documentation, email
lists etc. but didn't see anything on this:
when I do:
summary(species.glm1, correlation = TRUE)
I get a correlation matrix like this:
Correlation of Coefficients:
( p I(H C
pH * 1
I(pH^2) * B 1
> "Mark" == Mark Marques <[EMAIL PROTECTED]>
> on Fri, 28 Feb 2003 09:51:02 + writes:
Mark> I have "small" problem ...
Mark> with the cluster library each time I try to use
Mark> the "agnes","pam","fanny" functions with more than 2 elements
Mark> I get the foll
Hello,
I wonder if someone could send me suggestions on how to solve the following problem:
I have a vector of an arbitrary size (ex.
data<-c(10,10,11,10,12,11,10,12,11,11,10,11)) and use the table function, which gives
the following result
10 11 12
55 2
that's fine, but what I would
In 27/2, I got the following answer from Prof. Ripley: (The question is at the bottom)
>This ia already fixed in R-devel. The answer is the same: don't use
>na.omit implicitly: use it explicitly.
I feel rather stupid for the moment, as I don´t understand an answer that looks very
simple.
What´
R has amazing capabilities, but percentage tables are a weak spot
IMHO. There's prop.table but that's rather unwieldly, especially for
multiway tables. CrossTable by Marc Schwartz in the gregmisc library
makes percentage tables a breeze but is limited to two-way tables. So
I decided to try my own h
Dear all,
my function fn() (see code below) just takes a glm object and updates it by
including a function of a specified variable in dataframe.
x<-1:50
y<-rnorm(50)
d<-data.frame(yy=y,xx=x);rm(x,y)
o<-glm(yy~xx,data=d)
> fn(obj=o,x=xx)
Call: glm(formula = yy ~ xx + x1, data = obj$data)
...
I have "small" problem ...
with the cluster library each time I try to use
the "agnes","pam","fanny" functions with more than 2 elements
I get the following error:
>Error in vector("double", length) : negative length vectors are not allowed
>In addition: Warning message:
>NAs in
Do read the help page. It says:
`fnscale' An overall scaling to be applied to the value of `fn'
and `gr' during optimization. If negative, turns the problem
into a maximization problem. Optimization is performed on
`fn(par)/fnscale'.
`parscale' A vector of
> "Nobumoto" == Nobumoto Tomioka <[EMAIL PROTECTED]>
> on Thu, 27 Feb 2003 15:00:11 -0800 writes:
Nobumoto> Dear Stuff,
[you probably meant "staff" but that's not much better:
There's no paid staff for the R project.
It's all volunteering ! ]
Nobumoto> I am trying to enjoy "
Mona Riihimaki <[EMAIL PROTECTED]> writes:
> I've done lme-analysis with R; [...] I'd need also the mean squares.
AFAIK, lme doesn't calculate sum of squares (or mean squares). It
maximises the likelihood (or restricted likelihood) and uses tests
based on likelihood ratios.
--
Bjørn-Helge Mevi
Dear all,
I have a function MYFUN which depends on 3 positive parameters TETA[1],
TETA[2], and TETA[3]; x belongs to [0,1].
I integrate the function over [0,0.1], [0.1,0.2] and
[0.2,0.3] and want to choose the three parameters so that
these three integrals are as close to, resp., 2300, 4600 and 58
When running my WinXP Pro testing I used pixmap 0.3-1:
pixmap Bitmap Images (``Pixel Maps'')
Description:
Package: pixmap
Version: 0.3-1
Title: Bitmap Images (``Pixel Maps'')
Depends: R (>= 1.6.0)
Author: Friedrich Leisch and Roger Bivand
Maintainer: Friedrich Leisch <[EMAIL PROTECTED]>
52 matches
Mail list logo