Re: [Rd] Suggestion: help()
On Wed, 8 Jun 2005, Duncan Murdoch wrote: > Torsten Hothorn wrote: > > On Tue, 7 Jun 2005, Duncan Murdoch wrote: > > > > [...] > > > > > >>My proposal (modified following the suggestions I've heard so far) is as > >>follows: > >> > >> - to check that a couple of help topic aliases exist (.package > >>and ) > >> - to recommend that .package contain general information about > >>the package, and that be an alias for it, if it isn't used for > >>some other purpose. > >> - to write promptPackage() to help create an initial version of > >>.package.Rd. It can get some information from the DESCRIPTION > >>file; perhaps it could go looking for a vignette, or the INDEX, or > >> - to modify the other help system tools to make use of this (e.g. the > >>package: heading on a page would become a link to the .package > >>alias, etc.) > >> > > > > > > as a package author who already provides help pages for general package > > descriptions (`?multcomp' and `?coin' work and, if I remember correctly, > > Martin suggested to include the advertisement this way) I must > > admit that I never say `?foo' when I'm interested in a global overview > > about a new package `foo'. > > I do occasionally, but usually it's a waste of time. This proposal is > intended to address that. > > > Instead, `library(help = foo)' gives what I want to see, namely the title > > and description of a package and all documented topics. One may argue that > > asking `library' for help is not the most obvious thing to do. But people > > able to recall that fitting an ANOVA model requires `aov' and comparing > > two models needs `anova' should be able to have `library' in mind for > > general package information. > > As I pointed out, this is okay for people who know R already, but not so > good for beginners. The answer to the question "how do I get help on > foo?" is too complex. > > > So, for me having infrastructure for _automatically_ generated overviews > > is very nice, but _forcing_ package authors to provide additional > > meta-information would be less welcome. > > What do you think of Henrik's suggestion to generate a help topic giving > information equivalent to library(help=)? I think this would > happen at install time (not build time as he said; no need to put this > in the source tarballs). If the .package alias wasn't defined, the > installer would automatically create one. > > If we had this in place, I'd strengthen the advice in R-Exts not to > bother with a manually created INDEX file: that information should go > into a manually created .package topic instead. yes, this sounds reasonable - as long as ?.package is a "link" to library(help = ) (or help(package = as I learned only recently) when no .package.Rd file exists in , this would be fine, of course. Best, Torsten > > Duncan Murdoch > __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Suggestion: help()
On Tue, 7 Jun 2005, Duncan Murdoch wrote: [...] > My proposal (modified following the suggestions I've heard so far) is as > follows: > > - to check that a couple of help topic aliases exist (.package > and ) > - to recommend that .package contain general information about > the package, and that be an alias for it, if it isn't used for > some other purpose. > - to write promptPackage() to help create an initial version of > .package.Rd. It can get some information from the DESCRIPTION > file; perhaps it could go looking for a vignette, or the INDEX, or > - to modify the other help system tools to make use of this (e.g. the > package: heading on a page would become a link to the .package > alias, etc.) > as a package author who already provides help pages for general package descriptions (`?multcomp' and `?coin' work and, if I remember correctly, Martin suggested to include the advertisement this way) I must admit that I never say `?foo' when I'm interested in a global overview about a new package `foo'. Instead, `library(help = foo)' gives what I want to see, namely the title and description of a package and all documented topics. One may argue that asking `library' for help is not the most obvious thing to do. But people able to recall that fitting an ANOVA model requires `aov' and comparing two models needs `anova' should be able to have `library' in mind for general package information. So, for me having infrastructure for _automatically_ generated overviews is very nice, but _forcing_ package authors to provide additional meta-information would be less welcome. Best, Torsten __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] Re: simtest with lm object: depends on order in formula
On Fri, 1 Apr 2005, Christoph Buser wrote: > Hi > > I used the simtest function from the package multcomp. When > using simtest with an lm object, it seems to depend on the order > of the variables in the formula. See the code for an example: > > library(multcomp) > set.seed(1) > # response > y <- rnorm(21) > # one factor > f1 <- factor(c(rep(c("A", "B", "C"), 7))) > # and one continuous covariable > x <- rnorm(21) > testdata <- cbind(as.data.frame(y), f1, x) > > # the same model, just change the order in the formula > reg1 <- lm(y ~ x + f1, data=testdata) > reg2 <- lm(y ~ f1 + x, data=testdata) > summary(simtest(reg1)) > > Coefficients: > > Estimate t value Std.Err. p raw p Bonf p adj > > (Intercept)0.427 -1.6700.347 0.113 0.453 0.307 > > x 0.295 -0.5990.256 0.557 1.000 0.848 > > f1B0.204 -0.4060.503 0.690 1.000 0.866 > > f1C0.089 -0.2550.493 0.802 1.000 0.866 > summary(simtest(reg2)) > > Coefficients: > > Estimate t value Std.Err. p raw p Bonf p adj > > (Intercept)0.427 -1.6700.347 0.113 0.453 0.307 > > f1B0.295 -0.5990.503 0.557 1.000 0.848 > > f1C0.204 -0.4060.493 0.690 1.000 0.866 > > x 0.089 -0.2550.256 0.802 1.000 0.866 > > The tabular is fix, but the names of the variables are > permutated, maybe a bug in extracting the formula from the lm > object. > You can avoid the problem by using a formula instead of an lm > object, but in the help is mentioned that simtest works with an > lm object, too. you are right, thanks for the hint! Torsten > > > Regards, > > Christoph Buser > > -- > Christoph Buser <[EMAIL PROTECTED]> > Seminar fuer Statistik, LEO C11 > ETH (Federal Inst. Technology)8092 Zurich SWITZERLAND > phone: x-41-1-632-5414fax: 632-1228 > http://stat.ethz.ch/~buser/ > -- > > __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Use of htest class for different tests
On Sun, 13 Mar 2005, Gorjanc Gregor wrote: > Hello! > > First of all I must appologize if this has been raised previously, but > search provided by Robert King at the University of Newcastle seems to > be down these days. Additionally let me know if such a question should > be sent to R-help. > > I did a contribution to function hwe.hardy in package 'gap' during the > weekend. That functions performs Hardy-Weinberg equilibrium test using > MCMC. The return of the function does not have classical components for > htest class so I was afcourse not successfull in using it. However, I > managed to copy and modify some part of print.htest to accomplish the > same task. > > Now my question is what to do in such cases? Just copy parts of > print.htest and modify for each test or anything else. Are such cases > rare? If yes, then mentioned approach is probably the easiest. > you can use print.htest directly for the components which _are_ elements of objects of class `htest' and provide your one print method for all others. If your class `foo' (essentially) extends `htest', a simple version of `print.foo' could by print.foo <- function(x, ...) { # generate an object of class `htest' y <- x class(y) <- "htest" # maybe modify some thinks like y$method ... # print y using `print.htest' without copying code print(y) # and now print additional information cat(x$whatsoever) } > -- > Lep pozdrav / With regards, > Gregor GORJANC > > > University of Ljubljana > Biotechnical Faculty URI: http://www.bfro.uni-lj.si/MR/ggorjan > Zootechnical Departmentemail: gregor.gorjanc bfro.uni-lj.si > Groblje 3 tel: +386 (0)1 72 17 861 > SI-1230 Domzalefax: +386 (0)1 72 17 888 > Slovenia > > __ > R-devel@stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > > __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] promptMethods(foo) & \alias{foo}
Dear all, `promptMethods(foo)' currently does not produce and Rd-skeleton including an `\alias{foo}' entry, which is required (at least R CMD check keeps crying until the alias is added). Maybe one could simply add this line. Thanks, Torsten __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] R News: Call for Papers
Dear useRs and developeRs, the next issue of `R News' is scheduled for the beginning of May and we are now accepting submissions for this first issue in 2005. For more information see http://cran.r-project.org/doc/Rnews/ If you are the author of a package on CRAN and you would like to promote it a little bit, or if you simply have an interesting application using R, we hope you can find some time to write a short article on it. We suggest that it be approximately 3 pages or less. The idea of the newsletter is that it be interesting to R users without being too technical. For example an article describing a package could begin by briefly outlining the statistical background and go on to demonstrate the usage on some typical data set. Of course graphics are more than welcome! Bill Venables <[EMAIL PROTECTED]> is also encouraging submissions to the more specialist Programmer's Niche column. In this case the technical level could be a little higher, of course, but not necessarily: ingeniousness is the key. The R Help Desk column is intended to present answers to frequently asked questions as well as tricks that are useful to the majority of useRs. Please send submissions to Uwe Ligges <[EMAIL PROTECTED]>. The deadline for submissions is April, 10th, 2005 Keep the contributions rolling in! The Editorial Board, Doug Bates, Paul Murrell and Torsten Hothorn __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] typo in ?NotYetImplemented
The `examples' section says plot.mlm # to see how the "NotYetImplemented" # reference is made automagically ^ Best, Torsten __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] S3/S4 classes performance comparison
On Fri, 14 Jan 2005, Eric Lecoutre wrote: > > Hi R-devel, > > If you did read my survey on Rhelp about reporting, you may have seen that > I am implementing a way to handle outputs for R (mainly target output > destinations: xHTML and TeX). > In fact: I does have something that works for basic objects, entirely done > with S4 classes, with the results visible at: > http://www.stat.ucl.ac.be/ROMA/sample.htm > http://www.stat.ucl.ac.be/ROMA/sample.pdf > > To achieve this goal, I do use intermediary objects that would reprensent > the structure of the output. Thus I defined classes for Vector, Tables, > Rows, Cells, Sections, and so on. Most of those structure are recursive. > Then, at a firts attemps, a matrix would be represented as a Table > containing Rows containg Cells containing Vectors, which finally is easy to > export and which makes easy the customisation (if you need to insert a > footnote within a cell for example). > I know that this intermediary layout would be far more easier to handle at > C level, but I dont have any C skill for that... > > One of my problem is that this consumes a lot of memory/computation time. > Too much, indeed... > 20 sec. to export data(iris) on my PIV 3.2 Ghz 1Go RAM, which is not > acceptable. > > I was intending to do start properly, as starting from scratch new code. I > did write everything using S4 classes. > Doing a simple test reveals crucial efficiency differences between S3 and > S4 classes. > > Here is the test: > > --- > > ### S3 CLASSES > > S3content <- function(obj=NULL,add1=NULL,add2=NULL,type="",...){ > out <- list(content=obj,add1=add2,add2=add2,type=type) > class(out) <- "S3Content" > return(out) > } > > S3vector <- function(vec,...){ >out <- S3content(obj=vec,type="Vector",...) >class(out) <- "S3Vector" >return(out) > } > > > ### S4 classes > > setClass("S4content",representation(content="ANY",add1="ANY",add2="ANY",type="character")) > > S4content <- function(obj=NULL,add1=NULL,add2=NULL,type="",...){ >new("S4content",content=obj,add1=add1,add2=add2,type=type) > } > > S4vector <- function(vec,...){ >new("S4content",type="vector",content=vec,...) > } > > ### Now the test > > test <- rnorm(1) > > gc() > used (Mb) gc trigger (Mb) > Ncells 169135 4.6 531268 14.2 > Vcells 75260 0.6 786432 6.0 > > (system.time(lapply(test,S3vector))) > [1] 0.17 0.00 0.19 NA NA > > gc() > used (Mb) gc trigger (Mb) > Ncells 169136 4.6 531268 14.2 > Vcells 75266 0.6 786432 6.0 > > (system.time(lapply(test,S4vector))) > [1] 15.08 0.00 15.13NANA > - > > There is here a factor higher than 80! > > Is there something trivial I did overlook? > Is this 80 factor normal? > my experience was that calling the constructor _with_ data is slow, so the following performs a little bit better R> S3content <- function(obj=NULL,add1=NULL,add2=NULL,type="",...){ + out <- list(content=obj,add1=add2,add2=add2,type=type) + class(out) <- "S3Content" + return(out) + } R> R> S3vector <- function(vec,...){ +out <- S3content(obj=vec,type="Vector",...) +class(out) <- "S3Vector" +return(out) + } R> R> R> ### S4 classes R> R> setClass("S4content",representation(content="ANY",add1="ANY",add2="ANY",type="character")) [1] "S4content" R> R> S4vector <- function(vec,...){ +RET <- new("S4content") +[EMAIL PROTECTED] <- "vector" +[EMAIL PROTECTED] <- vec +RET + } R> R> test <- rnorm(1) R> gc() used (Mb) gc trigger (Mb) Ncells 156181 4.2 35 9.4 Vcells 67973 0.6 786432 6.0 R> system.time(lapply(test,S3vector)) [1] 0.23 0.00 0.23 0.00 0.00 R> gc() used (Mb) gc trigger (Mb) Ncells 156314 4.2 35 9.4 Vcells 68005 0.6 786432 6.0 R> system.time(lapply(test,S4vector)) [1] 6.04 0.00 6.04 0.00 0.00 R> Torsten > Is it still recommended (recommendable...) to use S4 classes when > considered that? > > > > Eric > > Eric Lecoutre > UCL / Institut de Statistique > Voie du Roman Pays, 20 > 1348 Louvain-la-Neuve > Belgium > > tel: (+32)(0)10473050 > [EMAIL PROTECTED] > http://www.stat.ucl.ac.be/ISpersonnel/lecoutre > > If the statistics are boring, then you've got the wrong numbers. -Edward > Tufte > > __ > R-devel@stat.math.ethz.ch mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > > __ R-devel@stat.math.ethz.ch mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
[Rd] fitting problems in coxph.fit
Dear Thomas & Dear List, the fitting function `coxph.fit' called by `coxph' may fail to estimate the regression coefficients when some values of the design matrix are very large. For example library(survival) ### load example data load(url("http://www.imbe.med.uni-erlangen.de/~hothorn/coxph_fit.Rda";)) method <- "efron" ### copied from `coxph.fit' coxfit <- .C("coxfit2", iter=as.integer(maxiter), as.integer(n), as.integer(nvar), stime, sstat, x= x[sorted,] , as.double(offset[sorted] - mean(offset)), as.double(weights), newstrat, means= double(nvar), coef= as.double(init), u = double(nvar), imat= double(nvar*nvar), loglik=double(2), flag=integer(1), double(2*n + 2*nvar*nvar + 3*nvar), as.double(control$eps), as.double(control$toler.chol), sctest=as.double(method=="efron") ,PACKAGE="survival") produces R> coxfit$coef [1] NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN NaN because (?) R> summary(x[,1]) Min. 1st Qu.Median Mean 3rd Qu. Max. 2.378e-01 8.758e+00 5.872e+01 1.640e+04 2.732e+02 4.000e+06 One the other hand x[,1] <- x[,1]/max(x[,1]) coxfit <- .C("coxfit2", iter=as.integer(maxiter), as.integer(n), as.integer(nvar), stime, sstat, x= x[sorted,] , as.double(offset[sorted] - mean(offset)), as.double(weights), newstrat, means= double(nvar), coef= as.double(init), u = double(nvar), imat= double(nvar*nvar), loglik=double(2), flag=integer(1), double(2*n + 2*nvar*nvar + 3*nvar), as.double(control$eps), as.double(control$toler.chol), sctest=as.double(method=="efron") ,PACKAGE="survival") looks much better R> coxfit$coef [1] 0.25123203 0. -0.42595541 -0.04488913 -0.26061995 0.44426458 [7] -0.38954286 -0.43081374 0.79573107 0.48234405 0.94636357 -0.25193465 [13] 1.15619712 0.32651765 -1.06731019 -0.24249939 I nailed the problem down to lines 261ff of `coxfit2.c' where zbeta = offset[person]; for (i=0; ihttps://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] R2.0.0 bug in function vcov in library survival (PR#7266)
> > Have just compiled and installed R-2.0.0.tar.gz running SuSE9.0. > > The function vcov do not accept "coxph" object as input any longer. > > The same R-program running R1.9.1 do work. R-program attached below. > > Exporting the coxph object from R2.0.0 to R1.9.1 I get vcov ouput in R1.9.1. > Exporting the coxph object from R1.9.1 to R2.0.0 I get errors in R2.0.0 > > === Copy of the R-program === > R : Copyright 2004, The R Foundation for Statistical Computing > Version 2.0.0 (2004-10-04), ISBN 3-900051-07-0 > > R is free software and comes with ABSOLUTELY NO WARRANTY. > You are welcome to redistribute it under certain conditions. > Type 'license()' or 'licence()' for distribution details. > > R is a collaborative project with many contributors. > Type 'contributors()' for more information and > 'citation()' on how to cite R or R packages in publications. > > Type 'demo()' for some demos, 'help()' for on-line help, or > 'help.start()' for a HTML browser interface to help. > Type 'q()' to quit R. > > [Previously saved workspace restored] > > > rm(list=ls()) > > load("~/R/wlh0301/data/breast.RData") > > library(survival) > Loading required package: splines > > u2<-coxph(Surv(time,event) ~ age + cbmi + smoker + drink , data=breast.1) > > tmp <- vcov(u2) > Error in vcov(u2) : no applicable method for "vcov" > > > > class(u2) > [1] "coxph" > S3method(vcov, coxph) is missing from the NAMESPACE file of package `survival', I think. For the meantime you may use R> survival:::vcov.coxph function (object, ...) { rval <- object$var dimnames(rval) <- list(names(coef(object)), names(coef(object))) rval } Torsten > __ > [EMAIL PROTECTED] mailing list > https://stat.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://stat.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] R and C++ code
> Hello, > I want to use a C++ library under R, but I have problems. > I have readen the manual: "Writing R Extensions" and the setion " Interfacing C++ > code". > I have created : > //X.cpp > #include "X.h" > #include > > X::X() > { > std::cout << "constructor X" << std::endl; > } > > X::~X() > { > } > > > //X.h > #ifndef X_H > #define X_H > > class X > { >public: > X(); > > ~X(); > }; > #endif > > //X_main.cpp > #include "X.h" > > extern "C" { > > void X_main(){ > X x; > } > } > > > //essai.R did you load the shared library via dyn.load("X_main.so") ? Torsten > foo<-function() > {.C("X_main");} > > When I execute the file essai.R, the R console returns: > > Error in .C("X_main") : C function name not in load table. > > If you can help me. > Thank you. > > Sandrine > > __ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Segfault: .Call and classes with logical slots
On Mon, 26 Apr 2004, John Chambers wrote: > I think you need to PROTECT the vector you're putting in the slot as > well as the overall object. At any rate, the problem goes away for me > with the revised version of dummy.c below. yes, and it seems that PROTECT'ing the logical variable is sufficient while PROTECT'ing the class but not the logical variable causes a segfault again. I tried with numeric slots too: No problems. > (Empirically, PROTECT'ing > the class definition didn't seem to be needed, but experience suggests > that too much protection is better than too little.) I tried to save (UN)PROTECT calls because of efficiency reasons. Anyway, this helps me a lot, thanks! Torsten > > #include > > SEXP foo() { > > SEXP ans, cl, el; > > PROTECT(cl = MAKE_CLASS("test")); > PROTECT(ans = NEW_OBJECT(cl)); > PROTECT(el = allocVector(LGLSXP, 1)); > SET_SLOT(ans, install("lgl"), el); > LOGICAL(GET_SLOT(ans, install("lgl")))[0] = TRUE; > UNPROTECT(3); > return(ans); > } > > > Torsten Hothorn wrote: > > > > Hi, > > > > the following example aiming at a class containing a logical slot > > segfaults under R-1.9.0 when `gctorture(on = TRUE)' is used: > > > > Code code (dummy.c): > > > > #include > > > > SEXP foo() { > > > > SEXP ans; > > > > PROTECT(ans = NEW_OBJECT(MAKE_CLASS("test"))); > > SET_SLOT(ans, install("lgl"), allocVector(LGLSXP, 1)); > > LOGICAL(GET_SLOT(ans, install("lgl")))[0] = TRUE; > > UNPROTECT(1); > > return(ans); > > } > > > > R code (dummy.R): > > > > dyn.load("dummy.so") > > > > setClass("test", representation = representation(lgl = "logical")) > > > > a = .Call("foo") > > a # OK > > > > gctorture(on = TRUE) > > a = .Call("foo") > > gctorture(on = FALSE) > > a # segfault > > > > which gives > > > > R> > > R> > > R> dyn.load("dummy.so") > > R> > > R> setClass("test", representation = representation(lgl = "logical")) > > [1] "test" > > R> > > R> a = .Call("foo") > > R> a > > An object of class "test" > > Slot "lgl": > > [1] TRUE > > > > R> > > R> gctorture(on = TRUE) > > R> a = .Call("foo") > > R> gctorture(on = FALSE) > > Segmentation fault > > > > Best, > > > > Torsten > > > > R> version > > _ > > platform i686-pc-linux-gnu > > arch i686 > > os linux-gnu > > system i686, linux-gnu > > status > > major1 > > minor9.0 > > year 2004 > > month04 > > day 12 > > language R > > R> > > > > ___ > > | | > > | Dr. rer. nat. Torsten Hothorn | > > | Institut fuer Medizininformatik, Biometrie und Epidemiologie| > > | Waldstrasse 6, D-91054 Erlangen, Deutschland| > > | Tel: ++49-9131-85-22707 (dienstl.) | > > | Fax: ++49-9131-85-25740 | > > | Email: [EMAIL PROTECTED] | > > | PLEASE send emails cc to [EMAIL PROTECTED] | > > | Web: http://www.imbe.med.uni-erlangen.de/~hothorn | > > |___| > > > > __ > > [EMAIL PROTECTED] mailing list > > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > -- > John M. Chambers [EMAIL PROTECTED] > Bell Labs, Lucent Technologiesoffice: (908)582-2681 > 700 Mountain Avenue, Room 2C-282 fax:(908)582-3340 > Murray Hill, NJ 07974web: http://www.cs.bell-labs.com/~jmc > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] Segfault: .Call and classes with logical slots
Hi, the following example aiming at a class containing a logical slot segfaults under R-1.9.0 when `gctorture(on = TRUE)' is used: Code code (dummy.c): #include SEXP foo() { SEXP ans; PROTECT(ans = NEW_OBJECT(MAKE_CLASS("test"))); SET_SLOT(ans, install("lgl"), allocVector(LGLSXP, 1)); LOGICAL(GET_SLOT(ans, install("lgl")))[0] = TRUE; UNPROTECT(1); return(ans); } R code (dummy.R): dyn.load("dummy.so") setClass("test", representation = representation(lgl = "logical")) a = .Call("foo") a # OK gctorture(on = TRUE) a = .Call("foo") gctorture(on = FALSE) a # segfault which gives R> R> R> dyn.load("dummy.so") R> R> setClass("test", representation = representation(lgl = "logical")) [1] "test" R> R> a = .Call("foo") R> a An object of class "test" Slot "lgl": [1] TRUE R> R> gctorture(on = TRUE) R> a = .Call("foo") R> gctorture(on = FALSE) Segmentation fault Best, Torsten R> version _ platform i686-pc-linux-gnu arch i686 os linux-gnu system i686, linux-gnu status major1 minor9.0 year 2004 month 04 day 12 language R R> ___ | | | Dr. rer. nat. Torsten Hothorn | | Institut fuer Medizininformatik, Biometrie und Epidemiologie| | Waldstrasse 6, D-91054 Erlangen, Deutschland| | Tel: ++49-9131-85-22707 (dienstl.) | | Fax: ++49-9131-85-25740 | | Email: [EMAIL PROTECTED] | | PLEASE send emails cc to [EMAIL PROTECTED] | | Web: http://www.imbe.med.uni-erlangen.de/~hothorn | |___| __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] `extends()'at C-level?
One can check easily if an object is of class "foo" via strcmp(CHAR(asChar(GET_CLASS(obj))), "foo") and I wonder if there is high-level functionality to check if `obj' extends a class? Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] Memory Protection & calling C-fun from C
Good morning! The descriptions of memory protection all assume that one is calling a C-function directly from R. I'm not sure if my understanding of calling a C-function from another C-function is correct: Suppose there are two functions SEXP bar(SEXP y) { SEXP b; PROTECT(b = allocVector(...)); ... computations on b UNPROTECT(1); return b } and SEXP foo(SEXP x) { SEXP a; PROTECT(a = bar(x)); /* use bar to do lowlevel computations */ ... futher computations on a UNPROTECT(1); return a; } Of course, R> .Call("bar", x) is safe but is R> .Call("foo", x) too? May it happen that the object `b' points to is destroyed before it is protected by `PROTECT(a = bar(x))'? I searched for examples of that in R and some packages but did only find some where `bar' is defined by double* bar(double *y) and the problems do not occur. Anyway, I want to be able to call both `foo' and `bar' directly from R and C-level (say for the sake of writing tests in R for the lowlevel functions). Any clarification is very welcome && Thanks in advance, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] Global assignment with S4 objects in R 1.9.0 beta
Hi, some change during the last 10 days breaks code where slots of an object are changed globally: setClass("mylist", contains = "list") setClass("dummy", representation = representation( a = "mylist")) foo1 = function(i, x) { [EMAIL PROTECTED] <<- x ### change a slot } foo2 = function() { ### define an object mydummy <<- new("dummy") a <- vector(length = 10, mode = "list") class(a) <- "mylist" [EMAIL PROTECTED] <<- a for (i in 1:10) foo1(i, i+1) mydummy } try(thisdummy <- foo2()) unlist([EMAIL PROTECTED]) This one works as expected R : Copyright 2004, The R Foundation for Statistical Computing Version 1.9.0 alpha (2004-03-18), ISBN 3-900051-00-3 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for a HTML browser interface to help. Type 'q()' to quit R. > invisible(options(echo = TRUE)) > > setClass("mylist", contains = "list") [1] "mylist" > > setClass("dummy", representation = representation( + a = "mylist")) [1] "dummy" > > foo1 = function(i, x) { + [EMAIL PROTECTED] <<- x + } > > foo2 = function() { + mydummy <<- new("dummy") + a <- vector(length = 10, mode = "list") + class(a) <- "mylist" + [EMAIL PROTECTED] <<- a + for (i in 1:10) foo1(i, i+1) + mydummy + } > > try(thisdummy <- foo2()) > unlist([EMAIL PROTECTED]) [1] 2 3 4 5 6 7 8 9 10 11 > proc.time() [1] 1.99 0.06 2.03 0.00 0.00 > and with yesterdays R-devel it fails R : Copyright 2004, The R Foundation for Statistical Computing Version 1.9.0 beta (2004-03-28), ISBN 3-900051-00-3 R is free software and comes with ABSOLUTELY NO WARRANTY. You are welcome to redistribute it under certain conditions. Type 'license()' or 'licence()' for distribution details. R is a collaborative project with many contributors. Type 'contributors()' for more information and 'citation()' on how to cite R in publications. Type 'demo()' for some demos, 'help()' for on-line help, or 'help.start()' for a HTML browser interface to help. Type 'q()' to quit R. R> invisible(options(echo = TRUE)) R> R> setClass("mylist", contains = "list") [1] "mylist" R> R> setClass("dummy", representation = representation( + a = "mylist")) [1] "dummy" R> R> foo1 = function(i, x) { + [EMAIL PROTECTED] <<- x + } R> R> foo2 = function() { + mydummy <<- new("dummy") + a <- vector(length = 10, mode = "list") + class(a) <- "mylist" + [EMAIL PROTECTED] <<- a + for (i in 1:10) foo1(i, i+1) + mydummy + } R> R> try(thisdummy <- foo2()) Error in foo1(i, i + 1) : Object "*tmp*" not found R> unlist([EMAIL PROTECTED]) Error in unlist([EMAIL PROTECTED]) : Object "thisdummy" not found Execution halted Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] plot.dendrogram and expressions
Hi, currently the "label" and "edgetext" attributes of a dendrogram are coerced to character before they are added to a plot with `text'. Is there a specific reason to do so (expect for the determination of the size of the character string to be plotted)? Otherwise one could plot the attributes directly via diff dendrogram.R /usr/src/R/src/library/stats/R/dendrogram.R 317c317 < text(xBot, yBot + vln, attr(child,"label")) --- > text(xBot, yBot + vln, nodeText) 337c337 < text(mx, my, attr(child, "edgetext")) --- > text(mx, my, edgeText) 344c344 < text(xBot, my, attr(child, "edgetext")) --- > text(xBot, my, edgeText) and one could use expressions for plotting symbols. Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] Re: [R] chisq.test freezing on certain inputs (PR#5701)
On Thu, 11 Dec 2003, Jeffrey Chang wrote: > Hello everybody, > > I'm running R 1.8.1 on both Linux and OS X compiled with gcc 3.2.2 and > 3.3, respectively. The following call seems to freeze the interpreter > on both systems: > > chisq.test(matrix(c(233, 580104, 3776, 5786104), 2, 2), > simulate.p.value=TRUE) > > By freeze, I mean, the function call never returns (running > 10 hours > so far), the process is unresponsive to SIGINT (but I call kill it with > TERM), and the process still consumes cycles on the CPU. > This is due to calling `exp' with a very small value leading to a zero return value in rcont2 (src/appl/rcont.c) line 70: x = exp(fact[iap - 1] + fact[ib] + fact[ic] + fact[idp - 1] - fact[ie] - fact[nlmp - 1] - fact[igp - 1] - fact[ihp - 1] - fact[iip - 1]); if (x >= dummy) { goto L160; } sumprb = x; y = x; y is never checked for zero and later on L150: if (lsm) { goto L155; } /* Decrement entry in row L, column M */ j = nll * (ii + nll); if (j == 0) { goto L154; } --nll; y = y * j / (double) ((id - nll) * (ia - nll)); sumprb += y; if (sumprb >= dummy) { goto L159; } if (! lsp) { goto L140; } goto L150; y has no chance of becoming larger than zero and we are in the goto trap until the end of time. A simple fix would be checking for zero but I don't know how one would proceed in this case ... Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Using log() on an openMosix cluster
On Fri, 21 Nov 2003, Roger D. Peng wrote: > Hi all, I was hoping to get some advice about a problem that I realize > will be difficult to reproduce for some people. I'm running R 1.7.1 on > an openMosix (Linux) cluster and have been experiencing some odd > slow-downs. If anyone has experience with such a setup (or a similar > one) I'd appreciate any help. Here's a simplified version of the problem. > > I'm trying to run the following code: > ## > N <- 10; a <- numeric(N); b <- numeric(N) > e <- rnorm(N) > > for(i in 1:N) { > a[i] <- exp(e[i]) > b[i] <- log(abs(a[i])) > } > ## > > When I run it on the head node, everything is fine. However, when I > send the R process off to one of the cluster nodes (i.e. using mosrun > from the head node) the program takes about 10 times longer (in > wall-clock time, cpu time is roughly the same). > Did you adapt the sig*jmp definitions in src/include/Defn.h? This was necessary until R-1.7.1 and is no longer needed, thanks to Luke's changes in 1.8.0: o On Unix-like systems interrupt signals now set a flag that is checked periodically rather than calling longjmp from the signal handler. This is analogous to the behavior on Windows. This reduces responsiveness to interrupts but prevents bugs caused by interrupting computations in a way that leaves the system in an inconsistent state. It also reduces the number of system calls, which can speed up computations on some platforms and make R more usable with systems like Mosix. I tried the example above with N = 1.000.000: N <- 100; a <- numeric(N); b <- numeric(N) e <- rnorm(N) for(i in 1:N) { a[i] <- exp(e[i]) b[i] <- log(abs(a[i])) } cat(proc.time()) with R-1.8.0 with Linux 2.4.22 and OpenMosix-Patch and started 10 processes which migrated immediately. [EMAIL PROTECTED]:~/tmp/log$ grep -1 cat *.Rout log1.Rout-R> log1.Rout:R> cat(proc.time()) log1.Rout-37.04 1.02 43.44 0 0.01R> -- log10.Rout-R> log10.Rout:R> cat(proc.time()) log10.Rout-34.25 0.45 40.21 0 0.01R> -- log2.Rout-R> log2.Rout:R> cat(proc.time()) log2.Rout-22.19 0.33 29.36 0 0R> -- log3.Rout-R> log3.Rout:R> cat(proc.time()) log3.Rout-24.46 0.42 32.96 0 0.03R> -- log4.Rout-R> log4.Rout:R> cat(proc.time()) log4.Rout-36.88 0.38 40.73 0 0.02R> -- log5.Rout-R> log5.Rout:R> cat(proc.time()) log5.Rout-34.79 0.52 42.83 0.02 0R> -- log6.Rout-R> log6.Rout:R> cat(proc.time()) log6.Rout-34.14 0.54 41.46 0 0.01R> -- log7.Rout-R> log7.Rout:R> cat(proc.time()) log7.Rout-35.21 0.66 43.4 0 0R> -- log8.Rout-R> log8.Rout:R> cat(proc.time()) log8.Rout-25.27 0.55 33.77 0 0.01R> -- log9.Rout-R> log9.Rout:R> cat(proc.time()) log9.Rout-36.69 0.44 43.16 0.01 0R> So, everything is fine here. I guess using R-1.8.1 will fix your problem. Torsten > Interestingly, when I tried running the following code: > ## > N <- 10; a <- numeric(N); b <- numeric(N) > e <- rnorm(N) > > for(i in 1:N) { > a[i] <- exp(e[i]) > b[i] <- exp(abs(a[i])) > } > ## > > I didn't experience any slow-down! That is the wall-clock time is the > same when run on the head node or on the cluster nodes. The only > difference between the two programs is that one takes a log in the for() > loop and the other one takes an exponential. > > I guess my question is why would taking the log() produce a 10 fold > increase in runtime? I know that Mosix clusters can experience serious > performance hits if you make a lot of system calls or write out data to > files but I don't think I'm doing that here. Is there some major > difference in the way that exp() and log() are implemented? > > I'm pretty sure this isn't an R problem but I'm wondering if R is doing > something behind the scenes that's affecting performance in the > openMosix setting. > > Thanks in advance for any help. > > -roger > > __ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] function 'density' in r-devel
> Hi, > > Just to report. > > I yesterday's r-devel I get: > > d <- density(rnorm(100)) > Error in var(x, na.rm = na.rm) : 3 arguments passed to "cov" which requires 4. > you are the third one discovering this :-) cov.R and var.R were removed from the base/R directory but we all failed to update the sources correctly. Just rebuild from a fresh checkout. Best, Torsten > > Regards, > > > L. > > __ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] dwilcox (PR#4212)
> Full_Name: Mark J. Lamias > Version: 1.7.0 > OS: Windows 2000 Pro > Submission from: (NULL) (65.222.84.72) > > > I am running the qwilcox procedure and it is producing incorrect results. For > example, dwilcox(.025, 3, 5) not really: R> dwilcox(.025, 3, 5) [1] 0 which is natural since the statistic can take integer values only. > should equal 6, but it is equal to 1. Similarly, > dwilcox(.025, 3, 6) should equal 7, but it equals 2. The critical values are > not set being returned with the correct values. I've verified this with a > program that performs direct enumeration to determine the appropriate critical > values for .05 (two- tail): > > n1 n2 n crtical_value > 3 5 8 6 I think you failed to notice what `?qwilcox' tries to tell you: This distribution is obtained as follows. Let 'x' and 'y' be two random, independent samples of size 'm' and 'n'. Then the Wilcoxon rank sum statistic is the number of all pairs '(x[i], y[j])' for which 'y[j]' is not greater than 'x[i]'. This statistic takes values between '0' and 'm * n', and its mean and variance are 'm * n / 2' and 'm * n * (m + n + 1) / 12', respectively. Moreover, it is documented that `probabilities are P[X <= x]' and therefore R> qwilcox(.025, 3, 5) + 3*4/2 [1] 7 means "the smallest x with P(W <= x) => 0.025 is 7" which you can check easily R> pwilcox(7 - 3*4/2, 3, 5) [1] 0.03571429 whereas following your calculations R> pwilcox(6 - 3*4/2, 3, 5) [1] 0.01785714 Best, Torsten > 3 6 9 7 > 3 7 10 7 > 3 8 11 8 > 3 9 12 8 > 3 10 13 9 > 3 11 14 9 > 3 12 15 10 > 3 13 16 10 > 3 14 17 11 > 3 15 18 11 > 3 16 19 12 > 3 17 20 12 > 3 18 21 13 > 3 19 22 13 > 3 20 23 14 > 3 21 24 14 > 3 22 25 15 > 3 23 26 15 > 3 24 27 16 > 3 25 28 16 > 3 26 29 17 > 3 27 30 17 > 3 28 31 18 > 3 29 32 19 > 3 30 33 19 > 4 4 8 10 > 4 5 9 11 > 4 6 10 12 > 4 7 11 13 > 4 8 12 14 > 4 9 13 14 > 4 10 14 15 > 4 11 15 16 > 4 12 16 17 > 4 13 17 18 > 4 14 18 19 > 4 15 19 20 > 4 16 20 21 > 4 17 21 21 > 4 18 22 22 > 4 19 23 23 > 4 20 24 24 > 4 21 25 25 > 4 22 26 26 > 4 23 27 27 > 4 24 28 27 > 4 25 29 28 > 4 26 30 29 > 4 27 31 30 > 4 28 32 31 > 4 29 33 32 > 4 30 34 33 > 5 5 10 17 > 5 6 11 18 > 5 7 12 20 > 5 8 13 21 > 5 9 14 22 > > __ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] `var' broken in 1.8.0 alpha (2003-09-15)
> Yes these two are related, but the e-mail subject is wrong: > It's *your* fault because you must have unpacked the new version > on top of an older one. > The official alpha version (Snapshots from here, or rsync) do > not have your problem: > Before a few days ago, there were files > src/library/base/R/var.R > src/library/base/R/cov.R > the newer R-alpha do not have them -- and if you use the old > files where you shouldn't --> you get the problem. > yes, thats the problem, however it is a problem with rsync. I update the tree via rsync -rC rsync.r-project.org::r-devel R an this does not remove those files: [EMAIL PROTECTED]:~/software/R/src/library/base/R$ ls -la cov.R -rw-r--r--1 hothorn users 375 Sep 12 04:00 cov.R [EMAIL PROTECTED]:~/software/R/src/library/base/R$ ls -la var.R -rw-r--r--1 hothorn users 361 Sep 12 04:00 var.R A possible fix is using rsync -rC --delete rsync.r-project.org::r-devel R (maybe one should add a hint in section 2.4 of the FAQ). Thank you for pointing this out! Best, Torsten > Regards, > Martin Maechler <[EMAIL PROTECTED]> http://stat.ethz.ch/~maechler/ > Seminar fuer Statistik, ETH-Zentrum LEO C16 Leonhardstr. 27 > ETH (Federal Inst. Technology)8092 Zurich SWITZERLAND > phone: x-41-1-632-3408fax: ...-1228 <>< > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] `var' broken in 1.8.0 alpha (2003-09-15)
Hi, in last nights alpha version, `var' is broken: R> var(rnorm(100)) Error in var(rnorm(100)) : 3 arguments passed to "cov" which requires 4. which I suspect is due to recent changes to `cov'. The same is true for R> cov(rnorm(100), rnorm(100)) Error in cov(rnorm(100), rnorm(100)) : 3 arguments passed to "cov" which requires 4. Best, Torsten R> version _ platform i686-pc-linux-gnu arch i686 os linux-gnu system i686, linux-gnu status alpha major1 minor8.0 year 2003 month09 day 15 language R __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] tracing nonstandardGenericFunction
Hi, how can one trace a nonstandardGenericFunction, especially "initialize"? An example: setClass("dummy", representation(a = "numeric")) setMethod("initialize", "dummy", function(.Object, a = 2) { ### I want to trace this function [EMAIL PROTECTED] <- a .Object }) setMethod("show", "dummy", function(object) print([EMAIL PROTECTED])) b <- new("dummy", a = 3) trace("show", browser, signature = "dummy") ### trace method "show" to ### class "dummy" show(b) ### works fine trace("initialize", browser, signature = "dummy") R> b <- new("dummy") ### does not trace the function of interest ... Tracing initialize(value, ...) on entry Called from: initialize(value, ...) Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] wilcox.test, CI (PR#3666)
> Full_Name: David Wooff > Version: 1.7.0 > OS: i686-pc-linux-gnu > Submission from: (NULL) (129.234.4.10) > > > wilcox.test exits with error message when confidence interval required, under > some situations. I suspect this occurs when the data contain a zero and for some > data lengths only: > > print(wilcox.test(c(2,1,4,3,6,-5,0),conf.int=T)) > The conditional distribution cannot be computed for data with zeros in 'wilcox.test' and therefore the asymptotic distribution is used instead: R> wilcox.test(c(2,1,4,3,6,-5,0),conf.int = FALSE) Wilcoxon signed rank test with continuity correction data: c(2, 1, 4, 3, 6, -5, 0) V = 16, p-value = 0.2945 alternative hypothesis: true mu is not equal to 0 Warning message: Cannot compute exact p-value with zeroes in: wilcox.test.default(c(2, 1, 4, 3, 6, -5, 0), co nf.int = FALSE) and in this special case 'uniroot' fails when searching for the confidence limits (which I wouldn't trust anyway for such small sample sizes). As a workaround you can use R> wilcox.exact(c(2,1,4,3,6,-5,0),conf.int = TRUE) Exact Wilcoxon signed rank test data: c(2, 1, 4, 3, 6, -5, 0) V = 16, p-value = 0.3125 alternative hypothesis: true mu is not equal to 0 95 percent confidence interval: -5 6 sample estimates: (pseudo)median 2.5 from package 'exactRankTests' which derives the confidence limits from the conditional distribution. Best, Torsten > fails > > print(wilcox.test(c(2,1,4,3,6,-5,0,1),conf.int=T)) > > works > > print(wilcox.test(c(2,1,4,3,6,-5,1),conf.int=T)) > > works > > Apologies if this is known - did search bug lists first. > > David. > > __ > [EMAIL PROTECTED] mailing list > https://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] typo in R-exts
Hi, the definition of `fsign' in R-exts currently is double fsign (double x, double y) Function Performs "transfer of sign" and is defined as |x| * sign(x) but should read double fsign (double x, double y) Function Performs "transfer of sign" and is defined as |x| * sign(y) ^^^ Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] brackets in prompt.data.frame (PR#2676)
prompt.data.frame produces Rd-code with unbalanced brackets: data(iris) prompt(iris, filename=NA) gives ... $format [1] "\\format{" [2] " A data frame with 150 observations on the following 5 variables." [3] " \\describe{" [4] "\\item{Sepal.Length}{a numeric vector}" [5] "\\item{Sepal.Width}{a numeric vector}" [6] "\\item{Petal.Length}{a numeric vector}" [7] "\\item{Petal.Width}{a numeric vector}" [8] "\\item{Species}{a factor with levels}" [9] "\\item{Species}{\\code{setosa} }" [10] "\\item{Species}{\\code{versicolor} }" [11] "\\item{Species}{\\code{virginica} }" [12] "}" where the closing bracket to "describe" is missing. The fix seems simply changing fmt <- c(fmt, "}") to fmt <- c(fmt, " } \n } ") in line 208 of "prompt.R". Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] cex.axis in boxplot (PR#2628)
Hi, the graphical parameter "cex.axis" does not have any affect for "boxplot": data(iris) boxplot(iris[,1:4], ylab="y", cex.lab=2, cex.axis=2) The patch is simply adding "cex.axis" to the search for axis relevant parameters in "bxp": ax.pars <- pars[names(pars) %in% c("xaxt", "yaxt", "las", "cex.axis")] Best, Torsten --please do not edit the information below-- Version: platform = i686-pc-linux-gnu arch = i686 os = linux-gnu system = i686, linux-gnu status = Under development (unstable) major = 1 minor = 7.0 year = 2003 month = 03 day = 11 language = R Search Path: .GlobalEnv, package:methods, package:ctest, package:mva, package:modreg, package:nls, package:ts, Autoloads, package:base __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
[Rd] class of integers
Hi, I'm a little bit confused about the class of integers (with yesterdays r-devel): R> a <- 1:10 R> class(a) [1] "integer" R> inherits(a, "integer") [1] FALSE R> data.class(a) [1] "numeric" R> is.numeric(a) [1] TRUE R> inherits(a, "numeric") [1] FALSE data.class is consistent with R-1.6.2, ok. The class of "a" is integer, also ok. At first: why does "inherits" state that "a" is not of class integer nor numeric? And at second: one possible way of writing portable code (between 1.6.2 and 1.7.0) for generics is using: foo <- function(y, ...) { if(is.null(class(y))) class(y) <- data.class(y) UseMethod("foo", y, ...) } foo.default <- function(y, ...) { stop(paste("Do not know how to handle objects of class", class(y))) } (the thread "[Rd] Methods package is now attached by default" around Jan 20th discussed this) If I have foo.numeric <- function(y, ...) ... this works with R-1.6.2 but since foo now dispatches on "class(y)" this fails for integers with r-devel. Does this mean that I have to implement foo.integer methods? Best, Torsten __ [EMAIL PROTECTED] mailing list https://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] package mvtnorm: sigma parameter in pmvnorm() (PR#2478)
On Tue, 21 Jan 2003 [EMAIL PROTECTED] wrote: > Full_Name: Jerome Asselin > Version: 1.6.2 > OS: RedHat Linux 7.2 > Submission from: (NULL) (142.103.173.179) > > > > pmvnorm() may fail for a univariate distribution when > its parameter "sigma" is defined as a matrix. It will > fail if sigma < 1. > confirmed and fixed in mvtnorm_0.5-8, soon at CRAN. Thanks for the report! Please do NOT file a bug report for contributed packages with priority less than "recommended" since the package maintainers do not have write access to the bugs-database anyway. Moreover, cc the report to the maintainer (which is the default way of reporting bugs in packages). Best, Torsten > library(mvtnorm) > > #THIS WORKS > > pmvnorm(lower=-Inf,upper=2,mean=0,sigma=matrix(1.5)) > [1] 0.9487648 > attr(,"error") > [1] 0 > attr(,"msg") > [1] "univariate: using pnorm" > > #THIS FAILS > > pmvnorm(lower=-Inf,upper=2,mean=0,sigma=matrix(.5)) > Error in checkmvArgs(lower = lower, upper = upper, mean = mean, corr = corr, : > diag(sigma) and lower are of different length > > #THIS WORKS > > pmvnorm(lower=-Inf,upper=2,mean=0,sigma=.5) > [1] 0.9976611 > attr(,"error") > [1] 0 > attr(,"msg") > [1] "univariate: using pnorm" > > __ > [EMAIL PROTECTED] mailing list > http://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list http://www.stat.math.ethz.ch/mailman/listinfo/r-devel
Re: [Rd] Methods package is now attached by default
On Sun, 19 Jan 2003, Kurt Hornik wrote: > >>>>> Prof Brian D Ripley writes: > > > On Fri, 17 Jan 2003, John Chambers wrote: > >> There are two main known differences from having methods attached: > >> > >> - the definition of class() changes, in particular no object ever has a > >> NULL class. If you have code that depends on test such as > >> `if(is.null(class(x)))...', there may be problems. > >> > >> Usually code with those sort of tests is doing a workaround of the fact > >> that not all objects had a class before. The best solution is usually > >> to ask what the code really wants to do. If you do have to retain > >> exactly the old behavior, one solution is to copy the version of class > >> and class<- from the base package and use those (as baseClass and > >> baseClass<-, e.g.) instead of class and class<-). > > > Here is one example, which makes the MASS scripts fail. > > >> library(MASS) > >> corresp > > function (x, ...) > > { > > if (is.null(class(x))) > > class(x) <- data.class(x) > > UseMethod("corresp", x, ...) > > } > > > That used to work with matrices, and dispatch to corresp.matrix. Now > > a matrix has reported class "matrix" but dispatch occurs to > > corresp.default. The temporary fix is to remove the is.null > > condition. > > > That's quite a common construction, and I think I should expect > > UseMethod to dispatch on the class class() reports. So it looks to me > > as if UseMethod needs to be altered to do so. > same happens e.g. with class "numeric": ipredbagg <- function(y, ...) UseMethod("ipredbagg") does NOT dispatch to ipredbagg.numeric if y IS of class "numeric". With the "old" class(), this one worked: ipredbagg.default <- function(y, ...) { # "numeric" is not an S3 class: check for regression problems and # the method dispatch should work for class(y) == "numeric" if (is.numeric(y)) { class(y) <- "numeric" return(ipredbagg(y, ...)) } else { stop(paste("Do not know how to handle objects of class", class(y))) } } but now I need to call return(ipredbagg.numeric(y, ...)) I think Brian is right with altering UseMethod() in a way taking care of what class() tells us. > > ... > > The daily check process on all CRAN packages shows that the packages > > StatDataML geoR ipred xtable > > now fail running their examples with methods loaded by default. The > three latter can be traced immediately to variations on the above; the > first fails in > > if (is.factor(x)) { > attr(xtmp, "levels") <- NULL > class(xtmp) <- class(xtmp)[!class(xtmp) %in% c("ordered","factor")] > } > > with the error message > > Error in "class<-"(*tmp*, >value = class(xtmp)[!class(xtmp) %in% c("ordered", : > Invalid replacement object to be a class string > > I think I can tell David how to fix this, but wasn't quite sure about > the wording in the error message ... > > I assume I should wait before contacting the authors of the other 3 > packages for changing their code---or do we already have an official > recommendation? I can fix the code for ipred. The only problem is making it work for both R-1.6.2 AND R-devel :-) Torsten > > -k > > __ > [EMAIL PROTECTED] mailing list > http://www.stat.math.ethz.ch/mailman/listinfo/r-devel > > __ [EMAIL PROTECTED] mailing list http://www.stat.math.ethz.ch/mailman/listinfo/r-devel