Re: [R-pkg-devel] problem with donttest example
On 26/01/2021 5:05 a.m., Olivier Martin (BioSP) wrote: Hi all, R CMD check --as-cran now runs \donttest examples . So what is the right way to avoid it, and what is the right way for submission to CRAN. Why are you listing the examples as \donttest? If it's because they will fail on CRAN, wouldn't they fail on a user's system? Then you should fix the examples so they don't fail. If you marked them that way because they violate some CRAN policy (e.g. writing to a user's home directory), then you should fix them. Those policies are there to protect users. If you marked them that way because they take too long to run, then it seems like CRAN shouldn't complain about the run time. I don't know if they do or not. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Rdmacros as Suggests rather than Imports
On 22/01/2021 4:51 p.m., Dirk Eddelbuettel wrote: Hi James, On 22 January 2021 at 15:42, James Pustejovsky wrote: | Thanks very much for your input! By "patch it out" do you mean modify my | package description for the Debian distribution only? Would it work to And code / content! Can't just alter the DESCRIPTION if the Rd still call mathjaxr. | create a separate branch of my Github repo with modifications for the | Debian distro, and then submit that directly to Debian maintainers? | | I'm sorry for my naivete here--I'm just not quite sure how to proceed. I attempted to lay out two alternatives: i) where you remove mathjaxr, the Debian package just follows ii) where you do nothing but the Debian package gets altered ("patched") and then differs in that only it no longer uses mathjaxr Hth, Dirk I wonder if there's a way to define stubs for the macros in the package, and use mathjaxr versions only conditional on having it available? I suspect that should be possible, but might require changes in R, not just in the individual packages. As to the workaround in rgl: since the mathjaxr macros are defined to be similar to the existing \eqn and \deqn macros in base R, it was easy to switch back to those and I (hopefully temporarily) dropped the mathjaxr dependency completely. For a package that made more extensive use of math in the help pages it would be more work. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] checking for detritus in the temp directory ... NOTE 'RtmpvRaBK1_copy'
On 14/01/2021 10:22 a.m., Joseph Thorley wrote: Hi All I’m getting the following Note for CRAN r-devel-linux-x86_64-debian-gcc * checking for detritus in the temp directory ... NOTE Found the following files/directories: ‘RtmpvRaBK1_copy’ but not on R-hub or any of of the other versions/operating systems tested: • OSX (local) - release • OSX (actions) - release • Ubuntu (actions) - 3.6, oldrel, release and devel • Windows (actions) - release • Windows (winbuilder) - devel The package is https://github.com/poissonconsulting/sims The file names in other posts discussing similar issues are different in structure suggesting different mechanisms https://www.mail-archive.com/r-package-devel@r-project.org/msg06089.html https://stackoverflow.com/questions/62456137/r-cran-check-detritus-in-temp-directory Any help on how to debug appreciated I did a search for "_copy" on your Github source, and I see a dir containing that being created in a test (but apparently removed afterwards), and also being created in the sims.Rmd vignette without being removed (though a file in the directory is removed). No idea why you see the NOTE only on some platforms. Duncan Murdoch Thanks Joe Joe Thorley PhD, RPBio Computational Biologist Poisson Consulting Ltd. 4216 Shasheen Road Nelson, BC V1L 6X1, Canada Tel: +1 250 551 2194 Email: j...@poissonconsulting.ca Web: www.poissonconsulting.ca [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN check texi2dvi failure
On 10/01/2021 9:12 a.m., Paul Gilbert wrote: Thanks Enrico for the great guess, and Georgi for the details. If I omit the space as seems to be implied in some documentation, changing \verb <https://www.bankofcanada.ca/2006/03/working-paper-2006-3> . to \verb<https://www.bankofcanada.ca/2006/03/working-paper-2006-3> . then the R CMD check error "\verb ended by end of line" happens on my linux machine. I did not try replacing the space with another deliminator, which I guess would now be the correct way to use \verb. Yes, you would need to do that. The solution of adding \usepackage{url} and changing to \url{https://www.bankofcanada.ca/2006/03/working-paper-2006-3}. does seem to work. (No "on CRAN" confirmation yet but I have not had the immediate pre-test rejection that I got previously.) The risk of this change is that the URL may fail in CRAN tests; see the recent "URL checks" thread on R-devel for a discussion of ways in which this can happen even when the URL works for you. Duncan Murdoch Paul On 2021-01-10 8:04 a.m., Georgi Boshnakov wrote: > The problem is not in the Warning from the example but from > the \verb commands in the references. > You use space to delimit the argument of \verb and I was surprised > that it worked since TeX ignores spaces after commands. > Apparently, this has been an exception for \verb but now this feature > is considered a bug and hs been recently fixed, see the atacjexchange > question below and the relevant paragraph from LaTeX News. Probably > the linux machines have updated their TeX installations. > > In short, changing the space tp say + delimiter for \verb command should fix the issue. > > Georgi Boshnakov > On 2021-01-09 6:52 p.m., Enrico Schumann wrote: When I run R CMD check on my Linux machine [*], I also do not get an error. But here is a guess: The error mentions \verb, and the LaTeX manual says that \verb should be followed by nonspace character. But in the vignette it is followed by a space. Maybe using \url in the vignette could fix the error? kind regards Enrico [*] R version 4.0.3 (2020-10-10) Platform: x86_64-pc-linux-gnu (64-bit) Running under: Ubuntu 20.10 On Sat, 09 Jan 2021, Paul Gilbert writes: I am trying to debug a problem that is appearing in the linux and Solaris checks, but not Windows or Mac checks, of my package tsfa as reported at https://cran.r-project.org/web/checks/check_results_tsfa.html The problem is with re-building the vignette ... this is package 'tsfa' version '2014.10-1' ... checking re-building of vignette outputs ... [6s/9s] WARNING Error(s) in re-building vignettes: ... Running 'texi2dvi' on 'Guide.tex' failed. LaTeX errors: ! LaTeX Error: \verb ended by end of line. ... In responding to the threat of removal I have also fixes some long standing warnings about adding imports to the NAMESPACE. The new version builds with --as-cran giving no errors or warnings with both R-devel on win-builder (2021-01-07 r79806) and on my linux machine (R 2021-01-08 r79812 on Linux Mint 19.3 Tricia). When I submit it to CRAN the Windows build is OK but the same error happens at the 'texi2dvi' step in the debian vignette re-build. This seems to happens after an example that correctly has a warning message (about Heywood cases). In my linux build the the warning happens but the message does not appear in the pdf output, so one possibility is that the handling of the warning on the CRAN Unix check machines fails to produce clean tex or suppress output. Another possibility is that my build using --as-cran is different from the actual CRAN build options. For example, my 00check.log shows ... * checking package vignettes in ‘inst/doc’ ... OK * checking re-building of vignette outputs ... OK * checking PDF version of manual ... OK * checking for non-standard things in the check directory ... OK ... so I am not sure if it uses texi2dvi. (I haven't used dvi myself for a long time.) I'm not sure how to debug this when I can't reproduce the error. Suggestions would be appreciated. Paul Gilbert __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Package used unconditionally only in testing
On 08/01/2021 9:17 a.m., Greg Freedman Ellis wrote: Hi all, I'm trying to update a package to conform to pass tests given `_R_CHECK_DEPENDS_ONLY_=TRUE`. In this package, we only use the package `httptest` during testing, but the tests are (almost) meaningless if it is not installed, so I would like to indicate that it is a required package rather than skipping tests if it is not installed. This sounds wrong. I don't know the httptest package, but I assume that since you were only using it for testing, I might be interested in using your package even if I wasn't interested in installing httptest. Making it a hard requirement would force me to install httptest. If I move `httptest` to Depends, then I get the error CRAN check note: Namespace in Imports field not imported from: ‘httptest’ All declared Imports should be used. You *definitely* shouldn't include it in Depends: that would force it onto the search list, and potentially break other things that I'm running, e.g. if they have name conflicts with it. You *probably* shouldn't include it in Imports. Why force me to load another package if I'll never use it? There's a tiny chance that would push my memory use over the edge, and my own code would fail. It should almost certainly be included in Suggests, and nowhere else. If that means your tests are skipped, you should feel free to warn the user in your test messages: but it shouldn't cause your tests to fail. I think this would best be solved by a DESCRIPTION field that indicates a package is required, but only for tests, but I do not see such a field. The only solution I can think of is to have a trivial import of `httptest` in the main package to silence the NOTE. Is there a better solution? Most users aren't going to run your tests, so they shouldn't be forced to install software that would let them do so. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Sweave vignette and bibtex
On 06/01/2021 6:20 a.m., Jarrod Hadfield wrote: Hi, I have a Sweave vignette in a package I have written. When building, the citations are not put into the pdf - perhaps because two passes of the tex file are required but only one is executed. Is there a way to force two passes of the tex file? Kind Regards, Jarrod The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336. _ You should give a bit more detail (what exactly do you have in your vignettes directory? How are you processing it?), and maybe even a reproducible example if you can extract a minimal amount from the document and the .bib file. I haven't used Sweave in a few years (I recommend switching to knitr's implementation of .Rnw processing), but I didn't have problems like yours in the past. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Used package not updated - needs java < V 11
On 16/12/2020 10:21 a.m., Knut Krueger wrote: Am 15.12.20 um 14:37 schrieb Duncan Murdoch: thank you for your answer You should not have @importFrom XLConnect createSheet writeWorksheet saveWorkbook in your ROxygen comments; that results in an unconditional import. Instead, you should use fully qualified calls each time, i.e. XLConnect::createSheet, XLConnect::writeWorksheet, XLConnect::saveWorkbook #' @importFrom XLConnect::createSheet, XLConnect::writeWorksheet, XLConnect::saveWorkbook This causes the error "there is no package called ‘XLConnect::createSheet,’" No, you should drop the @importFrom comment completely, and in your R code use those fully qualified forms. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] R CMD check warning on Solaris
FALSE fails <<- c(fails, file) message(gettextf("Error: processing vignette '%s' failed with diagnostics:\n%s", file, conditionMessage(e)))}) #> 32: tools:::buildVignettes(dir = "/export/home/XvmX59V/rminizinc.Rcheck/vign_test/rminizinc", ser_elibs = "/export/home/XvmX59V/Rtemp/RtmpFgaaxZ/file66b85b0113ad.rds") #> An irrecoverable exception occurred. R is aborting now ... I think this warning is coming because of the compiler 'Oracle Developer Studio 12.6' and I was not getting this warning on other operating systems (including the Oracle Solaris 10, x86, 32 bit, R-release). It would be great if you could please give suggestions on how to resolve this warning. It will help me to resubmit the package. Add a test to see whether Pandoc (of a sufficient version) is available. If it isn't, your vignettes shouldn't try to execute any code. Here's the test I use in rgl: if (!requireNamespace("rmarkdown") || !rmarkdown::pandoc_available("1.14")) { warning(call. = FALSE, "These vignettes assume rmarkdown and pandoc version 1.14. These were not found. Older versions will not work.") knitr::knit_exit() } This results in very short vignettes on systems that can't produce them, but since tar.gz files include the vignettes produced on my system, this shouldn't be a big problem. You should also list the Pandoc requirement as a system requirement in your Description file, e.g. SystemRequirements: pandoc (>=1.14, needed for vignettes) You may have other errors; the Pandoc issue probably wouldn't cause a segfault. But if the wrong version of rmarkdown isn't used, all bets are off. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Used package not updated - needs java < V 11
On 15/12/2020 8:02 a.m., Knut Krueger wrote: I am using in my Package XlConnect. If the Computer is using Java < 11 all is working. But if not, the package can not be used. inside teh functions tehre is an ' @importFrom XLConnect createSheet writeWorksheet saveWorkbook but only used for additional comfort to use excel sheets The package is usable without XlConnect if I change Imports: igraph,chron,gdata, XLConnect to Imports: igraph,chron,gdata Suggests:XLConnect but then I get the error checking package dependencies ... ERROR Namespace dependency not required: ‘XLConnect’ Just now the new version is on my private repository and working with or without Xlconnect depending on the java version How can I submit the package to cran until XlConnect is working with java > 11 You should not have @importFrom XLConnect createSheet writeWorksheet saveWorkbook in your ROxygen comments; that results in an unconditional import. Instead, you should use fully qualified calls each time, i.e. XLConnect::createSheet, XLConnect::writeWorksheet, XLConnect::saveWorkbook in your code. You should also wrap every use of those functions in checks like if (requireNamespace("XLConnect")) { run code } else { report that you can't run that code } and make sure none of your examples or vignettes fail if XLConnect is not present. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN packages suggesting other packages but not using them conditionally
On 12/12/2020 6:01 p.m., Ben Bolker wrote: On 12/12/20 5:50 PM, Duncan Murdoch wrote: On 12/12/2020 4:08 p.m., Spencer Graves wrote: Hi, Ben et al.: On 2020-12-12 13:43, Ben Bolker wrote: Apologies if I'm telling you something you already know: By default, fda::CRAN() uses the presence of environment variables matched by the regexp "^_R_" as a heuristic to decide whether it's being running on CRAN. testthat::skip_on_cran() calls testthat::on_cran() to look for an environment variable called NOT_CRAN equal to "true". The devtools::check() machinery automatically sets this variable. > testthat::on_cran Error: 'on_cran' is not an exported object from 'namespace:testthat' Besides, on my Mac, I get: > testthat:::on_cran() [1] TRUE My Mac is NOT CRAN, and I don't want that function to return TRUE on my computer unless I explicitly run "R CMD check --on-cran". So: fda::CRAN() depends on breakable assumptions, defaults to FALSE in an empty environment. skip_on_cran() defaults to TRUE in an empty environment (but defaults to FALSE in a devtools::check() environment). If future changes break fda::CRAN, I will have to deal with it then. I'd be happier if the CRAN maintainers would develop a procedure to make it easier for package maintainers do two things: * Include tests in their package that run longer than the time limit permitted on CRAN. That's very easy now. Just put them in a "slowtests" directory, and tell R CMD check to use that. How could it be easier? How would you do that? In "R CMD check --help" I see that one can use --test-dir= to specify the test directory, but I don't see a way to specify _additional_ test directories; short of setting a tests/ directory with CRAN-specific tests and a slowtests/ directory with *both* CRAN-specific and CRAN-excluded tests (thus duplicating files, which seems clunky), I don't see how to do this within a standard R CMD check framework (without testing a CRAN-indicating environment variable, which gets us back where we started ...) Or would you run R CMD check twice, once without and once with --test-dir=slowtests ? What I would do is have the slowtests run the regular tests. So if I want both, I run slowtests. If I want just the fast ones, I don't specify. I can't think why I wouldn't want to run the slow ones without the fast ones, but if I did, it's not too hard to figure out a scheme that runs fast by default, slow when requested, and both if you request that instead. There doesn't seem to be very much in "Writing R Extensions" about testing - a little bit in https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Package-subdirectories What am I missing? Just to clarify, the ideal would be to be able to designate a separate set of tests that were *not* run on CRAN, and to be able to run them in the same "R CMD check" pass as the CRAN-specific tests. Yes, do that as described above. There are a bunch of ways to achieve this, but I think Spencer is saying (and I agree) that it would be nice if it were there an official mechanism that made this easier (and it seems pretty easy if the CRAN maintainers were agreeable to the idea ... There is such a mechanism, and I've just described it (and not for the first time; it's also described in WRE). I think the problem is that you and Spencer are looking for something that's more complicated. It doesn't need to be complicated. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN packages suggesting other packages but not using them conditionally
On 12/12/2020 4:41 p.m., Ben Bolker wrote: On 12/12/20 4:08 PM, Spencer Graves wrote: Hi, Ben et al.: On 2020-12-12 13:43, Ben Bolker wrote: Apologies if I'm telling you something you already know: By default, fda::CRAN() uses the presence of environment variables matched by the regexp "^_R_" as a heuristic to decide whether it's being running on CRAN. testthat::skip_on_cran() calls testthat::on_cran() to look for an environment variable called NOT_CRAN equal to "true". The devtools::check() machinery automatically sets this variable. > testthat::on_cran Error: 'on_cran' is not an exported object from 'namespace:testthat' on_cran() is intended to be used via testthat::skip_on_cran() (which is exported, unlike on_cran()). Besides, on my Mac, I get: > testthat:::on_cran() [1] TRUE My Mac is NOT CRAN, and I don't want that function to return TRUE on my computer unless I explicitly run "R CMD check --on-cran". The assumption of testthat is that it's going to be deployed via devtools::check(), which automatically sets the environment variable NOT_CRAN equal to 'true'. For testing on your machine, you could use Sys.setenv(NOT_CRAN="true"); testthat:::on_cran() or you could put export NOT_CRAN=true in the shell/in your testing pipeline. So: fda::CRAN() depends on breakable assumptions, defaults to FALSE in an empty environment. skip_on_cran() defaults to TRUE in an empty environment (but defaults to FALSE in a devtools::check() environment). If future changes break fda::CRAN, I will have to deal with it then. I'd be happier if the CRAN maintainers would develop a procedure to make it easier for package maintainers do two things: * Include tests in their package that run longer than the time limit permitted on CRAN. * Give error messages that the package maintainer wants to see but that should be suppressed on CRAN or when the user decides to run "R CMD check --as-cran". I agree that this would be nice. A simple mechanism would be to set an official/sanctioned/stable environment variable such as _R_ON_CRAN in all CRAN testing pipelines. What's wrong with users setting NOT_CRAN on all non-CRAN testing pipelines? Most people want the same tests in both places. Those who like writing lots of time consuming tests are the ones who shouldn't mind a small step to control them. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN packages suggesting other packages but not using them conditionally
On 12/12/2020 4:08 p.m., Spencer Graves wrote: Hi, Ben et al.: On 2020-12-12 13:43, Ben Bolker wrote: Apologies if I'm telling you something you already know: By default, fda::CRAN() uses the presence of environment variables matched by the regexp "^_R_" as a heuristic to decide whether it's being running on CRAN. testthat::skip_on_cran() calls testthat::on_cran() to look for an environment variable called NOT_CRAN equal to "true". The devtools::check() machinery automatically sets this variable. > testthat::on_cran Error: 'on_cran' is not an exported object from 'namespace:testthat' Besides, on my Mac, I get: > testthat:::on_cran() [1] TRUE My Mac is NOT CRAN, and I don't want that function to return TRUE on my computer unless I explicitly run "R CMD check --on-cran". So: fda::CRAN() depends on breakable assumptions, defaults to FALSE in an empty environment. skip_on_cran() defaults to TRUE in an empty environment (but defaults to FALSE in a devtools::check() environment). If future changes break fda::CRAN, I will have to deal with it then. I'd be happier if the CRAN maintainers would develop a procedure to make it easier for package maintainers do two things: * Include tests in their package that run longer than the time limit permitted on CRAN. That's very easy now. Just put them in a "slowtests" directory, and tell R CMD check to use that. How could it be easier? Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] How to avoid R CMD check warning for documentation of non-package functions?
On 02/12/2020 12:57 p.m., Konrad Rudolph wrote: My package provides infrastructure support for callback functions defined in special environments in user code. They are conceptually similar (in fact, almost identical) to the `.onLoad` etc. “hooks for namespace events” in base R [1]. Now I’m adding documentation for these functions, via the following Rd code (or the equivalent roxygen2 code annotating a `NULL` value): ``` \name{topicname} \alias{topicname} \alias{onload} \title{Hooks for environment events} \usage{ onload(env) } \arguments{ \item{env}{the environment} } \description{ Short description } ``` Unfortunately, this causes an `R CMD check` warning: Functions or methods with usage in documentation object 'topicname' but not in code: ‘onload’ Right: this function does not exist in my package, and it *should not* exist in the package. Yet I do need to document it for users. What is the recommended way for doing so? In fact, from my reading of the R source, the base R documentation of ‘ns-hooks’ doesn’t seem to do anything special, and would presumably also cause this warning. I’m open to doing this differently, but I’d strongly prefer if these functions had their own help topic, with their own “usage” section. I don’t just want to add them as a custom section to the package documentation topic if this is at all avoidable. I haven't tried this, but I believe if you define functions with the right name and header in your package but don't export them the warning will go away. If that doesn't work (or defining those causes other issues), a more involved workaround would be to change the \docType{} declaration for the help page. \docType{package} is the most free-form, but you might get warned if you have two of them. \docType{data} might be flexible enough. If you do this, you won't use \usage{} or \arguments{}, you'll put together your own sections using \section{Usage}{ ... } and \section{Arguments}{ ... } and try to get the formatting right. Duncan Murdoch [1]: https://stat.ethz.ch/R-manual/R-devel/library/base/html/ns-hooks.html PS: A note on API design, I considered doing this differently: rather than define hooks via “special names”, users would define them by passing callbacks to a call to a package function; e.g. `mypkg::define_onload(function (env) { … })`. This would be conceptually similar to R’s `setHook` [2]. However, from the user’s point of view there’s no advantage to doing it this way, and it’s more verbose. Defining callbacks via special names has ample precedence, both in R and in other languages. And I don’t think `R CMD check` warnings should dictate API design in this manner. [2] https://stat.ethz.ch/R-manual/R-devel/library/base/html/userhooks.html __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] C++11 requirements for package dependencies
On 30/11/2020 11:54 a.m., Duncan Murdoch wrote: On 30/11/2020 11:31 a.m., Dirk Eddelbuettel wrote: On 30 November 2020 at 11:27, Duncan Murdoch wrote: | I think that C++11 isn't a requirement of RcppArmadillo, it's an option It is as of the 10.* series of Armadillo and hence RcppArmadillo 0.10.* I was going to complain that you should include SystemRequirements: C++11, but on reading more closely, your Makevars.in specifies this in a different way. So I guess the issue is with the Github Actions setup, which didn't spot the requirement. Sorry for the noise... I think the remotes::system_requirements() function is used in setting up the Github Action files; it doesn't appear to recognize the need of RcppArmadillo for C++11. I've posted this as an issue on Github. Hopefully I'm not wrong about this too... Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] C++11 requirements for package dependencies
On 30/11/2020 11:31 a.m., Dirk Eddelbuettel wrote: On 30 November 2020 at 11:27, Duncan Murdoch wrote: | I think that C++11 isn't a requirement of RcppArmadillo, it's an option It is as of the 10.* series of Armadillo and hence RcppArmadillo 0.10.* I was going to complain that you should include SystemRequirements: C++11, but on reading more closely, your Makevars.in specifies this in a different way. So I guess the issue is with the Github Actions setup, which didn't spot the requirement. Sorry for the noise... Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] C++11 requirements for package dependencies
I think that C++11 isn't a requirement of RcppArmadillo, it's an option that is used if available. (Assuming you are using the CRAN version, not an experimental/devel version.) A user of the header file can include #define ARMA_USE_CXX11 which would make it a system requirement of whatever package did that, even though it isn't a system requirement for RcppArmadillo. But I could be wrong about this... Duncan Murdoch On 30/11/2020 11:06 a.m., Mark Clements wrote: [Apologies for cross-posting] A colleague uses a package I maintain (rstpm2) as a dependency in their package (rsimsum) with testing using GitHub Actions. They found that testing failed against R versions 3.3, 3.4 and 3.5 because recent versions of RcppArmadillo (which is a dependency in rstpm2) require C++11. As a dependency diagram: rsimsum --> rstpm2 --> RcppArmadillo Should I update rstpm2 to include "CXX_STD = CXX11" in the Makevars and Makevars.win files and add "SystemRequirements: C++11" to the DESCRIPTION, or is there a simple way in GitHub Actions to use C++11 for older versions of R? Moreover, as a principle, should a package need to change the Makevars and DESCRIPTION files to suit the most recent updates of their dependencies? I would have thought that such a need would break many packages. Sincerely, Mark. När du skickar e-post till Karolinska Institutet (KI) innebär detta att KI kommer att behandla dina personuppgifter. Här finns information om hur KI behandlar personuppgifter<https://ki.se/medarbetare/integritetsskyddspolicy>. Sending email to Karolinska Institutet (KI) will result in KI processing your personal data. You can read more about KI’s processing of personal data here<https://ki.se/en/staff/data-protection-policy>. __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] devtools::release() does not release
Why not use the CRAN submission web page, as documented here: https://cran.r-project.org/web/packages/policies.html#Submission? Duncan Murdoch On 26/11/2020 2:57 p.m., Gábor Csárdi wrote: Why not submit a bug report at the devtools repository? ❯ packageDescription("devtools")$BugReports [1] "https://github.com/r-lib/devtools/issues; Gabor On Thu, Nov 26, 2020 at 7:50 PM Spencer Graves wrote: Hi folks, devtools::release() gave the following errors: ... Ready to submit fda (5.1.7) to CRAN? 1: Not yet 2: Absolutely 3: U... Maybe? Selection: 2 Uploading package & comments Confirming submission Error in if (new_url$query$submit == "1") { : argument is of length zero In addition: Warning messages: 1: In charToRaw(enc2utf8(val)) : argument should be a character vector of length 1 all but the first element will be ignored 2: In charToRaw(enc2utf8(val)) : argument should be a character vector of length 1 all but the first element will be ignored Jim Ramsay got this after setwd to what seemed to be the appropriate working directory and then running devtools::release(), as noted above. As near as both of us can determine, we've followed essentially the same protocol here as with other package submissions. In case you want the entire devtools::release() transcript, it appears below running under Big Sur on a Mac. Suggestions? Thanks, Spencer Graves & Jim Ramsay —— devtools::release() Have you checked for spelling errors (with `spell_check()`)? 1: Not yet 2: Nope 3: Absolutely Selection: 3 Have you run `R CMD check` locally? 1: Yes 2: I forget 3: No Selection: 1 ── Running additional devtools checks for fda Checking version number has three components... OK Checking dependencies don't rely on dev versions... OK Checking DESCRIPTION doesn't have Remotes field... OK ── Were devtool's checks successful? 1: I forget 2: No 3: Absolutely Selection: 3 Have you fixed all existing problems at https://cran.rstudio.com//web/checks/check_results_fda.html ? 1: Yes 2: Not yet 3: No Selection: 1 Have you checked on R-hub (with `check_rhub()`)? 1: Not yet 2: I forget 3: Definitely Selection: 3 Have you checked on win-builder (with `check_win_devel()`)? 1: No 2: Nope 3: Absolutely Selection: 3 Have you checked the 67 reverse dependencies (with the revdepcheck package)? 1: Definitely 2: No 3: U... Maybe? Selection: 1 Have you updated `NEWS.md` file? 1: U... Maybe? 2: No 3: Yeah Selection: 3 Have you updated `DESCRIPTION`? 1: Yup 2: Nope 3: I forget Selection: 1 Have you updated `cran-comments.md?` 1: Nope 2: Not yet 3: Yeah Selection: 3 ── Running Git checks for fda Current branch: master Checking uncommitted files... WARNING: All files should be tracked and committed before release. Please add and commit. Checking synchronisation with remote branch... ERROR: Error in 'git2r_remote_fetch': unsupported URL protocol ── Were Git checks successful? 1: Not yet 2: Nope 3: Yup Selection: 3 Is your email address ram...@psych.mcgill.ca? 1: Yup 2: I forget 3: U... Maybe? Selection: 1 Building ✓ checking for file ‘/Users/jamesramsay/Documents/R/fda_work/fda/DESCRIPTION’ (437ms) ─ preparing ‘fda’: (5.8s) ✓ checking DESCRIPTION meta-information ... ─ checking for LF line-endings in source and make files and shell scripts (1.6s) ─ checking for empty or unneeded directories ─ building ‘fda_5.1.7.tar.gz’ Submitting file: /var/folders/sc/6zzfzqbs5w7dl8q5g18tlqfcgn/T//RtmpbcSyMD/fda_5.1.7.tar.gz File size: 102.9 Mb Ready to submit fda (5.1.7) to CRAN? 1: Not yet 2: Absolutely 3: U... Maybe? Selection: 2 Uploading package & comments Confirming submission Error in if (new_url$query$submit == "1") { : argument is of length zero In addition: Warning messages: 1: In charToRaw(enc2utf8(val)) : argument should be a character vector of length 1 all but the first element will be ignored 2: In charToRaw(enc2utf8(val)) : argument should be a character vector of length 1 all but the first element will be ignored __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Failing vignette engine for package rasciidoc on solaris
On 23/11/2020 7:07 a.m., Andreas Dominik Cullmann via R-package-devel wrote: Dear List, one of my packages, rasciidoc (a simple wrapper to 'knitr' and 'asciidoc'), includes a minimal vigentte engine. When submitting the package with a vignette using that engine, the vignette is re-built on all platforms except for r-patched-solaris-x86: https://cran.r-project.org/web/checks/check_results_rasciidoc.html https://www.r-project.org/nosvn/R.check/r-patched-solaris-x86/rasciidoc-00check.html Maybe rasciidoc's vignette engine is somewhat superfluous, but since it works for all platforms except solaris, I kind of would like to use it. I cannot reproduce the error using rhub: ## Test environments - R-hub solaris-x86-patched-ods (r-release) ## R CMD check results ❯ On solaris-x86-patched-ods (r-release) checking CRAN incoming feasibility ... WARNING Maintainer: ‘Andreas Dominik Cullmann ’ Insufficient package version (submitted: 2.2.1, existing: 2.2.1) Days since last update: 5 ❯ On solaris-x86-patched-ods (r-release) checking top-level files ... NOTE Files ‘README.md’ or ‘NEWS.md’ cannot be checked without ‘pandoc’ being installed. ❯ On solaris-x86-patched-ods (r-release) checking examples ... NOTE Examples with CPU (user + system) or elapsed time > 5s user system elapsed rasciidoc 2.724 0.352 48.75 0 errors ✔ | 1 warning ✖ | 2 notes ✖ And I happen to have no solaris box at hand for testing. Does anybody have any suggestion on how I could tackle this? The error message says: "Can't find program `source-highlight`. Please install first (http://www.gnu.org/software/src-highlite/)." You have listed this in SystemRequirements as "recommended", but even if you had listed it as required, your build shouldn't fail if it is not found: it should test for that program, and continue on if it is not there. For example, it could output a vignette containing nothing except the warning message that source-highlight is needed to build the vignette. This is described in Section 1.6, "Writing portable packages", of Writing R Extensions. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Having shiny as an optional dependency
On 16/11/2020 4:55 a.m., Kamil Stachowski wrote: I did briefly wonder why your reply to me cited an email from Akshit Achara who asked about third party C++ API. I assumed it was because the solution to his problem and mine was the same. In the meantime, I got a reply from Kurt Hornik who said I have a top-level call soundcorrsGUI <- shiny::shinyApp (ui, server) so what I should really do is to move "shiny" from "Suggests" to "Imports". I don't want to do this, however, because the GUI is just an addition, the package is perfectly usable without it. I wrapped "soundcorrsGUI" in a function: soundcorrsGUI <- function () { if (!requireNamespace("shiny",quietly=T) || !requireNamespace("shinyjqui",quietly=T)) stop ("\"soundcorrsGUI\" requires packages \"shiny\" and \"shinyjqui\".") shiny::shinyApp (ui, server) } and this seems to solve the problem with "shiny", but now I have a problem with "shinyjqui". I actually only need one function from it: in the call to "shinyApp" above, ui <- shiny::navbarPage ([…] ui.soundchanges […]) where ui.soundchanges <- shiny::fluidPage ([…] shinyjqui::sortableCheckboxGroupInput […]) I tried wrapping "ui.soundchanges" in a function in the same way as I did with "soundcorrsGUI" but this doesn't help. Is there a way to make it work without turning "shiny" and "shinyjqui" into obligatory dependencies? If you never call shinyjqui::sortableCheckboxGroupInput except in that one place, then you should put that call within the requireNamespace test. Things I'd change there: - Do not use T, use TRUE. - Wrap *everything* that requires those packages in the requireNamespace test. - Make sure your example code in help pages never calls that function unless shiny and shinyjqui are present, by a test similar to the above but a positive one: if (requireNamespace("shiny") && requireNamespace("shinyjqui")) { # example code } Duncan Murdoch Best wishes, Kamil Stachowski On Mon, 16 Nov 2020, 00:55 Uwe Ligges, wrote: On 14.11.2020 19:53, Kamil Stachowski wrote: This is actually what I do. Before you replied, I was told to write to CRAN and explain, and they will decide whether the packages will be installed on CRAN machines. Thank you for your help nonetheless. No, I told you about a "third party C++ API" you asked for, not about an R package. R packages from BioC + CRAN should be availabe on CRAN, in that case it is a problem in your package. Best, Uwe Ligges Best, Kamil On Sat, 14 Nov 2020 at 19:39, Gábor Csárdi wrote: You probably import functions from shiny. Don't do that, refer to them with the `::` operator instead. Gabor On Sat, Nov 14, 2020 at 6:12 PM Kamil Stachowski wrote: I wrote a package that contains a graphical interface written with packages "shiny" and "shinyjqui". My package can also be used from the CLI, so I listed both "shiny" and "shinyjqui" as optional dependencies. I ran R CMD check --as-cran on my computer in R 3.6.3 and devel, and both passed without any errors or warnings. However, when I tried uploading the package to CRAN, I got this error: Error in loadNamespace(x) : there is no package called ‘shiny’ Error: unable to load R code in package ‘soundcorrs’ Execution halted ERROR: lazy loading failed for package ‘soundcorrs’ How can I fix this? [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in loadNamespace(x) : there is no package called 'formatR'
On 13/11/2020 4:32 p.m., Gábor Csárdi wrote: On Fri, Nov 13, 2020 at 9:02 PM Duncan Murdoch wrote: [...] Things may have changed since Henrik and I wrote the code, but his description matches my understanding as well (and I think he's contributed more recently than I have). The way non-Sweave vignettes work is that some packages register themselves to recognize vignette files in the vignettes directory. The default one recognizes .Rnw files as vignettes (and a few other extensions too); the knitr::rmarkdown one recognizes .Rmd files and some others. The only way for a package's registration code to be called is for it to be listed as a VignetteBuilder. See ?tools::vignetteEngine for details of the engine. Can one of you please fix WRE then? The part that says "...then knitr provides the engine but both knitr and rmarkdown are needed for using it, so both these packages need to be in the ‘VignetteBuilder’ field..." No, neither of us are members of R Core. Only R Core can edit the manuals. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in loadNamespace(x) : there is no package called 'formatR'
On 13/11/2020 1:48 p.m., Gábor Csárdi wrote: On Fri, Nov 13, 2020 at 6:10 PM Henrik Bengtsson wrote: I'm quite sure you want to use the following: Suggests: knitr, rmarkdown, formatR VignetteBuilder: knitr So this means that WRE is wrong? It says: > "Note that if, for example, a vignette has engine ‘knitr::rmarkdown’, > then knitr provides the engine but both knitr and rmarkdown are needed > for using it, so both these packages need to be in the > ‘VignetteBuilder’ field and at least suggested (as rmarkdown is only > suggested by knitr, and hence not available automatically along with > it). Many packages using knitr also need the package formatR which it > suggests and so the user package needs to do so too and include this > in ‘VignetteBuilder’." Things may have changed since Henrik and I wrote the code, but his description matches my understanding as well (and I think he's contributed more recently than I have). The way non-Sweave vignettes work is that some packages register themselves to recognize vignette files in the vignettes directory. The default one recognizes .Rnw files as vignettes (and a few other extensions too); the knitr::rmarkdown one recognizes .Rmd files and some others. The only way for a package's registration code to be called is for it to be listed as a VignetteBuilder. See ?tools::vignetteEngine for details of the engine. The rmarkdown package doesn't register any vignette engines, so it doesn't need to be in VignetteBuilder. Same for formatR. It's fairly common to have a package only used in the vignette, so you list it in Suggests. But that wouldn't apply only to rmarkdown and formatR, there are lots of other examples. However, I'd guess it's pretty common to forget to include rmarkdown and formatR, since they may not be explicitly used. Then putting them in the VignetteBuilder field will trigger an error if they are not also in Suggests. Duncan Murdoch My understanding is that R CMD build (and possibly other commands/functions as well) checks for the packages in 'VignetteBulider`, this is why you need to include rmarkdown and formatR as well. See the source code in build.R and e.g. https://bugs.r-project.org/bugzilla/show_bug.cgi?id=15775 etc. __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Import package countreg that is not on CRAN
Yes, thanks, I missed that option. On 13/11/2020 10:11 a.m., Henrik Bengtsson wrote: You can change your package from using: Imports: countreg to use: Suggests: countreg For this to work, you'll have to update your code, package tests, examples, and vignettes, to run conditionally on 'countreg' being installed, e.g. if (requireNamespace("countreg", quietly = TRUE)) { ... } Whether or not this makes sense depends on how central 'countreg' is to your package. If it is only needed optionally and for a small part of your functionality, then it's doable but if it's a mission-critical dependency, it might be too much of a hack. This is in compliance with CRAN and in agreement with what Uwe says too. I've used this myself in for instance https://cran.r-project.org/web/package=aroma.core where small parts of the functionality depend on packages not in the mainstream (= CRAN & Bioconductor) repositories. This allows those who wish to use alternative methods, if they want to go the extra mile to install them. Also, if they're popular enough it might nudge the maintainers of those enough to submit to CRAN or Bioconductor - a process that might take years, if at all. /Henrik On Fri, Nov 13, 2020 at 3:23 AM Duncan Murdoch wrote: On 13/11/2020 3:10 a.m., Jason Luo wrote: Hi, I'm submitting a new package (https://github.com/Penncil/pda/) to CRAN. It relies on some function (zerotrunc and hurdle in R/ODAP.R) from countreg ( https://rdrr.io/rforge/countreg/) , which is not on CRAN. The submission returns error as below https://win-builder.r-project.org/incoming_pretest/pda_1.0_20201113_083442/Debian/00check.log Seems the r-forge repo is identified in the DESCRIPTION Additional_repositories, but countreg is still not available. I assume this is not a rare problem but didn't find useful solutions online. Any suggestions? Thanks! If countreg is not in one of the mainstream repositories (CRAN or Bioconductor), then it may not have been subject to careful testing, so CRAN sees it as unreliable. Since your package depends on it, yours is also unreliable, so CRAN won't publish it. I don't know anything about the pda or countreg packages, so this is general advice on what you could do, and may not be applicable here: - you could take over maintenance of countreg (if its current maintainer agrees), and put in the work to get it on CRAN. - you could copy parts of countreg to your own package (if its license allows that), and drop your dependence on it. - you could substitute some other CRAN package that provides equivalent functionality and depend on that instead. - you could drop the parts of your package that need countreg, and submit a smaller package to CRAN without that dependency. - you could publicize that your package is on Github, and give up on publishing it on CRAN. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Import package countreg that is not on CRAN
On 13/11/2020 3:10 a.m., Jason Luo wrote: Hi, I'm submitting a new package (https://github.com/Penncil/pda/) to CRAN. It relies on some function (zerotrunc and hurdle in R/ODAP.R) from countreg ( https://rdrr.io/rforge/countreg/) , which is not on CRAN. The submission returns error as below https://win-builder.r-project.org/incoming_pretest/pda_1.0_20201113_083442/Debian/00check.log Seems the r-forge repo is identified in the DESCRIPTION Additional_repositories, but countreg is still not available. I assume this is not a rare problem but didn't find useful solutions online. Any suggestions? Thanks! If countreg is not in one of the mainstream repositories (CRAN or Bioconductor), then it may not have been subject to careful testing, so CRAN sees it as unreliable. Since your package depends on it, yours is also unreliable, so CRAN won't publish it. I don't know anything about the pda or countreg packages, so this is general advice on what you could do, and may not be applicable here: - you could take over maintenance of countreg (if its current maintainer agrees), and put in the work to get it on CRAN. - you could copy parts of countreg to your own package (if its license allows that), and drop your dependence on it. - you could substitute some other CRAN package that provides equivalent functionality and depend on that instead. - you could drop the parts of your package that need countreg, and submit a smaller package to CRAN without that dependency. - you could publicize that your package is on Github, and give up on publishing it on CRAN. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Strange error from CRAN on package submission
Actually I think it is a bug in the check code. I've just posted about this on the R-devel list. Duncan Murdoch On 12/11/2020 10:13 a.m., Martin Morgan wrote: This seems more like a problem with the CRAN test machine, with the movMF package installed with flexmix available but loaded with flexmix not available, maybe interacting with a caching mechanism used by the methods package to avoid re-computing methods tables? Otherwise how would movMF ever know to create the flexmix class / method? It seems like this could cause problems for the user if they installed movMV with flexmix available, but removed flexmix. This seems like a subtler variation of 'I installed package A but then removed dependency B and now A doesn't work', which could be a bug in R's remove.packages() but I tried to emulate the scenario of installing movMF and then removing flexmix in an interactive session, and then looking for the warning reported below. I was not successful, but the build report with the error is no longer available so I don't know what I'm looking for... Martin Morgan On 11/11/20, 4:44 PM, "R-package-devel on behalf of Duncan Murdoch" wrote: Here's what I think is happening. In the movMF:::.onLoad function there's a test whether flexmix is installed. If found, then it is loaded and some methods are set. (I'm not sure what caused flexmix to be installed: I didn't intentionally install it, but it ended up in there when I installed enough stuff to check Mercator.) In the R-devel --as-cran checks, some checks are run with only strong dependencies of your package visible. Somehow I think that .onLoad function sees flexmix and loads it, but then some other part of the check can't see it. A workaround is to add flexmix to your Imports clause. This is a strong enough dependency to make it visible, and the error goes away. HOWEVER, to me this is pretty clearly an R-devel bug: you have no control over methods set by packages that you don't even use, so you shouldn't have to change your dependency lists if one of them sets a method that you're using. Duncan Murdoch On 11/11/2020 3:31 p.m., Kevin R. Coombes wrote: > Oh, I forgot to mention explicitly that checking (with --as-cran) on the > development version of R on Windows also produces no errors or warnings. > > On 11/11/2020 1:39 PM, Kevin R. Coombes wrote: >> Hi Duncan, >> >> I just sent a longer version of this message, but it looks to me like >> the underlying issue is the fact that flexmix and Mercator both define >> and export "show" methods for their S4 classes. What confuses me is >> why the NAMESPACE of a package that is merely Suggest'ed by something >> several layers down the hierarchy should get attached and cause an >> issue like this one. (The attached NAMESPACE happens in current >> versions of R.) >> >> Thanks, >>Kevin >> >> On 11/11/2020 1:07 PM, Duncan Murdoch wrote: >>> Okay, I've tried testing on my Mac with R 4.0.3 and R-devel for the >>> new one, 4.0.3 for the CRAN version. >>> >>> I'm not seeing any check error with the CRAN version. I get an error >>> trying to check 0.11.4 from R-forge because I don't have flexmix >>> installed. If I take flexmix out of the Suggests list, it checks >>> with no error on 4.0.3, but I get the error you saw on R-devel when >>> checked with --as-cran. >>> >>> I tried debugging this, and narrowed it down a bit. It happens when >>> your package is installed, in particular in the do_install_source() >>> function in src/library/tools/R/install.R. But that function runs a >>> new R instance, and I didn't get to debugging that. I'll try again >>> later today if nobody else figures it out. >>> >>> Duncan Murdoch >>> >>> >>> >>> >>> On 11/11/2020 12:03 p.m., Kevin R. Coombes wrote: >>>> Hi Duncan, >>>> >>>> Oops; I didn't realize I had forgotten to push updates to the OOMPA web >>>> site. >>>> >>>> The code for Mercator is contained as part of the Thresher project in >>>> the subversion repository on R-Forge. >>>> (https://r-forge.r-project.org/projects/thresher/) It's under >>>> pkg/Mercator below that URL >>>> >>>> Thanks, >>>> Kevin >>>> >>>> On 11/11/202
Re: [R-pkg-devel] Strange error from CRAN on package submission
Okay, I've tried testing on my Mac with R 4.0.3 and R-devel for the new one, 4.0.3 for the CRAN version. I'm not seeing any check error with the CRAN version. I get an error trying to check 0.11.4 from R-forge because I don't have flexmix installed. If I take flexmix out of the Suggests list, it checks with no error on 4.0.3, but I get the error you saw on R-devel when checked with --as-cran. I tried debugging this, and narrowed it down a bit. It happens when your package is installed, in particular in the do_install_source() function in src/library/tools/R/install.R. But that function runs a new R instance, and I didn't get to debugging that. I'll try again later today if nobody else figures it out. Duncan Murdoch On 11/11/2020 12:03 p.m., Kevin R. Coombes wrote: Hi Duncan, Oops; I didn't realize I had forgotten to push updates to the OOMPA web site. The code for Mercator is contained as part of the Thresher project in the subversion repository on R-Forge. (https://r-forge.r-project.org/projects/thresher/) It's under pkg/Mercator below that URL Thanks, Kevin On 11/11/2020 11:30 AM, Duncan Murdoch wrote: Uwe suggested you suggest flexmix, but I see below you already tried that. I'd like to take a look, but I can't find your package. The existing version on CRAN gives the URL as http://oompa.r-forge.r-project.org/, but I can't see it mentioned there. Duncan Murdoch On 11/11/2020 8:44 a.m., Kevin R. Coombes wrote: Hi, I am trying to figure out how to fix warnings from two of the CRAN machines on the submission of an update to a package. The only change to my package was to add a "show" method to one of the S4 classes, which was requested by a reviewer of the paper we submitted. The inability to get this updated package into CRAN is the only thing holding up the revision (and probable acceptance) of the manuscript. The same "warnings"s were found in the previous version. The package is called Mercator, and the CRAN check results from the last version are here: https://cran.r-project.org/web/checks/check_results_Mercator.html I get warnings from the two fedora machine instances (clang and gcc). They both report Check: whether package can be installed. Result: WARN Found the following significant warnings: Warning: namespace ‘flexmix’ is not available and has been replaced > > Check: data for non-ASCII characters Result: WARN Warning: namespace 'flexmix' is not available and has been replaced by .GlobalEnv when processing object '' The relationships in the DESCRIPTION files are: 1. Mercator depends on Thresher 2. Thresher imports moVMF 3. moMVF suggests flexmix On my Windows machine, the package builds and installs with no errors or warnings even if flexmix is not available (which I believe to be the correct behavior). On R-Forge, both the Windows and LINUX versions build and install with no errors or warnings. On R-Hub, tested on multiple LINUX versions, the package builds and installs with no errors or warnings. And flexmix is still clearly available from CRAN: https://cran.r-project.org/web/packages/flexmix/index.html In the latest attempt to get things to work, I added Suggests: flexmix into the DESCRIPTION file for Mercator, but this didn't help fix the problem on CRAN. Is there anything I can do to fix this problem (other than moan here on this list and hope that CRAN can just install flexmix on those machines)? Thanks in advance for your help, Kevin [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Strange error from CRAN on package submission
Uwe suggested you suggest flexmix, but I see below you already tried that. I'd like to take a look, but I can't find your package. The existing version on CRAN gives the URL as http://oompa.r-forge.r-project.org/, but I can't see it mentioned there. Duncan Murdoch On 11/11/2020 8:44 a.m., Kevin R. Coombes wrote: Hi, I am trying to figure out how to fix warnings from two of the CRAN machines on the submission of an update to a package. The only change to my package was to add a "show" method to one of the S4 classes, which was requested by a reviewer of the paper we submitted. The inability to get this updated package into CRAN is the only thing holding up the revision (and probable acceptance) of the manuscript. The same "warnings"s were found in the previous version. The package is called Mercator, and the CRAN check results from the last version are here: https://cran.r-project.org/web/checks/check_results_Mercator.html I get warnings from the two fedora machine instances (clang and gcc). They both report Check: whether package can be installed. Result: WARN Found the following significant warnings: Warning: namespace ‘flexmix’ is not available and has been replaced > > Check: data for non-ASCII characters Result: WARN Warning: namespace 'flexmix' is not available and has been replaced by .GlobalEnv when processing object '' The relationships in the DESCRIPTION files are: 1. Mercator depends on Thresher 2. Thresher imports moVMF 3. moMVF suggests flexmix On my Windows machine, the package builds and installs with no errors or warnings even if flexmix is not available (which I believe to be the correct behavior). On R-Forge, both the Windows and LINUX versions build and install with no errors or warnings. On R-Hub, tested on multiple LINUX versions, the package builds and installs with no errors or warnings. And flexmix is still clearly available from CRAN: https://cran.r-project.org/web/packages/flexmix/index.html In the latest attempt to get things to work, I added Suggests: flexmix into the DESCRIPTION file for Mercator, but this didn't help fix the problem on CRAN. Is there anything I can do to fix this problem (other than moan here on this list and hope that CRAN can just install flexmix on those machines)? Thanks in advance for your help, Kevin [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error during automatic check for vignette re-building - The magick package is required to crop
On 05/11/2020 10:55 a.m., Pablo Fonseca wrote: Dear all, Currently I am trying to update my package (GALLO) with some small edits in the code. However, the package is not passing in the automatic check. I am getting only two warnings: The first one is a warning related with the maintainer email (my email in the case), which is correct. I really think that this is not the cause of the package failure during the automatic checking. The first one is actually about your version number: you called it version 1.0, but CRAN already has 1.0. You need to increase the number. The second warning is a problem during the vignette rebuilding. The warning is "The magick package is required to crop". In my first submission I didn't experience this error. In the recent submissions for updates, I included the magick package as a dependency of the package. However, this didn't fixed the issue. Additionally, I can't reproduce the errors in my machines, even when running R CMD check --as-cran. The error seems to be related with the lines 465-495 from the vignette. I double checked the code and introduced some small changes (such as reducing the figure dimensions). I am not sure about this one; I'd need to look at the package to check. Is it on Github? Duncan Murdoch Have ever someone faced a similar issue? However, it seems ineffective. Is there any possibility to be a false negative? I am sending bellow the links for the check logs. package GALLO_1.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests: Windows: <https://win-builder.r-project.org/incoming_pretest/GALLO_1.0_20201105_154946/Windows/00check.log> Status: 2 WARNINGs Debian: <https://win-builder.r-project.org/incoming_pretest/GALLO_1.0_20201105_154946/Debian/00check.log> Status: 2 WARNINGs Last released version's CRAN status: ERROR: 1, WARN: 11 See: <https://CRAN.R-project.org/web/checks/check_results_GALLO.html> Kind regards. Pablo Augusto de Souza Fonseca, Ph.D. Postdoctoral Fellow at University of Guelph Centre for Genetic Improvement of Livestock E pfons...@uoguelph.ca <mailto:pfons...@uoguelph.ca> W http://animalbiosciences.uoguelph.ca/abscpeople/pfonseca <http://animalbiosciences.uoguelph.ca/abscpeople/pfonseca> [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] import with except(ion)
On 30/10/2020 2:45 p.m., Göran Broström wrote: My CRAN package eha depends on the survival package, and that creates problems with innocent users: It is about the 'frailty' function (mainly). While (after 'library(eha)') f1 <- coxph(Surv(time, status) ~ age + frailty(inst), data = lung) produces what you would expect (a frailty survival analysis), the use of the coxreg function from eha f2 <- coxreg(Surv(time, status) ~ age + frailty(inst), data = lung) produces (almost) nonsense. That's because the survival::frailty function essentially returns its input and coxreg is happy with that, treats it as an ordinary numeric (or factor) covariate, and nonsense is produced, but some users think otherwise. (Maybe it would be better to introduce frailty in a separate argument?) I want to prevent this to happen, but I do not understand how to do it in the best way. I tried to move survival from Depends: to Imports: and adding import(survival, except = c(frailty, cluster)) to NAMESPACE. This had the side effect that a user must qualify the Surv function by survival::Surv, not satisfactory (similarly for other popular functions in survival). Another option I thought of was to define my own Surv function as Surv <- survival::Surv in my package, but it doesn't feel right. It seems to work, though. As you may understand from this, I am not very familiar with these issues. I have used Depends: survival for a long time and been happy with that. Any help on this is highly appreciated. I don't know if you received any private replies, but I don't see any public ones. It's not clear from your message what you would like to happen with expressions like f2 <- coxreg(Surv(time, status) ~ age + frailty(inst), data = lung) For example, would you like to generate an error, because you don't support frailty in this context? Could you clarify that? Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] checking PDF version of manual without hyperrefs or index ... ERROR
I don't know the best solution, but one workaround would be to replace "fl" in your Rd files with "fl". Duncan Murdoch On 29/10/2020 10:34 a.m., Anthony Hammond wrote: Hello, I'm attempting to upload a package to CRAN and although it passes the R CMD checks that I run, it doesn't pass the CRAN check and the response I get is pasted below. I've looked online and found numerous solutions; some made no sense to me & others didn't work. I tried downloading and putting upquote.sty and inconsolata.sty files into the latex folder. I also tried typing --no-manual in the project options under R CMD check options. Still the CRAN submission email response provided the below error. What can I do to make this go away and have my package accepted? It probably doesn't help that I know next to nothing about LaTeX. Incidentally I don't mind not having a pdf manual, so if there is a simple way to avoid the check by requesting not to have one, that'll do. Any assistance would be greatly appreciated. Kind Regards Anthony * checking PDF version of manual ... WARNING LaTeX errors when creating PDF version. This typically indicates Rd problems. LaTeX errors found: ! Package inputenc Error: Unicode char fl (U+FB02) (inputenc)not set up for use with LaTeX. See the inputenc package documentation for explanation. Type H for immediate help. * checking PDF version of manual without hyperrefs or index ... ERROR * checking for detritus in the temp directory ... OK * DONE Status: 1 ERROR, 1 WARNING, 3 NOTEs [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Warning on r-oldrel-macos-x86_64
On 25/10/2020 6:10 a.m., Helmut Schütz wrote: Dear all, we faced a warning on r-oldrel-macos-x86_64: https://cran.r-project.org/web/checks/check_results_PowerTOST.html I'm not concerned about "Pandoc (>= 1.12.3) and/or pandoc-citeproc not available" in the first 5 vignettes since there were no problem in the previous releases. However I found this on stackoverflow: https://stackoverflow.com/questions/50789125/how-to-get-an-rmarkdown-vignette-for-r-package-to-escape-cran-warnings-on-solari RolandAsc commented: "In my understanding you can only see this warning if there is an error (either because warnings are being converted to errors or because there is another subsequent error), else it just doesn't come through. This is not an answer but maybe an explanation." Seems to be correct. In my code a data.frame is assigned with a column "foo" and _without_ stringsAsFactors = FALSE. Later a function is called which requires "foo" as character. The default stringsAsFactors was changed to FALSE in R4.0.0. Hence, my question: How "old" is the old releaseon CRAN's test machines/ macos? oldrel on windows is OK. What shall we do? Re-submit with the same release-number(either add stringsAsFactors = FALSE or call the function with as.character("foo")with an explanation? By not specifying stringsAsFactors = FALSE, your vignette depends on R >= 4.0.0, so you should - state that dependency in the DESCRIPTION file, or - test for it in the vignette, or - remove the dependency by being explicit about stringsAsFactors = FALSE. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Licenses
On 22/10/2020 12:56 p.m., Marc Schwartz wrote: On Oct 22, 2020, at 12:12 PM, Duncan Murdoch wrote: On 22/10/2020 11:55 a.m., Marc Schwartz wrote: On Oct 22, 2020, at 11:19 AM, Marc Schwartz wrote: On Oct 22, 2020, at 10:21 AM, Kevin R. Coombes wrote: Hi, I am developing a package and getting a NOTE from R CMD check about licenses and ultimate dependencies on a restrictive license, which I can't figure out how to fix. My package imports flowCore, which has an Artistic-2.0 license. But flowCore imports cytolib, which has a license from the Fred Hutchinson Cancer Center that prohibits commercial use. I tried using the same license as flowCore, but still get the NOTE. Does anyone know which licenses can be used to be compatible with the Fred Hutch license? Or can I just do what flowCore apparently does and ignore the NOTE? Thanks, Kevin Hi Kevin, I have not looked at BioC's licensing requirements, but presumably, they are ok with the non-commercial use restrictions placed on users of cytolib, thus also on flowCore. If you want your package to be on CRAN, those restrictions on users are not allowed by CRAN's policy: https://cran.r-project.org/web/packages/policies.html "Such packages are not permitted to require (e.g., by specifying in ‘Depends’, ‘Imports’ or ‘LinkingTo’ fields) directly or indirectly a package or external software which restricts users or usage." Thus, you would seem to need to make a decision on hosting your package on CRAN, but without the need to import from flowCore/cytolib, or consider hosting your package on BioC, with the attendant restrictions on commercial use. Regards, Marc Schwartz Well Now that I look at: https://svn.r-project.org/R/trunk/share/licenses/license.db there are a few licenses listed there that do place restrictions on commercial use. These include some Creative Commons Non-Commercial use variants and the ACM license. Is the license DB file out of date, or is there an apparent conflict with the CRAN policy that I quoted above? Anyone with an ability to comment? Presumably CRAN would not accept the non-FOSS licenses that are listed in license.db, but R could still do computations on them, as described in ?library in the "Licenses" section. Duncan Murdoch Duncan, That is a reasonable distinction. However, upon searching CRAN with available.packages(), I came up with a list of packages that do include Non-Commercial restrictions, including CC BY-NC* and ACM licenses. There may be others that I missed visually scanning the output. There also appear to be some conflicts/inconsistencies with the 'License_restricts_use' field entry and the 'License' field in some cases, where, for example, most that have "CC BY-NC-SA 4.0" as the license, have "NA" as the entry for restricted use, rather than "yes". I am not going to list them here, as I don't want to pick on any particular package, but this does seem to point to an inconsistency between packages that are hosted on CRAN and the articulated policy... Perhaps those packages were accepted before this became a policy, and now that others depend on them, it would be too disruptive to remove them, and users are warned via the 'License_restricts_use' field entry. Why does it sometimes contain errors? That I don't know, other than blaming it on Murphy's Law. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Licenses
On 22/10/2020 11:55 a.m., Marc Schwartz wrote: On Oct 22, 2020, at 11:19 AM, Marc Schwartz wrote: On Oct 22, 2020, at 10:21 AM, Kevin R. Coombes wrote: Hi, I am developing a package and getting a NOTE from R CMD check about licenses and ultimate dependencies on a restrictive license, which I can't figure out how to fix. My package imports flowCore, which has an Artistic-2.0 license. But flowCore imports cytolib, which has a license from the Fred Hutchinson Cancer Center that prohibits commercial use. I tried using the same license as flowCore, but still get the NOTE. Does anyone know which licenses can be used to be compatible with the Fred Hutch license? Or can I just do what flowCore apparently does and ignore the NOTE? Thanks, Kevin Hi Kevin, I have not looked at BioC's licensing requirements, but presumably, they are ok with the non-commercial use restrictions placed on users of cytolib, thus also on flowCore. If you want your package to be on CRAN, those restrictions on users are not allowed by CRAN's policy: https://cran.r-project.org/web/packages/policies.html "Such packages are not permitted to require (e.g., by specifying in ‘Depends’, ‘Imports’ or ‘LinkingTo’ fields) directly or indirectly a package or external software which restricts users or usage." Thus, you would seem to need to make a decision on hosting your package on CRAN, but without the need to import from flowCore/cytolib, or consider hosting your package on BioC, with the attendant restrictions on commercial use. Regards, Marc Schwartz Well Now that I look at: https://svn.r-project.org/R/trunk/share/licenses/license.db there are a few licenses listed there that do place restrictions on commercial use. These include some Creative Commons Non-Commercial use variants and the ACM license. Is the license DB file out of date, or is there an apparent conflict with the CRAN policy that I quoted above? Anyone with an ability to comment? Presumably CRAN would not accept the non-FOSS licenses that are listed in license.db, but R could still do computations on them, as described in ?library in the "Licenses" section. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN pending status , left up in the air
On 19/10/2020 1:05 p.m., Spencer Graves wrote: On 2020-10-19 10:34, Rafael H. M. Pereira wrote: Thank you Dirk and Hugo for your responses. I guess I'll just have to be patient and wait. I can only imagine how the CRAN team is overwhelmed by the exponential growth of package submissions. I wonder, though, whether the CRAN maintainers and the R community more broadly are thinking about alternatives to deal with such growing demand without compromising the speed and consistency/quality of package development. Expanding the team of CRAN maintainers would be the most obvious solution but I confess I'm not familiar enough with the whole process to assess what would be the best routes of action to tackle this bottleneck. From my experience, it looks to me like their primary approach to handling the increased volume has been to improve automation. In the spirit of brainstorming, I'd like to share other ideas on this: MAKE IT EASY FOR A USER TO CHECK A DIFF FILE OF "Writing R Extensions" COMPARING THE CURRENT VERSION WITH ANY PREVIOUS ONE. That's already pretty easy on the sources, using svn diff. The user just needs to be comfortable using svn. For example, assuming you have R-devel checked out, run this to see what's changed since Jan 1, 2020: svn diff -r {2020-01-01} doc/manual/R-exts.texi You can do it without checking out a copy with some more typing: svn diff -r {2020-01-01} --old=https://svn.r-project.org/R/trunk/doc/manual/R-exts.texi There are probably online web services that do this, but I do have it checked out, so I'm not very interested in them. For example, every article on Wikipedia has a "View History" tab. That lists the dates of all the revisions with a terse summary of what was changed in each. I can click on any two and then click "Compare" to see all the changes in that period. I'm not going to reread every word of "Writing R Extensions" every time I submit something to CRAN. However, I would be willing to review a diff file if it were easy for me to do that. (And I'm NOT going to create my own private file copy of "Writing R Extensions" and manually create such a diff file.) Now you've got it. IMPROVE THE COLLABORATION BETWEEN THE CRAN TEAM AND OTHER DOCUMENTATION OF HOW TO PREPARE A PACKAGE FOR CRAN I know two sources of information on that: * Wickham and Bryan, R Packages (https://r-pkgs.org). I created a "cran-comments.md" file based on their recommendations, and missed their comment that it should be in ".Rbuildignore". My latest CRAN submission was rejected partly because of that. * Colin Fay, "Preparing your package for a CRAN submission" (https://github.com/ThinkR-open/prepare-for-cran). These instructions follow Wickham and Bryan in recommending "devtools::revdep_check()". Sadly, "revdep_check" is not currently in devtools but in a package called revdepcheck. Worse, that package is not available on CRAN and appears twice on GitHub. The original by bbolker has not been updated in 5 years. The version that is currently maintained is "https://github.com/r-lib/revdepcheck;. Fortunately, Hadley Wickham is the leading contributor to the latter, so writing him may help correct that infelicity, but I should also write to Colin Fay. Keeping documentation up to date is hard, and maintaining a productive collaboration is even harder. I don't think even writing the suggestion in ALL CAPS is enough to bring this about ;-). CRAN REVIEW GROUPS: There are now 41 different "CRAN Task Views". We could ask the maintainer of each Task View to try to recruit a committee around each one to discuss coverage and integration. Each committee could be asked to coordinate via email and in virtual meetings. They could be asked to pick 3 standard times for their virtual meetings, so anyone in the world would not always be excluded from a meeting that was 3 AM their time. Each package maintainer would be asked to specify at least one "Task View" for each package and be willing to discuss overlap, etc., with others. This might be a topic for the next useR conference. I would suggest a more modest goal: pick one task view in which you have an interest, and work to improve it. Then move on to the next one... Most of the contributors to R are reasonable people, but they have their own priorities. If you can make it easier for them to achieve their priorities, they'll appreciate it. If you ask them to change their priorities, they might not. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] puzzling CRAN rejection
On 13/10/2020 5:33 a.m., Iñaki Ucar wrote: On Tue, 13 Oct 2020 at 01:47, Ben Bolker wrote: On 10/12/20 7:37 PM, Duncan Murdoch wrote: On 12/10/2020 6:51 p.m., Ben Bolker wrote: On 10/12/20 6:36 PM, Duncan Murdoch wrote: On 12/10/2020 6:14 p.m., Ben Bolker wrote: I'd say a mismatch in saved output isn't a small problem, it's either a too-sensitive test or something serious. Duncan Murdoch That's fair enough, but it would be nice if (1) this were a NOTE and I don't think so. As I said, I think it should be marked as an ERROR. OK. But it would probably be wise (if the CRAN maintainers actually wanted to do this) to crank it up from silent -> NOTE -> WARNING -> ERROR over the course of several releases so as not to have widespread test failures on CRAN right away ... Do you think so? Why would you put saved results into the package unless you want to test if they match? My point was just that it would be disruptive to switch the severity of such mismatches from 'message, no NOTE' to 'ERROR' in a single step - I'd imagine it could lead to a very large number of CRAN packages suddenly failing their tests. Hold on, are we sure this is detected at all? The result of the tests is reported as OK. The "singular fit" message goes to stderr, so my guess is that it is not compared against the saved output at all. It is reported in the 00check.log file; I gave the link to the report. I think it's a bug in the check code that the check log reports OK at the end, when (what should be) a fatal error has been displayed. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] puzzling CRAN rejection
On 12/10/2020 6:51 p.m., Ben Bolker wrote: On 10/12/20 6:36 PM, Duncan Murdoch wrote: On 12/10/2020 6:14 p.m., Ben Bolker wrote: I'd say a mismatch in saved output isn't a small problem, it's either a too-sensitive test or something serious. Duncan Murdoch That's fair enough, but it would be nice if (1) this were a NOTE and I don't think so. As I said, I think it should be marked as an ERROR. OK. But it would probably be wise (if the CRAN maintainers actually wanted to do this) to crank it up from silent -> NOTE -> WARNING -> ERROR over the course of several releases so as not to have widespread test failures on CRAN right away ... Do you think so? Why would you put saved results into the package unless you want to test if they match? Honestly, I thought this had always been a fatal error. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] puzzling CRAN rejection
On 12/10/2020 6:14 p.m., Ben Bolker wrote: I'd say a mismatch in saved output isn't a small problem, it's either a too-sensitive test or something serious. Duncan Murdoch That's fair enough, but it would be nice if (1) this were a NOTE and I don't think so. As I said, I think it should be marked as an ERROR. Duncan Murdoch (2) it were made explicit in the CRAN policy that, *except by special exception*, an unresolved NOTE is grounds for rejection. This is broadly understood by experienced package maintainers but must sometimes come as a shock to newbies ... __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] puzzling CRAN rejection
On 12/10/2020 5:17 p.m., Ben Bolker wrote: On 10/12/20 4:40 PM, Duncan Murdoch wrote: There's this one in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/00check.log: Comparing 'lmer-1.Rout' to 'lmer-1.Rout.save' ...428d427 < boundary (singular) fit: see ?isSingular 430d428 < boundary (singular) fit: see ?isSingular Those messages about the singular fit show up in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/examples_and_tests/tests_i386/lmer-1.Rout but not in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/examples_and_tests/tests_i386/lmer-1.Rout.save The difference also doesn't show up in the x64 versions of the files. OK, thanks. I did notice this in passing (I think), but I got confused by the format. (Also, it doesn't even rise to the level of a NOTE ...) Yes, failing to match saved test output should be a fatal error, but isn't marked as one. It took me a while to localize the problem (line numbers have to be computing _after_ throwing away the R header info, see source code of tools::Rdiff()). One of my favourite programs back in the days when I used Windows was Beyond Compare (https://www.scootersoftware.com/). They've had a Mac version for a while now; it works well too (though I kind of prefer the old Windows UI a bit for some reason). It made it really easy to find this difference, once I figured out which files to compare. I didn't even recognize the line numbers in the CRAN report as line numbers at first. Having spent this long reading tea leaves, I think I'm going to write to the CRAN maintainers for clarification. * Refactoring all the tests to decrease the testing time significantly is certainly possible (at worst I can make a lot of stuff conditionally skipped on CRAN), but would be a nuisance that I'd rather save for the next release if possible. * Eliminating the two lines of variable output is easy, but it's mildly annoying to update the version number for this small a correction ... I'd say a mismatch in saved output isn't a small problem, it's either a too-sensitive test or something serious. Duncan Murdoch Looks like from now on there will only be odd-numbered releases of lme4 on CRAN, since I seem guaranteed to make trivial errors with my first (odd-numbered) try each time and have to bump the version number when fixing them ... Ben Bolker Duncan Murdoch On 12/10/2020 4:03 p.m., Ben Bolker wrote: Before I risk wasting the CRAN maintainers' time with a query, can anyone see what I'm missing here? Everything I can see looks OK, with the possible exception of the 'NA' result for "CRAN incoming feasibility" on r-devel-windows-ix86+x86_64 (which surely isn't my fault???) Any help appreciated, as always. Ben Bolker = Dear maintainer, package lme4_1.1-24.tar.gz does not pass the incoming checks automatically, please see the following pre-tests: Windows: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/00check.log> Status: OK Debian: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Debian/00check.log> Status: OK Last released version's CRAN status: ERROR: 2, NOTE: 5, OK: 5 See: <https://CRAN.R-project.org/web/checks/check_results_lme4.html> Last released version's additional issues: gcc-UBSAN <https://www.stats.ox.ac.uk/pub/bdr/memtests/gcc-UBSAN/lme4> CRAN Web: <https://cran.r-project.org/package=lme4> Please fix all problems and resubmit a fixed version via the webform. If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list: <https://stat.ethz.ch/mailman/listinfo/r-package-devel> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain. More details are given in the directory: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/> The files will be removed after roughly 7 days. *** Strong rev. depends ***: afex agRee altmeta aods3 arm ARTool bapred bayesammi BayesLN BayesSenMC baystability BBRecapture BClustLonG BFpack blme blmeco blocksdesign BradleyTerry2 buildmer cAIC4 car carcass cgam chngpt ciTools clickR climwin CLME clusteredinterference clusterPower CMatching cpr cvms DClusterm dfmeta DHARMa diagmeta difR doremi eda4treeR EdSurvey effects embed epr ESTER ez faraway faux fence finalfit fishmethods fullfact gamm4 geex GHap glmertree glmmEP GLMMRR glmmsr glmmTMB GLMpack gorica groupedstats gtheory gvcR HelpersMG HeritSeq hmi iccbeta IDmeasurer IMTest inferference influence.ME intRvals isni jlctree joineRmeta joineRML JointModel jomo jstable JWileymisc KenSyn lefko3 lmem.qtler LMERConvenienceFunctions lmerTest lmSupport longpower LSAmitR macc MAGNAMWAR manymodelr MargCond marked
Re: [R-pkg-devel] puzzling CRAN rejection
There's this one in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/00check.log: Comparing 'lmer-1.Rout' to 'lmer-1.Rout.save' ...428d427 < boundary (singular) fit: see ?isSingular 430d428 < boundary (singular) fit: see ?isSingular Those messages about the singular fit show up in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/examples_and_tests/tests_i386/lmer-1.Rout but not in https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/examples_and_tests/tests_i386/lmer-1.Rout.save The difference also doesn't show up in the x64 versions of the files. Duncan Murdoch On 12/10/2020 4:03 p.m., Ben Bolker wrote: Before I risk wasting the CRAN maintainers' time with a query, can anyone see what I'm missing here? Everything I can see looks OK, with the possible exception of the 'NA' result for "CRAN incoming feasibility" on r-devel-windows-ix86+x86_64 (which surely isn't my fault???) Any help appreciated, as always. Ben Bolker = Dear maintainer, package lme4_1.1-24.tar.gz does not pass the incoming checks automatically, please see the following pre-tests: Windows: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Windows/00check.log> Status: OK Debian: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/Debian/00check.log> Status: OK Last released version's CRAN status: ERROR: 2, NOTE: 5, OK: 5 See: <https://CRAN.R-project.org/web/checks/check_results_lme4.html> Last released version's additional issues: gcc-UBSAN <https://www.stats.ox.ac.uk/pub/bdr/memtests/gcc-UBSAN/lme4> CRAN Web: <https://cran.r-project.org/package=lme4> Please fix all problems and resubmit a fixed version via the webform. If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list: <https://stat.ethz.ch/mailman/listinfo/r-package-devel> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain. More details are given in the directory: <https://win-builder.r-project.org/incoming_pretest/lme4_1.1-24_20201012_210730/> The files will be removed after roughly 7 days. *** Strong rev. depends ***: afex agRee altmeta aods3 arm ARTool bapred bayesammi BayesLN BayesSenMC baystability BBRecapture BClustLonG BFpack blme blmeco blocksdesign BradleyTerry2 buildmer cAIC4 car carcass cgam chngpt ciTools clickR climwin CLME clusteredinterference clusterPower CMatching cpr cvms DClusterm dfmeta DHARMa diagmeta difR doremi eda4treeR EdSurvey effects embed epr ESTER ez faraway faux fence finalfit fishmethods fullfact gamm4 geex GHap glmertree glmmEP GLMMRR glmmsr glmmTMB GLMpack gorica groupedstats gtheory gvcR HelpersMG HeritSeq hmi iccbeta IDmeasurer IMTest inferference influence.ME intRvals isni jlctree joineRmeta joineRML JointModel jomo jstable JWileymisc KenSyn lefko3 lmem.qtler LMERConvenienceFunctions lmerTest lmSupport longpower LSAmitR macc MAGNAMWAR manymodelr MargCond marked mbest MDMR mediation MEMSS merDeriv merTools meta metamisc metan metaplus Metatron micemd MiRKAT misty mixAK MixedPsy MixMAP MixRF MLID mlma mlmRev mlVAR MMeM multiDimBio multil evelTools MultiRR MultisiteMediation mumm mvMISE MXM nanny omics OptimClassifier pamm panelr paramhetero PBImisc pbkrtest pcgen pedigreemm Phenotype phyr piecewiseSEM Plasmode PLmixed powerbydesign powerlmm predictmeans PrevMap prLogistic psfmi ptmixed qape r2mlm raincin Rcmdr refund reghelper regplot REndo reproducer rewie RLRsim robustBLME robustlmm rockchalk rosetta rpql rptR rr2 RRreg rsq rstanarm rstap rties RVAideMemoire RVFam sae semEff siland simr sjstats skpr SlaPMEG smicd SoyNAM SPCDAnalyze specr SPreFuGED squid stability standardize statgenGxE statgenSTA StroupGLMM structree Surrogate surrosurv swissMrP TcGSA themetagenomics tidygate tidyMicro tramME tukeytrend userfriendlyscience varTestnlme VCA VetResearchLMM warpMix WebPower welchADF WeMix Best regards, CRAN teams' auto-check service Flavor: r-devel-windows-ix86+x86_64 Check: CRAN incoming feasibility, Result: NA Maintainer: 'Ben Bolker ' Flavor: r-devel-windows-ix86+x86_64 Check: Overall checktime, Result: NOTE Overall checktime 23 min > 10 min Flavor: r-devel-linux-x86_64-debian-gcc Check: CRAN incoming feasibility, Result: Note_to_CRAN_maintainers Maintainer: 'Ben Bolker ' __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Procedure for the transfer or an R package maintainership
On 11/10/2020 10:20 a.m., Al T wrote: Dear R-devel-community! The current maintainer of the 'nephro' R-package (https://cran.r-project.org/web/packages/nephro/index.html), Cristian Pattaro, asked me if I'm willing to take over this work. I'm very interested in stepping in but since I've never done something similar before I'd like to ask for some advice. I read on stackoverflow (https://stackoverflow.com/questions/39223320/transferring-maintainership-of-an-r-package-on-cran) that the standard procedure is (in the following order): * a message from the old maintainers' email address stating the change in maintainership * a change in the DESCRIPTION file with the address of the new maintainer information. But I'm interested in the steps afterwards (see also https://stackoverflow.com/questions/64299121/what-do-i-have-to-do-if-im-taking-over-the-maintenance-of-a-gnu-r-package-publi): * How can I 'link' this package to a new repository on the version control platform of my choice (e.g. https://gitlab.com/ or https://github.com)? Just change any documentation in the package that points to the wrong place. In particular, this will include the URL: field in the DESCRIPTION file, but it may show up in help pages or vignettes as well. * How do the package update procedures on CRAN work (e.g. if I want to create a new version of the package)? That's described quite clearly here: https://cran.r-project.org/web/packages/policies.html And you *must* create a new version of the package for the maintainer change to take effect. Depending on how many other changes you are planning for it, you could make the minimal changes now, and bigger ones a month or two later, or wait until everything is ready. * How does the update procedure for the repository on r-forge (in my case: https://r-forge.r-project.org/projects/nephro/) work? Is this identical to the CRAN-package? r-forge uses Subversion, so to update the package there, just change your working copy and commit the changes. You'll need to be registered as a project member with sufficient privileges to do this; that registration needs to be done by someone with admin permissions. I looked into https://cran.r-project.org/web/packages/policies.html as well https://cran.r-project.org/doc/manuals/R-exts.html but I couldn't find an answer to my questions. Hopefully I've added enough info for you. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Static vignettes / bibtex warning in Debian
That WARNING is unlikely to be related to the vignette. The problem is that you have Depends/Imports/Suggests for some R package that does the same for the bibtex package (possibly with one ore more other indirect steps). Since this is only on one platform, it's probably not something you need to worry about: that other package has probably been updated to drop the bibtex dependence, but Debian hasn't got the update yet. I don't know how you determine which is the "guilty" package. Maybe there are more hints in the check log? Duncan Murdoch On 10/10/2020 12:53 p.m., Candia, Julian (NIH/NCI) [E] via R-package-devel wrote: Hello, I’m trying to include a static vignette by embedding a pdf document in latex. The procedure is very simple and is discussed here: http://www.markvanderloo.eu/yaRb/2019/01/11/add-a-static-pdf-vignette-to-an-r-package It essentially boils down to creating a .Rnw file with the following content: \documentclass{article} \usepackage{pdfpages} %\VignetteIndexEntry{author2019mypaper} \begin{document} \includepdf[pages=-, fitpaper=true]{mypaper.pdf} \end{document} I built and checked the package with no issues. However, upon uploading to CRAN, Debian generates 1 warning: Flavor: r-devel-linux-x86_64-debian-gcc Check: package dependencies, Result: WARNING Requires (indirectly) orphaned package: 'bibtex' I don’t know how to go around this issue. Essentially: how to add a static (pdf) vignette to a package that will pass all CRAN checks? Your advice is much appreciated. Thanks, Julián [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] rlang not available on Windows builder machine with R-devel
On 09/10/2020 8:05 a.m., Ayala Hernandez, Rafael wrote: Dear all, My NutrienTrackeR package has recently failed to install on the Windows builder machine with R-devel, with an error saying 'rlang' is not available (https://www.r-project.org/nosvn/R.check/r-devel-windows-ix86+x86_64/NutrienTrackeR-00install.html). Is this something I could fix on my side, or is it just on the builder machine's side? Looks to me like it's just the CRAN machine that had a glitch. I wouldn't worry about it unless CRAN asks you to do something, or you're submitting a new version. In the latter case, mention the failure in your submission message and say that you're assuming it's a problem on their end. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] is R CMD build --compact-vignettes working as expected?
When I tried that on MacOS, it did the gs compression with gs_quality set to "none", which does nothing. I don't know what quality CRAN uses, but for me setting the environment variable GS_QUALITY=screen made a big difference. Duncan Murdoch On 08/10/2020 11:10 a.m., John Fox wrote: Dear Ben, Actually, what I used was --compact-vignettes="both", with qpdf and gs installed on my Windows and Mac machines, and that apparently didn't work for me. Best, John On 2020-10-07 10:06 p.m., Ben Bolker wrote: OK, I think I see the problem. tl;dr use --compact-vignettes="both" when building the vignettes. --compact-vignettes by default only tries qpdf. when the R CMD check --as-cran is run, it tries both qpdf and gs. Since gs (apparently, in this case) compresses more aggressively than qpdf, it succeeds in compressing further, and the check complains. From R CMD build --help: --compact-vignettes= try to compact PDF files under inst/doc: "no" (default), "qpdf", "gs", "gs+qpdf", "both" --compact-vignettes same as --compact-vignettes=qpdf I don't know if this is worth documenting somewhere, or modifying the behaviour to make "both" the default for --compact-vignettes ? On 10/7/20 8:35 PM, Duncan Murdoch wrote: On 07/10/2020 8:32 p.m., Ben Bolker wrote: Thanks for the tip, I'll take a look. Given that three relatively experienced package authors all seem to have experienced similar issues, it seems that maybe this is worth figuring out/maybe I'm not just doing something boneheaded. Just to clarify: I've never noticed the problem you mention. I just know how to debug R CMD build. Duncan cheers Ben On 10/7/20 8:31 PM, Duncan Murdoch wrote: I don't know the answer to your question, but you can debug the --compact-vignettes option as follows. debug(tools::compactPDF) tools:::.build_packages(c("--compact-vignettes", "pkgdir")) where "pkgdir" is the directory of the source of your package. Add extra options to the build as separate elements of the argument to .build(): this function is called after args have been parsed. When I do that, I see that it rejects the compaction, because none of mine benefit much from it: it wants at least a 10% and 10K reduction. But Ben's example met those criteria. When I trick it into accepting the compaction, it does put the compacted PDF into the tarball. Duncan Murdoch On 07/10/2020 6:03 p.m., John Fox wrote: Dear Ben, On 2020-10-07 5:26 p.m., Ben Bolker wrote: I hope so too. The (annoying) workaround is to compact the vignette yourself (using qpdf directly or using tools::compactPDF), then use no-build-vignettes. The problem there is whatever's supposed to happen with building vignette indices. The ugly workaround, I guess, is to build the tarball, compact the vignettes oneself, then *replace* them in the tarball. (Obviously I can automate that, but it seems as though it would be unnecessary if I knew what was going on ...) I've used both of these workarounds and agree that it would be nice to avoid them. After all, what is the --compact-vignettes argument for? Best, John cheers Ben On 10/7/20 4:10 PM, John Fox wrote: Dear Ben, I was hoping that someone would pick up on this problem, because I've experienced the same issue of --compact-vignettes apparently ignored, e.g., with the Rcmdr package under R 4.0.2 on both macOS and Windows. Best, John John Fox, Professor Emeritus McMaster University Hamilton, Ontario, Canada web: https://socialsciences.mcmaster.ca/jfox/ On 2020-10-05 1:09 p.m., Ben Bolker wrote: Am I confused, or doing something wrong, or ... ? I have qpdf installed, and am running R CMD build with --compact-vignettes, but the PDF in the tarball doesn't seem to be compressed despite the fact that the output messages say "compacting vignettes ..." $ R CMD build --compact-vignettes lme4 * checking for file ‘lme4/DESCRIPTION’ ... OK * preparing ‘lme4’: * checking DESCRIPTION meta-information ... OK * cleaning src * installing the package to process help pages * saving partial Rd database * creating vignettes ... OK Warning: ‘inst/doc’ files ‘lmerperf.html’, ‘lmer.pdf’, ‘PLSvGLS.pdf’, ‘Theory.pdf’ ignored as vignettes have been rebuilt. Run R CMD build with --no-build-vignettes to prevent rebuilding. * compacting vignettes and other PDF files * cleaning src * checking for LF line-endings in source and make files and shell scripts * checking for empty or unneeded directories * building ‘lme4_1.1-24.tar.gz’ The copy of lmer.pdf in the resulting tarball is 900K or so: $ tar ztvf lme4_1.1-24.tar.gz lme4/inst/doc/lmer.pdf -rw-r--r-- bolker/bolker 907022 2020-10-05 12:59 lme4/inst/doc/lmer.pdf The previously built (and manually compacte
Re: [R-pkg-devel] is R CMD build --compact-vignettes working as expected?
On 07/10/2020 8:32 p.m., Ben Bolker wrote: Thanks for the tip, I'll take a look. Given that three relatively experienced package authors all seem to have experienced similar issues, it seems that maybe this is worth figuring out/maybe I'm not just doing something boneheaded. Just to clarify: I've never noticed the problem you mention. I just know how to debug R CMD build. Duncan cheers Ben On 10/7/20 8:31 PM, Duncan Murdoch wrote: I don't know the answer to your question, but you can debug the --compact-vignettes option as follows. debug(tools::compactPDF) tools:::.build_packages(c("--compact-vignettes", "pkgdir")) where "pkgdir" is the directory of the source of your package. Add extra options to the build as separate elements of the argument to .build(): this function is called after args have been parsed. When I do that, I see that it rejects the compaction, because none of mine benefit much from it: it wants at least a 10% and 10K reduction. But Ben's example met those criteria. When I trick it into accepting the compaction, it does put the compacted PDF into the tarball. Duncan Murdoch On 07/10/2020 6:03 p.m., John Fox wrote: Dear Ben, On 2020-10-07 5:26 p.m., Ben Bolker wrote: I hope so too. The (annoying) workaround is to compact the vignette yourself (using qpdf directly or using tools::compactPDF), then use no-build-vignettes. The problem there is whatever's supposed to happen with building vignette indices. The ugly workaround, I guess, is to build the tarball, compact the vignettes oneself, then *replace* them in the tarball. (Obviously I can automate that, but it seems as though it would be unnecessary if I knew what was going on ...) I've used both of these workarounds and agree that it would be nice to avoid them. After all, what is the --compact-vignettes argument for? Best, John cheers Ben On 10/7/20 4:10 PM, John Fox wrote: Dear Ben, I was hoping that someone would pick up on this problem, because I've experienced the same issue of --compact-vignettes apparently ignored, e.g., with the Rcmdr package under R 4.0.2 on both macOS and Windows. Best, John John Fox, Professor Emeritus McMaster University Hamilton, Ontario, Canada web: https://socialsciences.mcmaster.ca/jfox/ On 2020-10-05 1:09 p.m., Ben Bolker wrote: Am I confused, or doing something wrong, or ... ? I have qpdf installed, and am running R CMD build with --compact-vignettes, but the PDF in the tarball doesn't seem to be compressed despite the fact that the output messages say "compacting vignettes ..." $ R CMD build --compact-vignettes lme4 * checking for file ‘lme4/DESCRIPTION’ ... OK * preparing ‘lme4’: * checking DESCRIPTION meta-information ... OK * cleaning src * installing the package to process help pages * saving partial Rd database * creating vignettes ... OK Warning: ‘inst/doc’ files ‘lmerperf.html’, ‘lmer.pdf’, ‘PLSvGLS.pdf’, ‘Theory.pdf’ ignored as vignettes have been rebuilt. Run R CMD build with --no-build-vignettes to prevent rebuilding. * compacting vignettes and other PDF files * cleaning src * checking for LF line-endings in source and make files and shell scripts * checking for empty or unneeded directories * building ‘lme4_1.1-24.tar.gz’ The copy of lmer.pdf in the resulting tarball is 900K or so: $ tar ztvf lme4_1.1-24.tar.gz lme4/inst/doc/lmer.pdf -rw-r--r-- bolker/bolker 907022 2020-10-05 12:59 lme4/inst/doc/lmer.pdf The previously built (and manually compacted) version of lmer.pdf in the tarball is 500K: $ ls -l lme4/inst/doc/lmer.pdf -rw-r--r-- 1 bolker bolker 495199 Oct 3 22:15 lme4/inst/doc/lmer.pdf Is 'R CMD build' confused by the presence of a pre-built PDF in the inst/doc directory? Or am I somehow mistaken about how this is supposed to work? I would just use --no-build-vignettes and submit the tarball with the previously built/compressed PDF, but I'm trying to avoid a "Package has a VignetteBuilder field but no prebuilt vignette index" NOTE, which I assume is missing because I built without building vignettes ... ? As always, enlightenment is welcome. cheers Ben Bolker __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] is R CMD build --compact-vignettes working as expected?
I don't know the answer to your question, but you can debug the --compact-vignettes option as follows. debug(tools::compactPDF) tools:::.build_packages(c("--compact-vignettes", "pkgdir")) where "pkgdir" is the directory of the source of your package. Add extra options to the build as separate elements of the argument to .build(): this function is called after args have been parsed. When I do that, I see that it rejects the compaction, because none of mine benefit much from it: it wants at least a 10% and 10K reduction. But Ben's example met those criteria. When I trick it into accepting the compaction, it does put the compacted PDF into the tarball. Duncan Murdoch On 07/10/2020 6:03 p.m., John Fox wrote: Dear Ben, On 2020-10-07 5:26 p.m., Ben Bolker wrote: I hope so too. The (annoying) workaround is to compact the vignette yourself (using qpdf directly or using tools::compactPDF), then use no-build-vignettes. The problem there is whatever's supposed to happen with building vignette indices. The ugly workaround, I guess, is to build the tarball, compact the vignettes oneself, then *replace* them in the tarball. (Obviously I can automate that, but it seems as though it would be unnecessary if I knew what was going on ...) I've used both of these workarounds and agree that it would be nice to avoid them. After all, what is the --compact-vignettes argument for? Best, John cheers Ben On 10/7/20 4:10 PM, John Fox wrote: Dear Ben, I was hoping that someone would pick up on this problem, because I've experienced the same issue of --compact-vignettes apparently ignored, e.g., with the Rcmdr package under R 4.0.2 on both macOS and Windows. Best, John John Fox, Professor Emeritus McMaster University Hamilton, Ontario, Canada web: https://socialsciences.mcmaster.ca/jfox/ On 2020-10-05 1:09 p.m., Ben Bolker wrote: Am I confused, or doing something wrong, or ... ? I have qpdf installed, and am running R CMD build with --compact-vignettes, but the PDF in the tarball doesn't seem to be compressed despite the fact that the output messages say "compacting vignettes ..." $ R CMD build --compact-vignettes lme4 * checking for file ‘lme4/DESCRIPTION’ ... OK * preparing ‘lme4’: * checking DESCRIPTION meta-information ... OK * cleaning src * installing the package to process help pages * saving partial Rd database * creating vignettes ... OK Warning: ‘inst/doc’ files ‘lmerperf.html’, ‘lmer.pdf’, ‘PLSvGLS.pdf’, ‘Theory.pdf’ ignored as vignettes have been rebuilt. Run R CMD build with --no-build-vignettes to prevent rebuilding. * compacting vignettes and other PDF files * cleaning src * checking for LF line-endings in source and make files and shell scripts * checking for empty or unneeded directories * building ‘lme4_1.1-24.tar.gz’ The copy of lmer.pdf in the resulting tarball is 900K or so: $ tar ztvf lme4_1.1-24.tar.gz lme4/inst/doc/lmer.pdf -rw-r--r-- bolker/bolker 907022 2020-10-05 12:59 lme4/inst/doc/lmer.pdf The previously built (and manually compacted) version of lmer.pdf in the tarball is 500K: $ ls -l lme4/inst/doc/lmer.pdf -rw-r--r-- 1 bolker bolker 495199 Oct 3 22:15 lme4/inst/doc/lmer.pdf Is 'R CMD build' confused by the presence of a pre-built PDF in the inst/doc directory? Or am I somehow mistaken about how this is supposed to work? I would just use --no-build-vignettes and submit the tarball with the previously built/compressed PDF, but I'm trying to avoid a "Package has a VignetteBuilder field but no prebuilt vignette index" NOTE, which I assume is missing because I built without building vignettes ... ? As always, enlightenment is welcome. cheers Ben Bolker __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] A note from CRAN package check
On 28/09/2020 11:10 a.m., Yicheng Yang wrote: Dear all, I notice that there is a note from CRAN package check results for our package as follows: Version: 1.4.1 Check: installed package size Result: NOTE installed size is 5.3Mb sub-directories of 1Mb or more: libs 5.2Mb The link to our package is here: https://cran.r-project.org/web/packages/FHDI/index.html So I know the problem is that our package exceeds the maximum size of 5MB. But I don't understand where we can trim our package to meet this requirement. We can not modify the source codes (C++ file and R files) because we may lose some functionalities of the package. Do we have to trim R documents? Any suggestions? Reducing documents isn't going to be enough: most of the problem is in libs. You should investigate whether the libs you need are provided by other packages, or (if you wrote them yourself), whether it's really true that you can't shrink them. You could consider splitting your package into two packages, if there are several libs. You could try to explain to CRAN why you need to use all that space. There are packages on CRAN that exceed the 5Mb limit. Finally, you could choose to distribute your package in some other way besides CRAN. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN Windows failure due to old pandoc ?
On 26/09/2020 12:54 p.m., Dirk Eddelbuettel wrote: On 26 September 2020 at 11:50, Duncan Murdoch wrote: | On 26/09/2020 9:14 a.m., Dirk Eddelbuettel wrote: | > | > I had a submission fail and bomb with this error on Windows: | > | >Flavor: r-devel-windows-ix86+x86_64 | >Check: re-building of vignette outputs, Result: WARNING | > Error(s) in re-building vignettes: | > --- re-building 'vignettefilename.Rmd' using rmarkdown | > pandoc.exe: unrecognized option `--lua-filter' | > unrecognized option `--lua-filter' | > unrecognized option `--lua-filter' | > Try pandoc.exe --help for more information. | > Error: processing vignette 'vignettefilename.Rmd' failed with diagnostics: | > pandoc document conversion failed with error 2 | > --- failed re-building 'vignettefilename.Rmd' | > | > SUMMARY: processing the following file failed: | >'vignettefilename.Rmd' | > | > Error: Vignette re-building failed. | > Execution halted | > | > This looks like a host configuration problem: | > | >edd@rob:~$ pandoc --help | grep lua | > -L SCRIPTPATH --lua-filter=SCRIPTPATH | >edd@rob:~$ pandoc --version | head -1 | >pandoc 2.9.2.1 | >edd@rob:~$ | > | > Can we expect CRAN to update its pandoc binary? Or will we have to 'for now' | > rely on 'reply-all', explaining to CRAN that the failure is from their end? | > | > As they in the press, I had reached out to CRAN but they 'have not yet | > responded to requests for comments' as we know they're busy. Anybody seen | > this error though? | | I haven't seen that one, but I regularly get errors in rgl and tables | because of missing or insufficient pandoc on some systems. I added | lines like | |SystemRequirements: pandoc (>= 1.12.3) for vignettes | | to DESCRIPTION to state the Pandoc version, added rmarkdown to the | Suggests list, and added code like this to the start of HTML vignettes: | | ```{r echo = FALSE} | if (!requireNamespace("rmarkdown") || | !rmarkdown::pandoc_available("1.12.3")) { |warning("This vignette requires pandoc version 1.12.3; code will not | run in older versions.") |knitr::opts_chunk$set(eval = FALSE) | } | ``` Good point! Brooke and I actually recommend exactly that---conditional vignette builds---in our R Journal paper on drat for data repos (via Suggests). But (because of the overall fragility of these pipelines as well as preference for generally lighter setups) my vignettes actually tend to not even run code. I mostly just use three backticks followed by the language for which I desire highlighting from pandoc, i.e. ```r or ```c++. Hmmm, that's strange. From what I can see you only get the --lua-filter if pandoc 2.0 is available: https://github.com/rstudio/rmarkdown/blob/66d27e09befd5f0579f0f4e27c4b9325284b9b15/R/pandoc.R#L719 I think this is the current rmarkdown version. Duncan Murdoch Maybe we would need to pass the minimum version check into rmarkdown as an option so that rmarkdown knows not to ask for `--lua-filter` on setups where rmarkdown knows pandoc is too old? Or maybe make the vignette builder barf? | This makes the test happy, though it also makes the vignette pretty | useless on systems that don't meet the stated requirements. Since | SystemRequirements is free-form, I can see why CRAN doesn't do automatic | interpretation of it, but it would be nice if they did. Alas, the free-form requirement has been a constraint for a long time indeed. Dirk __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] CRAN Windows failure due to old pandoc ?
On 26/09/2020 9:14 a.m., Dirk Eddelbuettel wrote: I had a submission fail and bomb with this error on Windows: Flavor: r-devel-windows-ix86+x86_64 Check: re-building of vignette outputs, Result: WARNING Error(s) in re-building vignettes: --- re-building 'vignettefilename.Rmd' using rmarkdown pandoc.exe: unrecognized option `--lua-filter' unrecognized option `--lua-filter' unrecognized option `--lua-filter' Try pandoc.exe --help for more information. Error: processing vignette 'vignettefilename.Rmd' failed with diagnostics: pandoc document conversion failed with error 2 --- failed re-building 'vignettefilename.Rmd' SUMMARY: processing the following file failed: 'vignettefilename.Rmd' Error: Vignette re-building failed. Execution halted This looks like a host configuration problem: edd@rob:~$ pandoc --help | grep lua -L SCRIPTPATH --lua-filter=SCRIPTPATH edd@rob:~$ pandoc --version | head -1 pandoc 2.9.2.1 edd@rob:~$ Can we expect CRAN to update its pandoc binary? Or will we have to 'for now' rely on 'reply-all', explaining to CRAN that the failure is from their end? As they in the press, I had reached out to CRAN but they 'have not yet responded to requests for comments' as we know they're busy. Anybody seen this error though? I haven't seen that one, but I regularly get errors in rgl and tables because of missing or insufficient pandoc on some systems. I added lines like SystemRequirements: pandoc (>= 1.12.3) for vignettes to DESCRIPTION to state the Pandoc version, added rmarkdown to the Suggests list, and added code like this to the start of HTML vignettes: ```{r echo = FALSE} if (!requireNamespace("rmarkdown") || !rmarkdown::pandoc_available("1.12.3")) { warning("This vignette requires pandoc version 1.12.3; code will not run in older versions.") knitr::opts_chunk$set(eval = FALSE) } ``` This makes the test happy, though it also makes the vignette pretty useless on systems that don't meet the stated requirements. Since SystemRequirements is free-form, I can see why CRAN doesn't do automatic interpretation of it, but it would be nice if they did. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Dependency needs to be loaded manually even its specified in the package
On 18/09/2020 12:52 p.m., Dirk Eddelbuettel wrote: On 18 September 2020 at 18:38, Nuria Perez-Zanon wrote: | I am maintaining a package call CSTools which is aimed for | post-processing climate simulations. [...] | library(CSTools) | library(qmap) You never use library() in a package. Rather, you declare dependency relationsships via DESCRIPTION (and likely NAMESPACE). See "Writing R Extensions" for all the details. I think you misread the post: this was an example of code a user would run, not code from the package. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Dependency needs to be loaded manually even its specified in the package
On 18/09/2020 12:38 p.m., Nuria Perez-Zanon wrote: Dear all, I am maintaining a package call CSTools which is aimed for post-processing climate simulations. The package is already published on CRAN with all dependencies correctly state in DESCRIPTION, NAMESPACE and roxygen2 headers. However, when using one specific function which depends on 'qmap' package, I should loaded both packages by executing: library(CSTools) library(qmap) In case I don't load the second library, I get the error Error in doQmap(x = sample_cor, fobj = adjust, ...) : doQmap not defined for class(fobj) ==fitQmapQUANT Has anyone an idea for needing to manually load a dependency? I provide a code below if someone wants to test it. Thanks in advace, Núria P.S.: Here is the code: library(CSTools) exp <- lonlat_data$exp obs <- lonlat_data$obs res <- CST_QuantileMapping(exp, obs) That's a design flaw in the doQmap function. It looks like this: function (x, fobj, ...) { cc <- class(fobj) ffun <- substring(cc, 4, nchar(cc)) ffun <- paste("do", ffun, sep = "") test <- sapply(ffun, exists, mode = "function") if (all(test)) { ffun <- match.fun(ffun) } else { stop("doQmap not defined for class(fobj) ==", class(fobj)) } ffun(x, fobj, ...) } There are at least a couple of errors there: - It appears to assume class(fobj) is a single element character string. This wouldn't have caused your problem, but it will probably cause problems sometime.. - It tries to do something like S3 methods dispatch without using S3, by looking up "doQmapQUANT" in that line producing "test", but not saying where to look for it. You could probably fix this by adding the envir argument to exists() in that call, e.g. test <- sapply(ffun, exists, mode = "function", envir = parent.env(environment())) but it would be better to not try to invent a new object system. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster
On 14/09/2020 11:56 a.m., Wang, Zhu wrote: Yes, mypkg is different from pkg, and I am the maintainer of mypkg, but not pkg. Otherwise, things can be easier. Sorry for not clear enough. Then you should not call foo, for the reasons I stated. Alternatives are to contact the maintainer of pkg and explain why you would like them to export foo, or (if the license permits it) just copy the source of foo into your own package, giving appropriate credit to the original author. Duncan Murdoch Thanks to Duncan for a practical solution. Best, Zhu -Original Message- From: Duncan Murdoch Sent: Monday, September 14, 2020 10:49 AM To: Wang, Zhu ; David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 14/09/2020 10:30 a.m., Wang, Zhu wrote: In mypkg, I want to call a function foo from pkg, and foo is not exported. I thought I should use pkg:: or pkg:::, but R CMD check provided a warning. I'm assuming that mypkg is not the same as pkg; Jeff Newmiller's answer assumes the opposite. In that case where they are different, there is only one circumstance where you should be calling foo, and you'll have to do it using foo:::pkg. That circumstance is that you are the maintainer of both packages. You should explain this in your submission message, and ask CRAN to ignore the warning if there is one. The reason for this is the following. If someone else is maintaining pkg, then they are free to change the behaviour of foo without any consideration for you, because as an internal function, they have no contract with you to maintain its behaviour. On the other hand, if you maintain both packages, then you should be ready to modify mypkg as soon as you modify pkg:::foo. Duncan Murdoch Thanks, Zhu You don't need either pkg:: or pkg::: if you are calling the function from within the package. You may need one of those if the call is coming from a user script. -Original Message- From: Duncan Murdoch Sent: Monday, September 14, 2020 7:17 AM To: Wang, Zhu ; David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 13/09/2020 8:47 p.m., Wang, Zhu wrote: Apologize if I hijack this thread, but the use of ::: is something I was puzzled. I tried Duncan's solution in my R package mypkg, something like: pkg::callInternal("foo", args) R CMD check mypkg * checking dependencies in R code ... WARNING '::' or ':::' import not declared from: ‘pkg' I probably missed something here. You don't need either pkg:: or pkg::: if you are calling the function from within the package. You may need one of those if the call is coming from a user script. If you use pkg:: from mypkg, you need to declare that mypkg Imports pkg. (This is a line in its DESCRIPTION file.) I think that's what the WARNING is telling you. Duncan Murdoch Thanks, Zhu -Original Message- From: R-package-devel On Behalf Of Duncan Murdoch Sent: Sunday, September 13, 2020 3:20 PM To: David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 13/09/2020 3:51 p.m., David Kepplinger wrote: Dear list members, I submitted an update for my package and got automatically rejected by the incoming checks (as expected from my own checks) for using `:::` calls to access the package's namespace. "There are ::: calls to the package's namespace in its code. A package *almost* never needs to use ::: for its own objects:…" (emphasis mine) This was a conscious decision on my part as the package runs code on a user-supplied parallel cluster and I consider cluster-exporting the required functions a no-go as it would potentially overwrite objects in the clusters R sessions. The package code does not own the cluster and hence the R sessions. Therefore overwriting objects could potentially lead to unintended behaviour which is opaque to the user and difficult to debug. Another solution to circumvent the R CMD check note is to export the functions to the public namespace but mark them as internal. This was also suggested in another thread on this mailing list (c.f. "Etiquette for package submissions that do not automatically pass checks?"). I do not agree with this work-around as the methods are indeed internal and should never be used by users. Exporting truly internal functions for the sake of satisfying R CMD check is a bad argument, in particular if there is a clean, well-documented, solution by using `:::` Who is calling this function: package code or user code? I assume it's a bit of a mix: your package writes a script that calls the function when it runs in user space. (It would help if you gave an explicit example of when you need to use this technique.) If my assumption is correct, there are other simple workarounds besides exporting the functions. Instead of
Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster
On 14/09/2020 10:30 a.m., Wang, Zhu wrote: In mypkg, I want to call a function foo from pkg, and foo is not exported. I thought I should use pkg:: or pkg:::, but R CMD check provided a warning. I'm assuming that mypkg is not the same as pkg; Jeff Newmiller's answer assumes the opposite. In that case where they are different, there is only one circumstance where you should be calling foo, and you'll have to do it using foo:::pkg. That circumstance is that you are the maintainer of both packages. You should explain this in your submission message, and ask CRAN to ignore the warning if there is one. The reason for this is the following. If someone else is maintaining pkg, then they are free to change the behaviour of foo without any consideration for you, because as an internal function, they have no contract with you to maintain its behaviour. On the other hand, if you maintain both packages, then you should be ready to modify mypkg as soon as you modify pkg:::foo. Duncan Murdoch Thanks, Zhu You don't need either pkg:: or pkg::: if you are calling the function from within the package. You may need one of those if the call is coming from a user script. -Original Message- From: Duncan Murdoch Sent: Monday, September 14, 2020 7:17 AM To: Wang, Zhu ; David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 13/09/2020 8:47 p.m., Wang, Zhu wrote: Apologize if I hijack this thread, but the use of ::: is something I was puzzled. I tried Duncan's solution in my R package mypkg, something like: pkg::callInternal("foo", args) R CMD check mypkg * checking dependencies in R code ... WARNING '::' or ':::' import not declared from: ‘pkg' I probably missed something here. You don't need either pkg:: or pkg::: if you are calling the function from within the package. You may need one of those if the call is coming from a user script. If you use pkg:: from mypkg, you need to declare that mypkg Imports pkg. (This is a line in its DESCRIPTION file.) I think that's what the WARNING is telling you. Duncan Murdoch Thanks, Zhu -Original Message- From: R-package-devel On Behalf Of Duncan Murdoch Sent: Sunday, September 13, 2020 3:20 PM To: David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 13/09/2020 3:51 p.m., David Kepplinger wrote: Dear list members, I submitted an update for my package and got automatically rejected by the incoming checks (as expected from my own checks) for using `:::` calls to access the package's namespace. "There are ::: calls to the package's namespace in its code. A package *almost* never needs to use ::: for its own objects:…" (emphasis mine) This was a conscious decision on my part as the package runs code on a user-supplied parallel cluster and I consider cluster-exporting the required functions a no-go as it would potentially overwrite objects in the clusters R sessions. The package code does not own the cluster and hence the R sessions. Therefore overwriting objects could potentially lead to unintended behaviour which is opaque to the user and difficult to debug. Another solution to circumvent the R CMD check note is to export the functions to the public namespace but mark them as internal. This was also suggested in another thread on this mailing list (c.f. "Etiquette for package submissions that do not automatically pass checks?"). I do not agree with this work-around as the methods are indeed internal and should never be used by users. Exporting truly internal functions for the sake of satisfying R CMD check is a bad argument, in particular if there is a clean, well-documented, solution by using `:::` Who is calling this function: package code or user code? I assume it's a bit of a mix: your package writes a script that calls the function when it runs in user space. (It would help if you gave an explicit example of when you need to use this technique.) If my assumption is correct, there are other simple workarounds besides exporting the functions. Instead of putting pkg:::foo(args) into your script, put pkg::callInternal("foo", args) where pkg::callInternal is an exported function that can look up unexported functions in the namespace. You may argue that you prefer pkg:::foo for some reason: to which I'd respond that you are being rude to the CRAN volunteers. I've offered two options (one in the previous thread, a different one here), and there was a third one in that thread offered by Ivan Krylov. Surely one of these is good enough for your needs, and you shouldn't force CRAN to handle you specially. Duncan I argue `:::` is the only clean solution to this problem and no dirty work-arounds are necessary. This is a prime example of where `:::` is actually useful and needed inside a package. If
Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster
On 13/09/2020 8:47 p.m., Wang, Zhu wrote: Apologize if I hijack this thread, but the use of ::: is something I was puzzled. I tried Duncan's solution in my R package mypkg, something like: pkg::callInternal("foo", args) R CMD check mypkg * checking dependencies in R code ... WARNING '::' or ':::' import not declared from: ‘pkg' I probably missed something here. You don't need either pkg:: or pkg::: if you are calling the function from within the package. You may need one of those if the call is coming from a user script. If you use pkg:: from mypkg, you need to declare that mypkg Imports pkg. (This is a line in its DESCRIPTION file.) I think that's what the WARNING is telling you. Duncan Murdoch Thanks, Zhu -Original Message- From: R-package-devel On Behalf Of Duncan Murdoch Sent: Sunday, September 13, 2020 3:20 PM To: David Kepplinger ; R Package Devel Subject: Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster On 13/09/2020 3:51 p.m., David Kepplinger wrote: Dear list members, I submitted an update for my package and got automatically rejected by the incoming checks (as expected from my own checks) for using `:::` calls to access the package's namespace. "There are ::: calls to the package's namespace in its code. A package *almost* never needs to use ::: for its own objects:…" (emphasis mine) This was a conscious decision on my part as the package runs code on a user-supplied parallel cluster and I consider cluster-exporting the required functions a no-go as it would potentially overwrite objects in the clusters R sessions. The package code does not own the cluster and hence the R sessions. Therefore overwriting objects could potentially lead to unintended behaviour which is opaque to the user and difficult to debug. Another solution to circumvent the R CMD check note is to export the functions to the public namespace but mark them as internal. This was also suggested in another thread on this mailing list (c.f. "Etiquette for package submissions that do not automatically pass checks?"). I do not agree with this work-around as the methods are indeed internal and should never be used by users. Exporting truly internal functions for the sake of satisfying R CMD check is a bad argument, in particular if there is a clean, well-documented, solution by using `:::` Who is calling this function: package code or user code? I assume it's a bit of a mix: your package writes a script that calls the function when it runs in user space. (It would help if you gave an explicit example of when you need to use this technique.) If my assumption is correct, there are other simple workarounds besides exporting the functions. Instead of putting pkg:::foo(args) into your script, put pkg::callInternal("foo", args) where pkg::callInternal is an exported function that can look up unexported functions in the namespace. You may argue that you prefer pkg:::foo for some reason: to which I'd respond that you are being rude to the CRAN volunteers. I've offered two options (one in the previous thread, a different one here), and there was a third one in that thread offered by Ivan Krylov. Surely one of these is good enough for your needs, and you shouldn't force CRAN to handle you specially. Duncan I argue `:::` is the only clean solution to this problem and no dirty work-arounds are necessary. This is a prime example of where `:::` is actually useful and needed inside a package. If the R community disagrees, I think R CMD check should at least emit a WARNING instead of a NOTE and elaborate on the problem and accepted work-arounds in "Writing R extensions". Or keep emitting a NOTE but listing those nebulous reasons where `:::` would be tolerated inside a package. Having more transparent criteria for submitting to CRAN would be really helpful to the entire R community and probably also reduce the traffic on this mailing list. Best, David [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster
On 13/09/2020 3:51 p.m., David Kepplinger wrote: Dear list members, I submitted an update for my package and got automatically rejected by the incoming checks (as expected from my own checks) for using `:::` calls to access the package's namespace. "There are ::: calls to the package's namespace in its code. A package *almost* never needs to use ::: for its own objects:…" (emphasis mine) This was a conscious decision on my part as the package runs code on a user-supplied parallel cluster and I consider cluster-exporting the required functions a no-go as it would potentially overwrite objects in the clusters R sessions. The package code does not own the cluster and hence the R sessions. Therefore overwriting objects could potentially lead to unintended behaviour which is opaque to the user and difficult to debug. Another solution to circumvent the R CMD check note is to export the functions to the public namespace but mark them as internal. This was also suggested in another thread on this mailing list (c.f. "Etiquette for package submissions that do not automatically pass checks?"). I do not agree with this work-around as the methods are indeed internal and should never be used by users. Exporting truly internal functions for the sake of satisfying R CMD check is a bad argument, in particular if there is a clean, well-documented, solution by using `:::` Who is calling this function: package code or user code? I assume it's a bit of a mix: your package writes a script that calls the function when it runs in user space. (It would help if you gave an explicit example of when you need to use this technique.) If my assumption is correct, there are other simple workarounds besides exporting the functions. Instead of putting pkg:::foo(args) into your script, put pkg::callInternal("foo", args) where pkg::callInternal is an exported function that can look up unexported functions in the namespace. You may argue that you prefer pkg:::foo for some reason: to which I'd respond that you are being rude to the CRAN volunteers. I've offered two options (one in the previous thread, a different one here), and there was a third one in that thread offered by Ivan Krylov. Surely one of these is good enough for your needs, and you shouldn't force CRAN to handle you specially. Duncan I argue `:::` is the only clean solution to this problem and no dirty work-arounds are necessary. This is a prime example of where `:::` is actually useful and needed inside a package. If the R community disagrees, I think R CMD check should at least emit a WARNING instead of a NOTE and elaborate on the problem and accepted work-arounds in "Writing R extensions". Or keep emitting a NOTE but listing those nebulous reasons where `:::` would be tolerated inside a package. Having more transparent criteria for submitting to CRAN would be really helpful to the entire R community and probably also reduce the traffic on this mailing list. Best, David [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] fixing problems in submitted R package
On 04/09/2020 12:41 p.m., Vitor Hugo Moreau wrote: Hello, all. I have submitted a R package and it returns some NOTES from the automatic checks. It is the first time I create and deposit a package so I would like to know what to do. My package was built with Rstudio and it gave some of these NOTES too, bby I don't know how to solve them. These are the NOTES: 1. The build time stamp is missing. This is because you didn't use R CMD build to build your package. 2. * checking CRAN incoming feasibility ... NOTE Maintainer: ‘The package maintainer ’ "I don't even know what is wrong here, this is my professional email. Should I just ignore it?" If the text "The package maintainer" is in your DESCRIPTION file, you should replace it with your name. 3. * checking DESCRIPTION meta-information ... NOTE Checking should be performed on sources prepared by ‘R CMD build’. Same as 1. 4. checking R code for possible problems ... [4s/4s] NOTE likelihood: no visible binding for global variable ‘xi’ likelihood: no visible binding for global variable ‘yi’ weibull4.fit: no visible binding for '<<-' assignment to ‘xi’ weibull4.fit: no visible binding for '<<-' assignment to ‘yi’ weibull4.fit: no visible binding for global variable ‘yi’ weibull4.fit: no visible binding for global variable ‘xi’ Undefined global functions or variables: xi yi "There are two variable defined globally (xi and yi) and they are used for some - not all - functions. Is this really a problem or can I ignore it too?" You can't ignore it. You need to make sure xi and yi are defined in the source of your package, and they shouldn't be the top level if you are modifying them. It's possible but tricky to have global in packages, and it would be best to just avoid the problem by not having them. 5. * checking Rd files ... NOTE prepare_Rd: likelihood.Rd:15-16: Dropping empty section \details prepare_Rd: likelihood.Rd:25-26: Dropping empty section \note prepare_Rd: likelihood.Rd:28-29: Dropping empty section \seealso prepare_Rd: posterior.Rd:17-18: Dropping empty section \details prepare_Rd: posterior.Rd:28-29: Dropping empty section \note prepare_Rd: posterior.Rd:31-32: Dropping empty section \seealso prepare_Rd: prior.Rd:17-18: Dropping empty section \details prepare_Rd: prior.Rd:28-29: Dropping empty section \note prepare_Rd: prior.Rd:31-32: Dropping empty section \seealso prepare_Rd: proposalfunction.Rd:15-16: Dropping empty section \details prepare_Rd: proposalfunction.Rd:26-27: Dropping empty section \note prepare_Rd: proposalfunction.Rd:29-30: Dropping empty section \seealso prepare_Rd: run_metropolis_MCMC.Rd:20-21: Dropping empty section \details prepare_Rd: run_metropolis_MCMC.Rd:31-32: Dropping empty section \note prepare_Rd: run_metropolis_MCMC.Rd:34-35: Dropping empty section \seealso prepare_Rd: weibull4.Rd:37-38: Dropping empty section \seealso prepare_Rd: weibull4.fit.Rd:57-58: Dropping empty section \seealso "Is it a real problem to drop some empty sections in some MAN files. These functions will not be direct used by users" If they aren't functions that the users will see, then they don't need to be exported and documented. Duncan Murdoch Thank you a lot and sorry for so basic questions. Prof. Vitor Hugo Moreau, Ph.D. Departamento de Biotecnologia Universidade Federal da Bahia - UFBA [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] visible binding for '<<-' assignment
On 03/09/2020 4:31 p.m., Dan Zigmond wrote: Hi, all. I am developing a package that includes some global variables. Because these are non-ASCII, I have escaped them. But then because these are difficult to read, I want to provide an easy way for users to unescape all of them up front. Thus I have code like to create and save the data in global variables in one file: pali_vowels <- c("a", "\u0101", "i", "\u012b", "u", "\u016b", "e", "o") pali_consonants <- c("k", "kh", "g", "gh", "\u1e45", "c", "ch", "j", "jh", "\u00f1", "\u1e6d", "\u1e6dh", "\u1e0d", "\u1e0dh", "\u1e47", "t", "th", "d", "dh", "n", "p", "ph", "b", "bh", "m", "y", "r", "l", "v", "s", "h", "\u1e37", "\u1e43") pali_alphabet <-c(pali_vowels, pali_consonants) use_data(pali_alphabet, overwrite = TRUE) and then I try to export a function like this in another file: pali_string_fix <- function() { pali_alphabet <<- stringi::stri_unescape_unicode(pali_alphabet) # Several more of these... } The idea is that users can run pali_string_fix() once when they load the package and then they won't need to deal with all the Unicode escape sequences after that. You shouldn't be doing that. Write a function that returns those results, and tell the user that if they store them in a global variable named "string_fixes" (or whatever), then your function will use their values instead of your own built in ones. You should never write to the global environment, but you can read from it. Duncan Murdoch However, this is getting rejected by the CRAN checks with the message: * checking R code for possible problems ... [4s] NOTE pali_string_fix: no visible binding for '<<-' assignment to 'pali_alphabet' I'm guessing this is because the data and the function are defined in different files, so even though those globals are defined by my package, that isn't obvious when the check is run on this code. Does anyone have advice for how to fix this? Dan . - Dan Zigmond d...@shmonk.com [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Modeltime 0.1.0 Submission Failing - Not Sure What To Do
On 01/09/2020 11:52 a.m., Max Turgeon wrote: Hi Matt, If you scroll down just below the Windows and Debian OKs, you'll see these lines: Last released version's CRAN status: ERROR: 1, WARN: 3, NOTE: 1, OK: 7 See: <https://CRAN.R-project.org/web/checks/check_results_modeltime.html <https://cran.r-project.org/web/checks/check_results_modeltime.html>> It seems one of your vignettes is causing issues when trying to use phantomJS. That message is about the version that's on CRAN, not the new submission. I think Matt should just follow this directions: "If you are fairly certain the rejection is a false positive, please reply-all to this message and explain." Having OK results rejected sure seems like a false positive! Duncan Murdoch Best, Max Turgeon Assistant Professor Department of Statistics Department of Computer Science University of Manitoba maxturgeon.ca<http://maxturgeon.ca> From: R-package-devel on behalf of Matt Dancho Sent: September 1, 2020 10:26:06 AM To: r-package-devel@r-project.org Subject: [R-pkg-devel] Modeltime 0.1.0 Submission Failing - Not Sure What To Do Caution: This message was sent from outside the University of Manitoba. Hi, I'm having a difficult time with the pre-tests for Modeltime 0.1.0, a new version that I'm trying to get onto CRAN. The pre-tests indicate Windows & Debian are OK. Any help would be much appreciated. Thanks, Matt *CRAN Pretest Results* Dear maintainer, package modeltime_0.1.0.tar.gz does not pass the incoming checks automatically, please see the following pre-tests: Windows: < https://win-builder.r-project.org/incoming_pretest/modeltime_0.1.0_20200901_161424/Windows/00check.log Status: OK Debian: < https://win-builder.r-project.org/incoming_pretest/modeltime_0.1.0_20200901_161424/Debian/00check.log Status: OK Last released version's CRAN status: ERROR: 1, WARN: 3, NOTE: 1, OK: 7 See: <https://CRAN.R-project.org/web/checks/check_results_modeltime.html <https://cran.r-project.org/web/checks/check_results_modeltime.html>> CRAN Web: <https://cran.r-project.org/package=modeltime> Please fix all problems and resubmit a fixed version via the webform. If you are not sure how to fix the problems shown, please ask for help on the R-package-devel mailing list: <https://stat.ethz.ch/mailman/listinfo/r-package-devel> If you are fairly certain the rejection is a false positive, please reply-all to this message and explain. More details are given in the directory: < https://win-builder.r-project.org/incoming_pretest/modeltime_0.1.0_20200901_161424/ The files will be removed after roughly 7 days. No strong reverse dependencies to be checked. Best regards, CRAN teams' auto-check service Flavor: r-devel-windows-ix86+x86_64 Check: CRAN incoming feasibility, Result: NA Maintainer: 'Matt Dancho ' Flavor: r-devel-windows-ix86+x86_64 Check: Overall checktime, Result: NOTE Overall checktime 16 min > 10 min Flavor: r-devel-linux-x86_64-debian-gcc Check: CRAN incoming feasibility, Result: Note_to_CRAN_maintainers Maintainer: 'Matt Dancho ' *Matt Dancho | Founder, CEO | Business Science* p: 570-419-4337 <(570)%20419-4337> | www.business-science.io<http://www.business-science.io> Want to learn Data Science for Business? Get started today with Business Science <https://university.business-science.io/>. <http://www.business-science.io/> <https://twitter.com/bizScienc> [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] R CMD check "unable to verify current time"
R tries to get the time from http://worldtimeapi.org/api/timezone/UTC or http://worldclockapi.com/api/json/utc/now The first one doesn't accept UTC as a timezone; it appears to want etc/UTC instead. The second one is offline. Duncan Murdoch If both of those fail, you'll get the message you saw. On 27/08/2020 1:23 p.m., John Fox wrote: Dear r-package-devel list members, I got the following note when checking two different packages today --as-cran, both under R 4.0.2 and under R-devel, on my Windows 10 and macOS Catalina systems, and on several platforms on rhub: * checking for future file timestamps ... NOTE unable to verify current time I'm writing to inquire about the note because no one else has mentioned this problem recently, in case it's produced by something that I'm doing. There is a discussion of a similar problem from 2019 at <https://stat.ethz.ch/pipermail/r-package-devel/2019q1/003577.html>. Both packages that I was checking are close to CRAN releases and so I'd like to know whether I can disregard the note. Any help would be appreciated. John __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Etiquette for package submissions that do not automatically pass checks?
On 14/08/2020 3:08 p.m., Cesko Voeten wrote: A while ago, I submitted an update to my package 'buildmer' that does not pass R CMD check. This is deliberate. The package contains functionality to run on cluster nodes that were set up by the user and needs to access its own internal functions from there. In previous versions of the package, I had maintained a list of those functions and clusterExport()ed them, but that had the side effect of overwriting any same-named user objects on the user-provided cluster nodes, which I thought was poor form. The update therefore accesses these functions using ':::', which triggers a check warning. I thought the etiquette was to explain this in the 'Comments' box when submitting, but this gave me the same automated message that the package does not pass checks and that I should fix it or reply-all and explain. This led me to believe that I should not have used the 'Comments' box for this purpose, hence I resubmitted the package leaving the comments box empty, and I replied-all to the subsequent e-mail I got with an explanation similar to the above. It seems to me that what you should have done is "reply-all and explain", as the automated message said. It has now been a while since I sent that e-mail (ten days), and I have yet to hear back. I was wondering if the message had gotten lost, if they simply haven't gotten around to it yet (I have no idea how much mail they receive on a daily basis, but I'd think it's a lot more than I do), or if I should have handled this differently. Only CRAN can answer the first two questions, but before I bother them: was this the correct procedure, or should I simply have done something differently? You can see the state of your submission using the foghorn package. cran_incoming("buildmer") currently shows your package is in the "archive", which means "package rejected: it does not pass the checks cleanly and the problems are unlikely to be false positives". I only see version 1.7 there, which may indicate that you resubmitted exactly the same package (down to the version number). As the instructions at https://cran.r-project.org/web/packages/policies.html#Re_002dsubmission say, "Increasing the version number at each submission reduces confusion so is preferred even when a previous submission was not accepted." What I'd suggest now is that you do nothing more for a day or two, because CRAN members who aren't on holiday might read and respond to your message. If you don't hear anything, then I'd start over again, with a new version number, and an explanation in the comments, and likely a followup reply-all. Alternatively, you could export those troublesome functions from your package but document them as for internal use only. Renaming them with a name starting with "." will make them harder for users to stumble upon, but you can still access them using buildmer::.something, you shouldn't need clusterExport(). Then you will meet the technical requirement and not need any explanation. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Package Anchored Links with R-Dev
On 09/08/2020 3:15 p.m., Ben Bolker wrote: On 8/9/20 3:08 PM, Duncan Murdoch wrote: On 09/08/2020 2:59 p.m., John Mount wrote: Firstly: thanks to Ben for the help/fix. I know nobody asked, but. Having to guess where the documentation is just to refer to it is just going to be really brittle going forward. Previous: if the function you referred to existed in the package you were fine. That's not correct. The system could often work around the error, but not always. I may be missing something. It may well be that referring to a cross-package link by alias rather than by the name of the Rd page actually never worked, but: would there be a big barrier to making cross-package documentation links be able to follow aliases? I can imagine there may be technical hurdles but it seems like a well-defined problem ... To link to "?utils::dump.frames", you need to construct a URL that leads to the HTML file containing that help page on static builds of the help system. If utils is available, no problem, just look up the fact that the page you want is debugger.html at the time you construct the link. But it was documented that such links should work even if the target package was not available at the time the link was being made. So you need a link that you are sure will be available when the referenced package is eventually installed. Obviously that's going to be brittle. Perhaps the new requirement that referenced packages be mentioned in the DESCRIPTION file is a step towards addressing this. If everything that's referenced is present when the help system is being built, there will be an easy solution. Nowadays, it would probably be reasonable to put in stub pages for every alias, so that when you try to load dump.frames.html, the page itself redirects you to debugger.html. Or maybe you could just have a single page in each package that handles aliases via Javascript. Or R could just no longer support static copies of the help system. When you are using dynamically generated help pages, the link can be resolved by the server, and that's why those links have appeared to work for so long, even though the requirement to link to the filename instead of the alias has been there since before I wrote the dynamic help server. Duncan Murdoch Duncan Murdoch Future: if don't correctly specify where the help is you are wrong. Going forward: reorganizing a package's help can break referring packages. This sort of brittleness is going to be a big time-waster going forward. It seems like really the wrong direction in packaging, isolation, and separation of concerns (SOLID style principles). On Aug 9, 2020, at 11:04 AM, Ben Bolker wrote: This might have to be \link[utils:debugger]{dump.frames} now, i.e. explicitly linking to the man page on which dump.frames is found rather than following aliases? On Sun, Aug 9, 2020 at 2:01 PM John Mount wrote: With "R Under development (unstable) (2020-07-05 r78784)" (Windows) documentation references such as "\link[utils]{dump.frames}" trigger "Non-file package-anchored link(s) in documentation object" warnings even if the package is in your "Imports." Is that not the right form? Is there any way to fix this other than the workaround of just removing links from the documentation? I kind of don't want to do that as the links were there to help the user. __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Package Anchored Links with R-Dev
On 09/08/2020 2:59 p.m., John Mount wrote: Firstly: thanks to Ben for the help/fix. I know nobody asked, but. Having to guess where the documentation is just to refer to it is just going to be really brittle going forward. Previous: if the function you referred to existed in the package you were fine. That's not correct. The system could often work around the error, but not always. Duncan Murdoch Future: if don't correctly specify where the help is you are wrong. Going forward: reorganizing a package's help can break referring packages. This sort of brittleness is going to be a big time-waster going forward. It seems like really the wrong direction in packaging, isolation, and separation of concerns (SOLID style principles). On Aug 9, 2020, at 11:04 AM, Ben Bolker wrote: This might have to be \link[utils:debugger]{dump.frames} now, i.e. explicitly linking to the man page on which dump.frames is found rather than following aliases? On Sun, Aug 9, 2020 at 2:01 PM John Mount wrote: With "R Under development (unstable) (2020-07-05 r78784)" (Windows) documentation references such as "\link[utils]{dump.frames}" trigger "Non-file package-anchored link(s) in documentation object" warnings even if the package is in your "Imports." Is that not the right form? Is there any way to fix this other than the workaround of just removing links from the documentation? I kind of don't want to do that as the links were there to help the user. __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Package Anchored Links with R-Dev
On 09/08/2020 2:04 p.m., Ben Bolker wrote: This might have to be \link[utils:debugger]{dump.frames} now, i.e. explicitly linking to the man page on which dump.frames is found rather than following aliases? It's always had to be that way for reliable links. It's just that the QC checks are finally warning about the unreliability. Duncan Murdoch On Sun, Aug 9, 2020 at 2:01 PM John Mount wrote: With "R Under development (unstable) (2020-07-05 r78784)" (Windows) documentation references such as "\link[utils]{dump.frames}" trigger "Non-file package-anchored link(s) in documentation object" warnings even if the package is in your "Imports." Is that not the right form? Is there any way to fix this other than the workaround of just removing links from the documentation? I kind of don't want to do that as the links were there to help the user. __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] NOTE in r-devel-linux-x86_64-fedora-clang
On 07/08/2020 6:53 p.m., Helmut Schütz wrote: Hi Duncan, Duncan Murdoch wrote on 2020-08-07 21:55: On 07/08/2020 3:22 p.m., Helmut Schütz wrote: I see. However, the HTML-source states This manual is for R, version 4.1.0 Under development (2020-08-06). I was relying on the PDF-version (4.0.2 of 2020-06-22) which does *not* contain this sentence. Hence, I fell into the trap. Should be stated in the PDF as well, IMHO. Oh, c'mon. It will be a new requirement in R 4.1.0 It's not a requirement in 4.0.2, and you didn't get a NOTE about it there, you only got the note in one of the r-devel platforms. Yep, but why are the others not configured in the same way (with setenv _R_CHECK_SUGGESTS_ONLY_ false)? Doesn't sound consistent to me. What has this got to do with your suggestion that changes that will be released in R 4.1.0 next year should be documented in R 4.0.2? Duncan Murdoch The general way things work in R is that changes get announced well in advance of release *by putting them in R-devel*. That's why you're asked to check your package against R-devel before submitting: so that it meets upcoming announced changes to requirements as well as ones that are in the current release. Of course, we checked the package on winbuilder... Are you suggesting to set up a multi-boot system for all those OSs? Even if one -- not us -- would aim at that: Where to get Solaris v10? Buy a Mac to run checks on maxOS? At least I understand now the differences between r-devel-linux-x86_64-debian-gcc and r-devel-linux-x86_64-debian-gcc (given in the last line there: https://www.stats.ox.ac.uk/pub/bdr/Rconfig/r-devel-linux-x86_64-fedora-gcc). Helmut __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] NOTE in r-devel-linux-x86_64-fedora-clang
On 07/08/2020 3:22 p.m., Helmut Schütz wrote: Hi Duncan, Duncan Murdoch wrote on 2020-08-07 18:39: You're reading the wrong version of the manual. This is in the R-devel manual: "Packages referred to by these ‘other forms’ should be declared in the DESCRIPTION file, in the ‘Depends’, ‘Imports’, ‘Suggests’ or ‘Enhances’ fields. " This is at the end of section 2.5 in https://cran.r-project.org/doc/manuals/r-devel/R-exts.html. I see. However, the HTML-source states This manual is for R, version 4.1.0 Under development (2020-08-06). I was relying on the PDF-version (4.0.2 of 2020-06-22) which does *not* contain this sentence. Hence, I fell into the trap. Should be stated in the PDF as well, IMHO. Oh, c'mon. It will be a new requirement in R 4.1.0 It's not a requirement in 4.0.2, and you didn't get a NOTE about it there, you only got the note in one of the r-devel platforms. The general way things work in R is that changes get announced well in advance of release *by putting them in R-devel*. That's why you're asked to check your package against R-devel before submitting: so that it meets upcoming announced changes to requirements as well as ones that are in the current release. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] NOTE in r-devel-linux-x86_64-fedora-clang
On 07/08/2020 11:02 a.m., Brian G. Peterson wrote: On Fri, 2020-08-07 at 15:46 +0100, Gábor Csárdi wrote: If you want to link to a package in the documentation, you'll have to add it to Suggests. This doesn't make any sense. If you don't use the code from that package anywhere, then a cross-reference to that package should not require the extra dependency in Suggests. Cross references should be able to point to other functionality that might be useful to the user, or might add extra depth of understanding to a concept. If the user doesn't have the package installed, no worries, it is just a cross reference. The requirement you are suggesting is also not discussed in Writing R Extensions: https://cran.r-project.org/doc/manuals/r-patched/R-exts.html#Cross_002dreferences In fact, it explicitly allows links to potentially uninstalled packages. You're reading the wrong version of the manual. This is in the R-devel manual: "Packages referred to by these ‘other forms’ should be declared in the DESCRIPTION file, in the ‘Depends’, ‘Imports’, ‘Suggests’ or ‘Enhances’ fields. " This is at the end of section 2.5 in https://cran.r-project.org/doc/manuals/r-devel/R-exts.html. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in CHECK caused by dev.off()
On 23/07/2020 5:11 p.m., Helmut Schütz wrote: Hi Dirk, Dirk Eddelbuettel wrote on 2020-07-23 15:16: Helmut, For previous uploads you affirmed that you read the CRAN Repository Policy which states [...] Your package appears to violate that requirement. As I wrote previously the statement continues with "Limited exceptions may be allowed in interactive sessions if the package obtains confirmation from the user." I'm not a native speaker of English but for me "should not write" != "must not write". And "may be allowed" is not "will be allowed". I would fix the package. I removed the automatic switch to "~/" if no path is given. As before I recommend in the man-pages to give the full path. However, I _still_ state that "~/" _can_ be used for convenience. This is a little unclear (perhaps the language issue again). It's fine if your documentation recommends the user choose that. That's a variation on what I recommended to you (that you generate an error message that suggests how to avoid the error). I don't agree with it if you mean to say the CRAN policy gets it wrong, and you should be allowed to have your original automatic fallback. The package is used in a regulated environment. The output file contains an entire audit-trail (name of the user and system, version and date of the OS, R, all packages, functions used, time of execution, blablah). If the package would write to tempdir() I would have to add a warning in bold font of every man-page like "Execute the command tempdir() to find out where your result files reside. Copy the files to a safe location before you quit the R-session. If you fail to do so, your results will be lost." Off topic: I don't like it that on Windows tempdir is located at "C:/Users/{Username}/AppData/Local/Temp/Rtmp..." I strictly separate my OSes (on C), software (on D), data (on E), backups (on F). Writing to the system disk is not what I prefer. However, in my installation "~/" resolves to "E:/Users/{Username}/Documents/"... It can resolve anywhere you like (as long as its writable), by specifying the TMPDIR environment variable. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Check Error Due to Unicode in Documentation
On 23/07/2020 4:14 p.m., b...@denney.ws wrote: Hello, I have a personal package that I�d eventually like to clean up and either find other packages to be homes for the functions or perhaps eventually release it on CRAN. To that end, I try to keep package checks working. One of the functions that I use is to try to simplify Unicode text to ASCII. With that, I tend to receive data that is scientifically-focused to the mu character should be converted to a �u� instead of the standard conversion to �m�. On top of that, there are at least two Unicode characters that are visually the mu character, one is the micro character and the other is an actual lowercase mu. This function converts both of those to �u� as desired. I generate the documentation using roxygen2, but the text in the documentation aligns with the expected Unicode character, so I think the issue is not with roxygen. The issue is that Codoc gives the following error: * checking for code/documentation mismatches ... WARNING Codoc mismatches from documentation object 'unicode_to_ascii': unicode_to_ascii.character Code: function(x, verbose = FALSE, pattern = c("μ", "µ"), replacement = c("u", "u"), general_ But, the code and documentation appear to be the same. I think that the issue relates to something with Unicode support in Codoc, but I�m not sure how to test for that. The code is here: https://github.com/billdenney/bsd.report/blob/454caf217c5b333af1d65c7e63bbad 4194320e07/R/unicode_to_ascii.R#L28-L31 And the documentation is here: https://github.com/billdenney/bsd.report/blob/454caf217c5b333af1d65c7e63bbad 4194320e07/man/unicode_to_ascii.Rd#L17-L24 Do you have any suggestions on how to make this code/documentation work with Codoc? If you change the source to include the explicit characters (i.e. use pattern = c("μ", "µ") instead of pattern=c("\u03bc", "\u00b5")), does that help? It may cause other issues: WRE recommends against including UTF-8 chars in source code. If that doesn't solve the problem, then it looks like an issue with Roxygen2. I don't know if there's a way to tell it not to convert \u escapes into the corresponding character. If there isn't, it seems like that's something they should add. As a workaround, is there a way to say that this one particular .Rd file should be edited by hand, instead of auto-generated? Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in CHECK caused by dev.off()
On 23/07/2020 8:18 a.m., Helmut Schütz wrote: Hi David, David Cortes wrote on 2020-07-23 13:16: It is explained here: https://cran.r-project.org/web/packages/policies.html Section about source packages: "Packages should not write in the user’s home filespace (including clipboards), nor anywhere else on the file system apart from the R session’s temporary directory (or during installation in the location pointed to by TMPDIR: and such usage should be cleaned up)." THX; I missed that! However, the policy continues: "Limited exceptions may be allowed in interactive sessions if the package obtains confirmation from the user." IMHO, this is applicable here (i.e., the user _should_ specify a directory (as recommended in the man-pages), and only if none is provided, a warning is issued and confirmation obtained). If I would use tempdir() and the user forgets to copy the result files to another location and closes the console, everything would be lost and the user would have to start again from scratch. I think that this is not very user-friendly. I would issue an error instead of a warning, and in the error message, suggest some code that should work. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in CHECK caused by dev.off()
On 22/07/2020 5:40 p.m., Helmut Schütz wrote: Duncan Murdoch wrote on 2020-07-22 21:42: On 22/07/2020 1:25 p.m., Helmut Schütz wrote: [...] The problem is that I cannot reproduce it as well. Only CHECK laments about dev.off() which I changed to graphics.off() in the meantime. library(grDevices) foo <- TRUE # shall we plot? png.path <- "~/" # user's home folder png.path <- normalizePath(png.path) if (foo) { png(paste0(png.path, "test.png"), width = 480, height = 480, pointsize = 12) } plot(x = 0:1, y = 0:1, type = "l", xlab = "x", ylab = "y") if (foo) { graphics.off() } You don't test whether the call to png() succeeded. Correct. However, if (file.exists(paste0(png.path, "test.png"))) graphics.off() worked in the example but not in the package... During a check, it probably wouldn't, because you aren't allowed to write to "~/". Your package should be writing to tempdir(), or a location entered by the user. The user is asked to provide a certain path indeed. Only if none is provided, the fallback is "~/" (which is always writable). That disqualifies your package from inclusion on CRAN. If no destination is provided and tempdir() isn't acceptable, you shouldn't write anything. The user may be keeping an irreplaceable piece of information in "~/test.png", and your package would destroy it. It's not your decision to make to trespass on the user's file space. Duncan Murdoch The package is intended for "common" users and not "R-geeks". If I would write to tempdir(), I guess chances are pretty low that a user will be able to locate the file. What I still fail to understand: CHECK laments about grDevices::dev.off() in a certain man page, where I removed the argument foo completely in one example (which is within \donttest{}). Hence, the entire plotting routine is not executed at all. Furthermore, dev.off() is nowhere used, only graphics.off() - now after file.exists(). Regards, Helmut -- Ing. Helmut Schütz BEBAC – Consultancy Services for Bioequivalence and Bioavailability Studies Neubaugasse 36/11 1070 Vienna, Austria ehelmut.schu...@bebac.at Whttps://bebac.at/ Fhttps://forum.bebac.at/ __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in CHECK caused by dev.off()
On 22/07/2020 1:25 p.m., Helmut Schütz wrote: Hi Serguei, Serguei Sokol wrote on 2020-07-22 15:51: Hmm... I see 2 possibilities for still getting an error while the concerned part of code is not supposed to be run: - either you are running not updated version of your package; I _can_ built the package and it runs as intended. Only the CHECK throws the error. - or the error comes from some other place of the code. Closing the device is required only _once_ in the entire package. In my NAMESPACE I have (and had in all previous versions): importFrom(grDevices, png, graphics.off, dev.list, dev.off) Sorry but without a minimal reproducible example I cannot help more. The problem is that I cannot reproduce it as well. Only CHECK laments about dev.off() which I changed to graphics.off() in the meantime. library(grDevices) foo <- TRUE # shall we plot? png.path <- "~/" # user's home folder png.path <- normalizePath(png.path) if (foo) { png(paste0(png.path, "test.png"), width = 480, height = 480, pointsize = 12) } plot(x = 0:1, y = 0:1, type = "l", xlab = "x", ylab = "y") if (foo) { graphics.off() } You don't test whether the call to png() succeeded. During a check, it probably wouldn't, because you aren't allowed to write to "~/". Your package should be writing to tempdir(), or a location entered by the user. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error in CHECK caused by dev.off()
On 22/07/2020 8:36 a.m., Helmut Schütz wrote: Dear all, I have two variables, foo and bar. The first is TRUE if a png should be created and the second is TRUE if an already existing one should be overwritten. At the end of the plot I had if (foo | (foo & bar)) dev.off() This worked as expected in all versions of my package built in R up to v3.6.3. However, when I CHECK the package in v4.0.2 I get: > grDevices::dev.off() Error in grDevices::dev.off() : cannot shut down device 1 (the null device) Execution halted I tried: if (foo | (foo & bar)) { Assuming that foo and bar are each length one variables, this test is logically equivalent to if (foo) { Is that really what you intended? Duncan Murdoch dev <- dev.list() if (!is.null(dev)) { if (dev == 2) invisible(dev.off()) } } without success (same error). Even the more general if (foo | (foo & bar)) { graphics.off() } did not work. The plot is called only in an example of one man-page -- though embedded in \donttest{}. Even if I set both foo and bar to FALSE (i.e., the respective part of the code should not be executed at all), I get the same error. Any ideas/suggestions? Regards, Helmut __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Getting two independent packages with identical S3 generics to dispatch each other's methods
On 11/07/2020 7:52 a.m., Pavel N. Krivitsky wrote: Dear Duncan et al., Firstly, my apologies for the duplicated query. It seems that I had searched everywhere but the mailing list where I asked the question. Secondly, I was wondering if I could get some indication whether CRAN would accept a package with the following code and documentation (and only that): I think it would depend on the documentation and the submission message. You'll need to convince them not to reject your package under the "A package’s contribution has to be non-trivial" rule. Explain why you can't put the generic in one of the existing packages and import it from there into the other one. (I'd make ergm import lme4, since that only adds 5 packages that wouldn't otherwise be present: "minqa" "nloptr""statmod" "Rcpp" "RcppEigen", and those are all reasonably popular packages.) Duncan Murdoch 1) A number of exported generics of the form summary_formula(object, ..., lhs), simulate_formula(object, nsim=1, seed=NULL, ..., lhs), etc., which expect a formula as their first argument, evaluate the LHS of the formula, and dispatch based on the class of the result, which can also be overridden by the lhs= argument. 2) Corresponding S3 methods summary.formula(), simulate.formula(), etc. methods, that call the corresponding *_formula() generic. I am familiar with the generics package, but I don't think it's a good place for this functionality, because this is not the typical dispatching behaviour, and because *.formula() exports are not technically generics but S3 methods. In particular, as far as I know, existing mechanisms make it easy to cherry-pick generics, but they don't make it easy to cherry-pick methods. Best Regards, Pavel On Sat, 2020-07-11 at 07:29 -0400, Duncan Murdoch wrote: If the semantics of the two generics must remain identical in the future, then there is an implicit dependency between the code in the packages. You should formalize this by using one of the dependency mechanisms that the language provides, i.e. the clean solution. Duncan Murdoch On 10/07/2020 7:51 p.m., Pavel N. Krivitsky wrote: Dear All, I would like to have two packages that do not depend on each other that have an identical generic to be able to dispatch to each other's (non- conflicting) methods. If it is of interest, the background for why this is needed is given at the end of this e-mail. As it is, it looks like two packages that do not depend on each other both define a generic, they do not see each other's S3 methods. For example, in the two attached minimal packages, which define and export generic foo() (identical in both packages) and methods foo.character() and foo.numeric() that are exported via S3method(), we get, library(test.character) foo("a") foo.character() called. library(test.numeric) Attaching package: ‘test.numeric’ The following object is masked from ‘package:test.character’: foo foo(1) foo.numeric() called. foo("a") Error in UseMethod("foo") : no applicable method for 'foo' applied to an object of class "character" That is, test.numeric::foo() doesn't "see" test.character:::foo.character() and vice versa. Is there a way to make them see each other? This issue has arisen before, e.g. at https://stackoverflow.com/questions/25251136/how-to-conditionally-define-a-generic-function-in-r-namespace . The "clean" solution is, of course, to create a third package to define the generic that the two packages could import (and, if necessary, reexport). However, that involves creating an almost-empty package that then has to be submitted to CRAN, maintained, and add some amount of storage and computational overhead. Is there another way to do this that is transparent to the end user? # Background This arose as a result of two packages (lme4 and ergm) both wanting to implement a simulate.formula() method, causing conflicts when the user wanted to use both at the same time. ergm has a mechanism for dispatching based on the class of the LHS of the formula. It does so by defining a generic, simulate_formula() which evaluates the formula's LHS and dispatches a method (e.g., simulate_formula.()) based on that. Since lme4 and ergm generally use different LHSs, we are thinking of resolving the conflict by copying the LHS dispatching mechanism from ergm to lme4, and then defining our own summary_formula methods as needed. Thank you in advance, Pavel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Getting two independent packages with identical S3 generics to dispatch each other's methods
If the semantics of the two generics must remain identical in the future, then there is an implicit dependency between the code in the packages. You should formalize this by using one of the dependency mechanisms that the language provides, i.e. the clean solution. Duncan Murdoch On 10/07/2020 7:51 p.m., Pavel N. Krivitsky wrote: Dear All, I would like to have two packages that do not depend on each other that have an identical generic to be able to dispatch to each other's (non- conflicting) methods. If it is of interest, the background for why this is needed is given at the end of this e-mail. As it is, it looks like two packages that do not depend on each other both define a generic, they do not see each other's S3 methods. For example, in the two attached minimal packages, which define and export generic foo() (identical in both packages) and methods foo.character() and foo.numeric() that are exported via S3method(), we get, library(test.character) foo("a") foo.character() called. library(test.numeric) Attaching package: ‘test.numeric’ The following object is masked from ‘package:test.character’: foo foo(1) foo.numeric() called. foo("a") Error in UseMethod("foo") : no applicable method for 'foo' applied to an object of class "character" That is, test.numeric::foo() doesn't "see" test.character:::foo.character() and vice versa. Is there a way to make them see each other? This issue has arisen before, e.g. at https://stackoverflow.com/questions/25251136/how-to-conditionally-define-a-generic-function-in-r-namespace . The "clean" solution is, of course, to create a third package to define the generic that the two packages could import (and, if necessary, reexport). However, that involves creating an almost-empty package that then has to be submitted to CRAN, maintained, and add some amount of storage and computational overhead. Is there another way to do this that is transparent to the end user? # Background This arose as a result of two packages (lme4 and ergm) both wanting to implement a simulate.formula() method, causing conflicts when the user wanted to use both at the same time. ergm has a mechanism for dispatching based on the class of the LHS of the formula. It does so by defining a generic, simulate_formula() which evaluates the formula's LHS and dispatches a method (e.g., simulate_formula.()) based on that. Since lme4 and ergm generally use different LHSs, we are thinking of resolving the conflict by copying the LHS dispatching mechanism from ergm to lme4, and then defining our own summary_formula methods as needed. Thank you in advance, Pavel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 02/07/2020 10:41 a.m., Dr. Jens Oehlschlägel wrote: Thank you for the advice Duncan, But let's not get carried away here: we are talking about a *warning* that only arises if two packages are checked together that are never meant to be installed together. The new packages contain weeks of improvement-work, and I am not going to add back-and-forth-trick-work just to circumvent some warnings that arise only at the point of switching from old to new. That's a choice, but your package might face auto-rejection on CRAN. Asking CRAN to handle your update manually means extra work for them; if there's a simple way to handle it without that (and I've given you two to choose from), it seems kind of selfish not to use it. BTW, assuming you got the two packages on CRAN, it wouldn't be hard for a user to install the new bit without updating ff: just have both installed before your update, then ask to update bit. Since it doesn't depend on ff, it won't trigger an ff update. You can't expect R to know they were never meant to be installed together unless you record that fact in the dependencies in the DESCRIPTION file. Duncan Murdoch If there is a problem when checking the new packages together that's a different story and worth taking care about. I didn't find such problems. Kind regards Jens Gesendet: Donnerstag, 02. Juli 2020 um 15:23 Uhr Von: "Duncan Murdoch" An: "Dr. Jens Oehlschlägel" , r-package-devel@r-project.org Betreff: Re: Aw: Re: Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s) On 02/07/2020 7:49 a.m., Dr. Jens Oehlschlägel wrote: Duncan, One way is to make bit depend on a particular version of ff. That may cause a deadlock if both are being updated at once, but I think CRAN should be able to deal with it if they are informed of the issue. Exactly that I have done: I submitted all three packages bit/bit64/ff in version 4.0.2 and made them dependend on Version >= 4.0.0. And yes, the maintainers have been informed about the issue. I'm not sure that's what I meant, but I can't be sure, since I haven't seen your source. What I meant is a package dependency, i.e. the existing ff on CRAN is version 2.2-14.2 and it depends on bit without saying what version of bit is needed. The existing bit is 1.1-15.2 with no dependency on ff. So you can force the new ff to use the new bit by giving the version number, e.g. Depends: bit (>= 2.0) but it's not so obvious how to make the new bit depend on the new ff. There's no way to say that the dependency is only to a help page, and circular strong dependencies are messy, so I'd suggest you use one of the other options I offered: a dynamic link in the Rd file, or no link at all. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 02/07/2020 7:49 a.m., Dr. Jens Oehlschlägel wrote: Duncan, One way is to make bit depend on a particular version of ff. That may cause a deadlock if both are being updated at once, but I think CRAN should be able to deal with it if they are informed of the issue. Exactly that I have done: I submitted all three packages bit/bit64/ff in version 4.0.2 and made them dependend on Version >= 4.0.0. And yes, the maintainers have been informed about the issue. I'm not sure that's what I meant, but I can't be sure, since I haven't seen your source. What I meant is a package dependency, i.e. the existing ff on CRAN is version 2.2-14.2 and it depends on bit without saying what version of bit is needed. The existing bit is 1.1-15.2 with no dependency on ff. So you can force the new ff to use the new bit by giving the version number, e.g. Depends: bit (>= 2.0) but it's not so obvious how to make the new bit depend on the new ff. There's no way to say that the dependency is only to a help page, and circular strong dependencies are messy, so I'd suggest you use one of the other options I offered: a dynamic link in the Rd file, or no link at all. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
There are a few ways to deal with this, but waiting for ff to be updated is probably easiest. However, maybe ff can't be updated until bit is updated. Here are some possibilities: One way is to make bit depend on a particular version of ff. That may cause a deadlock if both are being updated at once, but I think CRAN should be able to deal with it if they are informed of the issue. Another way is to use R code in the Rd file to select which link to use. For example, instead of \link[ff]{clone.ff}, you could use \Sexpr[results=rd,stage=render]{clonelink()} where clonelink() is a function that generates either "\link[ff]{clone.ff}" or "\link[ff:clone]{clone.ff}" depending on the detected installed version of ff. Another choice that is nearly as easy as doing nothing is to include no link at all in this update, and make it a link again in the next update when the new ff is available. Duncan Murdoch On 02/07/2020 6:47 a.m., Dr. Jens Oehlschlägel wrote: Thanks Gabor and Duncan, It's actually in ff/man/clone.rd, not clone.ff.rd. There is no ff/man/clone.ff.rd file. but there *is* clone.ff.rd in the >= 4.0.0 versions of the packages bit/bit64/ff. Hence the check warning is a false alarm resulting from checking bit 4.0.2 (GitHub.com/truecluster) against ff 2.2-14.2 (CRAN) instead of checking it against the also submitted ff 4.0.2 (GitHub.com/truecluster). So all I can and will do is waiting that CRAN maintainers install and check again. Best Jens Duncan Murdoch Best regards Jens On 16.06.20 22:31, Gábor Csárdi wrote: This is how to look up the filename. The first "sp" is the topic name, the second is the package name. help("sp", "sp")[[1]] [1] "C:/Users/csard/R/win-library/4.0/sp/help/00sp" So you need to link to the "00sp.Rd" file: \link[sp:00sp]{sp} Gabor On Tue, Jun 16, 2020 at 9:09 PM Wayne Oldford wrote: Hi I got caught by this new test this week in trying to push an updated release of the loon package to CRAN. By following this thread, I corrected my cross-references to external packages but I got stymied by the one I hoped to give to the "sp" package for Spatial data _ Here is the history: I tried \link[sp:sp]{sp} which failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200616_162128/Debian/00check.log> Status: 1 WARNING That was meant to correct an earlier attempt (it did for other links to "scales" for example) where I had tried \link[sp]{sp} and failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200615_213749/Debian/00check.log> Status: 1 WARNING So to complete the possibilities as I understand them, I just now tried \link{sp} which, as might be expected, failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200616_213921/Debian/00check.log> Status: 1 WARNING As expected, error here was different: "Missing link" as opposed to "Non-file package-anchored link" _ I am not sure whether I have missed a subtlety in WRE or that the peculiar circumstance where the package, the topic, and the file name are all identical (sp) is some weird boundary case. Without further advice, I think I am just going to remove the link to "sp". It really is just a courtesy link to the package description for "sp". Thanks in advance for your thoughts. Wayne -Original Message- From: R-package-devel on behalf of Georgi Boshnakov Date: Tuesday, June 16, 2020 at 9:27 AM To: Gábor Csárdi , Duncan Murdoch Cc: List r-package-devel Subject: Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s) I think that the current behaviour is documented in WRE: "...There are two other forms of optional argument specified as \link[pkg]{foo} and \link[pkg:bar]{foo} to link to the package pkg, to files foo.html and bar.html respectively. These are rarely needed, perhaps to refer to not-yet-installed packages (but there the HTML help system will resolve the link at run time) or in the normally undesirable event that more than one package offers help on a topic7 (in which case the present package has precedence so this is only needed to refer to other packages). They are currently only used in HTML help (and ignored for hyperlinks in LATEX conversions of help pages), and link to the file rather than the topic (since there is no way to know which topics are in which files in an uninstalled package) ... Because they have been frequently misused, the HTML help system looks for topic foo in package pkg if it does not find file foo.html." Unless I am missing something, it seems that it would be relatively painless
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 01/07/2020 1:46 p.m., Dr. Jens Oehlschlägel wrote: Good evening, My package bit 4.0.2 (https://github.com/truecluster/bit) is being rejected by CRAN checks with warning: >Check: Rd cross-references, Result: WARNING > Non-file package-anchored link(s) in documentation object 'clone.Rd': >'[ff]{clone.ff}' > > See section 'Cross-references' in the 'Writing R Extensions' manual. although clone.ff is in clone.ff.rd as confirmed by > help("clone.ff","ff")[[1]] [1] "/home/mypc/R/x86_64-pc-linux-gnu-library/4.0/ff/help/clone.ff" I asked the maintainers to explain what is wrong and what to do and got no answer. Does someone here can help? It's actually in ff/man/clone.rd, not clone.ff.rd. There is no ff/man/clone.ff.rd file. Duncan Murdoch Best regards Jens On 16.06.20 22:31, Gábor Csárdi wrote: This is how to look up the filename. The first "sp" is the topic name, the second is the package name. help("sp", "sp")[[1]] [1] "C:/Users/csard/R/win-library/4.0/sp/help/00sp" So you need to link to the "00sp.Rd" file: \link[sp:00sp]{sp} Gabor On Tue, Jun 16, 2020 at 9:09 PM Wayne Oldford wrote: Hi I got caught by this new test this week in trying to push an updated release of the loon package to CRAN. By following this thread, I corrected my cross-references to external packages but I got stymied by the one I hoped to give to the "sp" package for Spatial data _ Here is the history: I tried \link[sp:sp]{sp} which failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200616_162128/Debian/00check.log> Status: 1 WARNING That was meant to correct an earlier attempt (it did for other links to "scales" for example) where I had tried \link[sp]{sp} and failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200615_213749/Debian/00check.log> Status: 1 WARNING So to complete the possibilities as I understand them, I just now tried \link{sp} which, as might be expected, failed here: Debian: <https://win-builder.r-project.org/incoming_pretest/loon_1.3.1_20200616_213921/Debian/00check.log> Status: 1 WARNING As expected, error here was different: "Missing link" as opposed to "Non-file package-anchored link" _ I am not sure whether I have missed a subtlety in WRE or that the peculiar circumstance where the package, the topic, and the file name are all identical (sp) is some weird boundary case. Without further advice, I think I am just going to remove the link to "sp". It really is just a courtesy link to the package description for "sp". Thanks in advance for your thoughts. Wayne -Original Message- From: R-package-devel on behalf of Georgi Boshnakov Date: Tuesday, June 16, 2020 at 9:27 AM To: Gábor Csárdi , Duncan Murdoch Cc: List r-package-devel Subject: Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s) I think that the current behaviour is documented in WRE: "...There are two other forms of optional argument specified as \link[pkg]{foo} and \link[pkg:bar]{foo} to link to the package pkg, to files foo.html and bar.html respectively. These are rarely needed, perhaps to refer to not-yet-installed packages (but there the HTML help system will resolve the link at run time) or in the normally undesirable event that more than one package offers help on a topic7 (in which case the present package has precedence so this is only needed to refer to other packages). They are currently only used in HTML help (and ignored for hyperlinks in LATEX conversions of help pages), and link to the file rather than the topic (since there is no way to know which topics are in which files in an uninstalled package) ... Because they have been frequently misused, the HTML help system looks for topic foo in package pkg if it does not find file foo.html." Unless I am missing something, it seems that it would be relatively painless to reverse the logic of the current behaviour of the help system, i.e. to start looking first for the topic and then for a file. Georgi Boshnakov -Original Message- From: R-package-devel On Behalf Of Gábor Csárdi Sent: 16 June 2020 13:44 To: Duncan Murdoch Cc: List r-package-devel Subject: Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s) On Mon, Jun 15, 2020 at 5:30 PM Duncan Murdoch wrote: > > On 15/06/2020 12:05 p.m., Martin Maechler wrote: > >>>>>> Duncan Murdoch on Sun, 14 Jun 2020 07:28:03 -0400 writes: > > > > > I agree with almost everything you wrote, ex
Re: [R-pkg-devel] package CatDataAnalysis
It's easier to install a package from CRAN than from Github, but not all that much easier. If it's too much trouble to satisfy CRAN, then don't bother. Just post instructions on how to do a Github install of it and move on to other things. (I'd post those instructions in a README.md file on the Github site, so when people follow your link they'll find the description.) Duncan Murdoch On 28/06/2020 12:07 p.m., Charles Geyer wrote: I have a package that has the datasets for Categorical Data Analysis by Agresti that do not appear in the book. The whole package is a github repo https://github.com/cjgeyer/CatDataAnalysis. All of the data were translated mechanically using the R script foo.R included in the repo (but not in the package) from Agresti's web site http://www.stat.ufl.edu/~aa/cda/data.html. This package seems to be a useful service to students and teachers. The data are much simpler to use with this package than trying to get the data from Agresti's web page (foo.R has 277 lines of code). When I submitted the package to CRAN, I got the following response. The Description field of the DESCRIPTION file is intended to be a (one paragraph) description of what the package does and why it may be useful. Please elaborate. Tell the users what the datasets are about and what they contain so they can use them even when they haven't read your book. Please fix and resubmit, and document what was changed in the submission comments. In an alternate universe without copyright law this seems a reasonable request. In this universe it seems to be asking for trouble. I know about fair use, but I am not a lawyer and do not want to walk the borderline between fair use and copyright violation. The package as it is seems OK because it comes from the author's public web site and these data were never in the book. Please note that I made Alan Agresti (with his acquiescence) the author of the package because it is his book and his data, but I (or rather foo.R) did all the work. I replied to cran.r-project.org, but that was apparently sent to /dev/null. This book is IMHO the authoritative textbook on the subject. Amazon sales rank agrees. The book is used for many courses. So this package would be very helpful as is to many students and teachers. So what to do? Is there any way to get this package on CRAN? __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] [External] Re: Two packages with the same generic function
On 23/06/2020 10:28 a.m., Viechtbauer, Wolfgang (SP) wrote: Still, if pkgA imports the generic from pkgB, then installing pkgA installs pkgB and all of its dependencies (Depends and Imports). Which of course makes sense (as otherwise pkgB cannot be installed). But all of this just for importing foo <- function(x, ...) UseMethod("foo") from pkgB. I think you'd need to be more specific about the two packages before I would believe this is much of a problem. If pkgA and pkgB both contain methods for the same generic, then they are probably working in the same problem domain, and already share many dependencies. It seems it would be rare that pkgA's dependencies and pkgB's dependencies are both big sets that don't have a lot of overlap. If it's only pkgB that has the big dependency set, then just put the generic in pkgA. And if you really are in that rare case where they both have big non-overlapping dependency sets, then create a tiny pkgC to hold the generic. On the other hand, if both packages were allowed to declare foo as a generic, and R should think of it as the same generic, confusion would follow: Think about the case of the filter functions in stats and dplyr. It's not a generic in stats, but obviously could be. In stats, the name is used to talk about linear filtering on a time series. (There are several different representations of time series in R, so it might make sense for stats::filter to be a generic to allow it to work on all of them.) In dplyr, the same name is used to describe subsetting a dataset. Those are both valid uses of the word "filter", but they have nothing to do with each other. It's perfectly reasonable to think that a user might want to do both kinds of filtering. If stats::filter was a generic and someone wrote a method for dplyr::filter, clearly a call to stats::filter should not use that method. It's even possible that some package doing time series analysis in the tidyverse framework would want to have methods for both generics. Duncan Murdoch Best, Wolfgang -Original Message- From: R-package-devel [mailto:r-package-devel-boun...@r-project.org] On Behalf Of Duncan Murdoch Sent: Tuesday, 23 June, 2020 12:25 To: Guido Schwarzer; r-package-devel@r-project.org Subject: Re: [R-pkg-devel] [External] Re: Two packages with the same generic function On 23/06/2020 4:22 a.m., Guido Schwarzer wrote: Am 23.06.20 um 10:00 schrieb Viechtbauer, Wolfgang (SP): [...] @Neal: A separate package with generic functions that pkgA and pkgB could import is an interesting suggestion, thanks! What makes this interesting is that there is no dependency on other packages in generics. Remains the question which help page would be shown for help(foo). If a package imports and then exports the generic, it would normally create a help page referring to the original one where the generic is defined. So both pkgA and pkgB would probably both create help pages, and the help system would show a page listing them both plus the generic one, and asking the user to choose. An example happens if you use library(broom) ?tidy The broom package links to a page that says "These objects are imported from other packages. Follow the links below to see their documentation." One of the links is to the ?tidy page in the generics package. You are allowed to say ?broom::tidy, and then you don't go to the list of possibilities, you go directly to the one you asked for. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Proper CRAN way - How to handle dependency of java jar file?
Your assumption that .jar files are not allowed is wrong: a number of packages contain them: rscala, J4R, Rbgs, bartMachine, OpenStreetMap, ... There's a specific mention about how to include them in the CRAN policy document: "For Java .class and .jar files, the sources should be in a top-level java directory in the source package (or that directory should explain how they can be obtained)." If you still decide not to include your .jar file (maybe it is too big, for example), then I think your option 1 is unusable for those people who can't write to the library location because of permission problems. (Admin privileges are often necessary to install packages in the main library.) Generally I think everyone can install packages somewhere, but users do really get confused when they have multiple library locations, possibly each containing a different version of a package. Duncan Murdoch On 23/06/2020 8:18 a.m., Rainer M Krug wrote: Hi I have a package called `plantuml` (https://github.com/rkrug/plantuml) which converts plantuml code to UML graphs. It uses for this the java program https://plantuml.com which is Open Source. As it is not allowed to distribute a binary with an R package, I use the approach of a function which downloads the jar file into the directory `system.file("jar/plantuml.jar", package = "plantuml”)`. This works nicely, and at the moment, the function is called automatically before the plantuml.jar is used. Now I would like to submit the package to CRAN. I can’t find the guidelines anymore, so I am asking here: What is the appropriate way of handling this? I can think of a at least two ways of making it obvious to the user, that a binary is downloaded: 1) if the file plantuml.jar is not present, ask the user to run the function `updatePlantumlJar()` which downloads the jar to the original location in the package directory 2) tell the user to download the file manually and to put it somewhere, where the package will find it I would prefer the first version, as the plantuml.jar would be in the package directory, where usually nobody but the package is doing stuff. Any suggestions on how I could make this “CRAN conform”? Thanks a lot, Rainer -- Rainer M. Krug, PhD (Conservation Ecology, SUN), MSc (Conservation Biology, UCT), Dipl. Phys. (Germany) Orcid ID: -0002-7490-0066 Department of Evolutionary Biology and Environmental Studies University of Zürich Office Y34-J-74 Winterthurerstrasse 190 8075 Zürich Switzerland Office: +41 (0)44 635 47 64 Cell: +41 (0)78 630 66 57 email: rainer.k...@uzh.ch rai...@krugs.de Skype: RMkrug PGP: 0x0F52F982 -- Rainer M. Krug, PhD (Conservation Ecology, SUN), MSc (Conservation Biology, UCT), Dipl. Phys. (Germany) Orcid ID: -0002-7490-0066 Department of Evolutionary Biology and Environmental Studies University of Zürich Office Y34-J-74 Winterthurerstrasse 190 8075 Zürich Switzerland Office: +41 (0)44 635 47 64 Cell: +41 (0)78 630 66 57 email: rainer.k...@uzh.ch rai...@krugs.de Skype: RMkrug PGP: 0x0F52F982 __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] [External] Re: Two packages with the same generic function
On 23/06/2020 4:22 a.m., Guido Schwarzer wrote: Am 23.06.20 um 10:00 schrieb Viechtbauer, Wolfgang (SP): [...] @Neal: A separate package with generic functions that pkgA and pkgB could import is an interesting suggestion, thanks! What makes this interesting is that there is no dependency on other packages in generics. Remains the question which help page would be shown for help(foo). If a package imports and then exports the generic, it would normally create a help page referring to the original one where the generic is defined. So both pkgA and pkgB would probably both create help pages, and the help system would show a page listing them both plus the generic one, and asking the user to choose. An example happens if you use library(broom) ?tidy The broom package links to a page that says "These objects are imported from other packages. Follow the links below to see their documentation." One of the links is to the ?tidy page in the generics package. You are allowed to say ?broom::tidy, and then you don't go to the list of possibilities, you go directly to the one you asked for. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Two packages with the same generic function
On 22/06/2020 10:17 p.m., Bert Gunter wrote: "Users don't get warned about overriding names in packages they've loaded, because that would just be irritating." Is that also true if the package or generic is imported by another that they load; or is a dependency of a package they load? If so, I would not call it "just irritating" because if silent, how would they know? I can't think of an example where this would be a problem. If a package imports objects from another package, it doesn't affect the user's search list. Maybe it affects what methods are available, but I can't see how it would change what generics are available. Can you give an example of what you're worried about? Duncan Murdoch Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Mon, Jun 22, 2020 at 5:58 PM Mark Leeds <mailto:marklee...@gmail.com>> wrote: Hi Duncan: I maintain dynlm and your example is the exact reason I've been getting emails from people regarding it not working correctly. I've been telling them to load dplyr by using library(dplyr, exclude = c("filter", "lag")) On Mon, Jun 22, 2020 at 7:57 PM Duncan Murdoch mailto:murdoch.dun...@gmail.com>> wrote: > On 22/06/2020 3:48 p.m., Tom Wainwright wrote: > > Yet another alternative is simply to prevent your second package from > > overriding the previously defined generic. The basic problem is the ease > > with which R allows overriding prior generic definitions (one of those > bits > > of bad behavior we in the USA used to call "a Bozo No-No"), which hides > all > > the previous methods, as demonstrated by the following code: > > > >> plot(1:3) > >>> plot <- function(x, ...) UseMethod("plot") > >>> plot(1:3) > >> Error in UseMethod("plot") : > >> no applicable method for 'plot' applied to an object of class > >> "c('integer', 'numeric')" > >>> rm(plot) > >>> plot(1:3) > > > > > > (Despite Murdoch's suggestion that overriding the generic SHOULD issue a > > warning, it doesn't seem to in R 4.0.1.) > > Sure it does, if pkgA and pkgB both export the same name, then you get a > warning when you attach the second one. For example, > > > library(MASS) > > library(dplyr) > > Attaching package: ‘dplyr’ > > The following object is masked from ‘package:MASS’: > > select > > The following objects are masked from ‘package:stats’: > > filter, lag > > The following objects are masked from ‘package:base’: > > intersect, setdiff, setequal, union > > Users don't get warned about overriding names in packages they've > loaded, because that would just be irritating. > > Duncan Murdoch > > > > > So, we might try protecting the generic definitions of "foo" in both > > packages by enclosing them in something like: > > > > tryCatch(invisible(methods("foo")), error = {foo <- function(x,...) > >> UseMethod("foo")}, finally=NULL) > > > > > > There's probably a more elegant way to accomplish this. This relies on > > "methods" returning an error if "foo" has no defined methods, so it is > not > > redefined if their are previous methods. I haven't had time to try this > in > > the two-package example, but it might work, although I'm not sure how to > > handle the Namespace declarations. > > > > Tom Wainwright > > > > On Mon, Jun 22, 2020 at 10:41 AM Bert Gunter mailto:bgunter.4...@gmail.com>> > wrote: > > > >> ... > >> and just to add to the query, assume the author of pkg B did (does) not > >> know of pkg A and so, for example, could (did) not import any of pkg A's > >> content into B. Given that there are at the moment ~20,000 packages out > >> there, this does not seem to be an unreasonable assumption. One may even > >> further assume that the user may not know that (s)he has package B > loaded, > >> as it may be a dependency of another package that (s)he uses. I
Re: [R-pkg-devel] Two packages with the same generic function
On 22/06/2020 3:48 p.m., Tom Wainwright wrote: Yet another alternative is simply to prevent your second package from overriding the previously defined generic. The basic problem is the ease with which R allows overriding prior generic definitions (one of those bits of bad behavior we in the USA used to call "a Bozo No-No"), which hides all the previous methods, as demonstrated by the following code: plot(1:3) plot <- function(x, ...) UseMethod("plot") plot(1:3) Error in UseMethod("plot") : no applicable method for 'plot' applied to an object of class "c('integer', 'numeric')" rm(plot) plot(1:3) (Despite Murdoch's suggestion that overriding the generic SHOULD issue a warning, it doesn't seem to in R 4.0.1.) Sure it does, if pkgA and pkgB both export the same name, then you get a warning when you attach the second one. For example, > library(MASS) > library(dplyr) Attaching package: ‘dplyr’ The following object is masked from ‘package:MASS’: select The following objects are masked from ‘package:stats’: filter, lag The following objects are masked from ‘package:base’: intersect, setdiff, setequal, union Users don't get warned about overriding names in packages they've loaded, because that would just be irritating. Duncan Murdoch So, we might try protecting the generic definitions of "foo" in both packages by enclosing them in something like: tryCatch(invisible(methods("foo")), error = {foo <- function(x,...) UseMethod("foo")}, finally=NULL) There's probably a more elegant way to accomplish this. This relies on "methods" returning an error if "foo" has no defined methods, so it is not redefined if their are previous methods. I haven't had time to try this in the two-package example, but it might work, although I'm not sure how to handle the Namespace declarations. Tom Wainwright On Mon, Jun 22, 2020 at 10:41 AM Bert Gunter wrote: ... and just to add to the query, assume the author of pkg B did (does) not know of pkg A and so, for example, could (did) not import any of pkg A's content into B. Given that there are at the moment ~20,000 packages out there, this does not seem to be an unreasonable assumption. One may even further assume that the user may not know that (s)he has package B loaded, as it may be a dependency of another package that (s)he uses. I certainly don't keep track of all the dependencies of packages I use. Under these assumptions, is there any more convenient alternative to Wolfgang's pkgA:foo(x) explicit call under such assumptions? If pkgA has a long name, what might one do? Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Mon, Jun 22, 2020 at 10:00 AM Viechtbauer, Wolfgang (SP) < wolfgang.viechtba...@maastrichtuniversity.nl> wrote: Hi All, Let's say there are two packages pkgA and pkgB, both of which have a generic function foo <- function(x, ...) UseMethod("foo") and pkgA has a method for objects of class "A": foo.A <- function(x, ...) print(x) and pkgB has a method for objects of class "B": foo.B <- function(x, ...) plot(x) Both packages export foo and their method and declare their respective S3 methods, so: export(foo) export(foo.A) S3method(foo, A) in NAMESPACE of pkgA and export(foo) export(foo.B) S3method(foo, B) in NAMESPACE of pkgB. If a user loads pkgA first and then pkgB, this fails: library(pkgA) library(pkgB) x <- 1:4 class(x) <- "A" foo(x) Error in UseMethod("foo") : no applicable method for 'foo' applied to an object of class "A" and vice-versa. Of course, pkgA::foo(x) works. Aside from pkgA importing foo() or vice-versa, is there some other clever way to make this work? In earlier versions of R (at least in 3.6.3), this used to work (i.e., the generic foo() from pkgB would find method foo.A() and vice-versa), but not since 4.0.0. Best, Wolfgang __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Two packages with the same generic function
On 22/06/2020 1:40 p.m., Bert Gunter wrote: ... and just to add to the query, assume the author of pkg B did (does) not know of pkg A and so, for example, could (did) not import any of pkg A's content into B. Given that there are at the moment ~20,000 packages out there, this does not seem to be an unreasonable assumption. One may even further assume that the user may not know that (s)he has package B loaded, as it may be a dependency of another package that (s)he uses. I certainly don't keep track of all the dependencies of packages I use. Under these assumptions, is there any more convenient alternative to Wolfgang's pkgA:foo(x) explicit call under such assumptions? If pkgA has a long name, what might one do? It's always possible to make a new name, e.g. fooA <- pkgA::foo fooB <- pkgB::foo If you are writing a package, this can be done in the NAMESPACE file, e.g. importFrom(pkgA, fooA = foo) though this doesn't appear to be documented in the usual places. Duncan Murdoch Bert Gunter "The trouble with having an open mind is that people keep coming along and sticking things into it." -- Opus (aka Berkeley Breathed in his "Bloom County" comic strip ) On Mon, Jun 22, 2020 at 10:00 AM Viechtbauer, Wolfgang (SP) < wolfgang.viechtba...@maastrichtuniversity.nl> wrote: Hi All, Let's say there are two packages pkgA and pkgB, both of which have a generic function foo <- function(x, ...) UseMethod("foo") and pkgA has a method for objects of class "A": foo.A <- function(x, ...) print(x) and pkgB has a method for objects of class "B": foo.B <- function(x, ...) plot(x) Both packages export foo and their method and declare their respective S3 methods, so: export(foo) export(foo.A) S3method(foo, A) in NAMESPACE of pkgA and export(foo) export(foo.B) S3method(foo, B) in NAMESPACE of pkgB. If a user loads pkgA first and then pkgB, this fails: library(pkgA) library(pkgB) x <- 1:4 class(x) <- "A" foo(x) Error in UseMethod("foo") : no applicable method for 'foo' applied to an object of class "A" and vice-versa. Of course, pkgA::foo(x) works. Aside from pkgA importing foo() or vice-versa, is there some other clever way to make this work? In earlier versions of R (at least in 3.6.3), this used to work (i.e., the generic foo() from pkgB would find method foo.A() and vice-versa), but not since 4.0.0. Best, Wolfgang __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Two packages with the same generic function
On 22/06/2020 1:00 p.m., Viechtbauer, Wolfgang (SP) wrote: Hi All, Let's say there are two packages pkgA and pkgB, both of which have a generic function foo <- function(x, ...) UseMethod("foo") and pkgA has a method for objects of class "A": foo.A <- function(x, ...) print(x) and pkgB has a method for objects of class "B": foo.B <- function(x, ...) plot(x) Both packages export foo and their method and declare their respective S3 methods, so: export(foo) export(foo.A) S3method(foo, A) in NAMESPACE of pkgA and export(foo) export(foo.B) S3method(foo, B) in NAMESPACE of pkgB. If a user loads pkgA first and then pkgB, this fails: library(pkgA) library(pkgB) Wouldn't the user have got a warning at this point about pkgB::foo masking pkgA::foo? x <- 1:4 class(x) <- "A" foo(x) Error in UseMethod("foo") : no applicable method for 'foo' applied to an object of class "A" and vice-versa. Of course, pkgA::foo(x) works. Aside from pkgA importing foo() or vice-versa, is there some other clever way to make this work? In earlier versions of R (at least in 3.6.3), this used to work (i.e., the generic foo() from pkgB would find method foo.A() and vice-versa), but not since 4.0.0. Can't one of the packages import the generic from the other package, and then declare the method as a method of that other generic? That seems like the only thing that would make sense. There's no reason to believe that pkgA::foo has anything whatsoever to do with pkgB::foo without this. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] how to prevent a small package from yielding a large installed size?
On 15/06/2020 1:24 p.m., Ivan Krylov wrote: On Mon, 15 Jun 2020 12:52:20 -0400 Duncan Murdoch wrote: maybe someone else can suggest how to read an object from the .rdb file using R code. Internally R uses C code for this. This function seems to work for me: # filename: the .rdb file # offset, size: the pair of values from the .rdx # type: 'gzip' if $compressed is TRUE, 'bzip2' for 2, 'xz' for 3 readRDB <- function(filename, offset, size, type = 'gzip') { f <- file(filename, 'rb') on.exit(close(f)) seek(f, offset + 4) unserialize(memDecompress(readBin(f, 'raw', size - 4), type)) } Thanks, though it didn't work for me. I get Error in unserialize(memDecompress(readBin(f, "raw", size - 4), type)) : no restore method available on every object I tried. However, maybe Dan will have better luck. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] how to prevent a small package from yielding a large installed size?
On 15/06/2020 12:30 p.m., Daniel Kelley wrote: Duncan, thanks very much for that very helpful hint. I got as follows. My guess is that the first column in rdx$variables is an address offset, and so it seems that the lion's share of the storage is dedicated to items with names starting with a decimal point. For example, the "[[" item is at offset of nearly 4M. I may try fiddling with my code in which I specialize that method, to see whether I can reduce the memory footprint. From what I can gather, both linux and windows build argoFloats into a package with R directory of about 2.5M size, which is a lot better than what I get in macOS but still over the warning threshold (I think) and therefore I worry about CRAN acceptance. The second column is the size, so actually the lion's share is dedicated to things that are not being shown. They are indexed in the rdx$references list, and are probably going to be harder to track down, because they probably don't have names assigned by you. For example, in the rgl package, I see > rdx$references $`env::1` [1] 661 1037 $`env::10` [1] 123952221 $`env::11` [1] 126378224 $`env::12` [1] 128575226 [ many more deleted ] Presumably `env::1` is an environment which might be referenced by several of the functions, and I'm guessing that one of yours is really big. This can happen accidentally: you have a temporary local variable in a function and create and save another function, or a formula, or some other environment-using object, and save the useless local variable along with it. I don't have a good suggestion for figuring out what's in the bad environment; maybe someone else can suggest how to read an object from the .rdb file using R code. Internally R uses C code for this. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 15/06/2020 12:05 p.m., Martin Maechler wrote: Duncan Murdoch on Sun, 14 Jun 2020 07:28:03 -0400 writes: > I agree with almost everything you wrote, except one thing: this isn't > newly enforced, it has been enforced since the help system began. What > I think is new is that there are now tests for it. Previously those > links just wouldn't work. > Duncan Murdoch Yes, to all... including Duncan's agreement with Gábor. Also, Duncan M earlier did mention that he had wanted to *change* the link-to-file behavior for these cases (when he wrote most of the Rd2html source code) but somehow did not get it. Actually, I don't think I pushed for this change at the time (or at least I didn't push much). I just wish now that I had, because I think it will be harder to do it now than it would have been then. Duncan And that's why we had partial workarounds (as the dynamic server still finding the links under some circumstances). My personal opinions was also that "we" (the R community; i.e., people providing good patches to the R sources / collaborating with R core / ...) should rather work to fix the current design/implementation "infelicity" than the current checks starting to enforce something which is really a wart in my view, and indeed, as Gábor also notes, will create R source documentation that depends on implementation details of other package's documentation. I don't like it either, not at all. Martin > On 14/06/2020 6:26 a.m., Gábor Csárdi wrote: >> On Sun, Jun 14, 2020 at 10:44 AM Duncan Murdoch >> wrote: >> [...] >>> >>> I think the argument was that static builds of the help pages would have >>> trouble resolving the links. With the current system, you can build a >>> help page that links to a page in package foo even if package foo is not >>> installed yet, and have the link work later after you install foo. >> >> That is true, but it is also not a big problem, I think. The CRAN >> Windows R installer does indeed build static help pages by default. >> But the built-in web server that serves these works around broken >> links by treating them as help topics instead of files. As you know. >> :) So this would only be a problem if you wanted to serve the static >> help pages with another web server. (Which is not a bad use case, but >> then maybe Rd2HTML() can just resolve them as topics and avoid the >> broken links.) >> >> Btw. the problem of linking to the wrong page is even worse with >> static builds of help pages, because if a link w/o a package (e.g. >> \link{filter}) picks up the wrong package at install time, then the >> wrong link is hard-coded in the html. If you are building binary >> packages, then they will link to the wrong help pages. >> >> WRE says that specifying the package in the link is rarely needed. >> This was probably the case some time ago, especially when packages did >> not have (compulsory) namespaces. But I am not sure if it still holds. >> I would argue that it is better to specify the package you are linking >> to. But the newly enforced requirement that we need to link to files >> instead of topics makes this more error prone. >> >> Gabor >> >> [...] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] how to prevent a small package from yielding a large installed size?
I can't install your package (I don't have an up to date GDAL), but maybe this is some help: - Package dependencies aren't included, except possibly for static linking of C/Fortran/C++ code. Those normally won't end up in an .rdb file. - .rdb files are part of the lazy load mechanism. You can read the corresponding .rdx file using readRDS(); it contains information on where to look in the .rdb file to find the source of an object. For example, if you have foo.rdb and foo.rdx, then this will tell you what's big in your foo.rdb file: rdx <- readRDS("foo.rdx") sizes <- sapply(rdx$variables, function(n) n[2]) Now sizes will be a named vector of objects contained in the rdb. You should find that sum(sizes) is similar to the size of the .rdb file, but probably a bit smaller, because there are some objects missed by this count: the ones contained in rdx$references. Duncan Murdoch On 15/06/2020 7:13 a.m., Daniel Kelley wrote: I am working on a package (https://github.com/ArgoCanada/argoFloats) that has a 412K source tarball (most of which is data; the R code is 176K), but that creates a library .rdb file of MUCH larger size, namely 7.2M. This file causes a build NOTE, being over the threshold of 1M, and that concerns me in terms of hoped-for submission to CRAN during this summer. My goal in writing this email is to get some advice regarding reducing the size of the .rds file, if indeed this is a general problem and not an artifact of my (macOS) development environment. Here's some more detail: argoFloats depends on some other packages, and so I am wondering whether the large multiplier between R source and .rdb file is because the other sources are dragged in. I could try moving everything to "Suggests", and use requireNamespace(), but that seems to go against recommendations, if I interpret Wickham and Bryan (https://r-pkgs.org/description.html) correctly. A possible clue is that I get a large-file note on macOS, but not when I use rhub for test linux builds, or winbuilder for a windows build. I do not have ready access to either linux or windows machines, to examine those builds in detail. My thinking is that examination of the .rdb file might help me to learn about problems (e.g. if it holds code from packages I "import" from, that might motivate me to move from "import" to "suggest"). Unfortunately, I have not been able to discover a way to examine that file, which seems to be designed for internal R use. I am attaching below my signature line the output from sessionInfo(), in case that helps. The URL I reference in my second paragraph has my DESCRIPTION file, and I will admit that I do not fully understand its nuances. Note that I use roxygen2 to build documentation and NAMESPACE. Any advice would be greatly appreciated, and indeed I thank anyone who got to the bottom of this long email. Dan E. Kelley [he/him/his 314ppm] Department of Oceanography Dalhousie University Halifax, NS, Canada R version 4.0.1 (2020-06-06) Platform: x86_64-apple-darwin17.0 (64-bit) Running under: macOS Catalina 10.15.6 Matrix products: default BLAS: /System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib LAPACK: /Library/Frameworks/R.framework/Versions/4.0/Resources/lib/libRlapack.dylib locale: [1] en_CA.UTF-8/en_CA.UTF-8/en_CA.UTF-8/C/en_CA.UTF-8/en_CA.UTF-8 attached base packages: [1] stats graphics grDevices utils datasets methods base other attached packages: [1] argoFloats_0.1.3 loaded via a namespace (and not attached): [1] Rcpp_1.0.4.6pillar_1.4.4compiler_4.0.1 plyr_1.8.6 class_7.3-17 [6] tools_4.0.1 testthat_2.3.2 digest_0.6.25 bit_1.1-15.2 ncdf4_1.17 [11] oce_1.2-1 memoise_1.1.0 RSQLite_2.2.0 lifecycle_0.2.0 tibble_3.0.1 [16] gtable_0.3.0lattice_0.20-41 gsw_1.0-6 pkgconfig_2.0.3 rlang_0.4.6 [21] DBI_1.1.0 rstudioapi_0.11 curl_4.3e1071_1.7-3 dplyr_1.0.0 [26] stringr_1.4.0 raster_3.1-5generics_0.0.2 vctrs_0.3.1 classInt_0.4-3 [31] bit64_0.9-7 grid_4.0.1 tidyselect_1.1.0glue_1.4.1 sf_0.9-4 [36] R6_2.4.1sp_1.4-2marmap_1.0.4 adehabitatMA_0.3.14 blob_1.2.1 [41] ggplot2_3.3.1 purrr_0.3.4 reshape2_1.4.4 magrittr_1.5 units_0.6-6 [46] scales_1.1.1codetools_0.2-16ellipsis_0.3.1 shape_1.4.4 colorspace_1.4-1 [51] KernSmooth_2.23-17 stringi_1.4.6 munsell_0.5.0 crayon_1.3.4.9000 __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] [CRAN-pretest-archived] CRAN submission gwsem 2.0.3
On 14/06/2020 10:59 a.m., Joshua N Pritikin wrote: On Sun, Jun 14, 2020 at 09:17:32AM -0400, Duncan Murdoch wrote: Since you're using Rmarkdown, you can use a variable for the eval chunk option, e.g. put this in your setup chunk: knitr::opts_chunk$set(eval = !is_CRAN) This works except for in-text `r 1+1` code fragments. Is there an option to make these evaluate to "[MISSING]" or similar? To suppress evaluation of those, you can use this: knitr::knit_hooks$set(evaluate.inline = function(x, envir) x) This will show the code in place of evaluating it and showing its value. Replace the function value with "[MISSING]" if you don't want to see the code. Duncan __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] [CRAN-pretest-archived] CRAN submission gwsem 2.0.3
On 13/06/2020 1:29 p.m., Joshua N Pritikin wrote: I'm trying to include vignettes that take much too long for CRAN check. At the beginning of the Rmarkdown vignette, I use is_CRAN <- !identical(Sys.getenv("NOT_CRAN"), "true") if (is_CRAN) q() And then I use export NOT_CRAN=true when I build locally. But CRAN check still complains, You shouldn't call q() from a vignette. Since you're using Rmarkdown, you can use a variable for the eval chunk option, e.g. put this in your setup chunk: knitr::opts_chunk$set(eval = !is_CRAN) Duncan Murdoch On Mon, Jun 08, 2020 at 04:24:43PM +0200, lig...@statistik.tu-dortmund.de wrote: Flavor: r-devel-linux-x86_64-debian-gcc Check: re-building of vignette outputs, Result: WARNING Error(s) in re-building vignettes: ... --- re-building 'GeneEnvironmentInteraction.Rmd' using rmarkdown --- re-building 'OneFactorModel.Rmd' using rmarkdown --- re-building 'PostGWASprocessing.Rmd' using rmarkdown --- re-building 'ResidualsModel.Rmd' using rmarkdown --- re-building 'StandardGWAS.Rmd' using rmarkdown --- re-building 'TwoFactorModel.Rmd' using rmarkdown --- re-building 'UserSpecifiedGWASModels.Rmd' using rmarkdown --- re-building 'growth.Rmd' using rmarkdown Error: Vignette re-building failed. Execution halted What's the correct way to avoid CRAN complaints? __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
I agree with almost everything you wrote, except one thing: this isn't newly enforced, it has been enforced since the help system began. What I think is new is that there are now tests for it. Previously those links just wouldn't work. Duncan Murdoch On 14/06/2020 6:26 a.m., Gábor Csárdi wrote: On Sun, Jun 14, 2020 at 10:44 AM Duncan Murdoch wrote: [...] I think the argument was that static builds of the help pages would have trouble resolving the links. With the current system, you can build a help page that links to a page in package foo even if package foo is not installed yet, and have the link work later after you install foo. That is true, but it is also not a big problem, I think. The CRAN Windows R installer does indeed build static help pages by default. But the built-in web server that serves these works around broken links by treating them as help topics instead of files. As you know. :) So this would only be a problem if you wanted to serve the static help pages with another web server. (Which is not a bad use case, but then maybe Rd2HTML() can just resolve them as topics and avoid the broken links.) Btw. the problem of linking to the wrong page is even worse with static builds of help pages, because if a link w/o a package (e.g. \link{filter}) picks up the wrong package at install time, then the wrong link is hard-coded in the html. If you are building binary packages, then they will link to the wrong help pages. WRE says that specifying the package in the link is rarely needed. This was probably the case some time ago, especially when packages did not have (compulsory) namespaces. But I am not sure if it still holds. I would argue that it is better to specify the package you are linking to. But the newly enforced requirement that we need to link to files instead of topics makes this more error prone. Gabor [...] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 13/06/2020 9:00 p.m., Gábor Csárdi wrote: On Sat, Jun 13, 2020 at 8:10 PM Duncan Murdoch wrote: On 13/06/2020 1:17 p.m., Zhian Kamvar wrote: [...] Is this a new policy? Zhian, it seems that some of the problematic links are created by roxygen2. You can fix these using this PR: https://github.com/r-lib/roxygen2/pull/1109 You can install this branch with remotes::install_github("r-lib/roxygen2@fix/link-to-file") To fix the remaining ones, you need to modify your \link[]{} macros or switch them to roxygen [pkg::fun()] links. [...] This is probably a new test of the long-time requirement that links should be to filenames, not aliases, when they go to other packages. I really like how the help system resolves links based on _topics_, even for cross-package links. So I admit that I did not follow this requirement too closely. It is better to link to topics instead of files, because then links are independent of how the manual is organized into files. E.g. it is not uncommon to split up a help file that used to document multiple functions, into several files, because the functions gain more functionality, or need more examples, etc. and the manual page is getting less focused and harder to follow. It is perfectly natural that the manual of a package is evolving together with the code. With enforcing this requirement, such documentation changes are considered as breaking changes. If we need to link to files, then what we get is more broken links, and more forced package updates, just to fix the broken links. In particular, if a package moves a help topic to another help file in a new release, then other packages linking to this topic have to update their links, and if these happened to be installed together with the old version of the linked package, they'll have a broken link. This is a pity, because finding the right help files is easy to automate, and in fact the help system already supports it perfectly well. As far as I can tell the only alternative of linking to the file is using an unqualified link, i.e. not specifying the target package at all. The help system can do the lookup at render time, and this is usually OK, but for a non-trivial number of cases it is not, because the resolution of the link depends on what packages are currently loaded. If the right package is not loaded, then the link will potentially go to the wrong help file, which is absurd. Or, if multiple packages have the required topic, the user is presented with a menu, which is also confusing. OTOH the manual authors know perfectly well which package they want to link to, they just can't specify it any more... I am probably missing something, but what do we gain from linking to files, instead of topics? Especially that linking to topics already works perfectly well? I think the argument was that static builds of the help pages would have trouble resolving the links. With the current system, you can build a help page that links to a page in package foo even if package foo is not installed yet, and have the link work later after you install foo. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] check cross-references error: Non-file package-anchored link(s)
On 13/06/2020 1:17 p.m., Zhian Kamvar wrote: Hello, I noticed a strange error pop up for R-devel (2020-06-12 r78687) check on travis: https://travis-ci.org/github/grunwaldlab/poppr/jobs/697831376#L4653-L4654 * checking Rd cross-references ... WARNING Non-file package-anchored link(s) in documentation object 'aboot.Rd': ‘[ape:phylo]{ape::phylo()}’ I looked at the Cross-reference section of WRE, but I couldn't find any mention of non-file package-anchored links being a problem: https://cran.r-project.org/doc/manuals/r-devel/R-exts.html#Cross_002dreferences Is this a new policy? This is probably a new test of the long-time requirement that links should be to filenames, not aliases, when they go to other packages. WRE says "There are two other forms of optional argument specified as \link[pkg]{foo} and \link[pkg:bar]{foo} to link to the package pkg, to files foo.html and bar.html respectively." The problem is that `phylo` is documented in the read.tree.Rd file in ape, so your link needs to be \link[ape:read.tree]{ape::phylo()} I wish I had fixed this inconsistency years ago when I rewrote the Rd code, but I didn't. It would have been painful at the time (there were already thousands of CRAN packages, and lots would have needed fixing), but would be much worse now. Another design flaw that I didn't fix is that you can have an Rd with \name{foo} and no \alias{foo}, and searching ?foo won't find it. In that case I failed to convince other R core members that it would have been a good change. Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] email misleading: checking CRAN incoming feasibility ... NOTE Maintainer
The built-in way is to put those tests in a separate directory, and specify that directory when you run R CMD check. For example, rgl has directory "inst/slowTests", and I use these options to R CMD check: --test-dir=inst/slowTests Duncan Murdoch On 08/06/2020 10:52 a.m., Spencer Graves wrote: Hi, Uwe et al.: What's the preferred way to eliminate tests on CRAN that the maintainer still wants to run on other platforms? For several years, I've been using "if(!fda::CRAN()){...}". I've been told that I should NOT do that, but it has worked for me, and I haven't found anything better. I've recently seen "testthat::skip_on_cran(...)", but I have yet to understand enough of how it works to actually use it. Thanks, Spencer Graves On 2020-06-08 09:43, stefano wrote: Hello Uwe, OK sorry for that. Best wishes. *Stefano * Stefano Mangiola | Postdoctoral fellow Papenfuss Laboratory The Walter Eliza Hall Institute of Medical Research +61 (0)466452544 Il giorno mar 9 giu 2020 alle ore 00:40 Uwe Ligges < lig...@statistik.tu-dortmund.de> ha scritto: On 08.06.2020 16:26, stefano wrote: Hello, I would like to point out that I (and others in various forums) find that the CRAN check with the note : *checking CRAN incoming feasibility ... NOTEMaintainer* Not true, it also says Flavor: r-devel-windows-ix86+x86_64 Check: running examples for arch 'x64', Result: NOTE Examples with CPU (user + system) or elapsed time > 10s user system elapsed lower_triangular-methods 11.48 011.5 Please reduce each example to less than 5 sec. Best, Uwe Ligges Triggers an email saying 1) *package nanny_0.1.7.tar.gz does not pass the incoming checks automatically* 2) *Please fix all problems and resubmit a fixed version via the webform* While apparently nothing should be done, at least according to some forum post https://stackoverflow.com/questions/23829978/checking-cran-incoming-feasibility-note-maintainer It would be nice to avoid this from the test side or the email side. It is pretty confusing for developers who think that they have to act. Best wishes. *Stefano * Stefano Mangiola | Postdoctoral fellow Papenfuss Laboratory The Walter Eliza Hall Institute of Medical Research +61 (0)466452544 [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Vignette depends on BH
On 02/06/2020 3:45 p.m., Tim Keitt wrote: I have an RMarkdown vignette in a package that uses Rcpp and depends on BH. I added BH to the Suggests list. It is still generating warnings about BH. What is the correct way to handle this? If a package is listed in Suggests, it should still work (possibly with diminished capabilities) when that package is not present. So at the start of your vignette you could have something like if (!require("BH")) { message("This vignette needs BH; since it is not installed, code will not be executed.") # Set default for chunk option "exec" to FALSE in all code chunks. } Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] [R] a question of etiquette
On 02/06/2020 11:14 a.m., Adelchi Azzalini wrote: Thanks to all people that contributed to this discussion, which turned out to be interesting, definitely not something which I expected at the beginning. To avoid verbosity, I restrict myself to two more points. (1) In case one adopts the indication that all the authors of a portion of code (irrespective of the extension, even in other languages) are "of course now also copyright holders and authors", how should this be translated in the nomenclature of Writing R Extensions (Section 1.1.1 The DESCRIPTION file)? In this view, "ctb" is the more appropriate role, I believe. The "aut" label is not the right option. Otherwise, dozens on CRAN packages where "ctb" is extensively used should be amended. (2) Having read pertaining portions of manuals and pondered messages, I have come to the conclusion that the terminology set up in the above-quoted paragraph of "Writing R Extensions" is not always ideal. This issue would take a long time and space, so I only indicate one point: the role ‘"cre"’ (creator) for the package maintainer. There are many cases where this description does not fit. For instance, I have seen packages where an author has designed the package, written the entire code and documentation alone, maintained the package for some years, and then passed on the mere maintenance to somebody else; definitely, I would not describe the second person as the "creator". I think the issue was that those roles are not invented by R, they are standard MARC roles (reference listed in the ?person help page), and the full list contains nothing that is particularly close to the role of maintainer. Creator [cre] is defined as "A person or organization responsible for the intellectual or artistic content of a resource". The common English use of the word creator would match that with "originally responsible", whereas a maintainer is "currently responsible", so it's not completely off-base. Maybe R shouldn't have tried to use MARC roles, or should have invented an additional one. It's a bit late for that now, though. Duncan Murdoch So, I think the safe way is to include the original authors in the author list (and check their license carefully). In general, "check the license" is a very sensible indication. In the specific case, the Matlab code comes with no licence indication - nothing. I have now submitted mnormt_2.0.0.tar.gz to CRAN, with a comment/query about this issue. Let us see what "The CRAN" says. In case you want see the conclusion, the outcome should appear at https://cran.r-project.org/package=mnormt in a few days. Best regards, Adelchi Azzalini __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Workflow for Javascript in package?
On 01/06/2020 9:15 a.m., Bryan Hanson wrote: Duncan, I don’t think it is a particularly clever or efficient example, but my package exCon does some of what you describe. Additional disclaimer: I’m not very good at serious Javascript, and find the scoping rules challenging. Even so, exCon works pretty well on large data sets. But you may be more interested in the infrastructure aspects. Thanks! I hadn't thought of using js::uglify_optimize(). I'd like to do the minification during package install rather than when used, and this appears to work in the R source code (at top level): text <- readLines(system.file("htmlwidgets/lib/rglClass/rglClass.src.js", package = "rgl")) if (requireNamespace("js", quietly = TRUE)) text <- js::uglify_optimize(text) writeLines(text, file.path(system.file("htmlwidgets/lib/rglClass", package = "rgl"), "rglClass.js")) rm(text) I'm not completely sure this is supported/allowed, though I don't see a CRAN rule to disallow it. Now I can work on splitting up the big file, and doing a somewhat more complicated install of rglClass.js. Duncan Murdoch https://cran.r-project.org/package=exCon The gathering of separate .js files, minifying and deployment are handled in the main function exCon.R. Javascript and html files in inst/extdata. Bryan Prof. Bryan Hanson (emeritus) Dept of Chemistry & Biochemistry DePauw University Greencastle IN 46135 USA Web: academic.depauw.edu/~hanson/index.html <http://academic.depauw.edu/~hanson/index.html> Repo: github.com/bryanhanson <http://github.com/bryanhanson> Nerdy Blog: ChemoSpec.org <http://ChemoSpec.org> The Twit: @ProfBryanHanson I’m usually @ -4 GMT/UTC On Jun 1, 2020, at 7:44 AM, Duncan Murdoch <mailto:murdoch.dun...@gmail.com>> wrote: The rgl package includes a large amount of Javascript source to handle the display of output in browsers using rglwidget(). Currently this is mostly in one big file (inst/htmlwidgets/lib/rglClass/rglClass.src.js), but I'd like to make some improvements: - splitting it into separate files with related functions - automatically "compiling" it into a single file stripped of white space and comments, for faster loading. Unfortunately, I have no other experience writing a Javascript library, so I really don't know what I'm doing. Can anyone point me to other R packages containing htmlwidgets that do this "properly", or point to instructions for the standard workflow for Javascript library development outside of R that I could adapt? Duncan Murdoch __ R-package-devel@r-project.org <mailto:R-package-devel@r-project.org> mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
[R-pkg-devel] Workflow for Javascript in package?
The rgl package includes a large amount of Javascript source to handle the display of output in browsers using rglwidget(). Currently this is mostly in one big file (inst/htmlwidgets/lib/rglClass/rglClass.src.js), but I'd like to make some improvements: - splitting it into separate files with related functions - automatically "compiling" it into a single file stripped of white space and comments, for faster loading. Unfortunately, I have no other experience writing a Javascript library, so I really don't know what I'm doing. Can anyone point me to other R packages containing htmlwidgets that do this "properly", or point to instructions for the standard workflow for Javascript library development outside of R that I could adapt? Duncan Murdoch __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel
Re: [R-pkg-devel] Error creating union class: object ‘.__C__compMatrix’ not found
On 29/04/2020 1:16 p.m., renozao wrote: Thank you Martin, Looks like the mMatrix class defined in Matrix is not exported in 3.6.3, maybe it is now exported in the current R-devel. Matrix doesn't have the same versions as R: it's a recommended package, not a base package. It can be updated independently of R. The current version on CRAN is 1.2-18, but the R-forge version is 1.3-0. Neither one exports mMatrix. Duncan Murdoch In this case I'd rather use a conditional import and definition. I'll try that route. Bests, Renaud Sent with ProtonMail Secure Email. ‐‐‐ Original Message ‐‐‐ On Saturday, April 18, 2020 12:36 PM, Martin Maechler wrote: renozao on Wed, 8 Apr 2020 16:19:59 + writes: Thank you William for the reproducible example. > Currently I using the following (same as in William's example): > setClassUnion("mMatrix", c("Matrix", "matrix")) Martin, are the changes made in the union class handling affecting the way we should declare them? Thank you. Bests, > Renaud Thank you, Renaud, and Bill Dunlap. There is obviously a bit of a problem there, but it may well be "only" a problem in error handling. As Bill's trace(get, ..) shows, R tries to get "#HAS_DUPLICATE_CLASS_NAMES" which is indeed an indication of the problem: You are trying to REdefine a class union that already exists identically in the Matrix package: In the R-forge development version of 'Matrix', it is line 717 of Matrix/R/AllClass.R (and that will be close also in the CRAN version of Matrix). So at least for you, Renaud, the solution to the problem is easy: Just don't do what you should not do: What you want is already part of Matrix and after searching: It's been part of Matrix since ca. July 2012 ... - But yes, there's a buglet in the 'methods' package currently, which leads to a misleading error message (It's arguable if it should give an error which it did not previously; I think it would be quite a good idea to give at least a warning as you are masking class definition of the Matrix package which is in your search() path or at least among the loaded namespaces at this time). Best, Martin ‐‐‐ Original Message ‐‐‐ > On Wednesday, April 8, 2020 11:19 AM, William Dunlap wrote: Use trace() to get a bit more detail - .__C_compMatrix is looked for in the wrong environment with inherits=FALSE. >> >>> setClassUnion("mMatrix", c("Matrix", "matrix")) >> Tracing get(name, envir = env) on entry >> x=".AllMTable", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".MTable", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".AllMTable", envir="", topenv="", inherits=TRUE >> Tracing get("#HAS_DUPLICATE_CLASS_NAMES", envir = .classTable) on entry >> x="#HAS_DUPLICATE_CLASS_NAMES", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".AllMTable", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".MTable", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".MTable", envir="", topenv="", inherits=TRUE >> Tracing get(".SigLength", envir = env) on entry >> x=".SigLength", envir="", topenv="", inherits=TRUE >> Tracing get(".SigLength", envir = env) on entry >> x=".SigLength", envir="", topenv="", inherits=TRUE >> Tracing get(name, envir = env) on entry >> x=".MTable", envir="", topenv="", inherits=TRUE >> Tracing get(".SigLength", envir = env) on entry >> x
Re: [R-pkg-devel] Package submission failed with two warnings
For your first warning, see the CRAN policy: "Updates to previously-published packages must have an increased version. Increasing the version number at each submission reduces confusion so is preferred even when a previous submission was not accepted." For the second, it's hard to say what's happening. Is your package online somewhere so we could try it on different machines? Duncan Murdoch On 15/05/2020 12:46 a.m., FARSHAD TABASINEJAD wrote: Dear R-package-devel experts, This is the first time I’m writing to this platform about a package I recently submitted to CRAN (my first package). While the package is already available on CRAN, I still must fix a few warnings and one error as reported on the CRAN package check results page: https://cran.r-project.org/web/checks/check_results_Rpvt.html I recently resubmitted a modified version of the package to CRAN, however, it didn’t pass the automatic incoming checks. Windows: https://win-builder.r-project.org/incoming_pretest/Rpvt_0.1.0_20200512_051950/Windows/00check.log Status: 1 WARNING Debian: https://win-builder.r-project.org/incoming_pretest/Rpvt_0.1.0_20200512_051950/Debian/00check.log Status: 2 WARNINGs 1) The first warning that appears on both Windows and Debian results is “Insufficient package version (submitted: 0.1.0, existing: 0.1.0)”. Since the package is conditionally available on CRAN, do I need to modify the package version to 0.1.1 to get rid of this warning? Is there any way to fix this problem with the current version of the package(0.1.0)? 2) The second problem appears on the Debian test results: * checking re-building of vignette outputs ... [8s/8s] WARNING Error(s) in re-building vignettes: ... --- re-building ‘Rpvt.Rmd’ using rmarkdown pandoc-citeproc: Error in $: Incompatible API versions: encoded with [1,20] but attempted to decode with [1,17,5,4]. CallStack (from HasCallStack): error, called at ./Text/Pandoc/JSON.hs:111:48 in pandoc-types-1.17.5.4 5tHZ3B61A58JaKOMxwGQR4:Text.Pandoc.JSON Error running filter /usr/bin/pandoc-citeproc: Filter returned error status 1 Error: processing vignette 'Rpvt.Rmd' failed with diagnostics: pandoc document conversion failed with error 83 --- failed re-building ‘Rpvt.Rmd’ SUMMARY: processing the following file failed: ‘Rpvt.Rmd’ Error: Vignette re-building failed. Execution halted I’ve created this package on a Windows operating system with no problem in creating the ‘Rpvt.Rmd’ file as is the case with the CRAN tests on x86_64-w64-mingw32 (64-bit). Why does it fail to create the “Rpvt.Rmd” file on x86_64-pc-linux-gnu (Debian)? Is this something related to the citation style of the library.bib file that I’ve used in my vignettes folder? It is the “apa-6th-edition.csl” file downloaded from https://raw.githubusercontent.com/citation-style-language/styles/master/apa-6th-edition.csl Thank you in advance! Regards, Farshad [[alternative HTML version deleted]] __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel __ R-package-devel@r-project.org mailing list https://stat.ethz.ch/mailman/listinfo/r-package-devel