[Rd] MS Windows: R does not escape quotes in CLI options the same way as Rterm and Rscript

2021-12-15 Thread Henrik Bengtsson
On MS Windows 10, the following works:

> Rscript --vanilla -e "\"abc\""
[1] "abc"

and also:

> Rterm --vanilla --no-echo -e "\"abc.txt\""
[1] "abc.txt"

whereas attempting the same with 'R' fails;

> R --vanilla --no-echo -e "\"abc.txt\""
Error: object 'abc' not found
Execution halted

I get this with R 4.1.2 and R Under development (unstable) (2021-12-14
r81376 ucrt).

Is this a bug?

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] capturing multiple warnings in tryCatch()

2021-12-02 Thread Henrik Bengtsson
Simon's suggestion with withCallingHandlers() is the correct way.
Also, note that if you use tryCatch() to catch warnings, you're
*interrupting* the evaluation of the expression of interest, e.g.

> res <- tryCatch({ message("hey"); warning("boom"); message("there"); 42 }, 
> warning = function(w) { message("Warning caught: ", conditionMessage(w)); 
> 3.14 })
hey
Warning caught: boom
> res
[1] 3.14

Note how it never completes your expression.

/Henrik

On Thu, Dec 2, 2021 at 1:14 PM Simon Urbanek
 wrote:
>
>
> Adapted from demo(error.catching):
>
> > W=list()
> > withCallingHandlers(foo(), warning=function(w) { W <<- c(W, list(w)); 
> > invokeRestart("muffleWarning") })
> > str(W)
> List of 2
>  $ :List of 2
>   ..$ message: chr "warning 1"
>   ..$ call   : language foo()
>   ..- attr(*, "class")= chr [1:3] "simpleWarning" "warning" "condition"
>  $ :List of 2
>   ..$ message: chr "warning 2"
>   ..$ call   : language foo()
>   ..- attr(*, "class")= chr [1:3] "simpleWarning" "warning" "condition"
>
> Cheers,
> Simon
>
>
> > On Dec 3, 2021, at 10:02 AM, Fox, John  wrote:
> >
> > Dear R-devel list members,
> >
> > Is it possible to capture more than one warning message using tryCatch()? 
> > The answer may be in ?conditions, but, if it is, I can't locate it.
> >
> > For example, in the following only the first warning message is captured 
> > and reported:
> >
> >> foo <- function(){
> > +   warning("warning 1")
> > +   warning("warning 2")
> > + }
> >
> >> foo()
> > Warning messages:
> > 1: In foo() : warning 1
> > 2: In foo() : warning 2
> >
> >> bar <- function(){
> > +   tryCatch(foo(), warning=function(w) print(w))
> > + }
> >
> >> bar()
> > 
> >
> > Is there a way to capture "warning 2" as well?
> >
> > Any help would be appreciated.
> >
> > John
> >
> > --
> > John Fox, Professor Emeritus
> > McMaster University
> > Hamilton, Ontario, Canada
> > Web: http://socserv.mcmaster.ca/jfox/
> >
> >
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R-devel: as.vector(x, mode = "list") drops attributes despite documented not to

2021-12-01 Thread Henrik Bengtsson
Hi,

in R 4.1.2 we have:

> x <- structure(as.list(1:2), dim = c(1,2))
> x
 [,1] [,2]
[1,] 12
> as.vector(x, mode = "list")
 [,1] [,2]
[1,] 12

whereas in recent versions of R-devel (4.2.0) we have:

> x <- structure(as.list(1:2), dim = c(1,2))
> x
 [,1] [,2]
[1,] 12
> as.vector(x, mode = "list")
[[1]]
[1] 1

[[2]]
[1] 2

However, as I read ?as.vector, dropping of attributes should _not_
happen for non-atomic results such as lists.  Is the new behavior a
mistake?

Specifically, ?as.vector says:

'as.vector', a generic, attempts to coerce its argument into a vector
of mode 'mode' (the default is to coerce to whichever vector mode is
most convenient): if the result is atomic all attributes are removed.

[...]

Details:

The atomic modes are "logical", "integer", "numeric" (synonym
"double"), "complex", "character" and "raw".

[...] On the other hand, as.vector removes all attributes including
names for results of atomic mode (but not those of mode "list" nor
"expression").

Value:

[...]

For as.vector, a vector (atomic or of type list or expression). All
attributes are removed from the result if it is of an atomic mode, but
not in general for a list result. The default method handles 24 input
types and 12 values of type: the details of most coercions are
undocumented and subject to change.

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] mapply(): Special case of USE.NAMES=TRUE with recent R-devel updates

2021-11-30 Thread Henrik Bengtsson
Hi,

in R-devel (4.2.0), we now get:

> mapply(paste, "A", character(), USE.NAMES = TRUE)
named list()

Now, in ?mapply we have:

USE.NAMES: logical; use the names of the first ... argument, or if
that is an unnamed character vector, use that vector as the names.

This basically says we should get:

> answer <- list()
> first <- "A"
> names(answer) <- first

which obviously is an error. The help is not explicit what should
happen when the length "of the first ... argument" is zero, but the
above behavior effectively does something like:

> answer <- list()
> first <- "A"
> names(answer) <- first[seq_along(answer)]
> answer
named list()

Is there a need for the docs to be updated, or should the result be an
unnamed empty list?

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] Use set.seed inside function

2021-11-29 Thread Henrik Bengtsson
The easiest is to use withr::with_seed(), e.g.

> withr::with_seed(seed = 42L, randomcoloR::distinctColorPalette(6))
[1] "#A0E1BC" "#B8E363" "#D686BE" "#DEA97F" "#B15CD8" "#A2B9D5"
> withr::with_seed(seed = 42L, randomcoloR::distinctColorPalette(6))
[1] "#A0E1BC" "#B8E363" "#D686BE" "#DEA97F" "#B15CD8" "#A2B9D5"

It works by undoing globalenv()$.Random.seed after the random number
generator has updated.  If you want to roll your own version of this,
you need to make sure to handle the special case when there is no
pre-existing .Random.seed in globalenv().

Regarding packages and functions changing the random seed via a
set.seed() [without undoing it]: this should *never* be done, because
it will wreak havoc on a analyses and studies that rely on random
numbers.  My rule of thumb: only the end-user should be allowed to use
set.seed(), which should typically be done at the top of their R
scripts.

/Henrik

On Mon, Nov 29, 2021 at 1:23 PM Meng Chen  wrote:
>
> Thanks. I think it may work in theory, generating "enough" distinct colors
> is fairly easy. Then the problem will be how to find a subset of colors of
> size n, and the selected colors are still most distinguishable. I think I
> will do this with my eyes if no other methods, a tedious job.
> But at least for my curiosity, I still want to know if there are other ways
> to achieve this. I feel like 80% of people who use the distinctColorPallete
> function actually don't need the "random" feature :) Thanks.
>
> On Mon, Nov 29, 2021 at 9:39 PM James W. MacDonald  wrote:
>
> > It appears that you don't actually want random colors, but instead you
> > want the same colors each time. Why not just generate the vector of 'random
> > distinct colors' one time and save the vector of colors?
> >
> > -Original Message-
> > From: Bioc-devel  On Behalf Of Meng Chen
> > Sent: Monday, November 29, 2021 3:21 PM
> > To: bioc-devel@r-project.org
> > Subject: [Bioc-devel] Use set.seed inside function
> >
> > Dear BioC team and developers,
> >
> > I am using BiocCheck to check my package, it returns a warning:
> > " Remove set.seed usage in R code"
> >
> > I am using "set.seed" inside my functions, before calling function
> > distinctColorPalette (randomcoloR package) in order to generate
> > reproducible "random distinct colors". So what would be the best practice
> > to solve this warning? I think 1. use set.seed and don't change anything.
> > 2. use the set.seed function, but include something like below inside the
> > function *gl.seed <- .Random.seed* *on.exit(assign(".Random.seed", gl.seed,
> > envir = .GlobalEnv))* 3. use some other functions for the purpose
> >
> > Any suggestions will be appreciated. Thanks.
> > --
> > Best Regards,
> > Chen
> >
> > [[alternative HTML version deleted]]
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
> >
>
>
> --
> Best Regards,
> Chen
>
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Rd] How can a package be aware of whether it's on CRAN

2021-11-23 Thread Henrik Bengtsson
On Tue, Nov 23, 2021 at 12:06 PM Gábor Csárdi  wrote:
>
> On Tue, Nov 23, 2021 at 8:49 PM Henrik Bengtsson
>  wrote:
> >
> > > Is there any reliable way to let packages to know if they are on CRAN, so 
> > > they can set omp cores to 2 by default?
> >
> > Instead of testing for "on CRAN" or not, you can test for 'R CMD
> > check' running or not. 'R CMD check' sets environment variable
> > _R_CHECK_LIMIT_CORES_=TRUE. You can use that to limit your code to run
> > at most two (2) parallel threads or processes.
>
> AFAICT this is only set with --as-cran and many CRAN machines don't
> use that and I am fairly sure that some of them don't set this env var
> manually, either.

Oh my - yes & yes, especially on the second part - I totally forgot.
So, that alone is not sufficient. It's not meant to be easy, eh?

So, parallelly::availableCores() tries to account for this as well by
detecting that 'R CMD check' runs, cf.
https://github.com/HenrikBengtsson/parallelly/blob/3e403f600e7181423b9d77c739373d36b4fe34df/R/zzz.R#L42-L47

/Henrik

>
> Gabor
>
> [...]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] How can a package be aware of whether it's on CRAN

2021-11-23 Thread Henrik Bengtsson
> Is there any reliable way to let packages to know if they are on CRAN, so 
> they can set omp cores to 2 by default?

Instead of testing for "on CRAN" or not, you can test for 'R CMD
check' running or not. 'R CMD check' sets environment variable
_R_CHECK_LIMIT_CORES_=TRUE. You can use that to limit your code to run
at most two (2) parallel threads or processes.

The parallelly::availableCores() function is agile to this and many
other settings, i.e. it'll return 2 when running via 'R CMD check'. As
the author, I obviously suggest using that function to query what
amount of CPU resources your R process is allowed to use. For more
info, see 
.

/Henrik

PS. I'm in the camp of *not* having R packages parallelize by default.
At least not until R and its community have figured out how to avoid
ending up with nested parallelization (e.g. via dependencies) by
mistake.  We would also need a standard for end-users (and the sysadms
on the machines they're running) to control the default number of CPU
cores the R session may use.  Right now we only have a few scattered
settings for separate purposes, e.g. option 'mc.cores'/env var
'MC_CORES', and option 'Ncpus', which is not enough for establishing a
de facto standard.

On Tue, Nov 23, 2021 at 11:11 AM Dipterix Wang  wrote:
>
> Dear R wizards,
>
> I recently received an email from Prof. Ripley. He pointed out that my 
> package seriously violates the CRAN policy: "using 8 threads is a serious 
> violation of the CRAN policy”. By default the number of cores my package uses 
> is determined from system CPU cores. After carefully reading all the CRAN 
> policies, now I understand that CRAN does not allow a package to use more 
> than 2 CPU cores when checking a package. I can easily change my code to let 
> my tests comply to that constraint.
>
> However, this warning worries me because my package uses OpenMP. I got 
> “caught" partially because I printed the number of cores used in the package 
> startup message, and one of my test exceeded the time limit (which leads to 
> manual inspection). However, what if I develop a package that imports on 
> those openmp-dependent packages? (For example, data.table, fst…) These 
> packages use more than 2 cores by default. If not carefully treated, it’ll be 
> very easy to exceed that limit, and it’s very hard for CRAN to detect it.
>
> Is there any reliable way to let packages to know if they are on CRAN, so 
> they can set omp cores to 2 by default?
>
> Best,
> - Dipterix
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] DOCS: Exactly when, in the signaling process, is option 'warn' applied?

2021-11-18 Thread Henrik Bengtsson
Hi,

the following question sprung out of a package settings option warn=-1
to silence warnings, but those warnings were still caught by
withCallingHandlers(..., warning), which the package author did not
anticipate. The package has been updated to use suppressWarnings()
instead, but as I see a lot of packages on CRAN [1] use
options(warn=-1) to temporarily silence warnings, I wanted to bring
this one up. Even base R itself [2] does this, e.g.
utils::assignInMyNamespace().

Exactly when is the value of 'warn' options used when calling warning("boom")?

I think the docs, including ?options, would benefit from clarifying
that. To the best of my understanding, it should also mention that
options 'warn' is meant to be used by end-users, and not in package
code where suppressWarnings() should be used.

To clarify, if we do:

> options(warn = -1)
> tryCatch(warning("boom"), warning = function(w) stop("Caught warning: ", 
> conditionMessage(w), call. = FALSE))
Error: Caught warning: boom

we see that the warning is indeed signaled.  However, in Section '8.2
warning' of the 'R Language Definition' [3], we can read:

"The function `warning` takes a single argument that is a character
string. The behaviour of a call to `warning` depends on the value of
the option `"warn"`. If `"warn"` is negative warnings are ignored.
[...]"

The way this is written, it may suggest that warnings are
ignored/silences already early on when calling warning(), but the
above example shows that that is not the case.

>From the same section, we can also read:

"[...] If it is zero, they are stored and printed after the top-level
function has completed. [...]"

which may hint at the 'warn' option is applied only when a warning
condition is allowed to "bubble up" all the way to the top level.
(FWIW, this is how always though it worked, but it's only now I looked
into the docs and see it's ambiguous on this).

/Henrik

[1] 
https://github.com/search?q=org%3Acran+language%3Ar+R%2F+in%3Afile%2Cpath+options+warn+%22-1%22=Code
[2] 
https://github.com/wch/r-source/blob/0a31ab2d1df247a4289efca5a235dc45b511d04a/src/library/utils/R/objects.R#L402-L405
[3] https://cran.r-project.org/doc/manuals/R-lang.html#warning

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] gettext(msgid, domain="R") doesn't work for some 'msgid':s

2021-11-05 Thread Henrik Bengtsson
I'm trying to reuse some of the translations available in base R by using:

  gettext(msgid, domain="R")

This works great for most 'msgid's, e.g.

$ LANGUAGE=de Rscript -e 'gettext("cannot get working directory", domain="R")'
[1] "kann das Arbeitsverzeichnis nicht ermitteln"

However, it does not work for all.  For instance,

$ LANGUAGE=de Rscript -e 'gettext("Execution halted\n", domain="R")'
[1] "Execution halted\n"

This despite that 'msgid' existing in:

$ grep -C 2 -F 'Execution halted\n' src/library/base/po/de.po

#: src/main/main.c:342
msgid "Execution halted\n"
msgstr "Ausführung angehalten\n"

It could be that the trailing newline causes problems, because the
same happens also for:

$ LANGUAGE=de Rscript --vanilla -e 'gettext("error during cleanup\n",
domain="R")'
[1] "error during cleanup\n"

Is this meant to work, and if so, how do I get it to work, or is it a bug?

Thanks,

Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] BUG?: R CMD check with --as-cran *disables* checks for unused imports otherwise performed

2021-11-02 Thread Henrik Bengtsson
I've just posted this to BugZilla as PR18229
(https://bugs.r-project.org/show_bug.cgi?id=18229) to make sure it's
tracked.

/Henrik

On Wed, Oct 20, 2021 at 8:08 PM Jeffrey Dick  wrote:
>
> FWIW, I also encountered this issue and posted on R-pkg-devel about it, with 
> no resolution at the time (May 2020). See "Dependencies NOTE lost with 
> --as-cran" (https://stat.ethz.ch/pipermail/r-package-devel/2020q2/005467.html)
>
> On Wed, Oct 20, 2021 at 11:55 PM Henrik Bengtsson 
>  wrote:
>>
>> ISSUE:
>>
>> Using 'R CMD check' with --as-cran,
>> set_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_=TRUE, whereas the
>> default is FALSE, which you get if you don't add --as-cran.
>> I would expect --as-cran to check more things and more be conservative
>> than without.  So, is this behavior a mistake?  Could it be a thinko
>> around the negating "IGNORE", and the behavior is meant to be vice
>> verse?
>>
>> Example:
>>
>> $ R CMD check QDNAseq_1.29.4.tar.gz
>> ...
>> * using R version 4.1.1 (2021-08-10)
>> * using platform: x86_64-pc-linux-gnu (64-bit)
>> ...
>> * checking dependencies in R code ... NOTE
>> Namespace in Imports field not imported from: ‘future’
>>   All declared Imports should be used.
>>
>> whereas, if I run with --as-cran, I don't get that NOTE;
>>
>> $ R CMD check --as-cran QDNAseq_1.29.4.tar.gz
>> ...
>> * checking dependencies in R code ... OK
>>
>>
>> TROUBLESHOOTING:
>>
>> In src/library/tools/R/check.R [1], the following is set if --as-cran is 
>> used:
>>
>>   Sys.setenv("_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_" = "TRUE")
>>
>> whereas, if not set, the default is:
>>
>> ignore_unused_imports <-
>> config_val_to_logical(Sys.getenv("_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_",
>> "FALSE"))
>>
>> [1] 
>> https://github.com/wch/r-source/blob/b50e3f755674cbb697a4a7395b766647a5cfeea2/src/library/tools/R/check.R#L6335
>> [2] 
>> https://github.com/wch/r-source/blob/b50e3f755674cbb697a4a7395b766647a5cfeea2/src/library/tools/R/QC.R#L5954-L5956
>>
>> /Henrik
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Fwd: Using existing envars in Renviron on friendly Windows

2021-11-02 Thread Henrik Bengtsson
Oh, I see, I misunderstood.  Thanks for clarifying.

One more thing, to mix-and-match environment variables and strings
with escaped characters, while mimicking how POSIX shells does it, by
using strings with double and single quotes. For example, with:

$ cat .Renviron
APPDATA='C:\Users\foobar\AppData\Roaming'
R_LIBS_USER="${APPDATA}"'\R-library'

we get:

$ Rscript --no-init-file --quiet -e 'cat(sprintf("R_LIBS_USER=[%s]\n",
Sys.getenv("R_LIBS_USER")))'
R_LIBS_USER=[C:\Users\foobar\AppData\Roaming\R-library]

and

$ source .Renviron
$ echo "R_LIBS_USER=[${R_LIBS_USER}]"
R_LIBS_USER=[C:\Users\foobar\AppData\Roaming\R-library]

/Henrik

On Sun, Oct 31, 2021 at 2:59 AM Tomas Kalibera  wrote:
>
>
> On 10/31/21 2:55 AM, Henrik Bengtsson wrote:
> >> ... If one still needed backslashes,
> >> they could then be entered in single quotes, e.g. VAR='c:\users'.
> > I don't think it matters whether you use single or double quotes -
> > both will work.  Here's a proof of concept on Linux with R 4.1.1:
> >
> > $ cat ./.Renviron
> > A=C:\users
> > B='C:\users'
> > C="C:\users"
> >
> > $ Rscript -e "Sys.getenv(c('A', 'B', 'C'))"
> >A   B   C
> >"C:users" "C:\\users" "C:\\users"
>
> Yes, but as I wrote "I think the Renviron files should be written in a
> way so that they would work the same in a POSIX shell". This is why
> single quotes. With double quotes, backslashes are interpreted
> differently from a POSIX shell.
>
> Tomas
>
>
> >
> > /Henrik
> >
> > On Wed, Oct 27, 2021 at 11:45 AM Tomas Kalibera
> >  wrote:
> >>
> >> On 10/21/21 5:18 PM, Martin Maechler wrote:
> >>>>>>>> Michał Bojanowski
> >>>>>>>>   on Wed, 20 Oct 2021 16:31:08 +0200 writes:
> >>>   > Hello Tomas,
> >>>   > Yes, that's accurate although rather terse, which is perhaps the
> >>>   > reason why I did not realize it applies to my case.
> >>>
> >>>   > How about adding something in the direction of:
> >>>
> >>>   > 1. Continuing the cited paragraph with:
> >>>   > In particular, on Windows it may be necessary to quote references 
> >>> to
> >>>   > existing environment variables, especially those containing file 
> >>> paths
> >>>   > (which include backslashes). For example: `"${WINVAR}"`.
> >>>
> >>>   > 2. Add an example (not run):
> >>>
> >>>   > # On Windows do quote references to variables containing paths, 
> >>> e.g.:
> >>>   > # If APPDATA=C:\Users\foobar\AppData\Roaming
> >>>   > # to point to a library tree inside APPDATA in .Renviron use
> >>>   > R_LIBS_USER="${APPDATA}"/R-library
> >>>
> >>>   > Incidentally the last example is on backslashes too.
> >>>
> >>>
> >>>   > What do you think?
> >>>
> >>> I agree that adding an example really helps a lot in such cases,
> >>> in my experience, notably if it's precise enough to be used +/- directly.
> >> Yes, I agree as well. I think the Renviron files should be written in a
> >> way so that they would work the same in a POSIX shell, so e.g.
> >> VAR="${VAR0}" or VAR="${VAR0}/subdir" are the recommended ways to
> >> preserve backslashes in VAR0. It is better to use forward slashes in
> >> string literals, e.g. VAR="c:/users". If one still needed backslashes,
> >> they could then be entered in single quotes, e.g. VAR='c:\users'.
> >>
> >> The currently implemented parsing of Renviron files differs in a number
> >> of details from POSIX shells, some are documented and some are not.
> >> Relying only on the documented behavior that is the same as in POSIX
> >> shells is the best choice for future compatibility.
> >>
> >> Tomas
> >>
> >>>
> >>>   > On Mon, Oct 18, 2021 at 5:02 PM Tomas Kalibera 
> >>>  wrote:
> >>>   >>
> >>>   >>
> >>>   >> On 10/15/21 6:44 PM, Michał Bojanowski wrote:
> >>>   >> > Perhaps a small update to ?.Renviron would be in order to 
> >>> mention that...
> >>>   >>
> >>>   >> Would you have a more spe

Re: [Rd] Fwd: Using existing envars in Renviron on friendly Windows

2021-10-30 Thread Henrik Bengtsson
> ... If one still needed backslashes,
> they could then be entered in single quotes, e.g. VAR='c:\users'.

I don't think it matters whether you use single or double quotes -
both will work.  Here's a proof of concept on Linux with R 4.1.1:

$ cat ./.Renviron
A=C:\users
B='C:\users'
C="C:\users"

$ Rscript -e "Sys.getenv(c('A', 'B', 'C'))"
  A   B   C
  "C:users" "C:\\users" "C:\\users"

/Henrik

On Wed, Oct 27, 2021 at 11:45 AM Tomas Kalibera
 wrote:
>
>
> On 10/21/21 5:18 PM, Martin Maechler wrote:
> >> Michał Bojanowski
> >>  on Wed, 20 Oct 2021 16:31:08 +0200 writes:
> >  > Hello Tomas,
> >  > Yes, that's accurate although rather terse, which is perhaps the
> >  > reason why I did not realize it applies to my case.
> >
> >  > How about adding something in the direction of:
> >
> >  > 1. Continuing the cited paragraph with:
> >  > In particular, on Windows it may be necessary to quote references to
> >  > existing environment variables, especially those containing file 
> > paths
> >  > (which include backslashes). For example: `"${WINVAR}"`.
> >
> >  > 2. Add an example (not run):
> >
> >  > # On Windows do quote references to variables containing paths, e.g.:
> >  > # If APPDATA=C:\Users\foobar\AppData\Roaming
> >  > # to point to a library tree inside APPDATA in .Renviron use
> >  > R_LIBS_USER="${APPDATA}"/R-library
> >
> >  > Incidentally the last example is on backslashes too.
> >
> >
> >  > What do you think?
> >
> > I agree that adding an example really helps a lot in such cases,
> > in my experience, notably if it's precise enough to be used +/- directly.
>
> Yes, I agree as well. I think the Renviron files should be written in a
> way so that they would work the same in a POSIX shell, so e.g.
> VAR="${VAR0}" or VAR="${VAR0}/subdir" are the recommended ways to
> preserve backslashes in VAR0. It is better to use forward slashes in
> string literals, e.g. VAR="c:/users". If one still needed backslashes,
> they could then be entered in single quotes, e.g. VAR='c:\users'.
>
> The currently implemented parsing of Renviron files differs in a number
> of details from POSIX shells, some are documented and some are not.
> Relying only on the documented behavior that is the same as in POSIX
> shells is the best choice for future compatibility.
>
> Tomas
>
> >
> >
> >  > On Mon, Oct 18, 2021 at 5:02 PM Tomas Kalibera 
> >  wrote:
> >  >>
> >  >>
> >  >> On 10/15/21 6:44 PM, Michał Bojanowski wrote:
> >  >> > Perhaps a small update to ?.Renviron would be in order to mention 
> > that...
> >  >>
> >  >> Would you have a more specific suggestion how to update the
> >  >> documentation? Please note that it already says
> >  >>
> >  >> "‘value’ is then processed in a similar way to a Unix shell: in
> >  >> particular the outermost level of (single or double) quotes is 
> > stripped,
> >  >> and backslashes are removed except inside quotes."
> >  >>
> >  >> Thanks,
> >  >> Tomas
> >  >>
> >  >> > On Fri, Oct 15, 2021 at 6:43 PM Michał Bojanowski 
> >  wrote:
> >  >> >> Indeed quoting works! Kevin suggested the same, but he didnt 
> > reply to the list.
> >  >> >> Thank you all!
> >  >> >> Michal
> >  >> >>
> >  >> >> On Fri, Oct 15, 2021 at 6:40 PM Ivan Krylov 
> >  wrote:
> >  >> >>> Sorry for the noise! I wasn't supposed to send my previous 
> > message.
> >  >> >>>
> >  >> >>> On Fri, 15 Oct 2021 16:44:28 +0200
> >  >> >>> Michał Bojanowski  wrote:
> >  >> >>>
> >  >>  AVAR=${APPDATA}/foo/bar
> >  >> 
> >  >>  Which is a documented way of referring to existing environment
> >  >>  variables. Now, with that in R I'm getting:
> >  >> 
> >  >>  Sys.getenv("APPDATA")# That works OK
> >  >>  [1] "C:\\Users\\mbojanowski\\AppData\\Roaming"
> >  >> 
> >  >>  so OK, but:
> >  >> 
> >  >>  Sys.getenv("AVAR")
> >  >>  [1] "C:UsersmbojanowskiAppDataRoaming/foo/bar"
> >  >> >>> Hmm, a function called by readRenviron does seem to remove 
> > backslashes,
> >  >> >>> but not if they are encountered inside quotes:
> >  >> >>>
> >  >> >>> 
> > https://github.com/r-devel/r-svn/blob/3f8b75857fb1397f9f3ceab6c75554e1a5386adc/src/main/Renviron.c#L149
> >  >> >>>
> >  >> >>> Would AVAR="${APPDATA}"/foo/bar work?
> >  >> >>>
> >  >> >>> --
> >  >> >>> Best regards,
> >  >> >>> Ivan
> >  >> > __
> >  >> > R-devel@r-project.org mailing list
> >  >> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
> >  > __
> >  > R-devel@r-project.org mailing list
> >  > https://stat.ethz.ch/mailman/listinfo/r-devel
>
> __
> R-devel@r-project.org mailing 

Re: [Bioc-devel] Package name

2021-10-22 Thread Henrik Bengtsson
For CRAN packages it's easy. Packages on CRAN are eternal. They may be
archived, but they are never removed, so in a sense they're always
"currently on CRAN". Archived packages may still be installed, but only
with some efforts of the user. Some packages go in an out of "archived"
status depending how quick the maintainer fixes issues. Because of this, I
cannot really see how a CRAN package name can be "reused" by anyone else
without a formal handover agreement between old and new maintainers. Even
so, I think CRAN needs to approve on the "update" in order to unarchive it.

Personally, I'd argue the same should apply to Bioconductor packages.
Reusing package names for other purposes/different APIs is just asking for
troubles, e.g. when it comes to future scientists trying to reproduce
legacy results.

/Henrik

On Fri, Oct 22, 2021, 03:02 Wolfgang Huber  wrote:

> This is probably a niche concern, but  I’d find it a pity if a good
> package name (*) became unavailable forever, esp. if it refers to a
> real-world concept not owned by the authors of the original package.
> Perhaps we could allow re-using a name after a grace period (say 1 or 2
> years)?
> To be extra safe, one could also require the first version number of the
> new package be much higher than the last version of the old (dead) package.
>
> (*) One example I have in mind where we re-used the name of an extinct
> project is rhdf5.
>
> Kind regards
> Wolfgang
>
> > Il giorno 21ott2021, alle ore 13:39, Kern, Lori
>  ha scritto:
> >
> > Good point.  I'll open an issue on the github to fix.
> >
> >
> > Lori Shepherd
> >
> > Bioconductor Core Team
> >
> > Roswell Park Comprehensive Cancer Center
> >
> > Department of Biostatistics & Bioinformatics
> >
> > Elm & Carlton Streets
> >
> > Buffalo, New York 14263
> >
> > 
> > From: Bioc-devel  on behalf of
> Laurent Gatto 
> > Sent: Thursday, October 21, 2021 12:53 AM
> > To: bioc-devel@r-project.org 
> > Subject: [Bioc-devel] Package name
> >
> > The Package Guidelines for Developers and Reviewers say that:
> >
> > A package name should be descriptive and should not already exist as a
> current package (case-insensitive) in Bioconductor nor CRAN.
> >
> > The sentences says current packages - does that imply that names of
> packages that have been archived (on CRAN) or deprecated (on Bioconductor)
> are available? This is likely to lead to serious confusion.
> >
> > Laurent
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> >
> https://secure-web.cisco.com/18tLjfrOdSZ-K_8neKbEy5VWz_fgbNJthSRI3zRVyXXtc-p9kCgNhG51wWXnY7UGhy4yP_imTwLGoP4BCIicB_fqzg9U937WF_IJiOPJh7NnfQXFLeEV-SiiJJ1eCyN2vaJFacWPvahAlN135mDHZNw_peW0Yl4BOq8m2QBMh4i952Nt6oghMQpSWSjaP_2bN4VKIBT2ZP-A7pDqddlOSeCCaMEKJZp_6w1WthdY69MB6lAbsF-i9uX3JVNSCmAlXW3YMNOfVEBijto4EJaGIUJMJwGX_vec9kTf9gtFiYztotSHNfquFZ4GlaHmXeHwPaBEtazOY5fPiuzLjzDK52Q/https%3A%2F%2Fstat.ethz.ch%2Fmailman%2Flistinfo%2Fbioc-devel
> >
> >
> >
> > This email message may contain legally privileged and/or confidential
> information.  If you are not the intended recipient(s), or the employee or
> agent responsible for the delivery of this message to the intended
> recipient(s), you are hereby notified that any disclosure, copying,
> distribution, or use of this email message is prohibited.  If you have
> received this message in error, please notify the sender immediately by
> e-mail and delete this email message from your computer. Thank you.
> >   [[alternative HTML version deleted]]
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Rd] Fwd: Using existing envars in Renviron on friendly Windows

2021-10-20 Thread Henrik Bengtsson
Two comments/suggestions:

1. What about recommending to always quote the value in Renviron
files, e.g. ABC="Hello world" and DEF="${APPDATA}/R-library"?  This
should a practice that works on all platforms.

2. What about having readRenviron() escapes strings it imports via
environment variables?  See example below.  Is there ever a use case
where someone wants/needs, or even rely on, the current behavior? (I
would even like to argue the current behavior is a design bug that
should be fixed.)  As an analogue from the shell world, Bash escapes
its input.

To illustrate the latter, with:

A=C:\\ABC
B=${A}
C="${A}"

or equivalently:

A="C:\ABC"
B=${A}
C="${A}"

we currently get:

$ Rscript -e "Sys.getenv(c('A', 'B', 'C'))"
A B C
"C:\\ABC"   "C:ABC" "C:\\ABC"

If base::readRenviron() would escape "input" environment variables, we
would get identical values for both 'B' and 'C', which I think is what
most people would expect.

To be clear, this is a problem that occur on all platforms, but it's
more likely to be revealed on MS Windows since paths uses backslashes,
but you could image a Linux user using something like
A="Hello\nworld\n" and would also be surprised about the above
behavior, when they end up with B="Hellonworldn".

/Henrik

On Wed, Oct 20, 2021 at 7:31 AM Michał Bojanowski  wrote:
>
> Hello Tomas,
>
> Yes, that's accurate although rather terse, which is perhaps the
> reason why I did not realize it applies to my case.
>
> How about adding something in the direction of:
>
> 1. Continuing the cited paragraph with:
> In particular, on Windows it may be necessary to quote references to
> existing environment variables, especially those containing file paths
> (which include backslashes). For example: `"${WINVAR}"`.
>
> 2. Add an example (not run):
>
> # On Windows do quote references to variables containing paths, e.g.:
> # If APPDATA=C:\Users\foobar\AppData\Roaming
> # to point to a library tree inside APPDATA in .Renviron use
> R_LIBS_USER="${APPDATA}"/R-library
>
> Incidentally the last example is on backslashes too.
>
> What do you think?
>
> On Mon, Oct 18, 2021 at 5:02 PM Tomas Kalibera  
> wrote:
> >
> >
> > On 10/15/21 6:44 PM, Michał Bojanowski wrote:
> > > Perhaps a small update to ?.Renviron would be in order to mention that...
> >
> > Would you have a more specific suggestion how to update the
> > documentation? Please note that it already says
> >
> > "‘value’ is then processed in a similar way to a Unix shell: in
> > particular the outermost level of (single or double) quotes is stripped,
> > and backslashes are removed except inside quotes."
> >
> > Thanks,
> > Tomas
> >
> > > On Fri, Oct 15, 2021 at 6:43 PM Michał Bojanowski  
> > > wrote:
> > >> Indeed quoting works! Kevin suggested the same, but he didnt reply to 
> > >> the list.
> > >> Thank you all!
> > >> Michal
> > >>
> > >> On Fri, Oct 15, 2021 at 6:40 PM Ivan Krylov  
> > >> wrote:
> > >>> Sorry for the noise! I wasn't supposed to send my previous message.
> > >>>
> > >>> On Fri, 15 Oct 2021 16:44:28 +0200
> > >>> Michał Bojanowski  wrote:
> > >>>
> >  AVAR=${APPDATA}/foo/bar
> > 
> >  Which is a documented way of referring to existing environment
> >  variables. Now, with that in R I'm getting:
> > 
> >  Sys.getenv("APPDATA")# That works OK
> >  [1] "C:\\Users\\mbojanowski\\AppData\\Roaming"
> > 
> >  so OK, but:
> > 
> >  Sys.getenv("AVAR")
> >  [1] "C:UsersmbojanowskiAppDataRoaming/foo/bar"
> > >>> Hmm, a function called by readRenviron does seem to remove backslashes,
> > >>> but not if they are encountered inside quotes:
> > >>>
> > >>> https://github.com/r-devel/r-svn/blob/3f8b75857fb1397f9f3ceab6c75554e1a5386adc/src/main/Renviron.c#L149
> > >>>
> > >>> Would AVAR="${APPDATA}"/foo/bar work?
> > >>>
> > >>> --
> > >>> Best regards,
> > >>> Ivan
> > > __
> > > R-devel@r-project.org mailing list
> > > https://stat.ethz.ch/mailman/listinfo/r-devel
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] BUG?: R CMD check with --as-cran *disables* checks for unused imports otherwise performed

2021-10-20 Thread Henrik Bengtsson
ISSUE:

Using 'R CMD check' with --as-cran,
set_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_=TRUE, whereas the
default is FALSE, which you get if you don't add --as-cran.
I would expect --as-cran to check more things and more be conservative
than without.  So, is this behavior a mistake?  Could it be a thinko
around the negating "IGNORE", and the behavior is meant to be vice
verse?

Example:

$ R CMD check QDNAseq_1.29.4.tar.gz
...
* using R version 4.1.1 (2021-08-10)
* using platform: x86_64-pc-linux-gnu (64-bit)
...
* checking dependencies in R code ... NOTE
Namespace in Imports field not imported from: ‘future’
  All declared Imports should be used.

whereas, if I run with --as-cran, I don't get that NOTE;

$ R CMD check --as-cran QDNAseq_1.29.4.tar.gz
...
* checking dependencies in R code ... OK


TROUBLESHOOTING:

In src/library/tools/R/check.R [1], the following is set if --as-cran is used:

  Sys.setenv("_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_" = "TRUE")

whereas, if not set, the default is:

ignore_unused_imports <-
config_val_to_logical(Sys.getenv("_R_CHECK_PACKAGES_USED_IGNORE_UNUSED_IMPORTS_",
"FALSE"))

[1] 
https://github.com/wch/r-source/blob/b50e3f755674cbb697a4a7395b766647a5cfeea2/src/library/tools/R/check.R#L6335
[2] 
https://github.com/wch/r-source/blob/b50e3f755674cbb697a4a7395b766647a5cfeea2/src/library/tools/R/QC.R#L5954-L5956

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] Too many dependencies / MultiAssayExperiment + rtracklayer

2021-10-19 Thread Henrik Bengtsson
If you're willing to depend on R (>= 4.0.0), then tools::R_user_dir() can
replace the 'rappdirs' package.

/Henrik

On Mon, Oct 18, 2021, 09:05 Shraddha Pai  wrote:

> Hi all,
> Despite moving rarely-used packages to Suggests and eliminating some (e.g.
> TCGAutils), the number of dependencies is still listed as 200 for our
> package netDx.
> https://www.bioconductor.org/packages/devel/bioc/html/netDx.html#since
> Is there anything else we can do to cut down on dependencies?
>
> Thank you,
> Shraddha
>
> On Tue, Sep 21, 2021 at 5:35 PM Shraddha Pai 
> wrote:
>
> > Hi Michael,
> > Thanks! Looks like the package trying to load 'rtracklayer' was
> > 'TCGAutils' (see graph from Zugang above, generated using pkgndep - looks
> > to be quite useful). Turns out TCGAutils really wasn't necessary for my
> > package so I just took it out and removed all associated dependencies -
> > mercifully an easier fix.
> >
> > Thanks for your help,
> > Shraddha
> >
> > On Mon, Sep 20, 2021 at 2:57 PM Michael Lawrence <
> > lawrence.mich...@gene.com> wrote:
> >
> >> Hi Shraddha,
> >>
> >> From the rtracklayer perspective, it sounds like Rsamtools is
> >> (indirectly) bringing in those system libraries. I would have expected
> >> zlibbioc to cover the zlib dependency, and perhaps bz2 and lzma
> >> support is optional. Perhaps a core member could comment on that.
> >>
> >> In the past, I've used this package
> >> https://github.com/Bioconductor/codetoolsBioC to identify missing
> >> NAMESPACE imports. In theory, you could remove the rtracklayer import
> >> and run functions in that package to identify the symbol-level
> >> dependencies. The output is a bit noisy though.
> >>
> >> Btw, using @importFrom only allows you to be selective of symbol-level
> >> dependencies, not package-level.
> >>
> >> Michael
> >>
> >> On Mon, Sep 20, 2021 at 11:37 AM Shraddha Pai  >
> >> wrote:
> >> >
> >> > Hello again,
> >> > I'm trying to simplify the dependencies for my package "netDx", make
> it
> >> > easier to install. It's currently got over 200(!) + some Unix
> libraries
> >> > that need to be installed.
> >> >
> >> > 1. I ran pkgDepMetrics() from BiocPkgTools to find less-needed pkgs,
> and
> >> > the package with the most dependencies is MultiAssayExperiment (see
> >> below
> >> > email). I'm using MAE to construct a container - is there a way to use
> >> > @importFrom calls to reduce MAE dependencies?
> >> >
> >> > 2. Another problem package is rtracklayer which requires Rhtslib,
> which
> >> > requires some unix libraries: zlib1g-dev libbz2-dev liblzma-dev. I'm
> not
> >> > sure which functionality in the package requires rtracklayer - how
> can I
> >> > tell? Is there a way to simplify / reduce these deps so the user
> doesn't
> >> > have to install all these unix packages?
> >> >
> >> > 3. Are there other "problem packages" you can see that I can remove?
> >> Let's
> >> > assume for now ggplot2 stays because people find it useful to have
> >> plotting
> >> > functions readily available.
> >> >
> >> > Thanks very much in advance,
> >> > Shraddha
> >> > ---
> >> > "ImportedAndUsed" "Exported" "Usage" "DepOverlap" "DepGainIfExcluded"
> >> > "igraph" 1 782 0.13 0.05 0
> >> > "ggplot2" 1 520 0.19 0.19 0
> >> > "pracma" 1 448 0.22 0.03 0
> >> > "plotrix" 1 160 0.62 0.03 1
> >> > "S4Vectors" 2 283 0.71 0.03 0
> >> > "grDevices" 1 112 0.89 0.01 0
> >> > "httr" 1 91 1.1 0.05 0
> >> > "scater" 1 85 1.18 0.4 0
> >> > "utils" 3 217 1.38 0.01 0
> >> > "GenomeInfoDb" 1 60 1.67 0.06 0
> >> > "stats" 12 449 2.67 0.01 0
> >> > "bigmemory" 1 35 2.86 0.03 3
> >> > "RCy3" 12 386 3.11 0.32 18
> >> > "BiocFileCache" 1 29 3.45 0.23 3
> >> > "glmnet" 1 24 4.17 0.07 2
> >> > "parallel" 2 33 6.06 0.01 0
> >> > "combinat" 1 13 7.69 0.01 1
> >> > "MultiAssayExperiment" 4 46 8.7 0.22 1
> >> > "foreach" 2 23 8.7 0.02 0
> >> > "graphics" 8 87 9.2 0.01 0
> >> > "GenomicRanges" 15 106 14.15 0.08 0
> >> > "rappdirs" 1 7 14.29 0.01 0
> >> > "reshape2" 1 6 16.67 0.05 0
> >> > "RColorBrewer" 1 4 25 0.01 0
> >> > "netSmooth" 1 3 33.33 0.82 3
> >> > "Rtsne" 1 3 33.33 0.02 0
> >> > "doParallel" 1 2 50 0.03 0
> >> > "ROCR" 2 3 66.67 0.05 4
> >> > "clusterExperiment" NA 122 NA 0.74 0
> >> > "IRanges" NA 255 NA 0.04 0
> >> >
> >> >
> >> > --
> >> >
> >> > *Shraddha Pai, PhD*
> >> > Principal Investigator, OICR
> >> > Assistant Professor, Department of Molecular Biophysics, University of
> >> > Toronto
> >> > shraddhapai.com; @spaiglass on Twitter
> >> > https://pailab.oicr.on.ca
> >> >
> >> >
> >> > *Ontario Institute for Cancer Research*
> >> > MaRS Centre, 661 University Avenue, Suite 510, Toronto, Ontario,
> Canada
> >> M5G
> >> > 0A3
> >> > *@OICR_news*  | *www.oicr.on.ca*
> >> > 
> >> >
> >> >
> >> >
> >> > *Collaborate. Translate. Change lives.*
> >> >
> >> >
> >> >
> >> > This message and any attachments may contain confidential and/or
> >> privileged
> >> > information for the sole use of the intended recipient. Any 

Re: [Bioc-devel] Strange "internal logical NA value has been modified" error

2021-10-12 Thread Henrik Bengtsson
In addition to checking with Valgrind, the ASan/UBsan and rchk
platforms on R-Hub (https://builder.r-hub.io/) can probably also be
useful;

> rhub::check(platform = "linux-x86_64-rocker-gcc-san")
> rhub::check(platform = "ubuntu-rchk")

/Henrik



On Tue, Oct 12, 2021 at 4:54 PM Martin Morgan  wrote:
>
> It is from base R
>
>   
> https://github.com/wch/r-source/blob/a984cc29b9b8d8821f8eb2a1081d9e0d1d4df56e/src/main/memory.c#L3214
>
> and likely indicates memory corruption, not necessarily in the code that 
> triggers the error (this is when the garbage collector is triggered...). 
> Probably in *your* C code :) since it's the least tested. Probably writing 
> out of bounds.
>
> This could be quite tricky to debug. I'd try to get something close to a 
> minimal reproducible example.
>
> I'd try to take devtools out of the picture, maybe running the test/testhat.R 
> script from the command line using Rscript, or worst case creating a shell 
> package that adds minimal code and can be checked with R CMD build 
> --no-build-vignettes / R CMD check.
>
> You could try inserting gc() before / after the unit test; it might make it 
> clear that the unit test isn't the problem. You could also try 
> gctorture(TRUE); this will make your code run extremely painfully slowly, 
> which puts a big premium on having a minimal reproducible example; you could 
> put this near the code chunks that are causing problems.
>
> You might have success running under valgrind, something like R -d valgrind 
> -f minimal_script.R.
>
> Hope those suggestions help!
>
> Martin
>
>
> On 10/12/21, 6:43 PM, "Bioc-devel on behalf of Pariksheet Nanda" 
>  
> wrote:
>
> Hi folks,
>
> I've been told to ask some of my more fun questions on this mailing list
> instead of Slack.  I'm climbing the ladder of submitting my first
> Bioconductor package (https://gitlab.com/coregenomics/tsshmm) and feel
> like there are gremlins that keep adding rungs to the top of the ladder.
>   The latest head scratcher from running devtools::check() is a unit
> test for a  trivial 2 line function failing with this gem of an error:
>
>
>  > test_check("tsshmm")
> ══ Failed tests
> 
> ── Error (test-tss.R:11:5): replace_unstranded splits unstranded into +
> and - ──
> Error in `tryCatchOne(expr, names, parentenv, handlers[[1L]])`: internal
> logical NA value has been modified
> Backtrace:
>   █
>1. ├─testthat::expect_equal(...) test-tss.R:11:4
>2. │ └─testthat::quasi_label(enquo(expected), expected.label, arg =
> "expected")
>3. │   └─rlang::eval_bare(expr, quo_get_env(quo))
>4. └─GenomicRanges::GRanges(c("chr:100:+", "chr:100:-"))
>5.   └─methods::as(seqnames, "GRanges")
>6. └─GenomicRanges:::asMethod(object)
>7.   └─GenomicRanges::GRanges(ans_seqnames, ans_ranges, ans_strand)
>8. └─GenomicRanges:::new_GRanges(...)
>9.   └─S4Vectors:::normarg_mcols(mcols, Class, ans_len)
>   10. └─S4Vectors::make_zero_col_DFrame(x_len)
>   11.   └─S4Vectors::new2("DFrame", nrows = nrow, check = 
> FALSE)
>   12. └─methods::new(...)
>   13.   ├─methods::initialize(value, ...)
>   14.   └─methods::initialize(value, ...)
>   15. └─methods::validObject(.Object)
>   16.   └─base::try(...)
>   17. └─base::tryCatch(...)
>   18.   └─base:::tryCatchList(expr, classes,
> parentenv, handlers)
>   19. └─base:::tryCatchOne(expr, names,
> parentenv, handlers[[1L]])
> [ FAIL 1 | WARN 0 | SKIP 0 | PASS 109 ]
>
>
> The full continuous integration log is here:
> https://gitlab.com/coregenomics/tsshmm/-/jobs/1673603868
>
> The function in question is:
>
>
> replace_unstranded <- function (gr) {
>  idx <- strand(gr) == "*"
>  if (length(idx) == 0L)
>  return(gr)
>  sort(c(
>  gr[! idx],
>  `strand<-`(gr[idx], value = "+"),
>  `strand<-`(gr[idx], value = "-")))
> }
>
>
> Also online here:
> 
> https://gitlab.com/coregenomics/tsshmm/-/blob/ef5e19a0e2f68fca93665bc417afbcfb6d437189/R/hmm.R#L170-178
>
> ... and the unit test is:
>
>
> test_that("replace_unstranded splits unstranded into + and -", {
>  expect_equal(replace_unstranded(GRanges("chr:100")),
>   GRanges(c("chr:100:+", "chr:100:-")))
>  expect_equal(replace_unstranded(GRanges(c("chr:100", "chr:200:+"))),
>   sort(GRanges(c("chr:100:+", "chr:100:-", "chr:200:+"
> })
>
>
> Also online here:
> 
> 

Re: [Rd] R-devel: as.character() for hexmode no longer pads with zeros

2021-09-23 Thread Henrik Bengtsson
Thanks for confirming and giving details on the rationale (... and
I'll updated R.utils to use format() instead).

Regarding as.character(x)[j] === as.character(x[j]): I agree with this
- is that property of as.character()/subsetting explicitly
stated/documented somewhere?  I wonder if this is a property we should
all strive for for other types of objects?

/Henrik

On Thu, Sep 23, 2021 at 12:46 AM Martin Maechler
 wrote:
>
> >>>>> Henrik Bengtsson
> >>>>> on Wed, 22 Sep 2021 20:48:05 -0700 writes:
>
> > The update in rev 80946
> > 
> (https://github.com/wch/r-source/commit/d970867722e14811e8ba6b0ba8e0f478ff482f5e)
> > caused as.character() on hexmode objects to no longer pads with zeros.
>
> Yes -- very much on purpose; by me, after discussing a related issue
> within R-core which showed "how wrong" the previous (current R)
> behavior of the as.character() method is for
> hexmode and octmode objects :
>
> If you look at the whole rev 80946 , you also read NEWS
>
>  * as.character() for "hexmode" or "octmode" objects now
>fulfills the important basic rule
>
>   as.character(x)[j] === as.character(x[j])
>   ^
>
> rather than just calling format().
>
> The format() generic (notably for "atomic-alike" objects) should indeed
> return a character vector where each string has the same "width",
> however, the result of  as.character(x) --- at least for all
> "atomic-alike" / "vector-alike" objects --
> for a single x[j] should not be influenced by other elements in x.
>
>
>
>
> > Before:
>
> >> x <- structure(as.integer(c(0,8,16,24,32)), class="hexmode")
> >> x
> > [1] "00" "08" "10" "18" "20"
> >> as.character(x)
> > [1] "00" "08" "10" "18" "20"
>
> > After:
>
> >> x <- structure(as.integer(c(0,8,16,24,32)), class="hexmode")
> >> x
> > [1] "00" "08" "10" "18" "20"
> >> as.character(x)
> > [1] "0"  "8"  "10" "18" "20"
>
> > Was that intended?
>
> Yes!
> You have to explore your example a bit to notice how "illogical"
> the behavior before was:
>
> > as.character(as.hexmode(0:15))
>  [1] "0" "1" "2" "3" "4" "5" "6" "7" "8" "9" "a" "b" "c" "d" "e" "f"
> > as.character(as.hexmode(0:16))
>  [1] "00" "01" "02" "03" "04" "05" "06" "07" "08" "09" "0a" "0b" "0c" "0d" 
> "0e"
> [16] "0f" "10"
>
> > as.character(as.hexmode(16^(0:2)))
> [1] "001" "010" "100"
> > as.character(as.hexmode(16^(0:3)))
> [1] "0001" "0010" "0100" "1000"
> > as.character(as.hexmode(16^(0:4)))
> [1] "1" "00010" "00100" "01000" "1"
>
> all breaking the rule in the NEWS  and given above.
>
> If you want format()  you should use format(),
> but as.character() should never have used format() ..
>
> Martin
>
> > /Henrik
>
> > PS. This breaks R.utils::intToHex()
> > [https://cran.r-project.org/web/checks/check_results_R.utils.html]
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] R-devel: as.character() for hexmode no longer pads with zeros

2021-09-22 Thread Henrik Bengtsson
The update in rev 80946
(https://github.com/wch/r-source/commit/d970867722e14811e8ba6b0ba8e0f478ff482f5e)
caused as.character() on hexmode objects to no longer pads with zeros.

Before:

> x <- structure(as.integer(c(0,8,16,24,32)), class="hexmode")
> x
[1] "00" "08" "10" "18" "20"
> as.character(x)
[1] "00" "08" "10" "18" "20"

After:

> x <- structure(as.integer(c(0,8,16,24,32)), class="hexmode")
> x
[1] "00" "08" "10" "18" "20"
> as.character(x)
[1] "0"  "8"  "10" "18" "20"

Was that intended?

/Henrik

PS. This breaks R.utils::intToHex()
[https://cran.r-project.org/web/checks/check_results_R.utils.html]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] WISH: set.seed(seed) to produce error if length(seed) != 1 (now silent)

2021-09-17 Thread Henrik Bengtsson
> I'd say a more serious problem would be using set.seed(.Random.seed) ...

Exactly, I'm pretty sure I also tried that at some point.  This leads
to another thing I wanted to get to, which is to add support for
exactly that case.  So, instead of having poke around with:

globalenv()$.Random.seed <- new_seed

where 'new_seed' is a valid ".Random.seed" seed, it would be
convenient to be able to do just set.seed(new_seed), which comes handy
in parallel processing.

/Henrik

On Fri, Sep 17, 2021 at 3:10 PM Duncan Murdoch  wrote:
>
> I'd say a more serious problem would be using set.seed(.Random.seed),
> because the first entry codes for RNGkind, it hardly varies at all.  So
> this sequence could really mislead someone:
>
>  > set.seed(.Random.seed)
>  > sum(.Random.seed)
> [1] 24428993419
>
> # Use it to get a new .Random.seed value:
>  > runif(1)
> [1] 0.3842704
>
>  > sum(.Random.seed)
> [1] -13435151647
>
> # So let's make things really random, by using the new seed as a seed:
>  > set.seed(.Random.seed)
>  > sum(.Random.seed)
> [1] 24428993419
>
> # Back to the original!
>
> Duncan Murdoch
>
>
> On 17/09/2021 8:38 a.m., Henrik Bengtsson wrote:
> >> I’m curious, other than proper programming practice, why?
> >
> > Life's too short for troubleshooting silent mistakes - mine or others.
> >
> > While at it, searching the interwebs for use of set.seed(), gives
> > mistakes/misunderstandings like using set.seed(), e.g.
> >
> >> set.seed(6.1); sum(.Random.seed)
> > [1] 73930104
> >> set.seed(6.2); sum(.Random.seed)
> > [1] 73930104
> >
> > which clearly is not what the user expected.  There are also a few
> > cases of set.seed(), e.g.
> >
> >> set.seed("42"); sum(.Random.seed)
> > [1] -2119381568
> >> set.seed(42); sum(.Random.seed)
> > [1] -2119381568
> >
> > which works just because as.numeric("42") is used.
> >
> > /Henrik
> >
> > On Fri, Sep 17, 2021 at 12:55 PM GILLIBERT, Andre
> >  wrote:
> >>
> >> Hello,
> >>
> >> A vector with a length >= 2 to set.seed would probably be a bug. An error 
> >> message will help the user to fix his R code. The bug may be accidental or 
> >> due to bad understanding of the set.seed function. For instance, a user 
> >> may think that the whole state of the PRNG can be passed to set.seed.
> >>
> >> The "if" instruction, emits a warning when the condition has length >= 2, 
> >> because it is often a bug. I would expect a warning or error with 
> >> set.seed().
> >>
> >> Validating inputs and emitting errors early is a good practice.
> >>
> >> Just my 2 cents.
> >>
> >> Sincerely.
> >> Andre GILLIBERT
> >>
> >> -Message d'origine-
> >> De : R-devel [mailto:r-devel-boun...@r-project.org] De la part de Avraham 
> >> Adler
> >> Envoyé : vendredi 17 septembre 2021 12:07
> >> À : Henrik Bengtsson
> >> Cc : R-devel
> >> Objet : Re: [Rd] WISH: set.seed(seed) to produce error if length(seed) != 
> >> 1 (now silent)
> >>
> >> Hi, Henrik.
> >>
> >> I’m curious, other than proper programming practice, why?
> >>
> >> Avi
> >>
> >> On Fri, Sep 17, 2021 at 11:48 AM Henrik Bengtsson <
> >> henrik.bengts...@gmail.com> wrote:
> >>
> >>> Hi,
> >>>
> >>> according to help("set.seed"), argument 'seed' to set.seed() should be:
> >>>
> >>>a single value, interpreted as an integer, or NULL (see ‘Details’).
> >>>
> >>>  From code inspection (src/main/RNG.c) and testing, it turns out that
> >>> if you pass a 'seed' with length greater than one, it silently uses
> >>> seed[1], e.g.
> >>>
> >>>> set.seed(1); sum(.Random.seed)
> >>> [1] 4070365163
> >>>> set.seed(1:3); sum(.Random.seed)
> >>> [1] 4070365163
> >>>> set.seed(1:100); sum(.Random.seed)
> >>> [1] 4070365163
> >>>
> >>> I'd like to suggest that set.seed() produces an error if length(seed)
> >>>> 1.  As a reference, for length(seed) == 0, we get:
> >>>
> >>>> set.seed(integer(0))
> >>> Error in set.seed(integer(0)) : supplied seed is not a valid integer
> >>>
> >>> /Henrik
> >>>
> >>> __
> >>> R-devel@r-project.org mailing list
> >>> https://stat.ethz.ch/mailman/listinfo/r-devel
> >>>
> >> --
> >> Sent from Gmail Mobile
> >>
> >>  [[alternative HTML version deleted]]
> >>
> >> __
> >> R-devel@r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-devel
> >>
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] WISH: set.seed(seed) to produce error if length(seed) != 1 (now silent)

2021-09-17 Thread Henrik Bengtsson
> I’m curious, other than proper programming practice, why?

Life's too short for troubleshooting silent mistakes - mine or others.

While at it, searching the interwebs for use of set.seed(), gives
mistakes/misunderstandings like using set.seed(), e.g.

> set.seed(6.1); sum(.Random.seed)
[1] 73930104
> set.seed(6.2); sum(.Random.seed)
[1] 73930104

which clearly is not what the user expected.  There are also a few
cases of set.seed(), e.g.

> set.seed("42"); sum(.Random.seed)
[1] -2119381568
> set.seed(42); sum(.Random.seed)
[1] -2119381568

which works just because as.numeric("42") is used.

/Henrik

On Fri, Sep 17, 2021 at 12:55 PM GILLIBERT, Andre
 wrote:
>
> Hello,
>
> A vector with a length >= 2 to set.seed would probably be a bug. An error 
> message will help the user to fix his R code. The bug may be accidental or 
> due to bad understanding of the set.seed function. For instance, a user may 
> think that the whole state of the PRNG can be passed to set.seed.
>
> The "if" instruction, emits a warning when the condition has length >= 2, 
> because it is often a bug. I would expect a warning or error with set.seed().
>
> Validating inputs and emitting errors early is a good practice.
>
> Just my 2 cents.
>
> Sincerely.
> Andre GILLIBERT
>
> -Message d'origine-
> De : R-devel [mailto:r-devel-boun...@r-project.org] De la part de Avraham 
> Adler
> Envoyé : vendredi 17 septembre 2021 12:07
> À : Henrik Bengtsson
> Cc : R-devel
> Objet : Re: [Rd] WISH: set.seed(seed) to produce error if length(seed) != 1 
> (now silent)
>
> Hi, Henrik.
>
> I’m curious, other than proper programming practice, why?
>
> Avi
>
> On Fri, Sep 17, 2021 at 11:48 AM Henrik Bengtsson <
> henrik.bengts...@gmail.com> wrote:
>
> > Hi,
> >
> > according to help("set.seed"), argument 'seed' to set.seed() should be:
> >
> >   a single value, interpreted as an integer, or NULL (see ‘Details’).
> >
> > From code inspection (src/main/RNG.c) and testing, it turns out that
> > if you pass a 'seed' with length greater than one, it silently uses
> > seed[1], e.g.
> >
> > > set.seed(1); sum(.Random.seed)
> > [1] 4070365163
> > > set.seed(1:3); sum(.Random.seed)
> > [1] 4070365163
> > > set.seed(1:100); sum(.Random.seed)
> > [1] 4070365163
> >
> > I'd like to suggest that set.seed() produces an error if length(seed)
> > > 1.  As a reference, for length(seed) == 0, we get:
> >
> > > set.seed(integer(0))
> > Error in set.seed(integer(0)) : supplied seed is not a valid integer
> >
> > /Henrik
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
> --
> Sent from Gmail Mobile
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] WISH: set.seed(seed) to produce error if length(seed) != 1 (now silent)

2021-09-17 Thread Henrik Bengtsson
Hi,

according to help("set.seed"), argument 'seed' to set.seed() should be:

  a single value, interpreted as an integer, or NULL (see ‘Details’).

>From code inspection (src/main/RNG.c) and testing, it turns out that
if you pass a 'seed' with length greater than one, it silently uses
seed[1], e.g.

> set.seed(1); sum(.Random.seed)
[1] 4070365163
> set.seed(1:3); sum(.Random.seed)
[1] 4070365163
> set.seed(1:100); sum(.Random.seed)
[1] 4070365163

I'd like to suggest that set.seed() produces an error if length(seed)
> 1.  As a reference, for length(seed) == 0, we get:

> set.seed(integer(0))
Error in set.seed(integer(0)) : supplied seed is not a valid integer

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] git: lost write access to some repos + what is my BiocCredentials email address?

2021-08-19 Thread Henrik Bengtsson
Thank you, confirming git push now works and

$ ssh -T g...@git.bioconductor.org | grep -E
"(affxparser|aroma.light|illuminaio|QDNAseq)$"
X11 forwarding request failed on channel 0
 R Wpackages/QDNAseq
 R Wpackages/affxparser
 R Wpackages/aroma.light
 R Wpackages/illuminaio

/Henrik

On Thu, Aug 19, 2021 at 2:38 PM Nitesh Turaga  wrote:
>
> Hi Henrik,
>
> You should have access to these packages again.
>
> Please try again.
>
> Best
>
> Nitesh Turaga
> Scientist II, Department of Data Science,
> Bioconductor Core Team Member
> Dana Farber Cancer Institute
>
> > On Aug 19, 2021, at 8:08 AM, Henrik Bengtsson  
> > wrote:
> >
> > Hi,
> >
> > I seem to have "lost" write access to several Bioconductor git
> > repositories that I had git push access for before;
> >
> > $ ssh -T g...@git.bioconductor.org | grep -E
> > "(affxparser|aroma.light|illuminaio|QDNAseq)$"
> > X11 forwarding request failed on channel 0
> > R  packages/QDNAseq
> > R  packages/affxparser
> > R  packages/aroma.light
> > R Wpackages/illuminaio
> >
> > Using `ssh -v ...`, I see that my git+ssh "offers" the server an RSA
> > public key (B...PwYDZ), which is accepted.  Since this gives me
> > write access to one of the repositories, I either have lost write
> > access to the others, or I somehow have ended up with different SSH
> > keys associated with different repositories (since I had write
> > permissions in the past).
> >
> > For example, with:
> >
> > $ git clone g...@git.bioconductor.org:packages/aroma.light
> > $ cd aroma.light
> > $ git remote -v
> > origing...@git.bioconductor.org:packages/aroma.light (fetch)
> > origing...@git.bioconductor.org:packages/aroma.light (push)
> >
> > I get:
> >
> > $ git push
> > X11 forwarding request failed on channel 0
> > FATAL: W any packages/aroma.light h.bengtsson DENIED by fallthru
> > (or you mis-spelled the reponame)
> > fatal: Could not read from remote repository.
> >
> > Please make sure you have the correct access rights and the repository 
> > exists.
> >
> > I followed FAQ #15 to check what SSH key I have on BiocCredentials,
> > but when I try to activate the account on
> > https://git.bioconductor.org/BiocCredentials/account_activation/ using
> > the email address I have in the DESCRIPTION file, I get
> > "henr...@braju.com is not associated with a maintainer of a
> > Bioconductor package. Please check the spelling or contact
> > bioc-devel@r-project.org for help."(*) I suspect it's another email
> > address I should use, possibly one from the SVN era. How can I find
> > out which email address I should use?
> >
> > (*) FYI, the webpage hint reading "Enter the email associated with
> > your Bioconductor package" might be ambiguous; Is it really specific
> > to a particular package?  Should it say something like "Enter the
> > email associated with your Bioconductor developer account"?
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
>

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


[Bioc-devel] git: lost write access to some repos + what is my BiocCredentials email address?

2021-08-19 Thread Henrik Bengtsson
Hi,

I seem to have "lost" write access to several Bioconductor git
repositories that I had git push access for before;

$ ssh -T g...@git.bioconductor.org | grep -E
"(affxparser|aroma.light|illuminaio|QDNAseq)$"
X11 forwarding request failed on channel 0
 R  packages/QDNAseq
 R  packages/affxparser
 R  packages/aroma.light
 R Wpackages/illuminaio

Using `ssh -v ...`, I see that my git+ssh "offers" the server an RSA
public key (B...PwYDZ), which is accepted.  Since this gives me
write access to one of the repositories, I either have lost write
access to the others, or I somehow have ended up with different SSH
keys associated with different repositories (since I had write
permissions in the past).

For example, with:

$ git clone g...@git.bioconductor.org:packages/aroma.light
$ cd aroma.light
$ git remote -v
origing...@git.bioconductor.org:packages/aroma.light (fetch)
origing...@git.bioconductor.org:packages/aroma.light (push)

I get:

$ git push
X11 forwarding request failed on channel 0
FATAL: W any packages/aroma.light h.bengtsson DENIED by fallthru
(or you mis-spelled the reponame)
fatal: Could not read from remote repository.

Please make sure you have the correct access rights and the repository exists.

I followed FAQ #15 to check what SSH key I have on BiocCredentials,
but when I try to activate the account on
https://git.bioconductor.org/BiocCredentials/account_activation/ using
the email address I have in the DESCRIPTION file, I get
"henr...@braju.com is not associated with a maintainer of a
Bioconductor package. Please check the spelling or contact
bioc-devel@r-project.org for help."(*) I suspect it's another email
address I should use, possibly one from the SVN era. How can I find
out which email address I should use?

(*) FYI, the webpage hint reading "Enter the email associated with
your Bioconductor package" might be ambiguous; Is it really specific
to a particular package?  Should it say something like "Enter the
email associated with your Bioconductor developer account"?

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Rd] Force quitting a FORK cluster node on macOS and Solaris wreaks havoc

2021-08-16 Thread Henrik Bengtsson
Thank you Simon, this is helpful.  I take this is specific to quit(),
so it's a poor choice for emulating crashed parallel workers, and
Sys.kill() is much better for that.

I was focusing on that odd extra execution/output, but as you say,
there are lots of other things that is done by quit() here, e.g.
regardless of platform quit() damages the main R process too:

> f <- parallel::mcparallel(quit("no"))
> v <- parallel::mccollect(f)
Warning message:
In parallel::mccollect(f) : 1 parallel job did not deliver a result
> file.exists(tempdir())
[1] FALSE


Would it be sufficient to make quit() fork safe by, conceptually,
doing something like:

quit <- function(save = "default", status = 0, runLast = TRUE) {
  if (parallel:::isChild())
  stop("quit() must not be called in a forked process")
  .Internal(quit(save, status, runLast))
}

This would protect against calling quit() in forked code by mistake,
e.g. when someone parallelize over code/scripts they don't have full
control over and the ones who write those scripts might not be aware
that they may be used in forks.

Thanks,

Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Force quitting a FORK cluster node on macOS and Solaris wreaks havoc

2021-08-12 Thread Henrik Bengtsson
The following smells like a bug in R to me, because it puts the main R
session into an unstable state.  Consider the following R script:

a <- 42
message("a=", a)
cl <- parallel::makeCluster(1L, type="FORK")
try(parallel::clusterEvalQ(cl, quit(save="no")))
message("parallel:::isChild()=", parallel:::isChild())
message("a=", a)
rm(a)

The purpose of this was to emulate what happens when an parallel
workers crashes.

Now, if you source() the above on macOS, you might(*) end up with:

> a <- 42
> message("a=", a)
a=42
> cl <- parallel::makeCluster(1L, type="FORK")
> try(parallel::clusterEvalQ(cl, quit(save="no")))
Error: Error in unserialize(node$con) : error reading from connection
> message("parallel:::isChild()=", parallel:::isChild())
parallel:::isChild()=FALSE
> message("a=", a)
a=42
> rm(a)
> try(parallel::clusterEvalQ(cl, quit(save="no")))
Error: Error in unserialize(node$con) : error reading from connection
> message("parallel:::isChild()=", parallel:::isChild())
parallel:::isChild()=FALSE
> message("a=", a)
Error: Error in message("a=", a) : object 'a' not found
Execution halted

Note how 'rm(a)' is supposed to be the last line of code to be
evaluated.  However, the force quitting of the FORK cluster node
appears to result in the main code being evaluated twice (in
parallel?).

(*) This does not happen on all macOS variants. For example, it works
fine on CRAN's 'r-release-macos-x86_64' but it does give the above
behavior on 'r-release-macos-arm64'.  I can reproduce it on GitHub
Actions 
(https://github.com/HenrikBengtsson/teeny/runs/3309235106?check_suite_focus=true#step:10:219)
but not on R-hub's 'macos-highsierra-release' and
'macos-highsierra-release-cran'.  I can also reproduce it on R-hub's
'solaris-x86-patched' and solaris-x86-patched-ods' machines.  However,
I still haven't found a Linux machine where this happens.

If one replaces quit(save="no") with tools::pskill(Sys.getpid()) or
parallel:::mcexit(0L), this behavior does not take place (at least not
on GitHub Actions and R-hub).

I don't have access to a macOS or a Solaris machine, so I cannot
investigate further myself. For example, could it be an issue with
quit(), or does is it possible to trigger by other means? And more
importantly, should this be fixed? Also, I'd be curious what happens
if you run the above in an interactive R session.

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] find functions with missing Rd tags

2021-06-23 Thread Henrik Bengtsson
$ grep -L -F "\value{" man/*.Rd

/Henrik

On Wed, Jun 23, 2021 at 10:58 AM Alex Chubaty  wrote:
>
> During a recent package submission process, a CRAN maintainer showed one of
> their checks found missing \value{} documentation in some package Rd files,
> and asked us to ensure all exported functions have their return values
> described.
>
> This check (for missing Rd values) is not run by the default checks, so I
> have no idea how to quickly identify which functions are missing those
> components, without manually inspecting everything. I am hoping that
> someone here can tell me which special R CMD check incantation, or similar
> I can use to find _exported_ functions with missing Rd tags.
>
> Thank you,
> Alex
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] Build error "failure: length > 1 in coercion to logical" not reproducible

2021-06-17 Thread Henrik Bengtsson
On Thu, Jun 17, 2021 at 2:32 AM  wrote:
>
> Dear colleagues,
>
> It seems to me that, starting with the latest BioC devel branch (3.14), the
> build systems have become more pedantic about logical vectors of length > 1
> in conditions. Two of the packages I am maintaining, 'kebabs' and 'procoil'
> currently fail to build. Surely I want to fix this. However, I cannot
> reproduce these errors on my local system (R 4.1.0 alpha on Ubuntu 18.04
> LTS). The discussions https://support.bioconductor.org/p/9137605/ and
> https://github.com/Bioconductor/BBS/issues/71  have pointed me to the
> setting "_R_CHECK_LENGTH_1_CONDITION_=verbose".
>
> First question: Can anybody confirm that this has been changed in the recent
> devel?

Not a Bioc maintainer, but yes, the Bioc build system added this on
May 22, 2021 in order to catch similar bugs in package vignettes, cf.
https://community-bioc.slack.com/archives/CLUJWDQF4/p1622062783020300?thread_ts=1622053611.008100=CLUJWDQF4

>
> Second question: I have tried to include
> "_R_CHECK_LENGTH_1_CONDITION_=verbose" in my .Renviron file and it seems
> that my R session respects that. However, when I run 'R CMD build' on the
> aforementioned packages, they still build fine. The suggestions in
> https://github.com/Bioconductor/BBS/issues/71 don't work for me either
> (maybe I have done something wrong). I would actually like to reproduce the
> errors in my local system, since this will help me fixing the errors and
> testing the changes. So can anybody give me advice how I can make my local
> installation to check for logical vectors of length > 1 in conditions more
> strictly?

You want to set:

_R_CHECK_LENGTH_1_LOGIC2_=verbose

That one catches bugs where x && y or x || y is called with length(x)
> 1 or length(y) > 1.

Using:

_R_CHECK_LENGTH_1_CONDITION_=verbose

catches bugs where if (x) { ... } and similar conditions are called
with length(x) > 1.

In your case, a reproducible minimal example is:

Sys.setenv("_R_CHECK_LENGTH_1_LOGIC2_"="verbose")
files <- c("a", "b")
files <- Rsubread:::.check_and_NormPath(files)
...
Error in is.na(files) || is.null(files) :

  'length(x) = 2 > 1' in coercion to 'logical(1)'

The problem is that there's a is.na(files) || is.null(files) in the code, where

> is.na(files)
[1] FALSE FALSE
> is.null(files)
[1] FALSE

so, we have an x || y case with length(x) > 1.

/Henrik




>
> Thanks a lot in advance,
> Ulrich
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


[Rd] R for Windows leaves detritus in the temp directory

2021-06-15 Thread Henrik Bengtsson
ISSUE:

The TMPDIR validation done in src/gnuwin32/system.c:

/* in case getpid() is not unique -- has been seen under Windows */
snprintf(ifile, 1024, "%s/Rscript%x%x", tm, getpid(),
 (unsigned int) GetTickCount());
ifp = fopen(ifile, "w+b");
if(!ifp) R_Suicide(_("creation of tmpfile failed -- set TMPDIR suitably?"));
  }

does _not_ clean up after itself, i.e. there's a missing

unlink(ifile);

In contrast, ditto in src/unix/system.c does this.


BACKGROUND:

When running R CMD check --as-cran on my 'future' package, I get:

* checking for detritus in the temp directory ... NOTE
Found the following files/directories:
  'Rscript171866c62e'

when checked on R Under development (unstable) (2021-06-13 r80496),
including on win-builder.  I can reproduce this with a package
'tests/detritus.R':

  cl <- parallel::makeCluster(1)
  dummy <- parallel::clusterEvalQ(cl, {
cl <- parallel::makeCluster(1)
on.exit(parallel::stopCluster(cl))
parallel::clusterEvalQ(cl, Sys.getpid())
  })
  print(dummy)
  parallel::stopCluster(cl)


I believe it requires a nested PSOCK cluster to reproduce the 'R CMD
check' NOTE, e.g. it does _not_ happen with:

  cl <- parallel::makeCluster(1)
  dummy <- parallel::clusterEvalQ(cl, {
Sys.getpid()
  })
  print(dummy)
  parallel::stopCluster(cl)

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Old version of rtracklayer on a single check server

2021-05-18 Thread Henrik Bengtsson
I stand corrected about the R-devel version (and now R 4.1.0) on CRAN.

However, isn't it the case that it will never be solved for R 4.0.0 (to
become R-oldrel on CRAN) and CRAN will keep reporting an error on MS
Windows there because
https://bioconductor.org/packages/3.12/bioc/html/rtracklayer.html provides
only an older version for MS Windows?

If so, an alternative to relying on Suggests is to make the package depend
on R (>= 4.1.0).

/Henrik

On Tue, May 18, 2021, 09:08 Martin Morgan  wrote:

> That’s not correct Henrik.
>
> CRAN follows CRAN rules for installing packages, so uses
> tools:::BioC_version_for_R_version(). For R-devel we have
>
> > R.version.string
> [1] "R Under development (unstable) (2021-05-18 r80323)"
> > tools:::.BioC_version_associated_with_R_version()
> [1] '3.13'
>
> For this version of Bioconductor, the rtracklayer version (from
> https://bioconductor.org/packages/3.13/rtracklayer, or
> `available.packages(repos = 
> "https://bioconductor.org/packages/3.13/bioc;)["rtracklayer",
> "Version"]`) is 1.51.5.
>
> So the r-devel-windows-ix86+x86_64 builder mentioned in the post has the
> wrong version of rtracklayer for R-devel.
>
> Martin Morgan
>
> On 5/18/21, 11:49 AM, "R-package-devel on behalf of Henrik Bengtsson" <
> r-package-devel-boun...@r-project.org on behalf of
> henrik.bengts...@gmail.com> wrote:
>
> It's a problem with Bioconductor and a broken release history of
> 'rtracklayer' on MS Windows (e.g.
> https://bioconductor.org/packages/3.12/bioc/html/rtracklayer.html)
> plus how each Bioconductor version is tied to a specific R version.
> In other words, even if they fix it in Bioconductor 3.13 (for R
> 4.1.0), it can't be fixed in Bioconductor 3.12 (for R 4.0.0), so
> you're package will keep failing on Windows for R 4.0.0.  The reason
> why it can't be fixed in Bioconductor 3.12 is that they have now
> frozen that release forever.
>
> Because of this, I suspect the only solution is to make 'rtracklayer'
> an optional package, i.e. move it to Suggests: and update all your
> code to run conditionally of that package being available. I recommend
> you reach out to the bioc-devel mailing list for advice.
>
> /Henrik
>
> On Tue, May 18, 2021 at 4:33 AM Dalgleish, James (NIH/NCI) [F] via
> R-package-devel  wrote:
> >
> > To any who might have an idea:
> >
> > I've been reading several posts in the digest about dependency
> version issues on the check servers and I'm having my own issue, which I
> can't solve because I can't upgrade the check server's package version:
> > * installing *source* package 'CNVScope' ...
> > ** using staged installation
> > ** R
> > ** data
> > *** moving datasets to lazyload DB
> > ** inst
> > ** byte-compile and prepare package for lazy loading
> > Warning: multiple methods tables found for 'export'
> > Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()),
> versionCheck = vI[[j]]) :
> >   namespace 'rtracklayer' 1.48.0 is already loaded, but >= 1.51.5 is
> required
> > Calls:  ... namespaceImportFrom -> asNamespace ->
> loadNamespace
> > Execution halted
> > ERROR: lazy loading failed for package 'CNVScope'
> > * removing 'd:/RCompile/CRANguest/R-devel/lib/CNVScope'
> >
> > These errors tend to be check server dependent (only occurs on
> r-devel-windows-ix86+x86_64<
> https://cran.r-project.org/web/checks/check_flavors.html#r-devel-windows-ix86_x86_64>)
> and I'm just trying to make the small change to closeAllConnections() that
> was asked earlier of maintainers by Kurt Hornik and the CRAN team, but I
> can't because of this old package version on the devel check server, which
> has the same error:
> >
> https://win-builder.r-project.org/incoming_pretest/CNVScope_3.5.7_20210518_062953/Windows/00check.log
> >
> https://win-builder.r-project.org/incoming_pretest/CNVScope_3.5.7_20210518_062953/Windows/00install.out
> >
> > Is there any way around this? I notice the maintainer of the
> 'gtsummary' package had a similar issue:
> >
> > "> I am trying to make a release that depends on gt v0.3.0, but I
> get an error
> >
> > > when I test the package on Windows Dev
> `devtools::check_win_devel()` that
> >
> > > the gt package is available but it's an unsuitable version.  Does
> anyone
> >
> > > know why the gt v0.3.0 is unavailable?"
> >
> >
> >
> >

Re: [R-pkg-devel] Old version of rtracklayer on a single check server

2021-05-18 Thread Henrik Bengtsson
It's a problem with Bioconductor and a broken release history of
'rtracklayer' on MS Windows (e.g.
https://bioconductor.org/packages/3.12/bioc/html/rtracklayer.html)
plus how each Bioconductor version is tied to a specific R version.
In other words, even if they fix it in Bioconductor 3.13 (for R
4.1.0), it can't be fixed in Bioconductor 3.12 (for R 4.0.0), so
you're package will keep failing on Windows for R 4.0.0.  The reason
why it can't be fixed in Bioconductor 3.12 is that they have now
frozen that release forever.

Because of this, I suspect the only solution is to make 'rtracklayer'
an optional package, i.e. move it to Suggests: and update all your
code to run conditionally of that package being available. I recommend
you reach out to the bioc-devel mailing list for advice.

/Henrik

On Tue, May 18, 2021 at 4:33 AM Dalgleish, James (NIH/NCI) [F] via
R-package-devel  wrote:
>
> To any who might have an idea:
>
> I've been reading several posts in the digest about dependency version issues 
> on the check servers and I'm having my own issue, which I can't solve because 
> I can't upgrade the check server's package version:
> * installing *source* package 'CNVScope' ...
> ** using staged installation
> ** R
> ** data
> *** moving datasets to lazyload DB
> ** inst
> ** byte-compile and prepare package for lazy loading
> Warning: multiple methods tables found for 'export'
> Error in loadNamespace(j <- i[[1L]], c(lib.loc, .libPaths()), versionCheck = 
> vI[[j]]) :
>   namespace 'rtracklayer' 1.48.0 is already loaded, but >= 1.51.5 is required
> Calls:  ... namespaceImportFrom -> asNamespace -> loadNamespace
> Execution halted
> ERROR: lazy loading failed for package 'CNVScope'
> * removing 'd:/RCompile/CRANguest/R-devel/lib/CNVScope'
>
> These errors tend to be check server dependent (only occurs on 
> r-devel-windows-ix86+x86_64)
>  and I'm just trying to make the small change to closeAllConnections() that 
> was asked earlier of maintainers by Kurt Hornik and the CRAN team, but I 
> can't because of this old package version on the devel check server, which 
> has the same error:
> https://win-builder.r-project.org/incoming_pretest/CNVScope_3.5.7_20210518_062953/Windows/00check.log
> https://win-builder.r-project.org/incoming_pretest/CNVScope_3.5.7_20210518_062953/Windows/00install.out
>
> Is there any way around this? I notice the maintainer of the 'gtsummary' 
> package had a similar issue:
>
> "> I am trying to make a release that depends on gt v0.3.0, but I get an error
>
> > when I test the package on Windows Dev `devtools::check_win_devel()` that
>
> > the gt package is available but it's an unsuitable version.  Does anyone
>
> > know why the gt v0.3.0 is unavailable?"
>
>
>
> I'm open to any suggestions, but can't see a way around this issue from my 
> end without the ability to service the check server.
>
>
> Thanks,
> James Dalgleish
> Cancer Genetics Branch,
> Center for Cancer Research,
> National Cancer Institute,
> National Institutes of Health,
> Bethesda, MD
>
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Testing R build when using --without-recommended-packages?

2021-05-05 Thread Henrik Bengtsson
On Wed, May 5, 2021 at 2:13 AM Martin Maechler
 wrote:
>
> > Gabriel Becker
> > on Tue, 4 May 2021 14:40:22 -0700 writes:
>
> > Hmm, that's fair enough Ben, I stand corrected.  I will say that this 
> seems
> > to be a pretty "soft" recommendation, as these things go, given that it
> > isn't tested for by R CMD check, including with the -as-cran 
> extensions. In
> > principle, it seems like it could be, similar checks are made in package
> > code for inappropriate external-package-symbol usage/
>
> > Either way, though, I suppose I have a number of packages which have 
> been
> > invisibly non-best-practices compliant for their entire lifetimes (or at
> > least, the portion of that where they had tests/vignettes...).
>
> > Best,
> > ~G
>
> > On Tue, May 4, 2021 at 2:22 PM Ben Bolker  wrote:
>
> >> Sorry if this has been pointed out already, but some relevant text
> >> from
> >>
> >> 
> https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Suggested-packages
> >>
> >> > Note that someone wanting to run the examples/tests/vignettes may not
> >> have a suggested package available (and it may not even be possible to
> >> install it for that platform). The recommendation used to be to make
> >> their use conditional via if(require("pkgname")): this is OK if that
> >> conditioning is done in examples/tests/vignettes, although using
> >> if(requireNamespace("pkgname")) is preferred, if possible.
> >>
> >> ...
> >>
> >> > Some people have assumed that a ‘recommended’ package in ‘Suggests’
> >> can safely be used unconditionally, but this is not so. (R can be
> >> installed without recommended packages, and which packages are
> >> ‘recommended’ may change.)
>
>
> Thank you all (Henrik, Gabe, Dirk & Ben) !
>
> I think it would be a good community effort  and worth the time
> also of R core to further move into the right direction
> as Dirk suggested.
>
> I think we all agree it would be nice if Henrik (and anybody)
> could use  'make check' on R's own sources after using
>  --without-recommended-packages
>
> Even one more piece of evidence is the   tests/README   file in
> the R sources.  It has much more but simply starts with
>
> ---
> There is a hierarchy of check targets:
>
>  make check
>
> for all builders.  If this works one can be reasonably happy R is working
> and do `make install' (or the equivalent).
>
> make check-devel
>
> for people changing the code: this runs things like the demos and
> no-segfault which might be broken by code changes, and checks on the
> documentation (effectively R CMD check on each of the base packages).
> This needs recommended packages installed.
>
> make check-all
>
> runs all the checks, those in check-devel plus tests of the recommended
> packages.
>
> Note that for complete testing you will need a number of other
> ..
> ..
>
> ---
>
> So, our (R core) own intent has been that   'make check'  should
> run w/o rec.packages  but further checking not.
>
> So, yes, please, you are encouraged to send patches against the
> R devel trunk  to fix such examples and tests.

Thanks Martin!  Thanks for confirming and for being open to patches.
This encourages me to try to patch what we've got so that 'make check'
and 'make check-devel' can complete also without 'recommended'
packages.

/Henrik

>
> Best,
> Martin
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Testing R build when using --without-recommended-packages?

2021-05-04 Thread Henrik Bengtsson
Two questions to R Core:

1. Is R designed so that 'recommended' packages are optional, or
should that be considered uncharted territories?

2. Can such an R build/installation be validated using existing check methods?


--

Dirk, it's not clear to me whether you know for sure, or you draw
conclusions based your long experience and reading. I think it's very
important that others don't find this thread later on and read your
comments as if they're the "truth" (unless they are).  I haven't
re-read it from start to finish, but there are passages in 'R
Installation and Administration' suggesting you can build and install
R without 'recommended' packages.  For example, post-installation,
Section 'Testing an Installation' suggests you can run (after making
sure `make install-tests`):

cd tests
../bin/R CMD make check

but they fail the same way.  The passage continuous "... and other
useful targets are test-BasePackages and test-Recommended to run tests
of the standard and recommended packages (if installed) respectively."
(*).  So, to me that hints at 'recommended' packages are optional just
as they're "Priority: recommended".  Further down, there's also a
mentioning of:

$ R_LIBS_USER="" R --vanilla
> Sys.setenv(LC_COLLATE = "C", LC_TIME = "C", LANGUAGE = "en")
> tools::testInstalledPackages(scope = "base")

which also produces errors when 'recommended' packages are missing,
e.g. "Failed with error:  'there is no package called 'nlme'".

(*) BTW, '../bin/R CMD make test-BasePackages' gives "make: *** No
rule to make target 'test-BasePackages'.  Stop."

Thanks,

/Henrik

On Tue, May 4, 2021 at 12:22 PM Dirk Eddelbuettel  wrote:
>
>
> On 4 May 2021 at 11:25, Henrik Bengtsson wrote:
> | FWIW,
> |
> | $ ./configure --help
> | ...
> |   --with-recommended-packages
> |   use/install recommended R packages [yes]
>
> Of course. But look at the verb in your Subject: no optionality _in testing_ 
> there.
>
> You obviously need to be able to build R itself to then build the recommended
> packages you need for testing.
>
> Dirk
>
> --
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Testing R build when using --without-recommended-packages?

2021-05-04 Thread Henrik Bengtsson
FWIW,

$ ./configure --help
...
  --with-recommended-packages
  use/install recommended R packages [yes]

/Henrik

On Tue, May 4, 2021 at 11:17 AM Dirk Eddelbuettel  wrote:
>
>
> On 4 May 2021 at 11:07, Henrik Bengtsson wrote:
> | Thanks, but I don't understand. That's what I usually do when I build
> | R with 'recommended' packages.  But here, I explicitly do *not* want
> | to build and install 'recommended' packages with the R installation.
> | So, I'm going down the --without-recommended-packages path on purpose
> | and I'm looking for a way to validate such an installation.
>
> I understand the desire, and am sympathetic, but for all+ years I have been
> building R (or R-devel) from source this has never been optional. Nor has any
> optionality (for the build of R has a whole) been documented, at least as far
> as I know.
>
> Dirk
>
> --
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Testing R build when using --without-recommended-packages?

2021-05-04 Thread Henrik Bengtsson
Thanks, but I don't understand. That's what I usually do when I build
R with 'recommended' packages.  But here, I explicitly do *not* want
to build and install 'recommended' packages with the R installation.
So, I'm going down the --without-recommended-packages path on purpose
and I'm looking for a way to validate such an installation.

If your comment is on the 'stats' examples' hard dependency on 'MASS'
despite it's being a suggested packages, I still don't follow.

/Henrik

On Tue, May 4, 2021 at 10:16 AM Dirk Eddelbuettel  wrote:
>
>
> On 4 May 2021 at 09:31, Henrik Bengtsson wrote:
> | I'm on Linux (Ubuntu 18.04). How do I check an R build when using
> | --without-recommended-packages? 'make check' assumes 'recommended'
> | packages are installed, so that fails without them available.
>
> [...]
>
> | BTW, isn't this a bug? Shouldn't this example run conditionally on
> | 'MASS' being installed, because 'MASS' is a suggested package here;
>
> The 'R-admin' manual in Section 1.2 "Getting patched and development
> versions" ends on
>
>   If downloading manually from CRAN, do ensure that you have the correct
>   versions of the recommended packages: if the number in the file VERSION
>   is ‘x.y.z’ you need to download the contents of
>   ‘https://CRAN.R-project.org/src/contrib/dir’, where dir is
>   ‘x.y.z/Recommended’ for r-devel or x.y-patched/Recommended for r-patched,
>   respectively, to directory src/library/Recommended in the sources you
>   have unpacked. After downloading manually you need to execute
>   tools/link-recommended from the top level of the sources to make the
>   requisite links in src/library/Recommended. A suitable incantation from
>   the top level of the R sources using wget might be (for the correct
>   value of dir)
>
>   wget -r -l1 --no-parent -A\*.gz -nd -P src/library/Recommended \
> https://CRAN.R-project.org/src/contrib/dir
>   ./tools/link-recommended
>
> Dirk
>
> --
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Testing R build when using --without-recommended-packages?

2021-05-04 Thread Henrik Bengtsson
I'm on Linux (Ubuntu 18.04). How do I check an R build when using
--without-recommended-packages? 'make check' assumes 'recommended'
packages are installed, so that fails without them available.

DETAILS:

When I build R from source without 'recommended' packages:

curl -O https://cran.r-project.org/src/base-prerelease/R-latest.tar.gz
tar xvfz R-latest.tar.gz
cd R-beta
./configure --enable-memory-profiling --enable-R-shlib --prefix="$PREFIX"
make

I cannot figure out how to validate the build.  Following Section
'Installation' of 'R Installation and Administration', I run:

make check

results in:

Testing examples for package ‘stats’
Error: testing 'stats' failed
Execution halted

This is because those tests assume 'MASS' is installed;

$ cat /path/to/tests/Examples/stats-Ex.Rout.fail

> utils::data(muscle, package = "MASS")
Error in find.package(package, lib.loc, verbose = verbose) :
  there is no package called ‘MASS’
Calls:  -> find.package
Execution halted

BTW, isn't this a bug? Shouldn't this example run conditionally on
'MASS' being installed, because 'MASS' is a suggested package here;

Package: stats
Version: 4.1.0
...
Imports: utils, grDevices, graphics
Suggests: MASS, Matrix, SuppDists, methods, stats4

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R compilation on old(ish) CentOS

2021-04-30 Thread Henrik Bengtsson
Ben, it's most like what Peter says.  I can confirm it works; I just
installed https://cran.r-project.org/src/base-prerelease/R-latest.tar.gz
on an up-to-date CentOS 7.9.2009 system using the vanilla gcc (GCC)
4.8.5 that comes with that version and R compiles just fine and it
passes 'make check' too.

Since R is trying to move toward C++14 support by default, I agree
with Iñaki, you might wanr to build and run R with a newer version of
gcc.  gcc 4.8.5 will only give you C++11 support.  RedHat's Software
Collections (SCL) devtoolset:s is the easiest way to do this. I've
done this too and can confirm that gcc 7.3.1 that comes with SCL
devtoolset/7 is sufficient to get C++14 support.  I'm sharing my
installation with lots of users, so I'm make it all transparent to the
end-user with environment modules, i.e. 'module load r/4.1.0' is all
the user needs to know.

/Henrik

On Thu, Apr 29, 2021 at 7:28 AM Peter Dalgaard  wrote:
>
> You may want to check out your checkout
>
> I see:
>
> Peter-Dalgaards-iMac:R pd$ grep newsock src/main/connections.c
> con = R_newsock(host, port, server, serverfd, open, timeout, options);
>
> but your file seems to have lost the ", options" bit somehow. Also, mine is 
> line 3488, not 3477.
>
> Maybe you have an old file getting in the way?
>
> - Peter
>
> > On 29 Apr 2021, at 15:58 , Ben Bolker  wrote:
> >
> >  I probably don't want to go down this rabbit hole very far, but if anyone 
> > has any *quick* ideas ...
> >
> >  Attempting to build R from scratch with a fresh SVN checkout on a somewhat 
> > out-of-date CentOS system (for which I don't have root access, although I 
> > can bug people if I care enough).
> >
> >  ../r-devel/configure; make
> >
> > ends with
> >
> > gcc -std=gnu99 -I../../../r-devel/trunk/src/extra  -I. -I../../src/include 
> > -I../../../r-devel/trunk/src/include -I/usr/local/include 
> > -I../../../r-devel/trunk/src/nmath -DHAVE_CONFIG_H  -fopenmp  -g -O2  -c 
> > ../../../r-devel/trunk/src/main/connections.c -o connections.o
> > ../../../r-devel/trunk/src/main/connections.c: In function ‘do_sockconn’:
> > ../../../r-devel/trunk/src/main/connections.c:3477:5: error: too few 
> > arguments to function ‘R_newsock’
> > con = R_newsock(host, port, server, serverfd, open, timeout);
> > ^
> > In file included from ../../../r-devel/trunk/src/main/connections.c:80:0:
> > ../../../r-devel/trunk/src/include/Rconnections.h:83:13: note: declared here
> > Rconnection R_newsock(const char *host, int port, int server, int serverfd, 
> > const char * const mode, int timeout, int options);
> > ^
> > make[3]: *** [connections.o] Error 1
> >
> >  Any suggestions for a quick fix/diagnosis?
> >
> >  cheers
> >Ben Bolker
> >
> > 
> >
> >
> > $ gcc --version
> > gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-39)
> >
> > $ lsb_release -a
> > LSB Version: 
> > :core-4.1-amd64:core-4.1-noarch:cxx-4.1-amd64:cxx-4.1-noarch:desktop-4.1-amd64:desktop-4.1-noarch:languages-4.1-amd64:languages-4.1-noarch:printing-4.1-amd64:printing-4.1-noarch
> > Distributor ID:   CentOS
> > Description:  CentOS Linux release 7.8.2003 (Core)
> > Release:  7.8.2003
> > Codename: Core
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>
> --
> Peter Dalgaard, Professor,
> Center for Statistics, Copenhagen Business School
> Solbjerg Plads 3, 2000 Frederiksberg, Denmark
> Phone: (+45)38153501
> Office: A 4.23
> Email: pd@cbs.dk  Priv: pda...@gmail.com
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Package fails to build on CRAN Windows server: rtracklayer version too new

2021-04-19 Thread Henrik Bengtsson
Not that it solves your problem per se, but here's what I think CRAN
only has rtracklayer 1.48.0 installed on WIndows, i.e. why you get
error "namespace 'rtracklayer' 1.48.0 is being loaded, but >= 1.51.5
is required.":

* The current Bioconductor "release" is version 3.12
* As you say, Bioconductor 3.12 provides rtracklayer 1.50.0
(https://www.bioconductor.org/packages/3.12/bioc/html/rtracklayer.html)
* However, if you look at that page, for Windows it only provides
rtracklayer 1.49.5 (this is because rtracklayer never successfully
checked on Bioconductor 3.12)
* Since the minor version component of rtracklayer 1.49.5 is _odd_, it
means it's from a Bioconductor 'devel' branch, i.e. when Bioconductor
3.12 was in devel.
* Thus, the most recent Bioconductor "release" where rtracklayer was
successfully distributed on Windows is Bioconductor 3.11.
* Bioconductor 3.11 provides rtracklayer 1.48.0
(https://www.bioconductor.org/packages/3.11/bioc/html/rtracklayer.html)

Clear as mud?

I suspect you're using Suggest: rtracklayer
(https://github.com/timoast/signac/blob/develop/DESCRIPTION) in your
new version.  If so, that's good.  I also don't think you're doing
anything wrong here.  Instead, I think it's one of your other
Bioconductor dependencies under 'Imports:', or possibly under
'Suggests:', that depend on 'rtracklayer'.  The fact that error says
"namespace 'rtracklayer' 1.48.0 is being loaded, but >= 1.51.5 is
required.", means that one of those Bioconductor dependencies, or
their dependencies, has an explicit 'rtracklayer (>= 1.51.5)' under
either Depends: or Imports:.  My recommendation is that you start
identifying that Bioconductor package, and in what way your 'Signac'
depends on that package.  That will help you decide on what to do
next.

/Henrik

PS. This can happen because Bioconductor allows a package to propagate
to a new "release" branch even if it fails on one of the operating
systems.

On Fri, Apr 16, 2021 at 1:59 PM Tim Stuart  wrote:
>
> Hi all,
>
> I am trying to submit an update to my package
> (https://cran.r-project.org/package=Signac) to CRAN, but it is
> currently failing to install on the Windows server. I see the
> following installation error: "namespace 'rtracklayer' 1.48.0 is being
> loaded, but >= 1.51.5 is required."
>
> The current release version for rtracklayer is 1.50.0
> (https://www.bioconductor.org/packages/release/bioc/html/rtracklayer.html).
> My package does not set a version requirement for rtracklayer, so this
> installation error must be caused by having a pre-release version of
> one of my bioconductor dependencies installed on the windows server.
>
> Does anyone know how I might be able to solve this issue? I have tried
> contacting the CRAN team a few times by reply-all to the CRAN emails,
> but have not had any response from them. This seems like an issue that
> could only be solved by fixing the installation of packages on the
> CRAN Windows server.
>
> Incoming test results:
> https://win-builder.r-project.org/incoming_pretest/Signac_1.1.2_20210415_225235/
>
> Thanks,
> Tim
>
> --
>
> This message is for the recipient’s use only, and may contain
> confidential, privileged or protected information. Any unauthorized use or
> dissemination of this communication is prohibited. If you received this
> message in error, please immediately notify the sender and destroy all
> copies of this message. The recipient should check this email and any
> attachments for the presence of viruses, as we accept no liability for any
> damage caused by any virus transmitted by this email.
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] help(".libPaths"): Paragraph lacks mentioning of R_LIBS_SITE

2021-04-19 Thread Henrik Bengtsson
In ?base::.libPaths, there's a paragraph saying:

The library search path is initialized at startup from the environment
variable R_LIBS (which should be a colon-separated list of directories
at which R library trees are rooted) followed by those in environment
variable R_LIBS_USER. Only directories which exist at the time will be
included.

Shouldn't R_LIBS_SITE also be mentioned in that passage?  Something like:

...followed by those in environment variables R_LIBS_USER and R_LIBS_SITE. ...

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] BiocCheck and version requirement

2021-04-08 Thread Henrik Bengtsson
On Thu, Apr 8, 2021 at 11:22 AM Kasper Daniel Hansen
 wrote:
>
> R CMD check sometimes wants dependency on specific R versions, for example 
> when the file format changed (forgot which version .. was it 3.6). But that 
> warning is emitted when it tests the package utilizes version-specific stuff.

Yes, serialization format v3 was introduced in R 3.5.0.  If there are
R binary files with that version, then 'R CMD build' will check and
warning about this, and automatically inject an explicit Depends: R
(>= 3.5.0) in the built tarball. Example:

$ R CMD build teeny
...
* checking for empty or unneeded directories
  NB: this package now depends on R (>= 3.5.0)
  WARNING: Added dependency on R >= 3.5.0 because serialized objects
in  serialize/load version 3 cannot be read in older versions of R.
File(s) containing such objects: ‘teeny/inst/pi.rds’
* building ‘teeny_0.1.0.tar.gz’

But that's not a *check* error.  If you'd submit to CRAN, they would
not notice because it's fixed in the source tarball.  On Bioconductor,
you'd see it in the build logs, because that builds from the package
source folder.  I don't know if there are additional checks for this
in 'R CMD check' - could be - but I guess you would have to do some
tricks to circumvent 'R CMD build' in order to get a source tarball
without the dependency.

Note that if the R binary files are in serialization version 2, then
there is no such warning.  In other words, a package developer can
very well keep their package backward compatible with R (< 3.5.0) by
not updating/resaving/regenerating binary files in version 3.  To stay
with version 2, one can use, e.g. saveRDS(x, file, version=2).

/Henrik

>
> I get that Bioc doesn't check under older versions. That is intentional. But 
> that is different from the check asking for a specific version for no 
> particular reason (as far as I can ascertain).
>
> Best,
> Kasper
>
> On Thu, Apr 8, 2021 at 8:13 PM Henrik Bengtsson  
> wrote:
>>
>> > I believe in the past we tried suggesting removing a R dependency 
>> > altogether but that R CMD check complained not having an R dependency?  
>> > Maybe someone remembers more about this.
>>
>> There are no such requirements in 'R CMD check'.  The only requirement
>> around 'Depends: R (>= x.y.z)', if specified, is with 'R CMD check
>> --as-cran' that requires z == 0. In other words, you can submit a
>> package to CRAN that depends on a specific patch version, e.g. either
>> R (>= 4.0.0) or R (>= 4.1.0) but not say R (>= 4.0.2) resulting in:
>>
>> * checking DESCRIPTION meta-information ... WARNING
>> Dependence on R version ‘4.0.2’ not with patchlevel 0
>>
>>
>> In the bigger picture: I argue that Bioconductor's dependency on R
>> version is a major, unnecessary(*) disadvantage and adds barriers
>> between the Bioconductor community and the rest of the R community
>> resulting in many lost opportunities and cross-pollination. (*) I know
>> the arguments, but I think there are better solutions already
>> available that do not enforce specific R versions on users and thereby
>> limit them, e.g. when they want to a single Bioconductor package with
>> zero dependencies in their workflow.  As a way forward, I'd suggest
>> that Bioconductor makes it a long-term target to make it possible for
>> any R user to install Bioc packages with a regular install.packages()
>> call - I think that can be achieved.
>>
>> /Henrik
>>
>> On Thu, Apr 8, 2021 at 10:39 AM Kern, Lori
>>  wrote:
>> >
>> > From a Bioconductor R perspective, we can't tell if the package would work 
>> > with a lesser version of R or lower versions of any package dependencies.  
>> > We accept packages and have requirements to ensure packages can run.
>> > You can always have another github branch
>> >
>> > I believe in the past we tried suggesting removing a R dependency 
>> > altogether but that R CMD check complained not having an R dependency?  
>> > Maybe someone remembers more about this.
>> >
>> >
>> >
>> >
>> > Lori Shepherd
>> >
>> > Bioconductor Core Team
>> >
>> > Roswell Park Comprehensive Cancer Center
>> >
>> > Department of Biostatistics & Bioinformatics
>> >
>> > Elm & Carlton Streets
>> >
>> > Buffalo, New York 14263
>> >
>> > 
>> > From: Kasper Daniel Hansen 
>> > Sent: Thursday, April 8, 2021 1:33 PM
>> > To: Kern, Lori 
>> > Cc: bioc-devel 
>> > Subject: Re: [Bioc-devel] BiocCheck and version requirement
>> >

Re: [Bioc-devel] BiocCheck and version requirement

2021-04-08 Thread Henrik Bengtsson
> I believe in the past we tried suggesting removing a R dependency altogether 
> but that R CMD check complained not having an R dependency?  Maybe someone 
> remembers more about this.

There are no such requirements in 'R CMD check'.  The only requirement
around 'Depends: R (>= x.y.z)', if specified, is with 'R CMD check
--as-cran' that requires z == 0. In other words, you can submit a
package to CRAN that depends on a specific patch version, e.g. either
R (>= 4.0.0) or R (>= 4.1.0) but not say R (>= 4.0.2) resulting in:

* checking DESCRIPTION meta-information ... WARNING
Dependence on R version ‘4.0.2’ not with patchlevel 0


In the bigger picture: I argue that Bioconductor's dependency on R
version is a major, unnecessary(*) disadvantage and adds barriers
between the Bioconductor community and the rest of the R community
resulting in many lost opportunities and cross-pollination. (*) I know
the arguments, but I think there are better solutions already
available that do not enforce specific R versions on users and thereby
limit them, e.g. when they want to a single Bioconductor package with
zero dependencies in their workflow.  As a way forward, I'd suggest
that Bioconductor makes it a long-term target to make it possible for
any R user to install Bioc packages with a regular install.packages()
call - I think that can be achieved.

/Henrik

On Thu, Apr 8, 2021 at 10:39 AM Kern, Lori
 wrote:
>
> From a Bioconductor R perspective, we can't tell if the package would work 
> with a lesser version of R or lower versions of any package dependencies.  We 
> accept packages and have requirements to ensure packages can run.
> You can always have another github branch
>
> I believe in the past we tried suggesting removing a R dependency altogether 
> but that R CMD check complained not having an R dependency?  Maybe someone 
> remembers more about this.
>
>
>
>
> Lori Shepherd
>
> Bioconductor Core Team
>
> Roswell Park Comprehensive Cancer Center
>
> Department of Biostatistics & Bioinformatics
>
> Elm & Carlton Streets
>
> Buffalo, New York 14263
>
> 
> From: Kasper Daniel Hansen 
> Sent: Thursday, April 8, 2021 1:33 PM
> To: Kern, Lori 
> Cc: bioc-devel 
> Subject: Re: [Bioc-devel] BiocCheck and version requirement
>
> But why is it even a warning? The submission policy is that warnings are 
> discouraged. This means that developers will adapt to this warning.
>
> The check is also against the policies on the website which state you don't 
> need a formal dependency.
>
> Best,
> Kasper
>
> On Thu, Apr 8, 2021 at 3:53 PM Kern, Lori 
> mailto:lori.sheph...@roswellpark.org>> wrote:
> This requirement has been around for awhile.  New package submissions are 
> checked against the devel version of Bioconductor. At the moment this is R 
> devel (R 4.1) and Bioconductor packages in devel for 3.13.  Given that 
> Bioconductor releases are closely tied to a R release,  and changes in R can 
> (and have) had great consequences for package functionality, we can not 
> guarantee a package will work on any previous version of R or with previous 
> versions of packages.
> It is why it is a warning and not an error
>
>
> Lori Shepherd
>
> Bioconductor Core Team
>
> Roswell Park Comprehensive Cancer Center
>
> Department of Biostatistics & Bioinformatics
>
> Elm & Carlton Streets
>
> Buffalo, New York 14263
>
> 
> From: Bioc-devel 
> mailto:bioc-devel-boun...@r-project.org>> 
> on behalf of Kasper Daniel Hansen 
> mailto:kasperdanielhan...@gmail.com>>
> Sent: Thursday, April 8, 2021 9:44 AM
> To: bioc-devel mailto:bioc-devel@r-project.org>>
> Subject: [Bioc-devel] BiocCheck and version requirement
>
> The latest BiocCheck (well, it may have been around for a bit) _requires_
> the package to have a R >= 4.1 dependency.
>
> That seems new to me. Right now it's a bit irritating, because if you're
> submitting a package that works with latest stable release, you are now
> prohibited from installing it from Github into your stable Bioc version.
>
> Traditionally, we have not enforced this I think, even though we all know
> of the implicit dependency.
>
> --
> Best,
> Kasper
>
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://secure-web.cisco.com/1Zqtuoo0O2aKBea_yHofM_QCv72B3JNIupD47xAVitntUD9FgXVvT4yX66u57RWFhMonvou61R_vk6u1LgIM5J8qpHXw4gXWyAxGlZFJEH--5tT-UESMe6_L4bbB6jIcOfYl0J5FI0gucNH0boaxPdv4-It-V5j3TPd2bd5Er3K7MHNVFhqgA5bs84nYYGnvBuOVns86_d2q_mkKzVTHay7GQUxlJhDGVxQbxlwyKvaVPNraVZJKI3lQzwTpavNpm7CpFuIOaDv9a9-euSOlKn3NYMdkPxNfAHv3u2sI1vZ_1ww4KU4c5TgGsp-ard5Ix/https%3A%2F%2Fstat.ethz.ch%2Fmailman%2Flistinfo%2Fbioc-devel
>
>
> This email message may contain legally privileged and/or confidential 
> information. If you are not the intended recipient(s), or the employee or 
> agent responsible for the delivery of this message to the 

Re: [Bioc-devel] Suppressing messages from Rprofile

2021-04-07 Thread Henrik Bengtsson
Correcting: the env var is R_LIBS_USER and nothing else.

/Henrik

On Wed, Apr 7, 2021 at 1:20 PM Henrik Bengtsson
 wrote:
>
> Can you go via a temporary file instead, i.e. output what you want to
> grab to a temporary file and pull the info from that instead?
>
> Side tracking ...
>
> > I thought about using --vanilla but I use .Rprofile to set my library path,
> > so ignoring that file completely defeats the point in my setup.
>
> I'm a proponent of customizing the library path via .Renviron instead
> of via .Rprofile.  In your case, you can have a line in ~/.Renviron
> with:
>
>   R_USER_LIBS=~/R/%p-library/%v-bioc_3.12
>
> Alternatively, you can set it elsewhere, e.g. ~/.bashrc, in a Linux
> environment module that is loaded, and so on.
>
> BTW, using Rscript --no-init-file ... would skip .Rprofile while still
> parsing .Renviron.
>
> /Henrik
>
> On Wed, Apr 7, 2021 at 8:16 AM Mike Smith  wrote:
> >
> > I have the following line in the configure.ac for rhdf5filters, which
> > returns the location of the HDF5 headers distributed with Rhdf5lib:
> >
> > RHDF5_INCLUDE=`"${R_HOME}/bin${R_ARCH_BIN}/Rscript" -e
> > 'cat(system.file("include", package="Rhdf5lib"))'`
> >
> > For me the output is a path like
> >  /mnt/data/R-lib/4.0.3-bioc_3.12/Rhdf5lib/include, which gets inserted into
> > the package Makevars file, and the package compilation works.
> >
> > However I've had multiple reports (
> > https://github.com/grimbough/rhdf5filters/issues/11) where this doesn't
> > work, all of which seem to relate to messages printed when an Rprofile is
> > loaded.  They have well intentioned messages like below, which don't work
> > so well when passed as compiler flags
> >
> > [1] "[BMRC] You have sourced the BMRC Rprofile provided at
> > /apps/misc/R/bmrc-r-user-tools/Rprofile"
> > [1] "[BMRC] Messages coming from this file (like this one) will be
> > prefixed with [BMRC]"
> > [1] "[BMRC] You are running R on host 
> > with CPU "
> > [1] "[BMRC] While running on this host, local R packages will be
> > sourced from and installed to
> > /well/combat/users/ifl143/R/4.0/ivybridge"
> > /gpfs3/well/combat/users/ifl143/R/4.0/ivybridge/Rhdf5lib/include
> >
> > I thought about using --vanilla but I use .Rprofile to set my library path,
> > so ignoring that file completely defeats the point in my setup.  Is anyone
> > aware of either a more reliable way of getting the information I want
> > (maybe suppressing messages, different mechanism entirely, etc)?
> > Alternatively, is there anything definitive in WRE or the like that
> > suggests printing messages Rprofile is a bad idea that I can pass on to the
> > users?
> >
> > Cheers,
> > Mike
> >
> > [[alternative HTML version deleted]]
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Suppressing messages from Rprofile

2021-04-07 Thread Henrik Bengtsson
Can you go via a temporary file instead, i.e. output what you want to
grab to a temporary file and pull the info from that instead?

Side tracking ...

> I thought about using --vanilla but I use .Rprofile to set my library path,
> so ignoring that file completely defeats the point in my setup.

I'm a proponent of customizing the library path via .Renviron instead
of via .Rprofile.  In your case, you can have a line in ~/.Renviron
with:

  R_USER_LIBS=~/R/%p-library/%v-bioc_3.12

Alternatively, you can set it elsewhere, e.g. ~/.bashrc, in a Linux
environment module that is loaded, and so on.

BTW, using Rscript --no-init-file ... would skip .Rprofile while still
parsing .Renviron.

/Henrik

On Wed, Apr 7, 2021 at 8:16 AM Mike Smith  wrote:
>
> I have the following line in the configure.ac for rhdf5filters, which
> returns the location of the HDF5 headers distributed with Rhdf5lib:
>
> RHDF5_INCLUDE=`"${R_HOME}/bin${R_ARCH_BIN}/Rscript" -e
> 'cat(system.file("include", package="Rhdf5lib"))'`
>
> For me the output is a path like
>  /mnt/data/R-lib/4.0.3-bioc_3.12/Rhdf5lib/include, which gets inserted into
> the package Makevars file, and the package compilation works.
>
> However I've had multiple reports (
> https://github.com/grimbough/rhdf5filters/issues/11) where this doesn't
> work, all of which seem to relate to messages printed when an Rprofile is
> loaded.  They have well intentioned messages like below, which don't work
> so well when passed as compiler flags
>
> [1] "[BMRC] You have sourced the BMRC Rprofile provided at
> /apps/misc/R/bmrc-r-user-tools/Rprofile"
> [1] "[BMRC] Messages coming from this file (like this one) will be
> prefixed with [BMRC]"
> [1] "[BMRC] You are running R on host 
> with CPU "
> [1] "[BMRC] While running on this host, local R packages will be
> sourced from and installed to
> /well/combat/users/ifl143/R/4.0/ivybridge"
> /gpfs3/well/combat/users/ifl143/R/4.0/ivybridge/Rhdf5lib/include
>
> I thought about using --vanilla but I use .Rprofile to set my library path,
> so ignoring that file completely defeats the point in my setup.  Is anyone
> aware of either a more reliable way of getting the information I want
> (maybe suppressing messages, different mechanism entirely, etc)?
> Alternatively, is there anything definitive in WRE or the like that
> suggests printing messages Rprofile is a bad idea that I can pass on to the
> users?
>
> Cheers,
> Mike
>
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Updates to BiocFileCache, AnnotationHub, and ExperimentHub

2021-04-07 Thread Henrik Bengtsson
FWIW, I ran into a similar problem when I moved R.cache
(https://cran.r-project.org/package=R.cache) from using ~/.Rcache to
~/.cache/R/R.cache (etc).  I decided on making it 100%-backward
compatible, i.e. if there's already a legacy ~/.Rcache cache folder,
it'll keep using that, otherwise the new standard.  That way nothing
breaks, and it's not a biggie if it keeps writing to the legacy cache
folder.  For now, it's silent, but I'll eventually deprecate
~/.Rcache, e.g. by producing one-time warning per session and in a
later release be more aggressive, and eventually make it defunct (but
not rushing there).  Here's what I wrote in my NEWS release:

Version: 0.14.0 [2019-12-05]

SIGNIFICANT CHANGES:

 * Now R.cache uses a default cache path that adheres to the standard cache
   location on the current operating system, whereas in previous versions it
   defaulted to ~/.Rcache.  On Unix, the 'XDG Base Directory Specification'
   is followed, which means that the R.cache folder will typically be
   ~/.cache/R/R.cache/.  On macOS, it will be ~/Library/Caches/R/R.cache/.
   On modern versions of Microsoft Windows, environment variables such
   as 'LOCALAPPDATA' will be used, which typically resolves to
   '%USERPROFILE%/AppData/Local, e.g. 'C:/Users/alice/AppData/Local'.
   If R.cache fails find a proper OS-specific cache folder, it will fall
   back to using ~/.Rcache as previously done.
   Importantly, if ~/.Rcache already exists, then that will be used by
   default.  This is done in order to not lose previously cached files.
   Users with an existing folder who wish to move to the new standard need
   to move or remove the ~/.Rcache folder manually.

/Henrik

On Wed, Apr 7, 2021 at 9:41 AM Aaron Lun
 wrote:
>
> > There is no guarantee we would be under the right user to have permissions 
> > to move the cache automatically and would not want to leave it in a broken 
> > state.
>
> Well, can't you try? If people follow your 4.1 instructions and they
> don't have permissions, the cache will be broken anyway.
>
> But let's say you can't move it, and your worst-case scenario comes to
> pass. EVEN THEN: I would expect a deprecation warning, no error, and
> BiocFileCache continuing to pull from the old cache for 6 months.
>
> Every previous non-transparent change to BioC's core infrastructure
> has come with a deprecation warning. I don't see why this is any
> different. An error is particularly galling given that the package was
> working fine before, it's not like you're doing some kind of critical
> bugfix.
>
> > This should not affect any cache that is explicitly stated with a different 
> > name in the constructor or using environment variables;  only in the case 
> > of BiocFileCache() .  Most package specific caches created their own cache 
> > in the constructor so it should not cause the ERROR in that case.
>
> If Vince's last email is any indication, and calling ExperimentHub()
> or AnnotationHub() causes an error... this will be a disaster. I'm
> going to get a lot of emails, unnecessary emails, from users wondering
> why scRNAseq and celldex don't work anymore. It'll be like our
> AWS-China problems multiplied by 10.
>
> Why not just make a new cache and populate it? Well, I don't really
> care what you do, as long as I don't get an error.
>
> -A
>
> > 
> > From: Aaron Lun 
> > Sent: Wednesday, April 7, 2021 11:41 AM
> > To: Kern, Lori 
> > Cc: bioc-devel@r-project.org 
> > Subject: Re: [Bioc-devel] Updates to BiocFileCache, AnnotationHub, and 
> > ExperimentHub
> >
> > Woah, I missed the part where you said that there would be an error.
> >
> > This does not sound good. Users are going to flip out, especially when
> > EHub and AHub are not visible dependencies (e.g., scRNAseq, celldex).
> > It also sounds completely unnecessary for EHub and AHub given that the
> > new cache can just be populated by fresh downloads. Similarly,
> > BiocFileCache::bfcrpath should not be affected, and people using that
> > shouldn't be getting an error.
> >
> > Why not just move the old default cache into the new location
> > automatically? This seems like the simplest solution given that
> > everyone accessing BFC resources should be doing so through the BFC
> > API. And most files are not position-dependent, unless people are
> > putting shared libraries in there.
> >
> > But even if you can't, an error is just too much. We use BiocFileCache
> > a lot in our company infrastructure and the brown stuff will hit the
> > fan if we have to find every old default cache and delete it. The
> > package should handle this for us.
> >
> > -A
> >
> > On Wed, Apr 7, 2021 at 4:46 AM Kern, Lori  
> > wrote:
> > >
> > > Mostly to lighten the dependency tree using tools that is built in with R 
> > > would remove one additional dependency.  Also clarity; the tools 
> > > directory adds an R folder for distinction that they are used with R 
> > > packages which seemed like if a user was ever 

Re: [R-pkg-devel] Vignettes from LaTeX files.

2021-03-05 Thread Henrik Bengtsson
Thank you. Glad to hear it's useful.

This plain TeX/LaTeX vignette engine is implemented using base R.  If
someone is willing to drive the efforts, I think it's not too much
work to refactor it and propose it for base R itself, where I think it
belongs, e.g. in the 'utils' package where Sweave lives, or in 'tools'
where tools::texi2dvi() lives.  The vignette engine could be called
'utils::tex', e.g. %\VignetteEngine{utils::tex}.  To me it meets the
standards for reproducible documentation from source, although,
obviously there's no 'tangle':d code output.  If we want the latter,
we're getting into literate programming, and we already have several
vignette engines for that.

Just a thought,

Henrik

On Thu, Mar 4, 2021 at 7:20 PM Rolf Turner  wrote:
>
>
>
> I have now tried out building vignettes from plain LaTeX source,
> following the admirably lucid instructions provided by Henrik Bengtsson
> in:
>
>  
> https://cran.r-project.org/web/packages/R.rsp/vignettes/R_packages-LaTeX_vignettes.pdf
>
> It all went unbelievably smoothly; no problems whatever.
>
> I am deeply grateful to Henrik.
>
> cheers,
>
> Rolf Turner
>
> --
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Phone: +64-9-373-7599 ext. 88276
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Why .Rbuildignore doesn't ignore?

2021-03-04 Thread Henrik Bengtsson
FYI, and in case others wonder or search for this later, the warnings:

Warning: invalid uid value replaced by that for user 'nobody'
Warning: invalid gid value replaced by that for user 'nobody'

are harmless and has nothing to do with your problem reported here.
It happens because on the system you're running your user account has
a user ID (UID) and a group ID (GID) that are > 32767, e.g.

$ id
uid=34002(alice) gid=34001(alicelab)

Since 'R CMD build' aims at building a *.tar.gz file that works also
only older systems with older tar software that only supports UID/GID
<= 32767, it created the *.tar.gz file with mockup UID=32767,
GID=32767 (called 'nobody') instead of your actual UID/GID.

/Henrik

On Thu, Mar 4, 2021 at 11:59 AM Duncan Murdoch  wrote:
>
> I don't have any strong suggestions.  Things I'd look for:
>
> - is the .Rbuildignore filename spelled correctly?  It was in your
> message, but maybe there's something different in the actual file on
> disk.  Running file.exists(".Rbuildignore") should return TRUE if your
> working directory is the main package directory.
>
> - is it encoded properly?  I don't know that this would cause trouble,
> but it might.  Run tools::showNonASCIIfile(".Rbuildignore")  to see if
> there are any non-ASCII chars in it.
>
> - are the filename patterns entered correctly?  Spaces or punctuation
> before or after the patterns will be taken to be part of the pattern.
> If your working directory is the top level package directory, this
> emulates the test R CMD build uses:
>
>   files <- list.files()
>   files[tools:::inRbuildignore(files, ".")]
>
> It should list all the files that you want to ignore.
>
> Duncan Murdoch
>
> On 04/03/2021 2:06 p.m., Jose Barrera wrote:
> > Dear Duncan and Jeff,
> >
> > Yes, I see those files when tar xvf miclust_1.2.6.tar.gz.
> >
> > As usual, I am both building and checking in a terminal (outside RStudio):
> >
> > $ R CMD build miclust --resave-data
> > * checking for file ‘miclust/DESCRIPTION’ ... OK
> > * preparing ‘miclust’:
> > * checking DESCRIPTION meta-information ... OK
> > * checking for LF line-endings in source and make files and shell scripts
> > * checking for empty or unneeded directories
> > * building ‘miclust_1.2.6.tar.gz’
> > Warning: invalid uid value replaced by that for user 'nobody'
> > Warning: invalid gid value replaced by that for user 'nobody'
> >
> > Then
> >
> > $ R CMD check --as-cran miclust_1.2.6.tar.gz
> >
> > If I move miclust.Rproj and README.Rmd outside the development directory,
> > then there is no problem, but I find that solution somewhat dirty. Any
> > ideas?
> >
> > Thanks,
> >
> > Jose Barrera
> > Statistician, Associate Lecturer
> >
> > *IS**Global*
> > Barcelona Institute for Global Health - Campus MAR
> > Barcelona Biomedical Research Park (PRBB) (Room Hypatia)
> >
> > Doctor Aiguader, 88
> > 08003 Barcelona, Spain
> > Tel. +34 93 2147383
> > jose.barr...@isglobal.org
> > 
> > Personal website: sites.google.com/view/josebarrera
> > www.isglobal.org
> >
> > This message is intended exclusively for its addressee and may contain
> > information that is CONFIDENTIAL and protected by professional privilege.
> > If you are not the intended recipient you are hereby notified that any
> > dissemination, copy or disclosure of this communication is strictly
> > prohibited by law. If this message has been received in error, please
> > immediately notify us via e-mail and delete it.
> >
> > DATA PROTECTION. We inform you that your personal data, including your
> > e-mail address and data included in your email correspondence, are included
> > in the ISGlobal Foundation filing system. Your personal data will be used
> > for the purpose of contacting you and sending information on the activities
> > of the above foundations. You can exercise your rights to  access to
> > personal data, rectification, erasure, restriction of processing, data
> > portability and object by contacting the following address: 
> > *l...@isglobal.org
> > *. ISGlobal Privacy Policy at *www.isglobal.org
> > *.
> >
> >
> > -
> >
> > CONFIDENCIALIDAD. Este mensaje y sus anexos se dirigen exclusivamente a su
> > destinatario y puede contener información confidencial, por lo que la
> > utilización, divulgación y/o copia sin autorización está prohibida por la
> > legislación vigente. Si ha recibido este mensaje por error, le rogamos lo
> > comunique inmediatamente por esta misma vía y proceda a su destrucción.
> >
> > PROTECCIÓN DE DATOS. Sus datos de carácter personal utilizados en este
> > envío, incluida su dirección de e-mail, forman parte de ficheros de
> > titularidad de la Fundación ISGlobal  para cualquier finalidades de
> > contacto, relación institucional y/o envío de información sobre sus
> > actividades. Los datos que usted nos pueda 

Re: [R-pkg-devel] Using the amsmath package in a vignette.

2021-03-03 Thread Henrik Bengtsson
The R.rsp has a vignette engine for plain LaTeX sources. See
https://cran.r-project.org/web/packages/R.rsp/vignettes/R_packages-LaTeX_vignettes.pdf
for how. It's straightforward. Maybe that helps.

Henrik

On Wed, Mar 3, 2021, 18:51 Rolf Turner  wrote:

>
> I am trying to create a vignette in a package (basically just using
> LaTeX code; no R calculations or data are involved).
>
> The LaTeX code involves the use of the align* environment from
> the amsmath package.  When I try to run Sweave() on the *.Rnw file
> I get a corresponding *.tex file, but then when I run pdflatex on
> that file I get an error:
>
> > ! LaTeX Error: Command \iint already defined.
> >Or name \end... illegal, see p.192 of the manual.
> >
> > See the LaTeX manual or LaTeX Companion for explanation.
> > Type  H   for immediate help.
> >  ...
> >
> > l.649 ...d{\iint}{\DOTSI\protect\MultiIntegral{2}}
>
> which is completely opaque to me.
>
> If I don't have \usepackage{amsmath} in the *.Rnw file, I get
> (unsurprisingly) an error message to the effect that the align*
> environment is undefined.
>
> Is there any way to make use of amsmath facilities in a vignette?
>
> Alternatively, is there any way to simply use the pdf output obtained
> by processing an ordinary LaTeX file as a vignette?  I have done a
> bit of web searching on this, but all of the hits that I get seem to be
> substantially out of date.  They refer to putting vignettes in
> /inst/doc and I'm pretty sure that this is no longer how it's done.
> (But I find all of the vignette business rather bewildering and
> confusing.)
>
> Grateful for any advice.
>
> cheers,
>
> Rolf Turner
>
> --
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Phone: +64-9-373-7599 ext. 88276
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] Remotes in DESCRIPTION not supported for BioC?

2021-03-02 Thread Henrik Bengtsson
Related: Does Bioconductor support 'Additional_repositories'? From RWE:

The ‘Additional_repositories’ field is a comma-separated list of
repository URLs where the packages named in the other fields may be
found. It is currently used by R CMD check to check that the packages
can be found, at least as source packages (which can be installed on
any platform).

FWIW, CRAN allows/supports them (I've only used them for packages
under 'Suggests:').

/Henrik

On Tue, Mar 2, 2021 at 4:02 AM Rainer Johannes
 wrote:
>
> thanks for the quick reply Lori!
>
> OK, so we will have to submit the package to Bioc ASAP.
>
> cheers, jo
>
> > On 2 Mar 2021, at 12:51, Kern, Lori  wrote:
> >
> > Currently the builders do not allow the user of remotes and you would have 
> > to wait for it to be submitted to CRAN or Bioconductor.  All Bioconductor 
> > dependencies must be on CRAN or Bioconductor.
> >
> > Get Outlook for iOS
> > From: Bioc-devel  on behalf of Rainer 
> > Johannes 
> > Sent: Tuesday, March 2, 2021 4:36:05 AM
> > To: bioc-devel 
> > Subject: [Bioc-devel] Remotes in DESCRIPTION not supported for BioC?
> >
> > Dear All!
> >
> > in one of the xcms vignettes we are using a R package which is not yet in 
> > Bioconductor and I added
> >
> > Remotes:
> > RforMassSpectrometry/MsBackendMgf
> >
> > to the DESCRIPTION thinking that on the BioC build machines the package 
> > might get installed (from github). It does apparently not as we get ERROR 
> > for this vignette. Is there any way I can use a non-Bioconductor and 
> > non-CRAN package or do I have to wait until the package is included in 
> > Bioconductor?
> >
> > thanks for any feedback!
> >
> > cheers, jo
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://secure-web.cisco.com/1H1jWIWyghkLhLM5RAXwwfoWK4PlOlae4Pc1-a_NRbOIOuLSLHtGAyInV-z8t0fCgCfVH0nxURKnBN77cdDWaaKfCoZMnqX4XshZNWoi44Ko2dWmnDvV5BjARxeduRCCmoKigDovJkvgYgbecL-t9v9m2HGULeKeb9va8J3NlYFXnVkCDGyzsNgbeZSN2VlWCKhLL1vQqlJEiglEPQraiCYfRjZdnxkQsRcLPnIkfvjbIHJWHFz0LYJRGav4iCiARcWCn0Uo0qe8ieknAiGloqMR2HJHD6oUNCdG8v7oYy73oVejgIqxYBbvJsrztDawr/https%3A%2F%2Fstat.ethz.ch%2Fmailman%2Flistinfo%2Fbioc-devel
> >
> >
> > This email message may contain legally privileged and/or confidential 
> > information. If you are not the intended recipient(s), or the employee or 
> > agent responsible for the delivery of this message to the intended 
> > recipient(s), you are hereby notified that any disclosure, copying, 
> > distribution, or use of this email message is prohibited. If you have 
> > received this message in error, please notify the sender immediately by 
> > e-mail and delete this email message from your computer. Thank you.
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Rsync to Bioconductor

2021-02-25 Thread Henrik Bengtsson
FYI, I just sent an email to 'maintai...@bioconductor.org' regarding
this but it bounced with the below errors (I've truncated your
address):

From: Mail Delivery System 
Date: Thu, Feb 25, 2021 at 12:39 PM
Subject: Undelivered Mail Returned to Sender
To: ...


This is the mail system at host delivery.bioconductor.org.

I'm sorry to have to inform you that your message could not
be delivered to one or more recipients. It's attached below.

For further assistance, please send mail to postmaster.

If you do so, please include this problem report. You can
delete your own text from the attached returned message.

   The mail system

: host mx1.roswellpark.iphmx.com[68.232.137.170]
said: 550 #5.7.1 Your access to submit messages to this e-mail system has
been rejected. (in reply to DATA command)

: host mx1.roswellpark.iphmx.com[68.232.137.170]
said: 550 #5.7.1 Your access to submit messages to this e-mail system has
been rejected. (in reply to DATA command)

: host
mx1.roswellpark.iphmx.com[68.232.137.170] said: 550 #5.7.1 Your access to
submit messages to this e-mail system has been rejected. (in reply to DATA
command)

: host mx1.roswellpark.iphmx.com[68.232.137.170]
said: 550 #5.7.1 Your access to submit messages to this e-mail system has
been rejected. (in reply to DATA command)

/Henrik

On Wed, Feb 24, 2021 at 8:03 AM Kern, Lori
 wrote:
>
> In the next few weeks we will be limiting the ability to rsync to 
> Bioconductor resources.  If you currently are using rsync please reach out to 
> maintai...@bioconductor.org as we will need additional information to 
> continue to allow rsync on your machines.
>
> Cheers,
>
>
> Lori Shepherd
>
> Bioconductor Core Team
>
> Roswell Park Comprehensive Cancer Center
>
> Department of Biostatistics & Bioinformatics
>
> Elm & Carlton Streets
>
> Buffalo, New York 14263
>
>
> This email message may contain legally privileged and/or confidential 
> information.  If you are not the intended recipient(s), or the employee or 
> agent responsible for the delivery of this message to the intended 
> recipient(s), you are hereby notified that any disclosure, copying, 
> distribution, or use of this email message is prohibited.  If you have 
> received this message in error, please notify the sender immediately by 
> e-mail and delete this email message from your computer. Thank you.
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] Testing on old R versions

2021-01-28 Thread Henrik Bengtsson
Hi,

you're probably already aware of it, but 'rgl' depends on 'magrittr'
which depends on 'rlang', and the latter requires R (>= 3.3.0).


BTW, I find Singularity (https://github.com/hpcng/singularity/) more
convenient to work with than Docker, e.g. you run as host $USER also
"inside" the container, $PWD and your $HOME folder is automatically
mounted, the built container is a single executable (just like any
other software you run).   Here's how you can use Rocker's r-base with
Singularity on Linux (Ubuntu 18.04):

# Download and build

{host}$ cd /tmp
{host}$ singularity build r-3.3.0.sif docker://r-base:3.3.0

{host}$ ls -l r-3.3.0.sif
-rwxr-xr-x 1 hb hb 385630208 Jan 28 16:05 r-3.3.0.sif


# Run as-is

{host}$ ./r-3.3.0.sif

R version 3.3.0 (2016-05-03) -- "Supposedly Educational"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
...

> quit("no")

Comment: There's also a 'singularity {run,exec,eval} ...' similar to
'docker' which comes with more features but lots can be done without
it.


# Persistent package installs to host via custom R_LIBS_USER

{host}$ cd /tmp
{host}$ mkdir R-libs-3.3.0
{host}$ R_LIBS_USER=R-libs-3.3.0 ./r-3.3.0.sif

R version 3.3.0 (2016-05-03) -- "Supposedly Educational"
Copyright (C) 2016 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
...

> .libPaths()[1]
[1] "/tmp/R-libs-test"
> install.packages("rgl")
Installing package into ‘/tmp/R-libs-test’
(as ‘lib’ is unspecified)
also installing the dependencies ‘ps’, ‘fs’, ‘rappdirs’, ‘processx’,
‘Rcpp’, ‘BH’, ‘sass’, ‘jquerylib’, ‘callr’, ‘yaml’, ‘digest’,
‘base64enc’, ‘rlang’, ‘evaluate’, ‘highr’, ‘markdown’, ‘xfun’,
‘httpuv’, ‘mime’, ‘xtable’, ‘R6’, ‘sourcetools’, ‘later’, ‘promises’,
‘crayon’, ‘fastmap’, ‘withr’, ‘commonmark’, ‘glue’, ‘bslib’, ‘cachem’,
‘ellipsis’, ‘lifecycle’, ‘lazyeval’, ‘miniUI’, ‘webshot’,
‘htmlwidgets’, ‘htmltools’, ‘knitr’, ‘jsonlite’, ‘shiny’, ‘crosstalk’,
‘manipulateWidget’
...

** preparing package for lazy loading
Error : object ‘hcl.colors’ is not exported by 'namespace:grDevices'
ERROR: lazy loading failed for package ‘rgl’
* removing ‘/tmp/R-libs-3.3.0/rgl’

The downloaded source packages are in
‘/tmp/hb/RtmpzBZ0FJ/downloaded_packages’
Warning message:
In install.packages("rgl") :
  installation of package ‘rgl’ had non-zero exit status

> str(rownames(installed.packages()))
 chr [1:77] "base64enc" "BH" "bslib" "cachem" "callr" ...

> quit("no")


# The packages are indeed installed to the host folder and survives restarts

{host}$ R_LIBS_USER=R-libs-3.3.0 ./r-3.3.0.sif
...
> str(rownames(installed.packages()))
 Named chr [1:77] "base64enc" "BH" "bslib" "cachem" "callr" ...
 - attr(*, "names")= chr [1:77] "" "" "" "" ...

So, running R via Singularity is almost the same as running R directly
on the host.

/Henrik


On Thu, Jan 28, 2021 at 2:29 PM Dirk Eddelbuettel  wrote:
>
>
> On 28 January 2021 at 16:50, Duncan Murdoch wrote:
> | Thanks Dirk, Neal and Nathan.  I ended up going with Dirk's suggestion.
> |
> | So far I haven't got it to work in 3.2.0; I probably won't put much
> | effort into supporting that old version.  But it's fine in 3.4.0 and
> | 3.5.0, and I'm trying 3.3.0 now.  Rocker rocks!
>
> Thanks, I would have to agree here :)
>
> | P.S. rgl now installs in the basic r-base container, without adding
> | additional libs.  This is due to some config checks that let it run on a
> | barebones machine:  it can still produce WebGL output for a browser, it
> | just can't show it on screen.
>
> Most excellent, didn't know that part.  Ping we off-list if you're interested
> in automating this, I also use these containers (or sometimes extensions with
> additional build-dependencies) to test builds in CI settings. Setting that
> can be pretty straightforward and self-contained.  It's easiest to instrument
> when you don't even need other build dependencies.
>
> Dirk
>
> --
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package used unconditionally only in testing

2021-01-08 Thread Henrik Bengtsson
R CMD check --as-cran will give an ERROR if not all Suggest:ed
packages are installed and available when it runs.

/Henrik

On Fri, Jan 8, 2021 at 11:25 AM Greg Freedman Ellis  wrote:
>
> > It should almost certainly be included in Suggests, and nowhere else.
> > If that means your tests are skipped, you should feel free to warn the
> > user in your test messages:  but it shouldn't cause your tests to fail.
>
> I guess my worry is that my upstream dependencies will break something
> without knowing it because the tests won't fail unless httptest is
> installed. I really wish there was a way to indicate dependencies for
> testing purposes only.
>
> (And, oops, you're right, I meant that I was considering including httptest
> in Imports not Depends)
>
> On Fri, Jan 8, 2021 at 12:04 PM David Bosak  wrote:
>
> > >> Namespace in Imports field not imported from: ‘httptest’
> >
> > >>All declared Imports should be used.
> >
> >
> >
> > I’ve seen that note before when I forgot to remove a @import reference in
> > the Roxygen comments.  Or it is still in the NAMESPACE file when I don’t
> > want it to be.
> >
> >
> >
> > Sent from Mail  for
> > Windows 10
> >
> >
> >
> > *From: *Duncan Murdoch 
> > *Sent: *Friday, January 8, 2021 11:15 AM
> > *To: *Greg Freedman Ellis ;
> > r-package-devel@r-project.org
> > *Subject: *Re: [R-pkg-devel] Package used unconditionally only in testing
> >
> >
> >
> > On 08/01/2021 9:17 a.m., Greg Freedman Ellis wrote:
> >
> > > Hi all,
> >
> > >
> >
> > > I'm trying to update a package to conform to pass tests given
> >
> > > `_R_CHECK_DEPENDS_ONLY_=TRUE`.
> >
> > >
> >
> > > In this package, we only use the package `httptest` during testing, but
> > the
> >
> > > tests are (almost) meaningless if it is not installed, so I would like to
> >
> > > indicate that it is a required package rather than skipping tests if it
> > is
> >
> > > not installed.
> >
> >
> >
> > This sounds wrong.  I don't know the httptest package, but I assume that
> >
> > since you were only using it for testing, I might be interested in using
> >
> > your package even if I wasn't interested in installing httptest.  Making
> >
> > it a hard requirement would force me to install httptest.
> >
> >
> >
> > >
> >
> > > If I move `httptest` to Depends, then I get the error CRAN check note:
> >
> > >> Namespace in Imports field not imported from: ‘httptest’
> >
> > >>All declared Imports should be used.
> >
> >
> >
> > You *definitely* shouldn't include it in Depends:  that would force it
> >
> > onto the search list, and potentially break other things that I'm
> >
> > running, e.g. if they have name conflicts with it.
> >
> >
> >
> > You *probably* shouldn't include it in Imports.  Why force me to load
> >
> > another package if I'll never use it?  There's a tiny chance that would
> >
> > push my memory use over the edge, and my own code would fail.
> >
> >
> >
> > It should almost certainly be included in Suggests, and nowhere else.
> >
> > If that means your tests are skipped, you should feel free to warn the
> >
> > user in your test messages:  but it shouldn't cause your tests to fail.
> >
> >
> >
> > > I think this would best be solved by a DESCRIPTION field that indicates a
> >
> > > package is required, but only for tests, but I do not see such a field.
> > The
> >
> > > only solution I can think of is to have a trivial import of `httptest` in
> >
> > > the main package to silence the NOTE. Is there a better solution?
> >
> >
> >
> > Most users aren't going to run your tests, so they shouldn't be forced
> >
> > to install software that would let them do so.
> >
> >
> >
> > Duncan Murdoch
> >
> >
> >
> > __
> >
> > R-package-devel@r-project.org mailing list
> >
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel
> >
> >
> >
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Accessing features in R4.0.

2020-12-16 Thread Henrik Bengtsson
BTW, 'backports' provides a backport for tools::R_user_dir() also for
R (< 4.0.0), so an alternative solution in this case is:

Imports: backports

and

importFrom(backports, R_user_dir)

The 'backports' package takes the same approach as I did in my previous message.

Henrik

On Wed, Dec 16, 2020 at 12:27 PM Henrik Bengtsson
 wrote:
>
> Colin, you can do:
>
> Imports: tools
>
> NAMESPACE:
> if (getRversion() >= 4) importFrom(tools,R_user_dir)
>
>
> R/000.namespace.R:
> ## Create a dummy R_user_dir() for R (< 4.0.0)
> ## to please R CMD check
> if (getRversion() < 4) {
>   R_user_dir <- function(...) NULL
> }
>
> and then use:
>
> if (getRversion() < 4) {
>   # do something
> } else {
>   R_user_dir("oysteR", which = "cache")
> }
>
> An advantage of this approach is that it's clear from Imports: and the
> NAMESPACE file what you're importing and when.  When using Suggests:
> and pkg::fcn() you can't peek at NAMESPACE to see what's actually
> used.
>
>
> Finally, if '#do something' is basically trying to do the same as
> tools::R_user_dir(), you could of course also consider writing your
> own forward-compatible wrapper for R (< 4.0.0), e.g.
>
> if (getRversion() < 4) {
>   R_user_dir <- function(...) {
> # do something
>}
> }
>
> and then use R_user_dir() as if you're running R (>= 4.0.0).  That's
> the cleanest version.
>
> Hope this helps,
>
> Henrik
>
>
> On Wed, Dec 16, 2020 at 11:12 AM Jeff Newmiller
>  wrote:
> >
> > For "obvious" reasons? I don't see this kind of avoidance as "obviously" 
> > correct at all. You have a dependency... it should be declared. There are 
> > various ways to proceed, with Imports or Depends or Suggests or pulling the 
> > code into your package... but trying to subvert the dependency management 
> > is counterproductive.
> >
> > On December 16, 2020 8:28:15 AM PST, Colin Gillespie 
> >  wrote:
> > >Hi,
> > >
> > >I'm planning on using the tools::R_user_dir function in a package. But
> > >for obvious reasons, I don't want to set the dependency on R 4.
> > >
> > >My code is basically
> > >
> > >if (as.numeric(R.version$major) < 4) # do something
> > >else tools::R_user_dir("oysteR", which = "cache")
> > >
> > >When checking on win-builder R3.6 I get the note
> > >
> > >* checking dependencies in R code ... NOTE
> > >Missing or unexported object: 'tools::R_user_dir'
> > >
> > >## Question
> > >
> > >Is my code correct and can I ignore this note?
> > >
> > >Thanks
> > >
> > >Colin
> > >
> > >
> > >Dr Colin Gillespie
> > >http://www.mas.ncl.ac.uk/~ncsg3/
> > >
> > >__
> > >R-package-devel@r-project.org mailing list
> > >https://stat.ethz.ch/mailman/listinfo/r-package-devel
> >
> > --
> > Sent from my phone. Please excuse my brevity.
> >
> > __
> > R-package-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Accessing features in R4.0.

2020-12-16 Thread Henrik Bengtsson
Colin, you can do:

Imports: tools

NAMESPACE:
if (getRversion() >= 4) importFrom(tools,R_user_dir)


R/000.namespace.R:
## Create a dummy R_user_dir() for R (< 4.0.0)
## to please R CMD check
if (getRversion() < 4) {
  R_user_dir <- function(...) NULL
}

and then use:

if (getRversion() < 4) {
  # do something
} else {
  R_user_dir("oysteR", which = "cache")
}

An advantage of this approach is that it's clear from Imports: and the
NAMESPACE file what you're importing and when.  When using Suggests:
and pkg::fcn() you can't peek at NAMESPACE to see what's actually
used.


Finally, if '#do something' is basically trying to do the same as
tools::R_user_dir(), you could of course also consider writing your
own forward-compatible wrapper for R (< 4.0.0), e.g.

if (getRversion() < 4) {
  R_user_dir <- function(...) {
# do something
   }
}

and then use R_user_dir() as if you're running R (>= 4.0.0).  That's
the cleanest version.

Hope this helps,

Henrik


On Wed, Dec 16, 2020 at 11:12 AM Jeff Newmiller
 wrote:
>
> For "obvious" reasons? I don't see this kind of avoidance as "obviously" 
> correct at all. You have a dependency... it should be declared. There are 
> various ways to proceed, with Imports or Depends or Suggests or pulling the 
> code into your package... but trying to subvert the dependency management is 
> counterproductive.
>
> On December 16, 2020 8:28:15 AM PST, Colin Gillespie  
> wrote:
> >Hi,
> >
> >I'm planning on using the tools::R_user_dir function in a package. But
> >for obvious reasons, I don't want to set the dependency on R 4.
> >
> >My code is basically
> >
> >if (as.numeric(R.version$major) < 4) # do something
> >else tools::R_user_dir("oysteR", which = "cache")
> >
> >When checking on win-builder R3.6 I get the note
> >
> >* checking dependencies in R code ... NOTE
> >Missing or unexported object: 'tools::R_user_dir'
> >
> >## Question
> >
> >Is my code correct and can I ignore this note?
> >
> >Thanks
> >
> >Colin
> >
> >
> >Dr Colin Gillespie
> >http://www.mas.ncl.ac.uk/~ncsg3/
> >
> >__
> >R-package-devel@r-project.org mailing list
> >https://stat.ethz.ch/mailman/listinfo/r-package-devel
>
> --
> Sent from my phone. Please excuse my brevity.
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Conditional timeout for httr request when running R CMD check

2020-11-30 Thread Henrik Bengtsson
The problem here was regarding user facing example():s, not package tests.

In order to keep a neat example for the user to play with, I'd
probably wrap the example code in a \donttest{} statement.  Though, I
don't remember if CRAN tests with R CMD check --run-dontest, or not.
There's also \dontrun{}.

My $.02

/Henrik

On Mon, Nov 30, 2020 at 11:09 AM Jeff Newmiller
 wrote:
>
> Don't test against a live website for most of your testing... use recorded or 
> simulated input. If your package functional interface doesn't allow for that, 
> then re-factor it so it does.
>
> For those tests that actually have to interact with the live website, only 
> run them if you know you are not on CRAN. The testthat package has support 
> for this if you don't want to roll your own.
>
> On November 30, 2020 10:45:01 AM PST, "Ayala Hernandez, Rafael" 
>  wrote:
> >Dear all,
> >
> >My package openSkies includes a set of functions to retrieve
> >information from the OpenSky API.
> >
> >The examples for these functions can, on rare occassions, take
> >anomously longer times to complete than usually because of issues on
> >the API side.
> >
> >I have already included a timeout parameter to prevent endless attempts
> >to retrieve the result when, for example, the API is down for
> >maintenance.
> >
> >I have recently been asked by CRAN to reduce the execution time of each
> >example to below 5 s. In order to do this, I can set the timeout
> >parameter to something below 5s, and return NULL and a message
> >indicating the resource is not currently available. However, I think
> >this might not be the best option for examples that demonstrate
> >important functionalities to users.
> >
> >Therefore, I was wondering if there is anyway to set up the timeout
> >parameter to a different value than usual based on the condition that
> >examples are being run as part of R CMD check?
> >
> >Thanks a lot in advance
> >
> >Best wishes,
> >
> >Rafa
> >
> >   [[alternative HTML version deleted]]
> >
> >__
> >R-package-devel@r-project.org mailing list
> >https://stat.ethz.ch/mailman/listinfo/r-package-devel
>
> --
> Sent from my phone. Please excuse my brevity.
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] return (x+1) * 1000

2020-11-20 Thread Henrik Bengtsson
Without having dug into the details, it could be that one could update
the parser by making a 'return' a keyword and require it to be
followed by a parenthesis that optionally contains an expression
followed by end of statement (newline or semicolon).  Such a
"promotion" of the 'return' statement seems backward compatible and
would end up throwing syntax errors on:

function() return
function() return 2*x
function() return (2*x) + 1

while still accepting:

function() return()
function() return(2*x)
function() return((2*x) + 1)

Just my two Friday cents

/Henrik

On Fri, Nov 20, 2020 at 3:37 PM Dénes Tóth  wrote:
>
> Yes, the behaviour of return() is absolutely consistent. I am wondering
> though how many experienced R developers would predict the correct
> return value just by looking at those code snippets.
>
> On 11/21/20 12:33 AM, Gabriel Becker wrote:
> > And the related:
> >
> > > f = function() stop(return("lol"))
> >
> > > f()
> >
> > [1] "lol"
> >
> >
> > I have a feeling all of this is just return() performing correctly
> > though. If there are already R CMD CHECK checks for this kind of thing
> > (I wasnt sure but I'm hearing from others there may be/are) that may be
> > (and/or may need to be) sufficient.
> >
> > ~G
> >
> > On Fri, Nov 20, 2020 at 3:27 PM Dénes Tóth  > > wrote:
> >
> > Or even more illustratively:
> >
> > uneval_after_return <- function(x) {
> > return(x) * stop("Not evaluated")
> > }
> > uneval_after_return(1)
> > # [1] 1
> >
> > On 11/20/20 10:12 PM, Mateo Obregón wrote:
> >  > Dear r-developers-
> >  >
> >  > After many years of using and coding in R and other languages, I
> > came across
> >  > something that I think should be flagged by the parser:
> >  >
> >  > bug <- function (x) {
> >  >   return (x + 1) * 1000
> >  > }
> >  >> bug(1)
> >  > [1] 2
> >  >
> >  > The return() call is not like any other function call that
> > returns a value to
> >  > the point where it was called from. I think this should
> > straightforwardly be
> >  > handled in the parser by flagging it as a syntactic error.
> >  >
> >  > Thoughts?
> >  >
> >  > Mateo.
> >  > --
> >  > Mateo Obregón.
> >  >
> >  > __
> >  > R-devel@r-project.org  mailing list
> >  > https://stat.ethz.ch/mailman/listinfo/r-devel
> >  >
> >
> > __
> > R-devel@r-project.org  mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] return (x+1) * 1000

2020-11-20 Thread Henrik Bengtsson
FWIW, 'R CMD check --as-cran' in R-devel checks for "bogus return"
statements but I think that's only for the case when one forgets the
parentheses, e.g. 'return' instead of 'return()'.

I don't think it catches this case but I'm also not sure. Though, I can
imagine it might be possible to enhance the current check to include also
this case.

It could be that setting _R_CHECK_BOGUS_RETURN_=true will enable this check
also in earlier versions in R; not sure when it was introduced.

/Henrik

On Fri, Nov 20, 2020, 13:58 Gabriel Becker  wrote:

> Hi all,
>
> I can confirm this occurs for me as well.
>
> The one thing that comes to mind is that there are certain larger
> expressions that contain calls to return which we absolutely don't want to
> be an error, e.g
>
> if(somestuff)
> return(TRUE)
>
>
> That said, the actual expression Mateo pointed out certainly does look like
> an error (it definitely isn't going to do what the developer intended).
>
> I haven't looked at the parser much, to be honest. I assume there is
> perhaps enough differentiation of if/else that return() could be allowed
> within that but not inside a larger expression without it?
>
> There would be things that are legal (though horrifying) now that would
> stop working though, such as:
>
> f = function(a) {
>
> ret = switch(a,
>
>  "1"= return("haha got 1!"),
>
>  "2" = "regular ole 2")
>
> ret
>
> }
>
>
> Whether it would be a problem or not that such insanity wouldn't work is
> less clear. Are there valid non-if embedded return() cases that are
> important to allow? If so (and if they're not differentiated by the parser,
> which I somewhat doubt switch is, for example, though I'm not certain), I'm
> skeptical we'd be able to do as he suggests.
>
> It does seem worth considering though. If it can't be a hard parse error
> but we agree many/most cases are problematic, perhaps adding detecting this
> to the static checks that R CMD CHECK performs is another way forward.
>
> Best,
> ~G
>
> On Fri, Nov 20, 2020 at 1:34 PM Mateo Obregón 
> wrote:
>
> > Dear r-developers-
> >
> > After many years of using and coding in R and other languages, I came
> > across
> > something that I think should be flagged by the parser:
> >
> > bug <- function (x) {
> >  return (x + 1) * 1000
> > }
> > > bug(1)
> > [1] 2
> >
> > The return() call is not like any other function call that returns a
> value
> > to
> > the point where it was called from. I think this should straightforwardly
> > be
> > handled in the parser by flagging it as a syntactic error.
> >
> > Thoughts?
> >
> > Mateo.
> > --
> > Mateo Obregón.
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Error in loadNamespace(x) : there is no package called 'formatR'

2020-11-13 Thread Henrik Bengtsson
I'm quite sure you want to use the following:

Suggests: knitr, rmarkdown, formatR
VignetteBuilder: knitr

Here are the details.  For the 'VignetteBuilder' field, you want to
put all packages that provide the **vignette engines** you are using.
For example, if your package vignettes use either of

%\VignetteEngine{knitr::knitr}
%\VignetteEngine{knitr::rmarkdown}

your package is using a vignette engine from the 'knitr' package, so
you need to specify:

VignetteBuilder: knitr

Next, with 'knitr' listed in 'VignetteBuilder', you need to make sure
'knitr' is listed in either 'Depends' or 'Suggests' (or possibly
'Imports' - not sure).  If 'knitr' is only used for your vignettes, it
is sufficient to specify it under 'Suggests', which is also the most
common way to use it, i.e.

Suggests: knitr

The above settles the **vignette-engine package**.  Then your vignette
engine might depend on additional packages.  Your package needs to
depend on those too, typically also listed under 'Suggests'.  For
example, when you use %\VignetteEngine{knitr::rmarkdown}, that
vignette engine requires the 'rmarkdown' package (can be guessed from
them name but reading the docs is the only way to be sure - I think
there's work in 'tools' to improve on this).So, this means you
need to use:

Suggests: knitr, rmarkdown

Finally, if your vignettes make use of additional, optional features
from other packages, you need to make sure your package depends on
those too.  Since you make use of 'formatR' features, you need to add
that to Suggests as well;

Suggests: knitr, rmarkdown, formatR

/Henrik

PS. Vignettes are a bit of special creatures. Their dependencies are
only needed during 'R CMD build' and 'R CMD check', which most
end-users never perform. I think it could be favorable if we could
declare vignette dependencies separate from install/run-time
dependencies, e.g.

VignetteBuilder: knitr
VignetteDepends: rmarkdown, formatR

It should make the above process a bit clearer.  It would also make it
clear to those who are only interested in viewing vignettes, but have
no interest in rebuilding vignettes, what packages they need to
install in order to access all of the package's functions.  Just an
idea.

On Fri, Nov 13, 2020 at 7:55 AM Joseph Park  wrote:
>
> Thank you.
>
> On 11/13/20 10:31 AM, Gábor Csárdi wrote:
> > >From WRE:
> >
> > "Note that if, for example, a vignette has engine ‘knitr::rmarkdown’,
> > then knitr provides the engine but both knitr and rmarkdown are needed
> > for using it, so both these packages need to be in the
> > ‘VignetteBuilder’ field and at least suggested (as rmarkdown is only
> > suggested by knitr, and hence not available automatically along with
> > it). Many packages using knitr also need the package formatR which it
> > suggests and so the user package needs to do so too and include this
> > in ‘VignetteBuilder’."
> >
> > So I think you need
> >
> > Suggests: knitr, rmarkdown, formatR
> > VignetteBuilder: knitr, rmarkdown, formatR
> >
> > On Fri, Nov 13, 2020 at 3:23 PM Joseph Park  wrote:
> >> Ah, yes... I see it now in Writing R Extensions.  Apologies for the
> >> oversight.
> >>
> >> Regarding rmarkdown, is it redundant to include rmarkdown in
> >> VignetteBuilder if it is in Suggests, or is perhaps needed in the build
> >> config as a separate entity?
> >>
> >> e.g:
> >>
> >> Suggests: knitr, rmarkdown
> >> VignetteBuilder: knitr, formatR
> >>
> >> or
> >>
> >> Suggests: knitr, rmarkdown
> >> VignetteBuilder: knitr, rmarkdown, formatR
> >>
> >> Thank you.
> >>
> >> J Park
> >>
> >> On 11/13/20 8:58 AM, Gábor Csárdi wrote:
> >>> I think you need to Suggest the formatR package, because your
> >>> vignettes use it. From 'Writing R extensions':
> >>>
> >>> "Many packages using knitr also need the package formatR which it
> >>> suggests and so the user package needs to do so too and include this
> >>> in ‘VignetteBuilder’."
> >>>
> >>> Gabor
> >>>
> >>> On Fri, Nov 13, 2020 at 1:49 PM Joseph Park  wrote:
>  Dear r-package-devel,
> 
>  The rEDM package is failing the automated check, as noted here:
> 
>  https://win-builder.r-project.org/incoming_pretest/rEDM_1.7.0_20201113_131811/Windows/00check.log
> 
>  When running rhub::check_for_cran(), disk file errors were reported.
> 
>  The automated check seems to be failing with:
> 
>  Error in loadNamespace(x) : there is no package called 'formatR'
> 
>  This package does not explicitly use formatR:
> 
>  Imports: methods, Rcpp (>= 1.0.1)
>  LinkingTo: Rcpp, RcppThread
>  Suggests: knitr, rmarkdown
>  VignetteBuilder: knitr
> 
>  Could it be these errors (disk full, no formatR) are related?  If not,
>  does formatR need to be listed as a dependency?
> 
>  If the former (R server config/resource build errors), do I need to
>  resubmit the package?
> 
>  Thank you.
> 
>  J Park
> 
> 
>    [[alternative HTML version deleted]]

Re: [R-pkg-devel] Import package countreg that is not on CRAN

2020-11-13 Thread Henrik Bengtsson
You can change your package from using:

  Imports: countreg

to use:

  Suggests: countreg

For this to work, you'll have to update your code, package tests,
examples, and vignettes, to run conditionally on 'countreg' being
installed, e.g.

if (requireNamespace("countreg", quietly = TRUE)) {
  ...
}

Whether or not this makes sense depends on how central 'countreg' is
to your package.  If it is only needed optionally and for a small part
of your functionality, then it's doable but if it's a mission-critical
dependency, it might be too much of a hack.

This is in compliance with CRAN and in agreement with what Uwe says too.

I've used this myself in for instance
https://cran.r-project.org/web/package=aroma.core where small parts of
the functionality depend on packages not in the mainstream (= CRAN &
Bioconductor) repositories. This allows those who wish to use
alternative methods, if they want to go the extra mile to install
them.  Also, if they're popular enough it might nudge the maintainers
of those enough to submit to CRAN or Bioconductor - a process that
might take years, if at all.

/Henrik

On Fri, Nov 13, 2020 at 3:23 AM Duncan Murdoch  wrote:
>
> On 13/11/2020 3:10 a.m., Jason Luo wrote:
> > Hi,
> >
> > I'm submitting a new package (https://github.com/Penncil/pda/) to CRAN. It
> > relies on some function (zerotrunc and hurdle in R/ODAP.R) from countreg (
> > https://rdrr.io/rforge/countreg/) , which is not on CRAN. The submission
> > returns error as below
> >
> > https://win-builder.r-project.org/incoming_pretest/pda_1.0_20201113_083442/Debian/00check.log
> >
> > Seems the r-forge repo is identified in the DESCRIPTION
> > Additional_repositories, but countreg is still not available. I assume this
> > is not a rare problem but didn't find useful solutions online. Any
> > suggestions? Thanks!
>
> If countreg is not in one of the mainstream repositories (CRAN or
> Bioconductor), then it may not have been subject to careful testing, so
> CRAN sees it as unreliable.  Since your package depends on it, yours is
> also unreliable, so CRAN won't publish it.
>
> I don't know anything about the pda or countreg packages, so this is
> general advice on what you could do, and may not be applicable here:
>
>- you could take over maintenance of countreg (if its current
> maintainer agrees), and put in the work to get it on CRAN.
>- you could copy parts of countreg to your own package (if its
> license allows that), and drop your dependence on it.
>- you could substitute some other CRAN package that provides
> equivalent functionality and depend on that instead.
>- you could drop the parts of your package that need countreg, and
> submit a smaller package to CRAN without that dependency.
>- you could publicize that your package is on Github, and give up on
> publishing it on CRAN.
>
> Duncan Murdoch
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Process to Incorporate Functions from {parallely} into base R's {parallel} package

2020-11-07 Thread Henrik Bengtsson
FWIW, there are indeed a few low hanging bug fixes in 'parallelly'
that should be easy to incorporate into 'parallel' without adding
extra maintenance.  For example, in parallel::makePSOCKcluster(), it
is not possible to disable SSH option '-l USER' so that it can be set
in ~/.ssh/config.  The remote user name will be the user name of your
local machine and if you try to set user=NULL, you'll end up with an
invalid SSH call.   The current behavior means that you are forced to
specify the remote user name in your R code.  All that it takes is to
fix this is to update:

  cmd <- paste(rshcmd, "-l", user, machine, cmd)

to something like:

  cmd <- paste(rshcmd, if (length(user) == 1L) paste("-l", user), machine, cmd)

This is one example of what I've patched in
parallelly::makeClusterPSOCK() over the years.  Another is the use of
reverse tunneling in SSH - that completely avoids the need to know and
specify your public IP and reconfiguring the firewalls from the remote
server back to your local machine so that the worker can connect back
to your local machine.  Not many users have the permission to
reconfigure firewalls and it's also extremely tedious.  Reverse SSH
tunneling is super simply; all you need to to is something like:

rshopts <- c(sprintf("-R %d:%s:%d", rscript_port, master, port), rshopts)

/Henrik

On Fri, Nov 6, 2020 at 4:37 PM Duncan Murdoch  wrote:
>
> On 06/11/2020 4:47 p.m., Balamuta, James Joseph wrote:
> > Hi all,
> >
> > Henrik Bengtsson has done some fantastic work with {future} and, more 
> > importantly, greatly improved constructing and deconstructing a 
> > parallelized environment within R. It was with great joy that I saw Henrik 
> > slowly split off some functionality of {future} into {parallelly} package. 
> > Reading over the package’s README, he states:
> >
> >> The functions and features added to this package are written to be 
> >> backward compatible with the parallel package, such that they may be 
> >> incorporated there later.
> >> The parallelly package comes with an open invitation for the R Core Team 
> >> to adopt all or parts of its code into the parallel package.
> >
> > https://github.com/HenrikBengtsson/parallelly
> >
> > I’m wondering what the appropriate process would be to slowly merge some 
> > functions from {parallelly} into the base R {parallel} package. Should this 
> > be done with targeted issues on Bugzilla for different fields Henrik has 
> > identified? Or would an omnibus patch bringing in all suggested 
> > modifications be preferred? Or is it best to discuss via the list-serv 
> > appropriate contributions?
>
> One way is to convince R Core that incorporating this into the parallel
> package would
>
>   - make less work for them, or
>   - add a lot to R that couldn't happen if it was a contributed package.
>
> The fact that it's good isn't a good reason to put it into a base
> package, which would largely mean transferring Henrik's workload to R
> Core.  There are lots of good packages, and their maintainers should
> continue to maintain them.
>
> Duncan Murdoch
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Delays in CRAN Windows Binaries?

2020-10-21 Thread Henrik Bengtsson
Related to this:

It would be neat to have a dashboard that reports on the current
latency is on the different CRAN "queues" are, e.g. how long is the
average waiting time when submitting a new package ("newbies") until
you get a manual reply or it's on CRAN, submitting an update with all
OK before it hits CRAN, waiting time for building Windows or macOS
binaries, etc.  Some, but not all, of this information can already be
guestimated from the info on ftp://cran.r-project.org/incoming/, on
easier on https://lockedata.github.io/cransays/articles/dashboard.html.
I think this could be a great contributor project - it doesn't have to
be hosted by CRAN.

/Henrik

On Wed, Oct 21, 2020 at 11:08 AM Marc Schwartz  wrote:
>
> Hi All,
>
> Just thought that I would check to see if there are any known issues/delays 
> for CRAN in creating R release and devel binaries for Windows for updated 
> packages.
>
> It has been four days since I submitted an update and the other binaries were 
> created within a couple of days. The two Windows binaries are the only 
> outstanding updates pending.
>
> Thanks!
>
> Marc Schwartz
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] Git pack file greater than 5MB

2020-10-01 Thread Henrik Bengtsson
I understood that it's a submission. Just wanted to make sure that it's
clear there might be side effects, e.g. people clone and collaborate also
before submitting to Bioc and a rewrite might surprise existing
collaborators etc.

/H

On Thu, Oct 1, 2020, 09:04 Nitesh Turaga  wrote:

> This package isn’t yet a Bioconductor package Henrik. It will break other
> forks most likely. This package hasn’t been submitted to the Contributions
> either to be reviewed. So this is the time to break what needs to be broken
> before it’s submitted to Bioconductor and gets into the Bioconductor git
> repository.
>
> Nitesh
>
> On Oct 1, 2020, at 11:57 AM, Henrik Bengtsson 
> wrote:
>
> Doesn't a git rewrite break all existing clones, forks out there? I'm
> happy to be corrected, if this is not the case.
>
> /Henrik
>
> On Thu, Oct 1, 2020, 08:16 Nitesh Turaga  wrote:
>
>> Hi,
>>
>> The BiocCheck will complain on the build system about the > 5MB package
>> size.
>>
>> The rewrite of the history with BFG cleaner (
>> https://rtyley.github.io/bfg-repo-cleaner/ <
>> https://rtyley.github.io/bfg-repo-cleaner/>) is not as severe as you
>> think it is to be honest. It is just removing these pack files which don’t
>> have a place in the tree structure. These are more often than not, orphan
>> files.
>>
>> If you are suspect of this solution, I would suggest you make a backup
>> clone of your repo and try it on that first before you touch the main repo.
>> Check the history (git log) to see if anything important is missing.
>>
>> But usually a software package has to be below 5MB. If you have some data
>> in there which is needed for the package, consider Experiment Hub.
>>
>> Best,
>>
>> Nitesh
>>
>> > On Sep 30, 2020, at 12:46 PM, McGrath, Max 
>> wrote:
>> >
>> > Hi all,
>> >
>> > We have a package that is ready for submission, but when running
>> BiocCheck a warning is generated noting that "The following files are over
>> 5MB in size: '.git/objects/pack/pack-xxx...". I've pruned, repacked, and
>> run git gc which reduced the file size from 5.2 to 5.1MB, but I have been
>> unable to reduce it further.
>> >
>> > I'm reaching out to determine if this is an issue, and if so to ask for
>> recommendations for solving it. Currently, the only solution I've come up
>> with is to rewrite the repository's history using a tool like
>> "git-filter-repo", but this is a more drastic action than I would prefer to
>> take. I would greatly appreciate any advice on the matter.
>> >
>> > Thank you,
>> > Max McGrath
>> >
>> >   [[alternative HTML version deleted]]
>> >
>> > ___
>> > Bioc-devel@r-project.org mailing list
>> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
>>
>>
>> [[alternative HTML version deleted]]
>>
>> ___
>> Bioc-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/bioc-devel
>>
>
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Git pack file greater than 5MB

2020-10-01 Thread Henrik Bengtsson
Doesn't a git rewrite break all existing clones, forks out there? I'm happy
to be corrected, if this is not the case.

/Henrik

On Thu, Oct 1, 2020, 08:16 Nitesh Turaga  wrote:

> Hi,
>
> The BiocCheck will complain on the build system about the > 5MB package
> size.
>
> The rewrite of the history with BFG cleaner (
> https://rtyley.github.io/bfg-repo-cleaner/ <
> https://rtyley.github.io/bfg-repo-cleaner/>) is not as severe as you
> think it is to be honest. It is just removing these pack files which don’t
> have a place in the tree structure. These are more often than not, orphan
> files.
>
> If you are suspect of this solution, I would suggest you make a backup
> clone of your repo and try it on that first before you touch the main repo.
> Check the history (git log) to see if anything important is missing.
>
> But usually a software package has to be below 5MB. If you have some data
> in there which is needed for the package, consider Experiment Hub.
>
> Best,
>
> Nitesh
>
> > On Sep 30, 2020, at 12:46 PM, McGrath, Max 
> wrote:
> >
> > Hi all,
> >
> > We have a package that is ready for submission, but when running
> BiocCheck a warning is generated noting that "The following files are over
> 5MB in size: '.git/objects/pack/pack-xxx...". I've pruned, repacked, and
> run git gc which reduced the file size from 5.2 to 5.1MB, but I have been
> unable to reduce it further.
> >
> > I'm reaching out to determine if this is an issue, and if so to ask for
> recommendations for solving it. Currently, the only solution I've come up
> with is to rewrite the repository's history using a tool like
> "git-filter-repo", but this is a more drastic action than I would prefer to
> take. I would greatly appreciate any advice on the matter.
> >
> > Thank you,
> > Max McGrath
> >
> >   [[alternative HTML version deleted]]
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
>
>
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel
>

[[alternative HTML version deleted]]

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] Anyone Know How To Setup Wine for Windows Testing?

2020-09-21 Thread Henrik Bengtsson
Getting R to run well on Wine would be really neat.  Here are some
additional comments that might encourage someone to move this forward.

On Ubuntu 18.04, you can install and run R 3.6.0 on Wine as:

curl -O https://cloud.r-project.org/bin/windows/base/old/3.6.0/R-3.6.0-win.exe
wine R-3.6.0-win.exe /SILENT
wine start "%ProgramW6432%/R/R-3.6.0/bin/i386/R"

However, for me, never versions of R (e.g. R 3.6.1) fail to install
with an error "This program does not support the version of Windows
your computer is running."  This is with the current stable version of
Wine is 5.0.2 (August 2020) available from https://www.winehq.org/.
(If you struggle to install Wine (>= 4.5) on Ubuntu 18.04, see also
https://askubuntu.com/questions/1145473/how-do-i-install-libfaudio0).
I observed the same with Wine 3.0.


Some more comments: Back in 2016 I was looking into running R on Wine
via futures.  This could make it possible to access features in
Windows-only R packages from R sessions running Linux.  I only spent
so much time on it before concluding it wasn't stable and not worth my
time to investigate further.  For instance, I managed to launch a Wine
background worker as in:

rscript <- c("wine", "start", "/Unix",
normalizePath("~/.wine/drive_c/Program\
Files/R/R-3.5.1/bin/i386/Rscript.exe", mustWork = TRUE))
cl <- future::makeClusterPSOCK(1L, rscript=rscript, verbose=TRUE)

but only when running 32-bit Wine.  On 64-bit Wine is stalled.

/Henrik

On Mon, Sep 21, 2020 at 7:16 AM Tomas Kalibera  wrote:
>
> On 9/18/20 11:56 PM, Steve Bronder wrote:
> > inline
> >
> >
> > On Wed, Sep 9, 2020 at 3:35 AM Tomas Kalibera
> > mailto:tomas.kalib...@gmail.com>> wrote:
> >
> > On 7/16/20 7:57 PM, Steve Bronder wrote:
> > > On Wed, Jul 15, 2020 at 2:22 PM Neal Fultz  > > wrote:
> > >
> > >> If you don't mind multi-gig docker containers, this can be helpful:
> > >>
> > >> https://github.com/scottyhardy/docker-wine
> > >>
> > >> It doesn't work with 64 bit versions of R as far as I could
> > tell, but 32
> > >> bit did install and start correctly in a few clicks when I
> > tried last year.
> > >>
> > > Thanks! I'm hoping if I can get this all working locally I can put
> > > everything into a docker container for other folks. At this
> > point I have R
> > > up and running and it can install binary packages, but there are
> > some
> > > terrible terrible Cygwin/Rtools errors I can't figure out. In
> > particular
> > > this warning / message seems worrisome
> > >
> > > Cygwin WARNING:
> > >Couldn't compute FAST_CWD pointer.  This typically occurs if
> > you're using
> > >an older Cygwin version on a newer Windows.  Please update to
> > the latest
> > >available Cygwin version from https://cygwin.com/. If the problem
> > > persists,
> > >please see https://cygwin.com/problems.html
> > >
> > > If anyone has interest I can post a script for setting up the
> > wine instance
> > > as far as I can get atm.
> >
> > Did you have any update on this? If there is a problem with MinGW-w64
> > version, you can (just for experimentation) try UCRT demo build,
> > which
> > uses a newer version of MinGW-w64 than RTools 4:
> >
> >
> > So this worked with a patched version of wine. that fixed the above
> > error. I'm waiting for that patch to go in and for the next wine
> > release to try this again.
> Thanks!
> >
> >
> > 
> > https://developer.r-project.org/Blog/public/2020/07/30/windows/utf-8-build-of-r-and-cran-packages/index.html
> > (reference at very bottom)
> >
> > > On Wed, Jul 15, 2020 at 10:56 AM J C Nash  > > wrote:
> > >
> > >>> Are you sure you want to try to run R etc. under Wine?
> > >>>
> > > Do I want to? No :-). But we (Stan) want to use flto and are
> > seeing errors
> > > on windows I want to be able to debug locally.
> >
> > Please note Brian Ripley has added/improved support for
> > cross-compiling
> > on Linux for Windows, so that it can be used for diagnosing LTO
> > warnings
> > (see NEWS in R-devel and src/gnuwin32/README.compilation). The
> > cross-compilers and needed cross-compiled libraries for R itself are
> > part of at least some Linux distributions (he tested on Fedora).
> >
> >
> > Yes very exciting!
> >
> >
> > >>> - If you have Windows running, either directly or in a VM, you
> > can run R
> > >>> there.
> > >>>
> > > Sadly I don't have access to a windows machine. If I can't
> > figure this out
> > > then I'll probably just get a windows aws instance. But it would
> > be nice
> > > for people to have a wine setup they could test locally on.
> >
> > Microsoft is giving for free time-limited VMs primarily for testing
> > Edge/MSIE, you can run them in e.g. in Virtualbox. As instructed
> >

Re: [R-pkg-devel] Use of `:::` in a package for code run in a parallel cluster

2020-09-14 Thread Henrik Bengtsson
Without having read all of the comments already made here, but my
understanding why ::: is not allowed is because you are reaching into
the internal API that the package owner does not guarantee will exist
in the next release.  If you rely on the internal code of another CRAN
package in your CRAN package, your CRAN package might break without
your control.  This might release an avalanche of reverse package
dependencies failing on CRAN.

The only thing you can safely rely on is the API that is explicitly
*exported* by an R package.  In order for the maintainer to break that
API for reverse dependent packages, they need to go through a process
of deprecating and defuncting what they want to break/remove - a
process that involves multiple releases and often reaching out to
package maintainers and asking them to update accordingly.   CRAN runs
reverse package dependency checks making sure that a package does not
break its exported API.  If it does, it will not roll out on CRAN.
So, in that sense CRAN helps uphold the contract of the exported APIs.
In contrast, a maintainer can do whatever they want whenever they want
with their internal code/API.

With more and more packages being infrastructure packages, I think
there is room for "protected" API, which is not exported to avoid
cluttering up the search path for end-users while it yet provides a
contract toward package developers relying on it.  There are various
ways to emulate such protected APIs but we don't have a standard and
there's a risk that 'R CMD check' fails to detect when the contract is
broken (resulting in delayed run-time errors on the user end).

My $.02

Henrik

On Mon, Sep 14, 2020 at 12:06 PM David Kepplinger
 wrote:
>
> Yes, my view is certainly rigid and I agree that in the cases where the
> function is actually used directly by the user, exporting it is the correct
> step.
>
> However, it seems some packages actually need to access internal functions
> from an outside context, but the code that accesses the function is
> logically contained completely inside the package. In these cases, package
> maintainers seem to be looking for alternatives to `:::` for the sake of
> avoiding the R CMD check note. I argue that the work arounds, however,
> either (a) achieve the exact same result as `:::`, but in a less
> transparent and likely more error prone way, or (b) unnecessarily making an
> internal function available to the user.
>
> I also agree with the CRAN team that package maintainers need to be made
> aware of the issue when using `:::` inside their package as it is most
> likely unnecessary. But the phrasing of the note ("almost never needs to
> use :::") combined with a lack of transparent guidelines on when it is
> acceptable leads to maintainers looking for alternatives mimicking the
> behavior of `:::`. I haven't found any official instructions in Writing R
> extensions or on the mailing list under what circumstances `:::` is deemed
> to be acceptable by the CRAN team (I have to admit searching for `:::` in
> the archives yields so many results I haven't looked at all of them). It's
> probably impossible to conceive every possible use case for `:::`, but a
> good start may be to have something in the documentation explicitly
> mentioning commonly observed patterns where `:::` is not acceptable, and
> the common exceptions to the rule (if there are any).
>
> Maybe this issue is so miniscule and almost never comes up that it's not
> worth mentioning in the documentation.
>
> Best,
> David
>
>
>
> On Mon, Sep 14, 2020 at 3:19 AM Georgi Boshnakov <
> georgi.boshna...@manchester.ac.uk> wrote:
>
> > You may have a case to argue to CRAN that you can get the "almost"
> > exemption (can't say without details) but your views look overly rigid.
> >
> > Exporting an object and marking it as internal is not a "work around",
> > even less a "dirty trick".
> > Export makes the object available outside the package's namespace and
> > makes it clear that this is intentional.
> > If you can't drop the 'package:::' prefix in your use case, this means
> > that this is what you actually do (i.e. use those objects outside the
> > namespace of the package). I would be grateful to CRAN for asking me to
> > export and hence document this.
> >
> >
> > Georgi Boshnakov
> >
> > PS Note that there is no such thing as "public namespace".
> >
> >
> > -Original Message-
> > From: R-package-devel  On Behalf
> > Of David Kepplinger
> > Sent: 13 September 2020 20:52
> > To: R Package Devel 
> > Subject: [R-pkg-devel] Use of `:::` in a package for code run in a
> > parallel cluster
> >
> > Dear list members,
> >
> > I submitted an update for my package and got automatically rejected by the
> > incoming checks (as expected from my own checks) for using `:::` calls to
> > access the package's namespace.
> > "There are ::: calls to the package's namespace in its code. A package
> > *almost* never needs to use ::: for its own objects:…" (emphasis mine)
> >
> > 

Re: [Rd] Rgui never processes ~/.Renviron

2020-09-09 Thread Henrik Bengtsson
I've "moved" this to
https://bugs.r-project.org/bugzilla/show_bug.cgi?id=17919 to make sure
it's tracked.  /Henrik

On Thu, Sep 3, 2020 at 7:25 AM Dirk Eddelbuettel  wrote:
>
>
> On 2 September 2020 at 23:38, Henrik Bengtsson wrote:
> | WORKAROUND:
> | Setting R_USER or HOME prior to calling Rgui will cause Rgui to
> | process ~/.Renviron, e.g.
>
> AFAICR one _always_ had to manually set $HOME on Windows as the convention of
> having it comes from the some other OSs and is not native.
>
> In short I don't think this is new. A quick Google search seems to confirm
> this with a SuperUser answer from 2013:
>
> https://superuser.com/questions/607105/is-the-home-environment-variable-normally-set-in-windows
>
> Dirk
>
> --
> https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Rgui never processes ~/.Renviron

2020-09-03 Thread Henrik Bengtsson
ISSUE:
It looks like Rgui.exe never processes ~/.Renviron - only ./.Renviron.

REPRODUCIBLE EXAMPLE:
On Windows, create the following ~/.Renviron and ~/.Rprofile files:

C:\Users\alice> Rscript -e "cat('FOO=123\n', file='~/.Renviron')"
C:\Users\alice> Rscript -e "cat('print(Sys.getenv(\'FOO\'))',
file='~/.Rprofile')"

and launch Rgui (from a folder other that ~):
C:\Users\alice> Rgui

and you'll see that FOO is reported as "" (empty), whereas with R or
Rscript, it is reported as "123".


TROUBLESHOOTING:
>From code inspection
():

#ifdef Win32
  {
char buf[1024]; /* MAX_PATH is less than this */
/* R_USER is not necessarily set yet, so we have to work harder */
s = getenv("R_USER");
if(!s) s = getenv("HOME");
if(!s) return;
snprintf(buf, 1024, "%s/.Renviron", s);
s = buf;
  }
#endif

I think it happens because neither R_USER nor HOME is set when the
Rgui startup process calls process_user_Renviron().

WORKAROUND:
Setting R_USER or HOME prior to calling Rgui will cause Rgui to
process ~/.Renviron, e.g.

C:\User\alice> set R_USER=%UserProfile%\Documents
C:\User\alice> Rgui

The background for finding this is R-help thread '[R] tempdir() does
not respect TMPDIR' on 2020-08-29
(https://stat.ethz.ch/pipermail/r-help/2020-August/468573.html).

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Bug in stats:::`[.formula`: (~ NULL)[2] gives Error ... missing value where TRUE/FALSE needed

2020-08-14 Thread Henrik Bengtsson
Hi, it looks like:

> stats:::`[.formula`
function (x, i)
{
ans <- NextMethod("[")
if (length(ans) == 0L || as.character(ans[[1L]])[1L] == "~") {
class(ans) <- "formula"
environment(ans) <- environment(x)
}
ans
}



doesn't like to extract NULL components on either the LHS or RHS.  For
example, with

> (~ NULL)[2]
Error in if (length(ans) == 0L || as.character(ans[[1L]])[1L] == "~") { :
  missing value where TRUE/FALSE needed

this despite:

> str(as.list(~ NULL))
List of 2
 $ : symbol ~
 $ : NULL
 - attr(*, "class")= chr "formula"
 - attr(*, ".Environment")=

> length(~ NULL)
[1] 2

> (~ NULL)[[2]]
NULL

Other examples are:

> (NULL ~ .)[2]
Error in if (length(ans) == 0L || as.character(ans[[1L]])[1L] == "~") { :
  missing value where TRUE/FALSE needed

> (NULL ~ NULL)[3]
Error in if (length(ans) == 0L || as.character(ans[[1L]])[1L] == "~") { :
  missing value where TRUE/FALSE needed


TROUBLESHOOTING:

The reason is that ans[[1L]] becomes NULL;

> trace(stats:::`[.formula`, tracer = quote(utils::str(as.list(ans))), at = 3L)
Tracing function "[.formula" in package "stats (not-exported)"
[1] "[.formula"

> (~ NULL)[2]
Tracing `[.formula`((~NULL), 2) step 3
List of 1
 $ : NULL
Error in if (length(ans) == 0L || as.character(ans[[1L]])[1L] == "~") { :
  missing value where TRUE/FALSE needed

which causes 'as.character(ans[[1L]])[1L] == "~"' to resolve to NA.


PATCH:

A minimal backward compatible fix would be:

`[.formula` <- function(x,i) {
ans <- NextMethod("[")
## as.character gives a vector.
if(length(ans) == 0L || (is.null(ans[[1L]]) && i <= length(x)) ||
   as.character(ans[[1L]])[1L] == "~") {
class(ans) <- "formula"
environment(ans) <- environment(x)
}
ans
}

A better fix would probably be to also detect out of range as in:

`[.formula` <- function(x,i) {
if (i > length(x))
stop(gettext("index out of range"))
ans <- NextMethod("[")
## as.character gives a vector.
if(length(ans) == 0L || is.null(ans[[1L]]) ||
   as.character(ans[[1L]])[1L] == "~") {
class(ans) <- "formula"
environment(ans) <- environment(x)
}
ans
}

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] sapply and vapply

2020-08-12 Thread Henrik Bengtsson
FWIW,

> sapply(X, length) - always numeric(1) (integer(1) or double(1) for vectors of 
> more than 2^31 - 1 elements)

Actually, the length of length(x) may not be 1L, e.g.

> x <- Formula::Formula(~ x)
> length(x)
[1] 0 1

>From help("length", package = "base"):

"Warning: Package authors have written methods that return a result of
length other than one (Formula) and that return a vector of type
double (Matrix), even with non-integer values (earlier versions of
sets). Where a single double value is returned that can be represented
as an integer it is returned as a length-one integer vector."

I/we recently learned this the hard way
(https://github.com/HenrikBengtsson/future/issues/395).  It's rather
unfortunate that not even length() is strictly defined here, I'd say.
I think we could move away from this if lengths() would be a generic
so lengths(x) could be used above.  But that's a discussion for
R-devel.

/Henrik


On Wed, Aug 12, 2020 at 9:33 AM Laurent Gatto
 wrote:
>
> Dear all,
>
> I have a quick question regarding the usage of vapply and sapply. The former 
> is recommended to insure that the output is always a vector of a specific 
> type. For example:
>
> > df1 <- data.frame(x = 1:3, y = LETTERS[1:3]) ## OK test
> > df2 <- data.frame(x = 1:3, y = Sys.time() + 1:3) ## Not OK test
> > sapply(df1, class) ## vector of chars, OK
>   x   y
>   "integer" "character"
> > sapply(df2, class) ## ouch, not a vector
> $x
> [1] "integer"
>
> $y
> [1] "POSIXct" "POSIXt"
>
> > vapply(df2, class, character(1)) ## prefer an error rather than a list
> Error in vapply(df2, class, character(1)) : values must be length 1,
>  but FUN(X[[2]]) result is length 2
>
> There are cases, however, were FUN ensures that the output will be of length 
> 1 and of a expected type. For example
>
> - sapply(X, all) - all() always returns logical(1)
> - sapply(X, length) - always numeric(1) (integer(1) or double(1) for vectors 
> of more than 2^31 - 1 elements)
>
> or more generally
>
> - sapply(X, slot, "myslot") - slot() will always return a character(1) 
> because @myslot is always character(1) (as defined by the class)
>
> Would you still recommend to use vapply() in such cases?
>
> Thank you in advance.
>
> Laurent
>
>
>
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] Package Anchored Links with R-Dev

2020-08-10 Thread Henrik Bengtsson
These recently introduced 'R CMD check' WARNINGs were disabled again
in R-devel on July 8, 2020:

r78793: 
https://github.com/wch/r-source/commit/ccd903e47ab42e1c181396256aea56454a2e70c9
r78794: 
https://github.com/wch/r-source/commit/b70f90ae11dd516c1c954ed15eb5a7c2a304b37f

This is because there's a plan to add support for what we're all
asking for in one way or the other. That's all I know.

/Henrik

On Mon, Aug 10, 2020 at 8:51 AM John Mount  wrote:
>
> To all: I'd like to apologize for my negative tone and imprecision in my 
> points. Thanks for the discussion and the effort you put into these systems.
>
> > On Aug 9, 2020, at 12:35 PM, Duncan Murdoch  
> > wrote:
> >
> > On 09/08/2020 3:15 p.m., Ben Bolker wrote:
> >> On 8/9/20 3:08 PM, Duncan Murdoch wrote:
> >>> On 09/08/2020 2:59 p.m., John Mount wrote:
>  Firstly: thanks to Ben for the help/fix.
> 
>  I know nobody asked, but.
> 
>  Having to guess where the documentation is just to refer to it is
>  just going to be really brittle going forward. Previous: if the
>  function you referred to existed in the package you were fine.
> >>>
> >>> That's not correct.  The system could often work around the error, but
> >>> not always.
> >>I may be missing something. It may well be that referring to a
> >> cross-package link by alias rather than by the name of the Rd page
> >> actually never worked, but: would there be a big barrier to making
> >> cross-package documentation links be able to follow aliases? I can
> >> imagine there may be technical hurdles but it seems like a well-defined
> >> problem ...
> >
> > To link to "?utils::dump.frames", you need to construct a URL that leads to 
> > the HTML file containing that help page on static builds of the help system.
> >
> > If utils is available, no problem, just look up the fact that the page you 
> > want is debugger.html at the time you construct the link.  But it was 
> > documented that such links should work even if the target package was not 
> > available at the time the link was being made.  So you need a link that you 
> > are sure will be available when the referenced package is eventually 
> > installed.  Obviously that's going to be brittle.
> >
> > Perhaps the new requirement that referenced packages be mentioned in the 
> > DESCRIPTION file is a step towards addressing this.  If everything that's 
> > referenced is present when the help system is being built, there will be an 
> > easy solution.
> >
> > Nowadays, it would probably be reasonable to put in stub pages for every 
> > alias, so that when you try to load dump.frames.html, the page itself 
> > redirects you to debugger.html.  Or maybe you could just have a single page 
> > in each package that handles aliases via Javascript.
> >
> > Or R could just no longer support static copies of the help system.
> >
> > When you are using dynamically generated help pages, the link can be 
> > resolved by the server, and that's why those links have appeared to work 
> > for so long, even though the requirement to link to the filename instead of 
> > the alias has been there since before I wrote the dynamic help server.
> >
> > Duncan Murdoch
> >
> >>>
> >>> Duncan Murdoch
> >>>
> >>>
> >>>  Future: if don't correctly specify where the help is you are wrong.
> >>> Going forward: reorganizing a package's help can break referring
> >>> packages. This sort of brittleness is going to be a big time-waster
> >>> going forward. It seems like really the wrong direction in packaging,
> >>> isolation, and separation of concerns (SOLID style principles).
> 
> > On Aug 9, 2020, at 11:04 AM, Ben Bolker  wrote:
> >
> > This might have to be \link[utils:debugger]{dump.frames} now, i.e.
> > explicitly linking to the man page on which dump.frames is found
> > rather than following aliases?
> >
> > On Sun, Aug 9, 2020 at 2:01 PM John Mount 
> > wrote:
> >>
> >> With "R Under development (unstable) (2020-07-05 r78784)" (Windows)
> >> documentation references such as "\link[utils]{dump.frames}"
> >> trigger "Non-file package-anchored link(s) in documentation object"
> >> warnings even if the package is in your "Imports."
> >>
> >> Is that not the right form? Is there any way to fix this other than
> >> the workaround of just removing links from the documentation? I
> >> kind of don't want to do that as the links were there to help the
> >> user.
> >>
> >> __
> >> R-package-devel@r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
>  __
>  R-package-devel@r-project.org mailing list
>  https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> >>>
> >
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

Re: [Bioc-devel] Possible problems with BiocParallel and R cmd check on Github

2020-07-30 Thread Henrik Bengtsson
(I assume you just forgot to 'Reply All' so I've added bioc-devel back
to the cc:)

> Unfortunately what I'm mentioning only happens in Github actions, which
> are the standard ones (we used usethis::use_github_action_check_standard),

I'd say that that 'usethis' setup did not anticipate package tests
that run parallel code.  So, don't assume it's perfect and that it
covers all use cases.

> we haven't modified the action ...

That's actually my suggestion - did you try my one-line addition?  I'm
95% certain it will solve your check errors on Windows as it did for
me.

> ... It also appears same actions performed on MacOS have
> some kind of problems since they stop even before checking the package.

Yes, there seems to be some hiccups.  I also get those since a few
days back.  These are out of our control and we just have to wait for
them to resolved upstream/elsewhere.

/Henrik

On Thu, Jul 30, 2020 at 2:05 AM Giulia Pais  wrote:
>
> Thanks for the reply,
>
> Unfortunately what I'm mentioning only happens in Github actions, which
> are the standard ones (we used
> usethis::use_github_action_check_standard), we haven't modified the
> action, errors I'm mentioning do not happen in local R environments on
> windows machines. It also appears same actions performed on MacOS have
> some kind of problems since they stop even before checking the package.
>
> Is it relevant for a bioconductor reviewer the result of these Github
> action reports? We would like to submit the package for moderation soon
> but we're unsure if we can due to this problem. Thank you.
>
> Il 29/07/2020 19:25, Henrik Bengtsson ha scritto:
> >  From a very quick look at this, I think you also need to explicitly
> > install the package itself for it to be available in external R
> > session (contrary to when using forked processing as on Linux and
> > macOS).  Something like this:
> >
> > - name: Install dependencies
> >  run: |
> >remotes::install_deps(dependencies = TRUE)
> >remotes::install_cran("rcmdcheck")
> >install.packages(".", repos=NULL, type="source") ## needed by
> > parallel workers
> >shell: Rscript {0}
> >
> > That's what I had to do when testing 'future'
> > (https://github.com/HenrikBengtsson/future/blob/1835a122764bbc0196c78be6da6973c8063922b3/.github/workflows/R-CMD-check.yaml#L69).
> >
> > /Henrik
> >
> > On Wed, Jul 29, 2020 at 10:18 AM Giulia Pais  wrote:
> >> Hi bioconductor team,
> >>
> >> we are currently developing a package for future submission on
> >> bioconductor which you can find here
> >> https://github.com/calabrialab/ISAnalytics.
> >>
> >> We use Github actions to keep track of R cmd checks at every commit, and
> >> this time, surprisingly for us, we had a failure on R cmd checks for
> >> windows (which is odd, since we develop on windows and performing check
> >> on 2 different windows machines didn't raise any error or warning) and
> >> we suspect this could be tied to the use of BiocParallel. For Windows,
> >> in fact, we use SnowParam instead of MulticoreParam and as the vignette
> >> and manual of BiocParallel specifies we must ensure that every worker is
> >> loaded with the proper dependencies, but apparently this is not
> >> necessary if the function to execute in bplapply belongs to the package
> >> in question. Here is the code of the function that might raise some
> >> problems:
> >>
> >> .import_type <- function(q_type, files, workers) {
> >> files <- files %>% dplyr::filter(.data$Quantification_type == q_type)
> >> # Register backend according to platform
> >> if (.Platform$OS.type == "windows") {
> >>   p <- BiocParallel::SnowParam(workers = workers, stop.on.error = 
> >> FALSE)
> >> } else {
> >>   p <- BiocParallel::MulticoreParam(workers = workers, stop.on.error
> >> = FALSE)
> >> }
> >> # Import every file
> >> suppressMessages(suppressWarnings({
> >>   matrices <- BiocParallel::bptry(
> >> BiocParallel::bplapply(files$Files_chosen, FUN = function(x) {
> >>   matrix <- ISAnalytics::import_single_Vispa2Matrix(x)
> >> }, BPPARAM = p)
> >>   )
> >>   }))
> >> BiocParallel::bpstop(p)
> >> correct <- BiocParallel::bpok(matrices)
> >> imported_files <- files %>% dplyr::mutate(Imported = correct)
> >> matrices <- matrices[correct]
> >> # Bind rows in single tibble

Re: [Bioc-devel] Possible problems with BiocParallel and R cmd check on Github

2020-07-29 Thread Henrik Bengtsson
>From a very quick look at this, I think you also need to explicitly
install the package itself for it to be available in external R
session (contrary to when using forked processing as on Linux and
macOS).  Something like this:

- name: Install dependencies
run: |
  remotes::install_deps(dependencies = TRUE)
  remotes::install_cran("rcmdcheck")
  install.packages(".", repos=NULL, type="source") ## needed by
parallel workers
  shell: Rscript {0}

That's what I had to do when testing 'future'
(https://github.com/HenrikBengtsson/future/blob/1835a122764bbc0196c78be6da6973c8063922b3/.github/workflows/R-CMD-check.yaml#L69).

/Henrik

On Wed, Jul 29, 2020 at 10:18 AM Giulia Pais  wrote:
>
> Hi bioconductor team,
>
> we are currently developing a package for future submission on
> bioconductor which you can find here
> https://github.com/calabrialab/ISAnalytics.
>
> We use Github actions to keep track of R cmd checks at every commit, and
> this time, surprisingly for us, we had a failure on R cmd checks for
> windows (which is odd, since we develop on windows and performing check
> on 2 different windows machines didn't raise any error or warning) and
> we suspect this could be tied to the use of BiocParallel. For Windows,
> in fact, we use SnowParam instead of MulticoreParam and as the vignette
> and manual of BiocParallel specifies we must ensure that every worker is
> loaded with the proper dependencies, but apparently this is not
> necessary if the function to execute in bplapply belongs to the package
> in question. Here is the code of the function that might raise some
> problems:
>
> .import_type <- function(q_type, files, workers) {
>files <- files %>% dplyr::filter(.data$Quantification_type == q_type)
># Register backend according to platform
>if (.Platform$OS.type == "windows") {
>  p <- BiocParallel::SnowParam(workers = workers, stop.on.error = FALSE)
>} else {
>  p <- BiocParallel::MulticoreParam(workers = workers, stop.on.error
> = FALSE)
>}
># Import every file
>suppressMessages(suppressWarnings({
>  matrices <- BiocParallel::bptry(
>BiocParallel::bplapply(files$Files_chosen, FUN = function(x) {
>  matrix <- ISAnalytics::import_single_Vispa2Matrix(x)
>}, BPPARAM = p)
>  )
>  }))
>BiocParallel::bpstop(p)
>correct <- BiocParallel::bpok(matrices)
>imported_files <- files %>% dplyr::mutate(Imported = correct)
>matrices <- matrices[correct]
># Bind rows in single tibble for all files
>if (purrr::is_empty(matrices)) {
>  return(NULL)
>}
>matrices <- purrr::reduce(matrices, function(x, y) {
>  x %>% dplyr::bind_rows(y) %>% dplyr::distinct()
>})
>list(matrices, imported_files)
> }
>
> The report of the Github action can be found here:
> https://github.com/calabrialab/ISAnalytics/runs/923261561
>
> The check apparently fails with these warnings: Warning - namespace
> 'ISAnalytics' is not available and has been replaced. We tried adding
> 'library(ISAnalytics)' and 'require(ISAnalytics)' but if we do that
> BiocCheck fails with a warning, prompting for removal of this lines.
> Could this be a real issue with our package or just a problem with
> Github actions?
>
> Thanks in advance,
>
> Giulia Pais
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Rd] Restrict package to load-only access - prevent attempts to attach it

2020-07-17 Thread Henrik Bengtsson
Thanks. Though, AFAIU, that addresses another use case/need.

I want reverse package dependencies to be able to import functions
from my package using standard R namespace mechanisms, e.g. import()
and importFrom().  The only thing I want to prevent is relying on it
being *attached* to the search() path and access functions that way.
So, basically, all usage should be via import(), importFrom()
NAMESPACE statements or pkg::fcn() calls.  All for the purpose of
avoiding the package being used outside of other packages.

I've got a few suggestions offline in addition to the above comments
including allowing the package to be attached but having .onAttach()
wipe the attached environment so it effectively adds zero objects to
the search() path.  This is a non-critical feature for me but
nevertheless an interesting one.

/Henrik

On Fri, Jul 17, 2020 at 1:01 PM Iñaki Ucar  wrote:
>
> Hi Henrik,
>
> A bit late, but you can take a look at smbache's {import} package [1]
> in case you didn't know it. I believe it does what you are describing.
>
> [1] https://github.com/smbache/import
>
> Iñaki
>
> On Tue, 23 Jun 2020 at 22:21, Henrik Bengtsson
>  wrote:
> >
> > Hi,
> >
> > I'm developing a package whose API is only meant to be used in other
> > packages via imports or pkg::foo().  There should be no need to attach
> > this package so that its API appears on the search() path. As a
> > maintainer, I want to avoid having it appear in search() conflicts by
> > mistake.
> >
> > This means that, for instance, other packages should declare this
> > package under 'Imports' or 'Suggests' but never under 'Depends'.  I
> > can document this and hope that's how it's going to be used.  But, I'd
> > like to make it explicit that this API should be used via imports or
> > ::.  One approach I've considered is:
> >
> > .onAttach <- function(libname, pkgname) {
> >if (nzchar(Sys.getenv("R_CMD"))) return()
> >stop("Package ", sQuote(pkgname), " must not be attached")
> > }
> >
> > This would produce an error if the package is attached.  It's
> > conditioned on the environment variable 'R_CMD' set by R itself
> > whenever 'R CMD ...' runs.  This is done to avoid errors in 'R CMD
> > INSTALL' and 'R CMD check' "load tests", which formally are *attach*
> > tests.  The above approach passes all the tests and checks I'm aware
> > of and on all platforms.
> >
> > Before I ping the CRAN team explicitly, does anyone know whether this
> > is a valid approach?  Do you know if there are alternatives for
> > asserting that a package is never attached.  Maybe this is more
> > philosophical where the package "contract" is such that all packages
> > should be attachable and, if not, then it's not a valid R package.
> >
> > This is a non-critical topic but if it can be done it would be useful.
> >
> > Thanks,
> >
> > Henrik
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>
>
>
> --
> Iñaki Úcar

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Interpret feedback: not write testthat-tests in examples

2020-07-16 Thread Henrik Bengtsson
If the point of having, say,

stopifnot(add(1, 2) == sum(c(1, 2))

is to make it explicit to the reader that your add() function gives
the same results as sum(), then I argue that is valid to use in an
example.  I'm pretty sure I've used that in some of my examples.  For
the purpose, there should be no reason why you can't use other
"assert" functions for this purpose, e.g.

testthat::expect_equal(add(1, 2), sum(c(1, 2))

Now, if the point of your "assert" statement is only to validate your
package/code, then I agree it should not be in the example code
because it adds clutter.  Such validation should be in a package test.

So, if the former, I suggest you reply to the CRAN Team and explain this.

/Henrik

On Thu, Jul 16, 2020 at 6:28 AM Richel Bilderbeek
 wrote:
>
> Dear R package developers,
>
> I would enjoy some help regarding some feedback I got on my package from a 
> CRAN volunteer, as I am unsure how to interpret this correctly.
>
> This is the feedback I got (I added '[do]'):
>
> > Please [do] not write testthat-tests in your examples.
>
> I wonder if this is about using `testthat` or using tests in general.
>
> To simplify the context, say I wrote a package with a function called `add`, 
> that adds two numbers. My example code would then be something like this:
>
> ```
> library(testthat)
>
> expect_equal(add(1, 2), 3)
> ```
>
> The first interpretation is about using `testthat`: maybe I should use base R 
> (`stopifnot`) or another testing library (`testit`) or hand-craft it myself?
>
> The second interpretation is about using tests in example code. I like to 
> actively demonstrate that my code works as expected. I checked the policies 
> regarding examples, and I could not find a rule that I should refrain from 
> doing so.
>
> What is the correct response to this feedback?
>
> Thanks for your guidance, Richel Bilderbeek
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How do you Rd reference parallel::mclapply() so it works also on Windows?

2020-07-08 Thread Henrik Bengtsson
On Wed, Jul 8, 2020 at 12:11 AM Martin Maechler
 wrote:
...
> Indeed, another example, why we should move away from the
> 'file' instead of 'topic' (<-> \alias{.}) paradigm and requirement.
>
> I'm optimistic we will do that soonish ...

On Wed, Jul 8, 2020 at 1:17 AM Kurt Hornik  wrote:
...
> But perhaps simply wait for a few more days ...

Oh, this is good news.

I see that these new warnings have already been disabled in R-devel.

Thxs,

Henrik


On Wed, Jul 8, 2020 at 1:17 AM Kurt Hornik  wrote:
>
> >>>>> Henrik Bengtsson writes:
>
> > Here's another "Non-file package-anchored link(s) ..." issue.  I'd
> > like to reference parallel::mclapply() in my help pages.  With the new
> > R-devel requirements, you have to link to the file and not the topic.
> > However, there is no cross-platform stable file reference for
> > parallel::mclapply();
>
> > # According to R on Linux
> >> basename(help("mclapply", package="parallel"))
> > [1] "mclapply"
>
> > # According to R on Windows
> >> basename(help("mclapply", package="parallel"))
> > [1] "mcdummies"
>
> > How can I provide a \link{} reference to the help("mclapply",
> > package="parallel") documentation?
>
> See R-exts, either "Platform-specific documentation" or "Dynamic pages".
>
> But perhaps simply wait for a few more days ...
>
> Best
> -k
>
>
> > Thanks,
>
> > Henrik

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] How do you Rd reference parallel::mclapply() so it works also on Windows?

2020-07-07 Thread Henrik Bengtsson
Here's another "Non-file package-anchored link(s) ..." issue.  I'd
like to reference parallel::mclapply() in my help pages.  With the new
R-devel requirements, you have to link to the file and not the topic.
However, there is no cross-platform stable file reference for
parallel::mclapply();

# According to R on Linux
> basename(help("mclapply", package="parallel"))
[1] "mclapply"

# According to R on Windows
> basename(help("mclapply", package="parallel"))
[1] "mcdummies"

How can I provide a \link{} reference to the help("mclapply",
package="parallel") documentation?

Thanks,

Henrik

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] Restrict package to load-only access - prevent attempts to attach it

2020-06-23 Thread Henrik Bengtsson
Hi,

I'm developing a package whose API is only meant to be used in other
packages via imports or pkg::foo().  There should be no need to attach
this package so that its API appears on the search() path. As a
maintainer, I want to avoid having it appear in search() conflicts by
mistake.

This means that, for instance, other packages should declare this
package under 'Imports' or 'Suggests' but never under 'Depends'.  I
can document this and hope that's how it's going to be used.  But, I'd
like to make it explicit that this API should be used via imports or
::.  One approach I've considered is:

.onAttach <- function(libname, pkgname) {
   if (nzchar(Sys.getenv("R_CMD"))) return()
   stop("Package ", sQuote(pkgname), " must not be attached")
}

This would produce an error if the package is attached.  It's
conditioned on the environment variable 'R_CMD' set by R itself
whenever 'R CMD ...' runs.  This is done to avoid errors in 'R CMD
INSTALL' and 'R CMD check' "load tests", which formally are *attach*
tests.  The above approach passes all the tests and checks I'm aware
of and on all platforms.

Before I ping the CRAN team explicitly, does anyone know whether this
is a valid approach?  Do you know if there are alternatives for
asserting that a package is never attached.  Maybe this is more
philosophical where the package "contract" is such that all packages
should be attachable and, if not, then it's not a valid R package.

This is a non-critical topic but if it can be done it would be useful.

Thanks,

Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] r-project.org SSL certificate issues

2020-06-09 Thread Henrik Bengtsson
Was this resolved upstream or is this something that R should/could
fix? If the latter, could this also go into the "emergency release" R
4.0.2 that is scheduled for 2020-06-22?

My $.02

/Henrik


On Sun, May 31, 2020 at 8:13 AM Gábor Csárdi  wrote:
>
> Btw. it would be also possible to create a macOS R installer that
> embeds a static or dynamic libcurl with Secure Transport, instead of
> the Apple default LibreSSL.
>
> This might be too late for R 4.0.1, I don't know.
>
> Gabor
>
> On Sun, May 31, 2020 at 4:09 PM Gábor Csárdi  wrote:
> >
> > On Sat, May 30, 2020 at 11:32 PM Gábor Csárdi  
> > wrote:
> > [...]
> > > Btw. why does this affect openssl? That root cert was published in
> > > 2010, surely openssl should know about it? Maybe libcurl / openssl
> > > only uses the chain provided by the server? Without trying to use an
> > > alternate chain?
> >
> > Yes, indeed it seems that old OpenSSL versions cannot handle
> > alternative certificate chains. This has been fixed in OpenSSL in
> > 2015, so modern Linux systems should be fine. However, macOS uses
> > LibreSSL, and LibreSSL never fixed this issue. E.g.
> > https://github.com/libressl-portable/portable/issues/595
> >
> > r-project.org can be updated to send the new root certificate, which
> > will solve most of our problems, but we'll probably have issues with
> > other web sites that'll update slower or never.
> >
> > FWIW I built macOS binaries for the curl package, using a static
> > libcurl and macOS Secure Transport, so these binaries does not have
> > this issue.
> >
> > They are at https://files.r-hub.io/curl-macos-static and they can be
> > installed with
> > install.packages("curl", repos =
> > "https://files.r-hub.io/curl-macos-static;, type = "binary")
> >
> > They support R 3.2 and up, including R 4.1, and should work on all
> > macOS versions that the given R release supports.
> >
> > Gabor
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Bioc-devel] How I hide non-exported function from the manual

2020-06-04 Thread Henrik Bengtsson
On Wed, Jun 3, 2020 at 10:40 PM Vincent Carey
 wrote:
>
> On Wed, Jun 3, 2020 at 11:48 PM stefano  wrote:
>
> > Hello Community,
> >
> > I am used to document function although hey are not exported
> >
> >
> I suppose you are talking about tidybulk?  I am somewhat mystified by the
> behavior
> of
>
> %vjcair> R CMD build tidybulk
>
> * checking for file ‘tidybulk/DESCRIPTION’ ... OK
>
> * preparing ‘tidybulk’:
>
> * checking DESCRIPTION meta-information ... OK
>
> * installing the package to process help pages
>
> * building the PDF package manual
>
> Hmm ... looks like a package
>
> Converting Rd files to LaTeX .
>
> Creating pdf output from LaTeX ...
>
>
> which produces a 147 page manual, and as you note many
>
> non-user-visible symbols are documented in manual.  I've
>
> never seen the "process help pages" phase in any
>)
> of my packages, and I don't know why.

This step happens iff you have \Sexpr{} macros in your man/*.Rd files
(see 
https://github.com/wch/r-source/blob/5bd6e3ce1430374105ebf02101f9d55173496cfe/src/library/tools/R/build.R#L582-L586)

/Henrik

>
>
> I don't have experience with the RdMacros setting in
>
> DESCRIPTION, and the way S3 methods are being handled
>
> in the package leads, I think, to an excess of Rd
>
> files relative to what you have as visible symbols
>
> in the package namespace.
>
>
> Perhaps some tidyverse experts can comment.
>
>
> > ```
> > #' Get differential transcription information to a tibble using edgeR.
> > #'
> > #' @import dplyr
> > #' @import tidyr
> > #' @import tibble
> > #' @importFrom magrittr set_colnames
> > #' @importFrom stats model.matrix
> > #' @importFrom utils installed.packages
> > #' @importFrom utils install.packages
> > #' @importFrom purrr when
> > #'
> > #'
> > #' @param .data A tibble
> > #' @param .formula a formula with no response variable, referring only to
> > numeric variables
> > #' @param .sample The name of the sample column
> > #' @param .transcript The name of the transcript/gene column
> > #' @param .abundance The name of the transcript/gene abundance column
> > #' @param .contrasts A character vector. See edgeR makeContrasts
> > specification for the parameter `contrasts`. If contrasts are not present
> > the first covariate is the one the model is tested against (e.g., ~
> > factor_of_interest)
> > #' @param method A string character. Either "edgeR_quasi_likelihood" (i.e.,
> > QLF), "edgeR_likelihood_ratio" (i.e., LRT)
> > #' @param significance_threshold A real between 0 and 1
> > #' @param minimum_counts A positive integer. Minimum counts required for at
> > least some samples.
> > #' @param minimum_proportion A real positive number between 0 and 1. It is
> > the threshold of proportion of samples for each transcripts/genes that have
> > to be characterised by a cmp bigger than the threshold to be included for
> > scaling procedure.
> > #' @param fill_missing_values A boolean. Whether to fill missing
> > sample/transcript values with the median of the transcript. This is rarely
> > needed.
> > #' @param scaling_method A character string. The scaling method passed to
> > the backend function (i.e., edgeR::calcNormFactors;
> > "TMM","TMMwsp","RLE","upperquartile")
> > #' @param omit_contrast_in_colnames If just one contrast is specified you
> > can choose to omit the contrast label in the colnames.
> > #'
> > #' @return A tibble with edgeR results
> > #'
> > get_differential_transcript_abundance_bulk <- function
> > [...]
> > ```
> >
> > However this leads to 2 problems
> >
> > 1) The PDF manual includes many function that are not accessible to the
> > user. How can I hide documented non-exported function from the manual
> > 2) I receive the Biocheck note. "You have  initialised objects".
> > Again how can I document an object without initialising it?
> >
> > Thanks a lot.
> >
> > Best wishes.
> >
> > *Stefano *
> >
> >
> >
> > Stefano Mangiola | Postdoctoral fellow
> >
> > Papenfuss Laboratory
> >
> > The Walter Eliza Hall Institute of Medical Research
> >
> > +61 (0)466452544
> >
> > [[alternative HTML version deleted]]
> >
> > ___
> > Bioc-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/bioc-devel
> >
>
> --
> The information in this e-mail is intended only for th...{{dropped:6}}

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Bioc-devel] Reducing dependencies

2020-06-02 Thread Henrik Bengtsson
RStudio provides pre-built R package for Linux and since a some weeks
now, they can be used on GitHub Actions
(https://github.com/r-lib/actions).  In addition, the run-time limit
on GitHub Actions is several hours compared to the 50 minutes you've'
got on Travis, so even if you install from source, you're less likely
to hit these limits on GitHub Actions.

Also, it could be that you could tweak/trick Travis to install above
Linux binary packages.

My $.02

/Henrik

On Tue, Jun 2, 2020 at 2:45 PM Koen Van den Berge
 wrote:
>
> Dear All,
>
> We have recently extended our Bioconductor package tradeSeq 
>  to allow 
> different input formats and accommodate extended downstream analyses, by 
> building on other R/Bioconductor packages.
> However this has resulted in a significant increase in the number of 
> dependencies due to relying on other packages that also have many 
> dependencies, for example causing very long build times on Travis 
> .
>
> We are therefore wondering about current recommendations to reduce the 
> dependency load. We have moved some larger packages from ‘Imports’ to 
> ‘Suggests’, but to no avail.
>
> Best,
> Koen
> [[alternative HTML version deleted]]
>
> ___
> Bioc-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/bioc-devel

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [Rd] Compatibility issues caused by new simplify argument in apply function

2020-05-22 Thread Henrik Bengtsson
Interesting problem.  I'm very rusty on S4 but would one solution be
to, already now, add 'simplify = TRUE' to the S4 method and document
it;

setMethod("apply", signature(X = "Speclib"),
  function(X,
   FUN,
   bySI = NULL,
   ...,
   simplify = TRUE) {

?

Henrik

On Fri, May 22, 2020 at 6:26 AM Duncan Murdoch  wrote:
>
> You didn't explained what the error is.  This is what it looks like to
> me, but I'm probably wrong in some details:
>
> 1. R-devel added an argument to the apply() function, so the header has
> changed from
>
>function (X, MARGIN, FUN, ...)
>
> to
>
>function(X, MARGIN, FUN, ..., simplify = TRUE)
>
> 2. Your package converted the function apply() to an S4 generic.
>
> 3. Now the signatures of your methods for this generic need to have the
> simplify argument, but if you do that, they won't work in previous
> versions of R.
>
> You'd like to have conditional code and documentation to depend on the
> version of R.
>
> Is that all correct?
>
> I don't think it's possible, for the reasons you found.  Certainly you
> can have conditional code, but the docs are going to fail.
>
> One thing that might work is in versions of R before this change, export
> your own version of apply, with the change in place, i.e.
>
> if(!("simplify" %in% names(formals(base::apply
>apply <- function(X, MARGIN, FUN, ..., simplify = TRUE) {
>  base::apply(X, MARGIN, FUN, ...)
>}
>
> and then conditionally export "apply" in these old versions.  Then your
> docs could match the new version everywhere.
>
> Another thing is to maintain two versions of your package, one for R
> versions before the change, another for versions after the change.  Add
> appropriate entries in the DESCRIPTION file, e.g.
>
> Depends:  R (> 4.0)
>
> Another is to argue with R Core that this change to a really old
> function is too hard to accommodate, and they should back it out, maybe
> by making a new function with the new signature.
>
> Or you could make a new function with the old signature, and use that
> instead of apply().
>
> Duncan Murdoch
>
>
>
> On 22/05/2020 6:26 a.m., Lukas Lehnert via R-devel wrote:
> > Dear R Developers,
> >
> > the new  simplify argument in apply causes that my package (hsdar) does not
> > pass the
> > checks in R-devel.
> >
> > The workaround, Kurt Hornik send me, is working for the R-code:
> > if("simplify" %in% names(formals(base::apply)))
> >   do something
> > else
> >   do something else
> >
> > Unfortunately, I cannot conditionalize the man pages of the functions. I get
> > the message
> > that "applySpeclib.Rd:12-14: Section \Sexpr is unrecognized and will be
> > dropped" if I try to
> > dynamically define the entire usage section. If I try to use \Sexpr inside 
> > the
> > \usage section,
> > I get the following warning: "applySpeclib.Rd:13-15: Tag \Sexpr is invalid 
> > in
> > a \usage block"
> >
> > Does anybody have an idea how to proceed. The full code is available below.
> >
> > Thanks
> >
> > Lukas
> >
> >
> > *1. Code for full usage section:*
> > ..
> > \description{
> > Apply function over all spectra or a subset of spectra in a \code{Speclib}.
> > }
> >
> > \Sexpr[echo=TRUE,results=rd,stage=install]{
> >hsdar:::.applyInHelp1("Speclib", usage = TRUE)
> > }
> >
> > \arguments{
> > ..
> >
> > *Function .applyInHelp1*
> > .applyInHelp1 <- function(fun_name, usage)
> > {
> >if (usage)
> >{
> >  if ("simplify" %in% names(formals(base::apply)))
> >  {
> >return(paste0("\\usage{\n",
> >  "\\S4method{apply}{", fun_name, "}(X, MARGIN, FUN, ...,
> > simplify = TRUE)\n",
> >  "}"))
> >  } else {
> >return(paste0("\\usage{\n",
> >  "\\S4method{apply}{", fun_name, "}(X, MARGIN, FUN, ...)
> > \n",
> >  "}"))
> >  }
> >} else {
> >  if ("simplify" %in% names(formals(base::apply)))
> >  {
> >return("}\n\\item{simplify}{Currently ignored")
> >  } else {
> >return("")
> >  }
> >}
> > }
> >
> >
> > *2. Using \Sexpr inside the \usage block*
> > \usage{
> > \S4method{apply}{Speclib}(X, FUN, bySI = NULL, ...
> > \Sexpr[echo=TRUE,results=rd,stage=install]{
> >hsdar:::.applyInHelp2(usage = TRUE)
> > }
> > )
> > }
> >
> >
> > *Function .applyInHelp2*
> > .applyInHelp2 <- function(usage)
> > {
> >if (usage)
> >{
> >  if ("simplify" %in% names(formals(base::apply)))
> >  {
> >return(", simplify = TRUE)")
> >  }
> >} else {
> >  if ("simplify" %in% names(formals(base::apply)))
> >  {
> >return("}\n\\item{simplify}{Currently ignored")
> >  } else {
> >return("")
> >  }
> >}
> > }
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> __
> 

Re: [Rd] Precision of function mean,bug?

2020-05-20 Thread Henrik Bengtsson
On Wed, May 20, 2020 at 11:10 AM brodie gaslam via R-devel
 wrote:
>
>  > On Wednesday, May 20, 2020, 7:00:09 AM EDT, peter dalgaard 
>  wrote:
> >
> > Expected, see FAQ 7.31.
> >
> > You just can't trust == on FP operations. Notice also
>
> Additionally, since you're implementing a "mean" function you are testing
> against R's mean, you might want to consider that R uses a two-pass
> calculation[1] to reduce floating point precision error.

This one is important.

FWIW, matrixStats::mean2() provides argument refine=TRUE/FALSE to
calculate mean with and without this two-pass calculation;

> a <- c(x[idx],y[idx],z[idx]) / 3
> b <- mean(c(x[idx],y[idx],z[idx]))
> b == a
[1] FALSE
> b - a
[1] 2.220446e-16

> c <- matrixStats::mean2(c(x[idx],y[idx],z[idx]))  ## default to refine=TRUE
> b == c
[1] TRUE
> b - c
[1] 0

> d <- matrixStats::mean2(c(x[idx],y[idx],z[idx]), refine=FALSE)
> a == d
[1] TRUE
> a - d
[1] 0
> c == d
[1] FALSE
> c - d
[1] 2.220446e-16

Not surprisingly, the two-pass higher-precision version (refine=TRUE)
takes roughly twice as long as the one-pass quick version
(refine=FALSE).

/Henrik

>
> Best,
>
> Brodie.
>
> [1] https://github.com/wch/r-source/blob/tags/R-4-0-0/src/main/summary.c#L482
>
> > > a2=(z[idx]+x[idx]+y[idx])/3
> > > a2==a
> > [1] FALSE
> > > a2==b
> > [1] TRUE
> >
> > -pd
> >
> > > On 20 May 2020, at 12:40 , Morgan Morgan  
> > > wrote:
> > >
> > > Hello R-dev,
> > >
> > > Yesterday, while I was testing the newly implemented function pmean in
> > > package kit, I noticed a mismatch in the output of the below R 
> > > expressions.
> > >
> > > set.seed(123)
> > > n=1e3L
> > > idx=5
> > > x=rnorm(n)
> > > y=rnorm(n)
> > > z=rnorm(n)
> > > a=(x[idx]+y[idx]+z[idx])/3
> > > b=mean(c(x[idx],y[idx],z[idx]))
> > > a==b
> > > # [1] FALSE
> > >
> > > For idx= 1, 2, 3, 4 the last line is equal to TRUE. For 5, 6 and many
> > > others the difference is small but still.
> > > Is that expected or is it a bug?
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] defining r audio connections

2020-05-06 Thread Henrik Bengtsson
What's the gist of the problem of making/having this part of the public
API? Is it security, is it stability, is it that the current API is under
construction, is it a worry about maintenance load for R Core, ...? Do we
know why?

It sounds like it's a feature that is  useful. I think we missed out on
some great enhancements in the past because of it not being part of the
public API.

/Henrik

On Wed, May 6, 2020, 16:26 Martin Morgan  wrote:

> yep, you're right, after some initial clean-up and running with or without
> --as-cran R CMD check gives a NOTE
>
>   *  checking compiled code
>   File ‘socketeer/libs/socketeer.so’:
> Found non-API calls to R: ‘R_GetConnection’,
>‘R_new_custom_connection’
>
>   Compiled code should not call non-API entry points in R.
>
>   See 'Writing portable packages' in the 'Writing R Extensions' manual.
>
> Connections in general seem more useful than ad-hoc functions, though
> perhaps for Frederick's use case Duncan's suggestion is sufficient. For
> non-CRAN packages I personally would implement a connection.
>
> (I mistakenly thought this was a more specialized mailing list; I wouldn't
> have posted to R-devel on this topic otherwise)
>
> Martin Morgan
>
> On 5/6/20, 4:12 PM, "Gábor Csárdi"  wrote:
>
> AFAIK that API is not allowed on CRAN. It triggers a NOTE or a
> WARNING, and your package will not be published.
>
> Gabor
>
> On Wed, May 6, 2020 at 9:04 PM Martin Morgan 
> wrote:
> >
> > The public connection API is defined in
> >
> >
> https://github.com/wch/r-source/blob/trunk/src/include/R_ext/Connections.h
> >
> > I'm not sure of a good pedagogic example; people who want to write
> their own connections usually want to do so for complicated reasons!
> >
> > This is my own abandoned attempt
> https://github.com/mtmorgan/socketeer/blob/b0a1448191fe5f79a3f09d1f939e1e235a22cf11/src/connection.c#L169-L192
> where connection_local_client() is called from R and _connection_local()
> creates and populates the appropriate structure. Probably I have done
> things totally wrong (e.g., by not checking the version of the API, as
> advised in the header file!)
> >
> > Martin Morgan
> >
> > On 5/6/20, 2:26 PM, "R-devel on behalf of Duncan Murdoch" <
> r-devel-boun...@r-project.org on behalf of murdoch.dun...@gmail.com>
> wrote:
> >
> > On 06/05/2020 1:09 p.m., frede...@ofb.net wrote:
> > > Dear R Devel,
> > >
> > > Since Linux moved away from using a file-system interface for
> audio, I think it is necessary to write special libraries to interface with
> audio hardware from various languages on Linux.
> > >
> > > In R, it seems like the appropriate datatype for a `snd_pcm_t`
> handle pointing to an open ALSA source or sink would be a "connection".
> Connection types are already defined in R for "file", "url", "pipe",
> "fifo", "socketConnection", etc.
> > >
> > > Is there a tutorial or an example package where a new type of
> connection is defined, so that I can see how to do this properly in a
> package?
> > >
> > > I can see from the R source that, for example, `do_gzfile` is
> defined in `connections.c` and referenced in `names.c`. However, I thought
> I should ask here first in case there is a better place to start, than
> trying to copy this code.
> > >
> > > I only want an object that I can use `readBin` and `writeBin`
> on, to read and write audio data using e.g. `snd_pcm_writei` which is part
> of the `alsa-lib` package.
> >
> > I don't think R supports user-defined connections, but probably
> writing
> > readBin and writeBin equivalents specific to your library
> wouldn't be
> > any harder than creating a connection.  For those, you will
> probably
> > want to work with an "external pointer" (see Writing R
> Extensions).
> > Rcpp probably has support for these if you're working in C++.
> >
> > Duncan Murdoch
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] mclapply returns NULLs on MacOS when running GAM

2020-04-28 Thread Henrik Bengtsson
re not talking about 
> > that as that is always the first option. Multicore works well in cases 
> > where there is no easy native solution and you need to share a lot of data 
> > for small results. If the data is small, or you need to read it first, then 
> > other methods like PSOCK may be preferable. In any case, parallelization 
> > only makes sense for code that you know will take a long time to run.
> >
> > Cheers,
> > Simon
> >
> >
> >> On 29/04/2020, at 11:54 AM, Shian Su  wrote:
> >>
> >> Thanks Henrik,
> >>
> >> That clears things up significantly. I did see the warning but failed to 
> >> include it my initial email. It sounds like an RStudio issue, and it seems 
> >> like that it’s quite intrinsic to how forks interact with RStudio. Given 
> >> this code is eventually going to be a part of a package, should I expect 
> >> it to fail mysteriously in RStudio for my users? Is the best solution here 
> >> to migrate all my parallelism to PSOCK for the foreseeable future?
> >>
> >> Thanks,
> >> Shian
> >>
> >>> On 29 Apr 2020, at 2:08 am, Henrik Bengtsson  
> >>> wrote:
> >>>
> >>> Hi, a few comments below.
> >>>
> >>> First, from my experience and troubleshooting similar reports from
> >>> others, a returned NULL from parallel::mclapply() is often because the
> >>> corresponding child process crashed/died. However, when this happens
> >>> you should see a warning, e.g.
> >>>
> >>>> y <- parallel::mclapply(1:2, FUN = function(x) if (x == 2) quit("no") 
> >>>> else x)
> >>> Warning message:
> >>> In parallel::mclapply(1:2, FUN = function(x) if (x == 2) quit("no") else 
> >>> x) :
> >>> scheduled core 2 did not deliver a result, all values of the job
> >>> will be affected
> >>>> str(y)
> >>> List of 2
> >>> $ : int 1
> >>> $ : NULL
> >>>
> >>> This warning is produces on R 4.0.0 and R 3.6.2 in Linux, but I would
> >>> assume that warning is also produced on macOS.  It's not clear from
> >>> you message whether you also got that warning or not.
> >>>
> >>> Second, forked processing, as used by parallel::mclapply(), is advised
> >>> against when using the RStudio Console [0].  Unfortunately, there's no
> >>> way to disable forked processing in R [1].  You could add the
> >>> following to your ~/.Rprofile startup file:
> >>>
> >>> ## Warn when forked processing is used in the RStudio Console
> >>> if (Sys.getenv("RSTUDIO") == "1" && !nzchar(Sys.getenv("RSTUDIO_TERM"))) {
> >>> invisible(trace(parallel:::mcfork, tracer =
> >>> quote(warning("parallel::mcfork() was used. Note that forked
> >>> processes, e.g. parallel::mclapply(), may be unstable when used from
> >>> the RStudio Console
> >>> [https://github.com/rstudio/rstudio/issues/2597#issuecomment-482187011];,
> >>> call.=FALSE
> >>> }
> >>>
> >>> to detect when forked processed is used in the RStudio Console -
> >>> either by you or by some package code that you use directly or
> >>> indirectly.  You could even use stop() here if you wanna be
> >>> conservative.
> >>>
> >>> [0] https://github.com/rstudio/rstudio/issues/2597#issuecomment-482187011
> >>> [1] https://stat.ethz.ch/pipermail/r-devel/2020-January/078896.html
> >>>
> >>> /Henrik
> >>>
> >>> On Tue, Apr 28, 2020 at 2:39 AM Shian Su  wrote:
> >>>>
> >>>> Yes I am running on Rstudio 1.2.5033. I was also running this code 
> >>>> without error on Ubuntu in Rstudio. Checking again on the terminal and 
> >>>> it does indeed work fine even with large data.frames.
> >>>>
> >>>> Any idea as to what interaction between Rstudio and mclapply causes this?
> >>>>
> >>>> Thanks,
> >>>> Shian
> >>>>
> >>>> On 28 Apr 2020, at 7:29 pm, Simon Urbanek 
> >>>> mailto:simon.urba...@r-project.org>> wrote:
> >>>>
> >>>> Sorry, the code works perfectly fine for me in R even for 1e6 
> >>>> observations (but I was testing with R 4.0.0). Are you using some kind 
> >>>> of GUI?
> >>>>
> >>&g

Re: [Rd] mclapply returns NULLs on MacOS when running GAM

2020-04-28 Thread Henrik Bengtsson
Hi, a few comments below.

First, from my experience and troubleshooting similar reports from
others, a returned NULL from parallel::mclapply() is often because the
corresponding child process crashed/died. However, when this happens
you should see a warning, e.g.

> y <- parallel::mclapply(1:2, FUN = function(x) if (x == 2) quit("no") else x)
Warning message:
In parallel::mclapply(1:2, FUN = function(x) if (x == 2) quit("no") else x) :
  scheduled core 2 did not deliver a result, all values of the job
will be affected
> str(y)
List of 2
 $ : int 1
 $ : NULL

This warning is produces on R 4.0.0 and R 3.6.2 in Linux, but I would
assume that warning is also produced on macOS.  It's not clear from
you message whether you also got that warning or not.

Second, forked processing, as used by parallel::mclapply(), is advised
against when using the RStudio Console [0].  Unfortunately, there's no
way to disable forked processing in R [1].  You could add the
following to your ~/.Rprofile startup file:

## Warn when forked processing is used in the RStudio Console
if (Sys.getenv("RSTUDIO") == "1" && !nzchar(Sys.getenv("RSTUDIO_TERM"))) {
  invisible(trace(parallel:::mcfork, tracer =
quote(warning("parallel::mcfork() was used. Note that forked
processes, e.g. parallel::mclapply(), may be unstable when used from
the RStudio Console
[https://github.com/rstudio/rstudio/issues/2597#issuecomment-482187011];,
call.=FALSE
}

to detect when forked processed is used in the RStudio Console -
either by you or by some package code that you use directly or
indirectly.  You could even use stop() here if you wanna be
conservative.

[0] https://github.com/rstudio/rstudio/issues/2597#issuecomment-482187011
[1] https://stat.ethz.ch/pipermail/r-devel/2020-January/078896.html

/Henrik

On Tue, Apr 28, 2020 at 2:39 AM Shian Su  wrote:
>
> Yes I am running on Rstudio 1.2.5033. I was also running this code without 
> error on Ubuntu in Rstudio. Checking again on the terminal and it does indeed 
> work fine even with large data.frames.
>
> Any idea as to what interaction between Rstudio and mclapply causes this?
>
> Thanks,
> Shian
>
> On 28 Apr 2020, at 7:29 pm, Simon Urbanek 
> mailto:simon.urba...@r-project.org>> wrote:
>
> Sorry, the code works perfectly fine for me in R even for 1e6 observations 
> (but I was testing with R 4.0.0). Are you using some kind of GUI?
>
> Cheers,
> Simon
>
>
> On 28/04/2020, at 8:11 PM, Shian Su 
> mailto:s...@wehi.edu.au>> wrote:
>
> Dear R-devel,
>
> I am experiencing issues with running GAM models using mclapply, it fails to 
> return any values if the data input becomes large. For example here the code 
> runs fine with a df of 100 rows, but fails at 1000.
>
> library(mgcv)
> library(parallel)
>
> df <- data.frame(
> + x = 1:100,
> + y = 1:100
> + )
>
> mclapply(1:2, function(i, df) {
> + fit <- gam(y ~ s(x, bs = "cs"), data = df)
> + },
> + df = df,
> + mc.cores = 2L
> + )
> [[1]]
>
> Family: gaussian
> Link function: identity
>
> Formula:
> y ~ s(x, bs = "cs")
>
> Estimated degrees of freedom:
> 9  total = 10
>
> GCV score: 0
>
> [[2]]
>
> Family: gaussian
> Link function: identity
>
> Formula:
> y ~ s(x, bs = "cs")
>
> Estimated degrees of freedom:
> 9  total = 10
>
> GCV score: 0
>
>
>
> df <- data.frame(
> + x = 1:1000,
> + y = 1:1000
> + )
>
> mclapply(1:2, function(i, df) {
> + fit <- gam(y ~ s(x, bs = "cs"), data = df)
> + },
> + df = df,
> + mc.cores = 2L
> + )
> [[1]]
> NULL
>
> [[2]]
> NULL
>
> There is no error message returned, and the code runs perfectly fine in 
> lapply.
>
> I am on a MacBook 15 (2016) running MacOS 10.14.6 (Mojave) and R version 
> 3.6.2. This bug could not be reproduced on my Ubuntu 19.10 running R 3.6.1.
>
> Kind regards,
> Shian Su
> 
> Shian Su
> PhD Student, Ritchie Lab 6W, Epigenetics and Development
> Walter & Eliza Hall Institute of Medical Research
> 1G Royal Parade, Parkville VIC 3052, Australia
>
>
> ___
>
> The information in this email is confidential and inte...{{dropped:6}}

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] suggestion: "." in [lsv]apply()

2020-04-16 Thread Henrik Bengtsson
I'm sure this exists elsewhere, but, as a trade-off, could you achieve
what you want with a separate helper function F(expr) that constructs
the function you want to pass to [lsv]apply()?  Something that would
allow you to write:

sapply(split(mtcars, mtcars$cyl), F(summary(lm(mpg ~ wt,.))$r.squared))

Such an F() function would apply elsewhere too.

/Henrik

On Thu, Apr 16, 2020 at 9:30 AM Michael Mahoney
 wrote:
>
> This syntax is already implemented in the {purrr} package, more or
> less -- you need to add a tilde before your function call for it to
> work exactly as written:
>
> purrr::map_dbl(split(mtcars, mtcars$cyl), ~ summary(lm(wt ~ mpg, 
> .))$r.squared)
>
> is equivalent to
>
> sapply(split(mtcars, mtcars$cyl), function(d) summary(lm(mpg ~ wt,
> d))$r.squared)
>
> Seems like using this package is probably an easier solution for this
> wish than adding a reserved variable and adding additional syntax to
> the apply family as a whole.
>
> Thanks,
>
> -Mike
>
> > From: Sokol Serguei 
> > Date: Thu, Apr 16, 2020 at 12:03 PM
> > Subject: Re: [Rd] suggestion: "." in [lsv]apply()
> > To: William Dunlap 
> > Cc: r-devel 
> >
> >
> > Thanks Bill,
> >
> > Clearly, my first proposition for wsapply() is quick and dirty one.
> > However, if "." becomes a reserved variable with this new syntax,
> > wsapply() can be fixed (at least for your example and alike) as:
> >
> > wsapply=function(l, fun, ...) {
> >  .=substitute(fun)
> >  if (is.name(.) || is.call(.) && .[[1]]==as.name("function")) {
> >  sapply(l, fun, ...)
> >  } else {
> >  sapply(l, function(d) eval(., list(.=d)), ...)
> >  }
> > }
> >
> > Will it do the job?
> >
> > Best,
> > Serguei.
> >
> > Le 16/04/2020 à 17:07, William Dunlap a écrit :
> > > Passing in a function passes not only an argument list but also an
> > > environment from which to get free variables. Since your function
> > > doesn't pay attention to the environment you get things like the
> > > following.
> > >
> > > > wsapply(list(1,2:3), paste(., ":", deparse(s)))
> > > [[1]]
> > > [1] "1 : paste(., \":\", deparse(s))"
> > >
> > > [[2]]
> > > [1] "2 : paste(., \":\", deparse(s))" "3 : paste(., \":\", deparse(s))"
> > >
> > > Bill Dunlap
> > > TIBCO Software
> > > wdunlap tibco.com 
> > >
> > >
> > > On Thu, Apr 16, 2020 at 7:25 AM Sokol Serguei  > > > wrote:
> > >
> > > Hi,
> > >
> > > I would like to make a suggestion for a small syntactic
> > > modification of
> > > FUN argument in the family of functions [lsv]apply(). The idea is to
> > > allow one-liner expressions without typing "function(item) {...}" to
> > > surround them. The argument to the anonymous function is simply
> > > referred
> > > as ".". Let take an example. With this new feature, the following call
> > >
> > > sapply(split(mtcars, mtcars$cyl), function(d) summary(lm(mpg ~ wt,
> > > d))$r.squared)
> > > #4 6 8
> > > #0.5086326 0.4645102 0.4229655
> > >
> > >
> > > could be rewritten as
> > >
> > > sapply(split(mtcars, mtcars$cyl), summary(lm(mpg ~ wt, .))$r.squared)
> > >
> > > "Not a big saving in typing" you can say but multiplied by the
> > > number of
> > > [lsv]apply usage and a neater look, I think, the idea merits to be
> > > considered.
> > > To illustrate a possible implementation, I propose a wrapper
> > > example for
> > > sapply():
> > >
> > > wsapply=function(l, fun, ...) {
> > >  s=substitute(fun)
> > >  if (is.name (s) || is.call(s) &&
> > > s[[1]]==as.name ("function")) {
> > >  sapply(l, fun, ...) # legacy call
> > >  } else {
> > >  sapply(l, function(d) eval(s, list(.=d)), ...)
> > >  }
> > > }
> > >
> > > Now, we can do:
> > >
> > > wsapply(split(mtcars, mtcars$cyl), summary(lm(mpg ~ wt, .))$r.squared)
> > >
> > > or, traditional way:
> > >
> > > wsapply(split(mtcars, mtcars$cyl), function(d) summary(lm(mpg ~ wt,
> > > d))$r.squared)
> > >
> > > the both work.
> > >
> > > How do you feel about that?
> > >
> > > Best,
> > > Serguei.
> > >
> > > __
> > > R-devel@r-project.org  mailing list
> > > https://stat.ethz.ch/mailman/listinfo/r-devel
> > >
> >
> >
> > [[alternative HTML version deleted]]
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Add a new environment variable switch for the 'large version' check

2020-04-16 Thread Henrik Bengtsson
I'd second Jim's feature request - it would be useful to be able to
disable this in CI and elsewhere.The concept of using an "unusual"
version component such as a very large number does a nice job of
indicating "unusual" and serves as a blocker for submitting
work-in-progress to CRAN by mistake (hence the validation in 'R CMD
check').

Another point, which I don't think Jim made, is that this would make
it possible to run R CMD check --as-cran on your work-in-progress and
get all OKs.  This in turn would allow us to trigger a non-zero exit
status also for NOTEs (not just ERRORs and WARNINGs).  Currently, the
warning on -9000 is a false positive in this sense.  This will allow
developers to be more conservative without risking to treat NOTEs as
something to expect as normal.  CI services are typically configured
to alert the developer on ERRORs and WARNINGs but, AFAIK, not on
NOTEs.

On the topic of unusual version numbers: I'd like to suggest that
CRAN(*) makes an unusual version bump whenever they orphan a package,
e.g. to suffix -1. CRAN already updates/modifies the package
tarball for orphaned packages by setting 'Maintainer: ORPHANED' in the
DESCRIPTION file. By also bumping the version of orphaned packages it
would it stand out in sessionInfo(), which helps in troubleshooting
and bug reports, etc.  But more importantly, the most recent stable
CRAN release remain untouched, which I think has a value by itself for
scientific purposes.

/Henrik

(*) Yes, I should email CRAN about this, but I think it's worth
vetting it here first.

On Thu, Apr 16, 2020 at 7:44 AM Dirk Eddelbuettel  wrote:
>
>
> Or you use a fourth component to signal a development version as Rcpp has
> done for years (and, IIRC, for longer than devtools et al used '9000').
>
> There is no functional difference between 1.2.3.1 and 1.2.3.9000. They are
> both larger than 1.2.3 (in the package_version() sense) and signal an
> intermediate version between 1.2.3 and 1.2.4.
>
> But one requires a patch. ¯\_(ツ)_/¯.
>
> Dirk
>
> --
> http://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] stringsAsFactors

2020-04-13 Thread Henrik Bengtsson
While at it, would it be worth mentioning in the NEWS for R 4.0.0 that
options 'stringsAsFactors' is being deprecated, e.g.

$ options(stringsAsFactors = TRUE)
Warning message:
In options(stringsAsFactors = TRUE) :
  'options(stringsAsFactors = TRUE)' is deprecated and will be disabled

?  Currently, the news only says:

* R now uses a stringsAsFactors = FALSE default, and hence by default
no longer converts strings to factors in calls to data.frame() and
read.table().

/Henrik

On Mon, Apr 13, 2020 at 5:23 AM Martin Maechler
 wrote:
>
> > Hugh Parsonage
> > on Mon, 13 Apr 2020 21:20:26 +1000 writes:
>
> > Further, in addition to the `val <- FALSE` patch a few hours ago by
> > Martin, the line after should also be changed
>
> > - if(!is.logical(val) || is.na(val) || length(val) != 1L)
> > + if(!is.logical(val) || length(val) != 1L || is.na(val))
>
> > ## Consider
> > Sys.setenv("_R_CHECK_LENGTH_1_LOGIC2_" = "TRUE")
> > options(stringsAsFactors = c(TRUE, FALSE))
>
> In R-devel and R 4.0.0 alpha/beta, you have
>
> > options(stringsAsFactors = c(TRUE, FALSE))
> Error in options(stringsAsFactors = c(TRUE, FALSE)) :
>   invalid value for 'stringsAsFactors'
>
>
> > default.stringsAsFactors()  # correct error message
>
> Note that the default.stringsAsFactors() function is also
> deprecated, of course.  Not "formally", in the sense that its
> use would give a deprecation warning  {which would be *bad* as
>   it's still used for the default argument e.g. of read.table()},
> but the help page (in R-devel and R 4.0.0 "pre-release")
> has been saying for a while now
>
> 1)
>
> Usage:
>
>  data.frame(  . )
>
>  default.stringsAsFactors() # << this is deprecated !
> ^^
>
> and 2)   in  'Details:'
>
>  default.stringsAsFactors is a utility 
>  ...   This function is *deprecated* now and will
>  no longer be available in the future.
>
>
> and so it'd be a waste to change it unnecessarily.
> Martin
>
> > On Mon, 13 Apr 2020 at 18:02, Martin Maechler
> >  wrote:
> >>
> >> > Duncan Murdoch
> >> > on Sun, 12 Apr 2020 08:57:14 -0400 writes:
> >>
> >> > The NEWS for R 4.0.0 says "R now uses a stringsAsFactors = FALSE
> >> > default, and hence by default no longer converts strings to factors 
> in
> >> > calls to data.frame() and read.table()."
> >>
> >> > This seems to have been implemented by setting 
> options(stringsAsFactors
> >> > = FALSE) in the main R profile.  However, setting
> >>
> >> > options(stringsAsFactors = NULL)
> >>
> >> > reverts to the same behavior as the old options(stringsAsFactors =
> >> > TRUE).  Is this intentional?
> >>
> >>
> >> No!  Thanks a lot for testing R 4.0.0 alpha/beta, noticing and
> >> alerting us about it.
> >>
> >> This will be changed ASAP.
> >>
> >> ... and it will benefit the whole R user community if quite a
> >> few good R users (as most readers of 'R-devel') would start
> >> using 'R 4.0.0 beta' routinely now --- thanks a lot in advance!
> >>
> >> Martin
> >>
> >> __
> >> R-devel@r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-devel
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] WISH: Sys.setlocale() to return value invisibly

2020-03-25 Thread Henrik Bengtsson
I case someone runs into this topic.  I just found the following
comment from 2012 on BugZilla explaining why Sys.setlocale() does
*not* return invisibly contrary to most++ other setters in R:

PR#15128: Sys.setlocale() - return previous setting invisibly?

Brian Ripley on 2012-12-09 16:53:43 UTC:
> It was a deliberate decision. Unlike options() the locale is usually set at 
> startup and it is major thing to change it in a session--and it is usually 
> only done recording the previous value to return to. The author certainly 
> wanted to see what he was changing from in a session.

https://bugs.r-project.org/bugzilla/show_bug.cgi?id=15128#c1

/Henrik

On Tue, Mar 20, 2018 at 2:11 PM Henrik Bengtsson
 wrote:
>
> Contrary to, say, Sys.setenv(), Sys.setlocale() returns it's value
> visibly.  This means that if you for instance add:
>
> Sys.setlocale("LC_COLLATE", "C")
>
> to your .Rprofile file, it will print:
>
> [1] "C"
>
> at startup. The workaround is to wrap the call in invisible(), but I'd
> argue that any "setter" function should return invisibly.
>
> Some more details:
>
> > withVisible(Sys.setlocale("LC_COLLATE", "C"))
> $value
> [1] "C"
>
> $visible
> [1] TRUE
>
> > withVisible(Sys.setenv(FOO = "C"))
> $value
> [1] TRUE
>
> $visible
> [1] FALSE
>
> /Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R CMD check --as-cran attempts to hide R_LIBS_USER but fails

2020-03-24 Thread Henrik Bengtsson
This has been fixed in R-devel:

r78046 | ripley | 2020-03-24 06:51:35 -0700 (Tue, 24 Mar 2020) | 1 line

handle Renviron files in the same way as POSIX shells

(diff: 
https://github.com/wch/r-source/commit/1658c8491e9cdc6d2fe61603ed23ae56232b6727)

I've verified that 'R CMD check --as-cran' now hides user's personal
library (R_LIBS_USER) such that the check environment, including test
scripts won't pick up packages from there, e.g. test scripts now
report:

> print(.libPaths())
[1] "/tmp/hb/Rtmpy6mBCg/RLIBS_1e6465250309"
[2] "/home/hb/software/R-devel/trunk/lib/R/library"

This is important, because, previously, your package might have not
have produced check errors even if all dependencies had not been
declared in your DESCRIPTION file, or in your package dependencies.

The above is only new for '--as-cran' on Linux and macOS - it already
worked as wanted on Windows (see, there's some advantages to be on
that OS).

/Henrik



On Wed, Mar 18, 2020 at 9:38 PM Henrik Bengtsson
 wrote:
>
> On Wed, Mar 18, 2020 at 8:04 PM Dirk Eddelbuettel  wrote:
> >
> >
> > On 18 March 2020 at 19:19, Henrik Bengtsson wrote:
> > | AFAIU, 'R CMD check --as-cran' tries to hide any site and user package
> > | libraries by setting R_LIBS_SITE and R_LIBS_USER.  However, contrary
> >
> > What makes you think that? AFAIK --as-cran just sets a bunch of the (nearly
> > countless) environment variables (all described in R Inst+Admin, as I 
> > recall)
> > to a set of values "close to" values CRAN uses.
>
> 1. --as-cran sets R_LIBS_SITE='no_such_dir' and R_LIBS_USER='',
> whereas without --as-cran they're not set.
>
> 2. --as-cran sets R_LIBS_USER='no_such_dir' on Windows and there it is
> masked, i.e. tests scripts do NOT see user's personal library because
> print(Sys.getenv("R_LIBS_USER")) is reported as "'no_such_dir'"
> (sic!).
>
> The only other interpretation I can imagine from using R_LIBS_USER=''
> on Linux is that is exists there to force the default settings in case
> it is already set externally by user.  For example, if one do
>
>   export R_LIBS_USER="$PWD"
>   R --vanilla CMD check --as-cran teeny_0.1.0.tar.gz
>
> then tests scripts still get 
> R_LIBS_USER="~/R/x86_64-pc-linux-gnu-library/3.6".
>
> But, then why is there a difference between Windows and Linux in this
> essential behavior?  To me, this suggests there is a mistake
> somewhere.  OTH, I know that lots of oddities in R exist for a reason.
>
> /Henrik
>
> >
> > | to R_LIBS_SITE, it fails for R_LIBS_USER and the user's personal
> > | library is still available for test scripts.  Should I revise my
> > | assumptions, or is that intentional?
> >
> > I would place a nickel on the former if betting was allowed in Illinois.
> >
> >   edd@rob:~$ Rscript --vanilla -e ".libPaths()"
> >   [1] "/usr/local/lib/R/site-library" "/usr/lib/R/site-library"
> >   [3] "/usr/lib/R/library"
> >   edd@rob:~$ R_LIBS_USER='' Rscript --vanilla -e ".libPaths()"
> >   [1] "/usr/local/lib/R/site-library" "/usr/lib/R/site-library"
> >   [3] "/usr/lib/R/library"
> >   edd@rob:~$
> >
> > I happen to turn per-user libraries off by default, which may affect things.
> > That said, I actually quite like having the same paths. Your mileage, as 
> > they
> > say, may vary.
> >
> > Dirk
> >
> >
> > | The short version. Shouldn't:
> > |
> > | $ R_LIBS_USER='' Rscript --vanilla -e ".libPaths()"
> > | [1] "/home/hb/R/x86_64-pc-linux-gnu-library/4.0"
> > | [2] "/home/hb/software/R-devel/trunk/lib/R/library"
> > |
> > | give the same output as:
> > |
> > | $ R_LIBS_USER="no_such_dir" Rscript --vanilla -e ".libPaths()"
> > | [1] "/home/hb/software/R-devel/trunk/lib/R/library"
> > |
> > | ?
> > |
> > | The long version:
> > |
> > | R_LIBS_SITE='no_such_dir' and R_LIBS_USER=''  is set up at the very
> > | end of tools:::setRlibs():
> > |
> > | setRlibs <-
> > | ...
> > | c(paste0("R_LIBS=", rlibs),
> > |   if(WINDOWS) " R_ENVIRON_USER='no_such_file'" else 
> > "R_ENVIRON_USER=''",
> > |   if(WINDOWS) " R_LIBS_USER='no_such_dir'" else "R_LIBS_USER=''",
> > |   " R_LIBS_SITE='no_such_dir'")
> > | }
> > |
> > | Monitoring with 'pstree' confirms this. On Linux with R 3.6.3, the
> > | call stack of a 'R CMD check --as-cran teeny_0.1.0.tar.gz' call looks
> > | like 

Re: [Rd] R CMD check --as-cran attempts to hide R_LIBS_USER but fails

2020-03-18 Thread Henrik Bengtsson
On Wed, Mar 18, 2020 at 8:04 PM Dirk Eddelbuettel  wrote:
>
>
> On 18 March 2020 at 19:19, Henrik Bengtsson wrote:
> | AFAIU, 'R CMD check --as-cran' tries to hide any site and user package
> | libraries by setting R_LIBS_SITE and R_LIBS_USER.  However, contrary
>
> What makes you think that? AFAIK --as-cran just sets a bunch of the (nearly
> countless) environment variables (all described in R Inst+Admin, as I recall)
> to a set of values "close to" values CRAN uses.

1. --as-cran sets R_LIBS_SITE='no_such_dir' and R_LIBS_USER='',
whereas without --as-cran they're not set.

2. --as-cran sets R_LIBS_USER='no_such_dir' on Windows and there it is
masked, i.e. tests scripts do NOT see user's personal library because
print(Sys.getenv("R_LIBS_USER")) is reported as "'no_such_dir'"
(sic!).

The only other interpretation I can imagine from using R_LIBS_USER=''
on Linux is that is exists there to force the default settings in case
it is already set externally by user.  For example, if one do

  export R_LIBS_USER="$PWD"
  R --vanilla CMD check --as-cran teeny_0.1.0.tar.gz

then tests scripts still get R_LIBS_USER="~/R/x86_64-pc-linux-gnu-library/3.6".

But, then why is there a difference between Windows and Linux in this
essential behavior?  To me, this suggests there is a mistake
somewhere.  OTH, I know that lots of oddities in R exist for a reason.

/Henrik

>
> | to R_LIBS_SITE, it fails for R_LIBS_USER and the user's personal
> | library is still available for test scripts.  Should I revise my
> | assumptions, or is that intentional?
>
> I would place a nickel on the former if betting was allowed in Illinois.
>
>   edd@rob:~$ Rscript --vanilla -e ".libPaths()"
>   [1] "/usr/local/lib/R/site-library" "/usr/lib/R/site-library"
>   [3] "/usr/lib/R/library"
>   edd@rob:~$ R_LIBS_USER='' Rscript --vanilla -e ".libPaths()"
>   [1] "/usr/local/lib/R/site-library" "/usr/lib/R/site-library"
>   [3] "/usr/lib/R/library"
>   edd@rob:~$
>
> I happen to turn per-user libraries off by default, which may affect things.
> That said, I actually quite like having the same paths. Your mileage, as they
> say, may vary.
>
> Dirk
>
>
> | The short version. Shouldn't:
> |
> | $ R_LIBS_USER='' Rscript --vanilla -e ".libPaths()"
> | [1] "/home/hb/R/x86_64-pc-linux-gnu-library/4.0"
> | [2] "/home/hb/software/R-devel/trunk/lib/R/library"
> |
> | give the same output as:
> |
> | $ R_LIBS_USER="no_such_dir" Rscript --vanilla -e ".libPaths()"
> | [1] "/home/hb/software/R-devel/trunk/lib/R/library"
> |
> | ?
> |
> | The long version:
> |
> | R_LIBS_SITE='no_such_dir' and R_LIBS_USER=''  is set up at the very
> | end of tools:::setRlibs():
> |
> | setRlibs <-
> | ...
> | c(paste0("R_LIBS=", rlibs),
> |   if(WINDOWS) " R_ENVIRON_USER='no_such_file'" else "R_ENVIRON_USER=''",
> |   if(WINDOWS) " R_LIBS_USER='no_such_dir'" else "R_LIBS_USER=''",
> |   " R_LIBS_SITE='no_such_dir'")
> | }
> |
> | Monitoring with 'pstree' confirms this. On Linux with R 3.6.3, the
> | call stack of a 'R CMD check --as-cran teeny_0.1.0.tar.gz' call looks
> | like this when a test script is running:
> |
> | `-sh /usr/lib/R/bin/check --as-cran teeny_0.1.0.tar.gz
> |  `-R --no-restore --slave --args nextArg--as-crannextArgteeny_0.1.0.tar.gz
> |   `-sh -c LANGUAGE=en _R_CHECK_INTERNALS2_=1
> | R_LIBS=/tmp/hb/RtmpQj4hXb/RLIBS_26e766e32c18 R_ENVIRON_USER=''
> | R_LIBS_USER=''  R_LIBS_SITE='no_such_dir' '/usr/lib/R/bin/R' --vanilla
> | --slave < '/tmp/hb/RtmpQj4hXb/file26e763770b6a'
> |`-R --vanilla --slave
> | `-sh -c LANGUAGE=C R_TESTS=startup.Rs '/usr/lib/R/bin/R' CMD BATCH
> | --vanilla  'env.R' 'env.Rout'
> |  `-sh /usr/lib/R/bin/BATCH --vanilla env.R env.Rout
> |   `-R -f env.R --restore --save --no-readline --vanilla
> |`-sh -c 'pstree' --arguments --long --show-parents 10558
> | `-pstree --arguments --long --show-parents 10558
> |
> | However, if I call print(Sys.getenv("R_LIBS_USER")) in my tests/env.R,
> | I'll find that it is no longer empty but it is indeed set to my
> | personal library "~/R/x86_64-pc-linux-gnu-library/3.6".
> |
> |
> | TROUBLESHOOTING:
> |
> | It looks like R_LIBS_USER is set if and only if it's empty by Renviron
> | in my system folder:
> |
> | $ grep R_LIBS < "$(Rscript -e "cat(file.path(R.home('etc'), 'Renviron'))")"
> | R_LIBS_USER=${R_LIBS_USER-'~/R/x86_64-pc-linux-gnu-library/3.6'}
> | #R_LIBS_USER=${R_LIBS_USER-'~/Library/R

[Rd] R CMD check --as-cran attempts to hide R_LIBS_USER but fails

2020-03-18 Thread Henrik Bengtsson
AFAIU, 'R CMD check --as-cran' tries to hide any site and user package
libraries by setting R_LIBS_SITE and R_LIBS_USER.  However, contrary
to R_LIBS_SITE, it fails for R_LIBS_USER and the user's personal
library is still available for test scripts.  Should I revise my
assumptions, or is that intentional?

The short version. Shouldn't:

$ R_LIBS_USER='' Rscript --vanilla -e ".libPaths()"
[1] "/home/hb/R/x86_64-pc-linux-gnu-library/4.0"
[2] "/home/hb/software/R-devel/trunk/lib/R/library"

give the same output as:

$ R_LIBS_USER="no_such_dir" Rscript --vanilla -e ".libPaths()"
[1] "/home/hb/software/R-devel/trunk/lib/R/library"

?

The long version:

R_LIBS_SITE='no_such_dir' and R_LIBS_USER=''  is set up at the very
end of tools:::setRlibs():

setRlibs <-
...
c(paste0("R_LIBS=", rlibs),
  if(WINDOWS) " R_ENVIRON_USER='no_such_file'" else "R_ENVIRON_USER=''",
  if(WINDOWS) " R_LIBS_USER='no_such_dir'" else "R_LIBS_USER=''",
  " R_LIBS_SITE='no_such_dir'")
}

Monitoring with 'pstree' confirms this. On Linux with R 3.6.3, the
call stack of a 'R CMD check --as-cran teeny_0.1.0.tar.gz' call looks
like this when a test script is running:

`-sh /usr/lib/R/bin/check --as-cran teeny_0.1.0.tar.gz
 `-R --no-restore --slave --args nextArg--as-crannextArgteeny_0.1.0.tar.gz
  `-sh -c LANGUAGE=en _R_CHECK_INTERNALS2_=1
R_LIBS=/tmp/hb/RtmpQj4hXb/RLIBS_26e766e32c18 R_ENVIRON_USER=''
R_LIBS_USER=''  R_LIBS_SITE='no_such_dir' '/usr/lib/R/bin/R' --vanilla
--slave < '/tmp/hb/RtmpQj4hXb/file26e763770b6a'
   `-R --vanilla --slave
`-sh -c LANGUAGE=C R_TESTS=startup.Rs '/usr/lib/R/bin/R' CMD BATCH
--vanilla  'env.R' 'env.Rout'
 `-sh /usr/lib/R/bin/BATCH --vanilla env.R env.Rout
  `-R -f env.R --restore --save --no-readline --vanilla
   `-sh -c 'pstree' --arguments --long --show-parents 10558
`-pstree --arguments --long --show-parents 10558

However, if I call print(Sys.getenv("R_LIBS_USER")) in my tests/env.R,
I'll find that it is no longer empty but it is indeed set to my
personal library "~/R/x86_64-pc-linux-gnu-library/3.6".


TROUBLESHOOTING:

It looks like R_LIBS_USER is set if and only if it's empty by Renviron
in my system folder:

$ grep R_LIBS < "$(Rscript -e "cat(file.path(R.home('etc'), 'Renviron'))")"
R_LIBS_USER=${R_LIBS_USER-'~/R/x86_64-pc-linux-gnu-library/3.6'}
#R_LIBS_USER=${R_LIBS_USER-'~/Library/R/3.6/library'}
# edd Jul 2007  Now use R_LIBS_SITE, not R_LIBS
R_LIBS_SITE=${R_LIBS_SITE-'/usr/local/lib/R/site-library:/usr/lib/R/site-library:/usr/lib/R/library'}

This is from installing R on Ubuntu 18.04 using 'apt install
r-base-core'.  To make sure it's not an issue with that distribution,
I also check a 'configure/make/make install' from SVN trunk and there
I see the same:

$ grep R_LIBS < "$(Rscript -e "cat(file.path(R.home('etc'), 'Renviron'))")"
R_LIBS_USER=${R_LIBS_USER-'~/R/x86_64-pc-linux-gnu-library/4.0'}
#R_LIBS_USER=${R_LIBS_USER-'~/Library/R/4.0/library'}

Printing it during tests/env.R confirms that it is indeed set to
"~/R/x86_64-pc-linux-gnu-library/4.0".

/Henrik

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] note about mispelled words

2020-03-17 Thread Henrik Bengtsson
You can single quote them to avoid them being spell checked, e.g. ...
using methods of 'Kruskall' and 'Brainerd'.  This is a common and
accepted practice.

This is hinted at in "Writing R Extensions" (e.g. help.start());

The mandatory ‘Description’ field should give a comprehensive
description of what the package does. One can use several (complete)
sentences, but only one paragraph. It should be intelligible to all
the intended readership (e.g. for a CRAN package to all CRAN users).
It is good practice not to start with the package name, ‘This package’
or similar. As with the ‘Title’ field, double quotes should be used
for quotations (including titles of books and articles), and single
quotes for non-English usage, including names of other packages and
external software. This field should also be used for explaining the
package name if necessary. URLs should be enclosed in angle brackets,
e.g. ‘’: see also Specifying URLs.

/Henrik

On Tue, Mar 17, 2020 at 8:30 AM Gianmarco Alberti
 wrote:
>
> Hello,
> I am checking a package of mine, and I got only 1 note regarding possibly 
> misspelled words in the DESCRIPTION.
>
> The issue I am facing is that those 6 words are not actually misspelled, 
> being either first or last names of individuals (actually, statistician; 
> e.g., Kruskall, Brainerd).
>
> Shall I have to do something (removing those; which does not make sense), or 
> upon submitting my new version of the package there is a way to make clear 
> that that note can be ignored?
>
> By the way, those names were already there in earlier versions of the package 
> and no note cropped out in those occasions.
>
> Thank you
> Best
> GmA
>
> 
> Dr Gianmarco Alberti (PhD Udine)
> Lecturer in Spatial Forensics
> Coordinator of the BA course in Criminology
> Department of Criminology
> Faculty for Social Wellbeing
> Room 332, Humanities B (FEMA)
> University of Malta, Msida, Malta (Europe) - MSD 2080
> tel +356 2340 3718
>
> Academic profiles
> https://www.researchgate.net/profile/Gianmarco_Alberti4
> https://malta.academia.edu/GianmarcoAlberti
>
> Google Scholar profile
> https://scholar.google.com/citations?user=tFrJKQ0J=en
>
> Correspondence Analysis website
> http://cainarchaeology.weebly.com/
>
> R packages on CRAN:
> CAinterprTools
> https://cran.r-project.org/web/packages/CAinterprTools/index.html
>
> GmAMisc
> https://cran.r-project.org/package=GmAMisc
>
> movecost
> https://cran.r-project.org/web/packages/movecost/index.html
> 
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] proper way to define an S4 method for 'plot'

2020-03-16 Thread Henrik Bengtsson
Maybe it's related to:

* The plot() S3 generic function is now in package base rather than
package graphics, as it is reasonable to have methods that do not use
the graphics package. The generic is currently re-exported from the
graphics namespace to allow packages importing it from there to
continue working, but this may change in future.

mentioned in the R 4.0.0 NEWS
(https://cran.r-project.org/doc/manuals/r-devel/NEWS.html)?

/Henrik

On Mon, Mar 16, 2020 at 3:52 PM Vincent Carey
 wrote:
>
> I just updated my R and I am getting into trouble with MLInterfaces
> maintenance.
>
> > BiocManager::install("MLInterfaces")
>
> *Bioconductor version 3.11 (BiocManager 1.30.10), R Under development
> (unstable)*
>
> *  (2020-03-15 r77975)*
>
> *Installing package(s) 'MLInterfaces'*
>
> *Warning: unable to access index for repository
> https://cran.rstudio.com/bin/macosx/el-capitan/contrib/4.0
> :*
>
> *  cannot open URL
> 'https://cran.rstudio.com/bin/macosx/el-capitan/contrib/4.0/PACKAGES
> '*
>
>
>   There is a binary version available but the source version is later:
>
>  binary source needs_compilation
>
> MLInterfaces 1.67.2 1.67.5 FALSE
>
>
> *installing the source package ‘MLInterfaces’*
>
>
> *trying URL
> 'https://bioconductor.org/packages/3.11/bioc/src/contrib/MLInterfaces_1.67.5.tar.gz
> '*
>
> *Content type 'application/x-gzip' length 1071876 bytes (1.0 MB)*
>
> *==*
>
> *downloaded 1.0 MB*
>
>
> * installing *source* package ‘MLInterfaces’ ...
>
> ** using staged installation
>
> ** R
>
> ** inst
>
> ** byte-compile and prepare package for lazy loading
>
> Error in getGeneric(f, TRUE, envir, package) :
>
>   no generic function found for ‘plot’
>
> Calls:  ... namespaceImportFrom -> .mergeImportMethods ->
>  -> getGeneric
>
> recover called non-interactively; frames dumped, use debugger() to view
>
> ** help
>
> *** installing help indices
>
> ** building package indices
>
> ** installing vignettes
>
> ** testing if installed package can be loaded from temporary location
>
> Error: package or namespace load failed for ‘MLInterfaces’ in getGeneric(f,
> TRUE, envir, package):
>
>  no generic function found for ‘plot’
>
>
> ...
>
>
> > sessionInfo()
>
> R Under development (unstable) (2020-03-15 r77975)
>
> Platform: x86_64-apple-darwin15.6.0 (64-bit)
>
> Running under: macOS Mojave 10.14.6
>
>
> Matrix products: default
>
> BLAS:
> /Library/Frameworks/R.framework/Versions/4.0/Resources/lib/libRblas.0.dylib
>
> LAPACK:
> /Library/Frameworks/R.framework/Versions/4.0/Resources/lib/libRlapack.dylib
>
>
> locale:
>
> [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
>
>
> attached base packages:
>
> [1] stats graphics  grDevices utils datasets  methods   base
>
>
> other attached packages:
>
> [1] rmarkdown_2.1
>
>
> loaded via a namespace (and not attached):
>
>  [1] BiocManager_1.30.10 compiler_4.0.0  startup_0.14.0
>
>  [4] tools_4.0.0 htmltools_0.4.0 Rcpp_1.0.3
>
>  [7] codetools_0.2-16knitr_1.28  xfun_0.12
>
> [10] digest_0.6.25   rlang_0.4.5 evaluate_0.14
>
> --
> The information in this e-mail is intended only for th...{{dropped:6}}

___
Bioc-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/bioc-devel


Re: [R-pkg-devel] Fwd: Re: [CRAN-pretest-archived] CRAN submission SSLR 0.9.0

2020-03-11 Thread Henrik Bengtsson
'fpalomares', it's not clear if you can reproduce this locally or not,
but make sure you can reproduce it locally.  I could using:

R CMD check --as-cran SSLR_0.9.0.tar.gz

with R 3.6.3 on Ubuntu 18.04.  In other words, this is neither a
problem with the CRAN incoming checks nor win-builder.

Yes, I agree, at a first glance, this looks odd because:

> library(SSLR)
> fit
function (object, ...)
{
UseMethod("fit")
}


> methods("fit")
[1] fit,onlearn-method fit.model_sslr
see '?methods' for accessing help and source code

Neither

$ Rscript -e 'example("COREG", package = "SSLR")'

nor

> example("COREG", package = "SSLR")

in an interactive session produce the error.

For troubleshooting, I'd added a:

print(methods("fit"))

before

m <- COREG(max.iter = 2) %>% fit(class ~ ., data = train)

I'd also try rewriting that without %>%, e.g. something like

a <- COREG(max.iter = 2)
b <- fit(class ~ a, data = train)

to see if that makes a difference.

My $.02

/Henrik

On Wed, Mar 11, 2020 at 8:13 AM Duncan Murdoch  wrote:
>
> Uwe Ligges answered you yesterday on this question.
>
> Duncan Murdoch
>
> On 11/03/2020 7:39 a.m., fpaloma...@correo.ugr.es wrote:
> > Hi,
> >
> > I dont know how to fix this problem:
> > https://win-builder.r-project.org/VK8P07E1QHFA/
> >
> > The error is:
> >
> > Error in UseMethod("fit") :
> > no applicable method for 'fit' applied to an object of class
> > "model_sslr"
> >
> > I check with my platform Windows 10 Home and I dont have problems
> >
> > The code used is:
> >
> > #' fit
> > #' @title fit object
> > #' @param object object
> > #' @param ... other parameters to be passed
> > #' @export
> > fit <- function(object, ...){
> > UseMethod("fit")
> > }
> >
> > #' @title Fit with formula and data
> > #' @description Funtion to fit through the formula
> > #' @param object is the model
> > #' @param formula is the formula
> > #' @param data is the total data train
> > #' @param ... unused in this case
> > #' @importFrom rlang quos
> > #' @export fit.model_sslr
> > #' @export
> > fit.model_sslr <- function(object, formula = NULL, data = NULL, ...) {
> >
> > dots <- quos(...)
> >
> >
> > if (all(c("x", "y") %in% names(dots)))
> >   rlang::abort("`fit.model_sslr()` is for the formula methods. Use
> > `fit_xy()` instead.")
> >
> > fit_form_interface <- check_form_interface(formula, data)
> >
> > x_and_y <- get_x_y(formula, data)
> >
> > eval_env <- rlang::env()
> > eval_env$x <- x_and_y$x
> > eval_env$y <- x_and_y$y
> >
> > fit_interface <- check_xy_interface(eval_env$x, eval_env$y)
> >
> > elapsed <- system.time(model <- object$fit_function(eval_env$x,
> > eval_env$y))
> >
> >
> > new_model_sslr_fitted(model,class(model),object$args,colnames(eval_env$x),elapsed,formula)
> >
> > }
> >
> >
> > In my NAMESPACE we have this:
> >
> > # Generated by roxygen2: do not edit by hand
> >
> > S3method(fit,model_sslr)
> > S3method(fit_x_u,model_sslr)
> > S3method(fit_xy,model_sslr)
> > S3method(predict,OneNN)
> > S3method(predict,model_sslr_fitted)
> > S3method(predict,snnrceG)
> > export(COREG)
> > export(EMLeastSquaresClassifierSSLR)
> > export(EMNearestMeanClassifierSSLR)
> > export(EntropyRegularizedLogisticRegressionSSLR)
> > export(LaplacianSVMSSLR)
> > export(LinearTSVMSSLR)
> > export(MCNearestMeanClassifierSSLR)
> > export(SSLRDecisionTree)
> > export(SSLRRandomForest)
> > export(TSVMSSLR)
> > export(USMLeastSquaresClassifierSSLR)
> > export(WellSVMSSLR)
> > export(coBC)
> > export(coBCCombine)
> > export(coBCG)
> > export(democratic)
> > export(democraticCombine)
> > export(democraticG)
> > export(fit)
> > export(fit.model_sslr)
> > export(fit_x_u)
> > export(fit_x_u.model_sslr)
> > export(fit_xy)
> > export(fit_xy.model_sslr)
> > export(newDecisionTree)
> > export(oneNN)
> > export(selfTraining)
> > export(selfTrainingG)
> > export(setred)
> > export(setredG)
> > export(snnrce)
> > export(train_generic)
> > export(triTraining)
> > export(triTrainingCombine)
> > export(triTrainingG)
> > exportClasses(nullOrNumericOrCharacter)
> > exportMethods(predict)
> > exportMethods(predict_inputs)
> > import(stats)
> > importFrom(RANN,nn2)
> > importFrom(RSSL,EMLeastSquaresClassifier)
> > importFrom(RSSL,EMNearestMeanClassifier)
> > importFrom(RSSL,EntropyRegularizedLogisticRegression)
> > importFrom(RSSL,LaplacianSVM)
> > importFrom(RSSL,LinearTSVM)
> > importFrom(RSSL,MCNearestMeanClassifier)
> > importFrom(RSSL,TSVM)
> > importFrom(RSSL,USMLeastSquaresClassifier)
> > importFrom(RSSL,WellSVM)
> > importFrom(dplyr,as_tibble)
> > importFrom(dplyr,tibble)
> > importFrom(foreach,"%dopar%")
> > importFrom(magrittr,"%>%")
> > importFrom(methods,new)
> > importFrom(parsnip,nearest_neighbor)
> > importFrom(parsnip,set_engine)
> > importFrom(plyr,is.formula)
> > importFrom(purrr,map)
> > importFrom(rlang,quos)
> > importFrom(stats,predict)
> > importFrom(utils,tail)
> > useDynLib(SSLR)
> >
> > We have:
> > S3method(fit,model_sslr)
> > ...
> > export(fit)
> > 

Re: [R-pkg-devel] Winbuilder queues jammed again?

2020-03-05 Thread Henrik Bengtsson
1. I'd guess it helps Uwe a bit you clarify exactly which queue you think
is stuck - otherwise he has to check them all. They're independent.

2. You can look at the different win-builder queues yourself via ftp, see
https://stat.ethz.ch/pipermail/r-package-devel/2020q1/005098.html

/Henrik

On Thu, Mar 5, 2020, 14:14 Ben Bolker  wrote:

> Maybe there's something queriable similar to the CRAN queue?  (In an
> ideal world this could even be incorporated into F Michonneau's
> foghorn package ...)
>
>It's probably been suggested already in this thread, but perhaps
> rhub would work for you as an alternative?
>
> On Thu, Mar 5, 2020 at 4:35 PM Rolf Turner 
> wrote:
> >
> >
> > Sorry to be a pest, but I submitted a package to winbuilder more than 24
> > hours ago, and nothing has come back to me.  For a while (a few days
> > ago) I was getting about a 20 minute turnaround.
> >
> > One gets dependant on facilities such as winbuilder and gets frustrated
> > when they don't perform quite as expected.
> >
> > I know this sounds demanding (in respect of a free service that is
> > provided entirely due to Uwe's good graces), but, said he plaintively,
> > is there any way that some sort of indication of the expected wait time
> > could be made available?  Or an indication of the length of the queue,
> > or something like that?
> >
> > Even an indication that one's submission is still in the queue and
> > hasn't disappeared into a black hole in cyberspace, would be reassuring.
> >
> > Apropos of the latter --- I say that I submitted a package, but in my
> > state of advanced senility I can't be sure.  Maybe I just *intended* to,
> > but then forgot, or messed up the submission procedure, or buggered
> > something else up.  There seems to be no way to check that I really did
> > make a submission.  Such a facility would be, uh, nice.  Said he,
> > plaintively.
> >
> > cheers,
> >
> > Rolf Turner
> >
> > --
> > Honorary Research Fellow
> > Department of Statistics
> > University of Auckland
> > Phone: +64-9-373-7599 ext. 88276
> >
> > __
> > R-package-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] [FORGED] Re: win-builder down?

2020-02-29 Thread Henrik Bengtsson
FYI, in past when queues got stuck, Uwe told me it's often a package that
launches an external process (e.g. java.exe) that doesn't terminate and
prevents R CMD check from terminating/timing out. He got watchdogs to kill
such stray events but there are false negatives.

A couple of weeks ago I proposed to add a placeholder for the package
currently tested and Uwe said he might look into it, e.g.

$ curl ftp://win-builder.r-project.org/R-devel/
02-05-20  03:10PM  2359737 bayestestR_0.5.2.tar.gz
02-05-20  11:42AM  0
some.pkg_0.0.1.tar.gz-PROCESSING
02-05-20  11:56AM  5053108 catdata_1.2.2.tar.gz

Even if it doesn't solve the problem, it'll communicate to "us" that
something is stuck and what it is.

Anyway, we can help out by reporting (like this) when a queue look stuck.
Then I think it helps to clarify which queue/builder is stuck.

Henrik


On Sat, Feb 29, 2020, 15:13 Rolf Turner  wrote:

>
> On 1/03/20 11:44 am, Max Kuhn wrote:
>
> > On February 29, 2020 at 5:06:35 PM, Rolf Turner (r.tur...@auckland.ac.nz
> > ) wrote:
> >>
> >> On 1/03/20 2:23 am, Hadley Wickham wrote:
> >>
> >> > Is it down again? I'm seeing the same problem again.
> >> > Hadley
> >> >
> >> > On Sat, Feb 22, 2020 at 2:41 PM Hadley Wickham  > wrote:
> >> >>
> >> >> Hi all,
> >> >>
> >> >> Is win-builder down? I submitted a couple of packages >24 hours ago,
> >> >> and haven't heard back.
> >> >>
> >> >> Hadley
> >>
> >> Me too. Submitted a package about 18 hours ago, and so far not a
> >> sausage. Although it's churlish to complain, one gets used to a service
> >> that has been provided and gets annoyed when the service disappears.
> >
> > True, but it would be unnecessary if `R CMD check —as-cran` did exactly
> > the same thing as win-builder (as well as the extra, informal checks
> > done on a first submission).
>
> Not entirely true.  The package that I am currently trying to get built
> is for use by some consulting clients and is not (yet) for public
> release.  So I can't/don't submit it to CRAN so as to get a Windoze
> binary that way.
>
> cheers,
>
> Rolf
>
> P.S. I note that the winbuilder web site says:
>
> >  Please do not upload Bioconductor packages or CRAN packages.
> >  Both Bioconductor and CRAN do have build systems 
>
> I presume that this exhortation is of some antiquity and is "no longer
> operative".
>
> R.
>
> --
> Honorary Research Fellow
> Department of Statistics
> University of Auckland
> Phone: +64-9-373-7599 ext. 88276
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] tcl problem with R-3.6.3?

2020-02-29 Thread Henrik Bengtsson
Here's a simpler example that should reproduce that error for you:

  ans <- utils::select.list(c("hello", "world", "again"), graphics=TRUE)

Does it?

FYI, I installed R 3.6.3 from source on CentOS 7 a few hours ago, and
for me the above works just fine.

For your immediate needs of selecting a CRAN mirror, you can set:

options(menu.graphics = FALSE)

as a workaround to skip Tcl-based menus.

/Henrik

On Sat, Feb 29, 2020 at 10:01 AM Charles Geyer  wrote:
>
> Just built 3.6.3 from source and tcl doesn't work.  Worked fine with the
> same laptop in 3.6.2.  Here's the exact error.
>
> blurfle$ R --vanilla
>
> R version 3.6.3 (2020-02-29) -- "Holding the Windsock"
> Copyright (C) 2020 The R Foundation for Statistical Computing
> Platform: x86_64-pc-linux-gnu (64-bit)
>
> R is free software and comes with ABSOLUTELY NO WARRANTY.
> You are welcome to redistribute it under certain conditions.
> Type 'license()' or 'licence()' for distribution details.
>
>   Natural language support but running in an English locale
>
> R is a collaborative project with many contributors.
> Type 'contributors()' for more information and
> 'citation()' on how to cite R or R packages in publications.
>
> Type 'demo()' for some demos, 'help()' for on-line help, or
> 'help.start()' for an HTML browser interface to help.
> Type 'q()' to quit R.
>
> > sessionInfo()
> R version 3.6.3 (2020-02-29)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Ubuntu 18.04.4 LTS
>
> Matrix products: default
> BLAS:   /home/geyer/local/current/lib/R/lib/libRblas.so
> LAPACK: /home/geyer/local/current/lib/R/lib/libRlapack.so
>
> locale:
>  [1] LC_CTYPE=en_US.UTF-8   LC_NUMERIC=C
>  [3] LC_TIME=en_US.UTF-8LC_COLLATE=en_US.UTF-8
>  [5] LC_MONETARY=en_US.UTF-8LC_MESSAGES=en_US.UTF-8
>  [7] LC_PAPER=en_US.UTF-8   LC_NAME=C
>  [9] LC_ADDRESS=C   LC_TELEPHONE=C
> [11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
>
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base
>
> loaded via a namespace (and not attached):
> [1] compiler_3.6.3
> > install.packages("aster")
> --- Please select a CRAN mirror for use in this session ---
> Error in structure(.External(.C_dotTclObjv, objv), class = "tclObj") :
>   [tcl] grab failed: window not viewable.
> > q()
>
> What's up with that?
>
> --
> Charles Geyer
> Professor, School of Statistics
> Resident Fellow, Minnesota Center for Philosophy of Science
> University of Minnesota
> char...@stat.umn.edu
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] win-builder down?

2020-02-29 Thread Henrik Bengtsson
Could be.  FYI, I think the different win-builder queues are
independent, i.e. when one get stuck it does not affect the others.
For example, I just verified that R-devel_gcc8 works.


$ date --iso-8601=seconds
2020-02-29T07:41:53-08:00


$ curl -s ftp://win-builder.r-project.org/R-oldrel/ | sed -E
's/(AM|PM)/\t\1/g' | sort -k 1 -k 3 -k 2
[empty]


$ curl -s ftp://win-builder.r-project.org/R-release/ | sed -E
's/(AM|PM)/\t\1/g' | sort -k 1 -k 3 -k 2
02-28-20  06:35 PM   129480 miraculix_0.9.19.1.tar.gz
02-28-20  06:41 PM   964211 wbstats_1.0.0.tar.gz
02-28-20  07:16 PM  2805713 SWMPrExtension_1.1.3.tar.gz
...
02-29-20  12:25 PM  2256595 lidR_2.2.3.tar.gz


$ curl -s ftp://win-builder.r-project.org/R-devel/ | sed -E
's/(AM|PM)/\t\1/g' | sort -k 1 -k 3 -k 2
02-28-20  07:59 PM   861233 tune_0.1.0.tar.gz
02-28-20  08:02 PM  4259662 qtl_1.46-2.tar.gz
02-28-20  08:07 PM   259716 forcats_0.4.0.9000.tar.gz
...
02-29-20  12:44 PM32053 PostcodesioR_0.2.0.tar.gz


$ curl -s ftp://win-builder.r-project.org/R-devel_gcc8/ | sed -E
's/(AM|PM)/\t\1/g' | sort -k 1 -k 3 -k 2
[empty]

/Henrik

On Sat, Feb 29, 2020 at 5:23 AM Hadley Wickham  wrote:
>
> Is it down again? I'm seeing the same problem again.
> Hadley
>
> On Sat, Feb 22, 2020 at 2:41 PM Hadley Wickham  wrote:
> >
> > Hi all,
> >
> > Is win-builder down? I submitted a couple of packages >24 hours ago,
> > and haven't heard back.
> >
> > Hadley
> >
> > --
> > http://hadley.nz
>
>
>
> --
> http://hadley.nz
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] methods package: A _R_CHECK_LENGTH_1_LOGIC2_=true error

2020-02-14 Thread Henrik Bengtsson
I still observe this error and just want to ping this thread so we
don't forget it. Should I add this to
https://bugs.r-project.org/bugzilla/ so it's tracked there?

This thread in the archives:

* https://stat.ethz.ch/pipermail/r-devel/2019-June/078049.html
* https://stat.ethz.ch/pipermail/r-devel/2019-July/078115.html
* https://stat.ethz.ch/pipermail/r-devel/2019-July/078126.html

/Henrik

On Sat, Jun 29, 2019 at 1:45 PM Martin Maechler
 wrote:
>
> >>>>> Martin Maechler
> >>>>> on Sat, 29 Jun 2019 12:05:49 +0200 writes:
>
> >>>>> Martin Maechler
> >>>>> on Sat, 29 Jun 2019 10:33:10 +0200 writes:
>
> >>>>> peter dalgaard
> >>>>> on Fri, 28 Jun 2019 16:20:03 +0200 writes:
>
> >>> > On 28 Jun 2019, at 16:03 , Martin Maechler 
>  wrote:
> >>> >
> >>> >>>>>> Henrik Bengtsson
> >>> >>>>>>on Thu, 27 Jun 2019 16:00:39 -0700 writes:
> >>> >
> >>> >> Using:
> >>> >>
> >>> >> untrace(methods::conformMethod)
> >>> >> at <- c(12,4,3,2)
> >>> >> str(body(methods::conformMethod)[[at]])
> >>> >> ## language omittedSig <- omittedSig && (signature[omittedSig] != 
> "missing")
> >>> >> cc <- 0L
> >>> >> trace(methods::conformMethod, tracer = quote({
> >>> >>  cc <<- cc + 1L
> >>> >>  print(cc)
> >>> >>  if (cc == 31) {  ## manually identified
> >>> >>untrace(methods::conformMethod)
> >>> >>trace(methods::conformMethod, at = list(at), tracer = quote({
> >>> >>  str(list(signature = signature, mnames = mnames, fnames = 
> fnames))
> >>> >>  print(ls())
> >>> >>  try(str(list(omittedSig = omittedSig, signature = signature)))
> >>> >>}))
> >>> >>  }
> >>> >> }))
> >>> >> loadNamespace("oligo")
> >>> >>
> >>> >> gives:
> >>> >>
> >>> >> Untracing function "conformMethod" in package "methods"
> >>> >> Tracing function "conformMethod" in package "methods"
> >>> >> Tracing conformMethod(signature, mnames, fnames, f, fdef, 
> definition)
> >>> >> step 12,4,3,2
> >>> >> List of 3
> >>> >> $ signature: Named chr [1:4] "TilingFeatureSet" "ANY" "ANY" "array"
> >>> >>  ..- attr(*, "names")= chr [1:4] "object" "subset" "target" "value"
> >>> >>  ..- attr(*, "package")= chr [1:4] "oligoClasses" "methods" 
> "methods" "methods"
> >>> >> $ mnames   : chr [1:2] "object" "value"
> >>> >> $ fnames   : chr [1:4] "object" "subset" "target" "value"
> >>> >> [1] "f"  "fdef"   "fnames" "fsig"   "imf"
> >>> >> [6] "method" "mnames" "omitted""omittedSig" "sig0"
> >>> >> [11] "sigNames"   "signature"
> >>> >> List of 2
> >>> >> $ omittedSig: logi [1:4] FALSE TRUE TRUE FALSE
> >>> >> $ signature : Named chr [1:4] "TilingFeatureSet" "ANY" "ANY" 
> "array"
> >>> >>  ..- attr(*, "names")= chr [1:4] "object" "subset" "target" "value"
> >>> >>  ..- attr(*, "package")= chr [1:4] "oligoClasses" "methods" 
> "methods" "methods"
> >>> >> Error in omittedSig && (signature[omittedSig] != "missing") :
> >>> >>  'length(x) = 4 > 1' in coercion to 'logical(1)'
> >>> >> Error: unable to load R code in package 'oligo'
> >>> >>
> >>> >
> >>> > Thank you, Henrik, nice piece of using trace() .. and the above
> >>> > is useful for solving the issue --  I c

Re: [Bioc-devel] Bioconductor Git: Online interface

2020-02-11 Thread Henrik Bengtsson
On Tue, Feb 11, 2020 at 2:42 PM Martin Morgan  wrote:
>
> (sending again from / to an appropriate email address, sorry for the noise)
>
> Henrik -- I appreciate the ease with which gitea can be deployed in this 
> one-off solution but cynically think that a real deployment would introduce 
> significant work, e.g., re-tooling our approach to new package addition, 
> management of user credentials, and integration of the nightly build system.

Among those examples, the only thing I see would be the workload of
adding/removing packages from Gitea so that it reflects what's on the
Git server.  Since Gitea can automatically sync with a Git server via
an URL, I don't see how things such as nightly builds etc come into
play.  User credentials should also not matter.  There should be no
backup needs other than possible one admin account with Swagger API
access.

>
> If you'd like to work on a more complete (off-line, so as not to present a 
> confusion of interfaces for the user and developer) implementation I'd be 
> happy to provide some pointers to the major challenges.

I proposed Gitea because I know it well now, it has GitHub like
features that people already knows about (e.g. linking to code
snippets 
https://gitea.com/hb/aroma.light/src/branch/master/R/iwpca.R#L147-L152),
and it has the more potentials/features if you choose to go down that
route. I know there are other Git-to-webpage tools out there, which
are purely designed for viewing. It might be that those are a safer
route to go down. Given how easy it is to set up Gitea for this, and
Gitea has much more features, I'd be surprise if it wouldn't be as
easy for those tools too.  It could be that they're zero-config, e.g.
point to the folder where the git repositories live and you're ready
to go live.

Feel free to forward discussions/ideas (better if already public
somewhere; there might be others who can pitch in too), but
unfortunately I don't have much spare cycles to work on this.

Being able to browse code and link to code when reaching out to
maintainers is something Bioconductor is really missing right now.
This would lower the friction for contributing to packages that you
don't work on on a regular basis, e.g. typos, bug fixes, etc.

Henrik

>
> Martin
>
> On 2/11/20, 5:16 PM, "Bioc-devel on behalf of Henrik Bengtsson" 
>  
> wrote:
>
> I wanna revive this old thread.
>
> I've used Gitea for internal git/issue trackers at the UCSF for quite
> a while now and it works really well.  I've also looked into how easy
> it would be to use it for pure code exposure and it's pretty
> straightforward.  Gitea even has built-in tools for automatically
> synchronize toward an existing git server *and* keep it up-to-date,
> which makes the process even easier(*).  An example what this looks
> like is:
>
>   https://gitea.com/hb/aroma.light
>
> That took me literally 15 seconds to set up.  Note how this is almost
> a bare bone *read-only* git code browser, e.g. there's no issue
> tracker.  Right now, it would require a teeny hack on the Gitea server
> to have the 'git clone' URL to map to the official Bioconductor URL
> (or to hide it completely).  It's also possible to link the 'Issue
> Tracker' to, say, GitHub issues based on what's in BugReports:, e.g.
>
>   https://gitea.com/hb/QDNAseq
>
> Note that the above read-only approach completely avoids having to map
> or maintain user accounts; it's a pure read-only online viewer of the
> Bioconductor git repositories.  (Depending on your setup, it might
> even be that you can expose the Gitea interface as
> https://git.bioconductor.org/ and have
> https://git.bioconductor.org/packages/ take you to the Gitea
> package page.)  There's a Swagger API that makes it possible to
> automate everything, e.g. creating new repositories for new Bioc
> packages.
>
> I'd classify this as a low risk and straightforward project to implement.
>
> /Henrik
>
> (*) Technical details: The most efficient approach would probably be
> to link to the existing git repositories via the file system, rather
> than going over ssh/https. Linking via the file system avoids any
> duplication.  The storage load of running such a Gitea instance would
> be very very small.  The Gitea instance don't need write permissions
> to the repositories, so there is no risk of Gitea messing up existing
> git repositories.
>
> On Thu, Oct 26, 2017 at 1:51 AM Martin Morgan
>  wrote:
> >
> > There has been previous discussion about this.
> >
> >https://stat.ethz.ch/pipermail/bioc-devel/2017-September/011455.html
> >
> > It is not in our short-term plan

  1   2   3   4   5   6   7   8   9   10   >