Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-22 Thread Dirk Eddelbuettel


On 22 May 2024 at 13:54, Nixon, Michelle Pistner wrote:
| Thank you both for your responses and help! Kurt-- your message makes a lot of
| sense. I'll try to debug soon and will reach out if I have more questions.

Interesting.

Kurt, is there a recommended way to test for this (rare, I may add) case of
'OpenMP present but usage verboten by R' ?  I do not see an option for 'R CMD
config' jumping out, and there is no pkg-config file either.

Testing via 'nm' as you show is possible but not exactly 'portable'.  So any
suggestions as to what to condition on here?  Michelle did AFAICT the Right
Thing (TM) by 'borrowing' from the fairly mature check in RcppArmadillo.

Dirk

| 
| Thanks,
| Michelle
| 
━━━
| From: Kurt Hornik 
| Sent: Wednesday, May 22, 2024 3:57 AM
| To: Dirk Eddelbuettel 
| Cc: Nixon, Michelle Pistner ; r-package-devel@r-project.org
| ; Kurt Hornik 
| Subject: Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang
| with OpenMP
|  
| [You don't often get email from kurt.hor...@wu.ac.at. Learn why this is
| important at https://aka.ms/LearnAboutSenderIdentification ]
| 
| >>>>> Dirk Eddelbuettel writes:
| 
| Friends, the Debian pretest check system uses LLVM 18 and has the
| corresponding OpenMP headers and libraries installed, but R is
| configured not to use these.  However, the configure test in fido does
| 
| cat < test-omp.cpp
| #include 
| int main() {
|   return omp_get_num_threads();
| }
| EOF
| 
| ## Execute R CMD SHLIB.
| "${R_HOME}/bin/R" CMD SHLIB test-omp.cpp >/dev/null 2>&1
| if test x"$?" = x"0"; then
| AC_MSG_RESULT([yes])
| openmp_already_works="yes"
| else
| AC_MSG_RESULT([no])
| fi
| 
| which does not work as intended: R CMD SHLIB will happily link but the
| .so will end up with an unresolved symbol:
| 
| $ nm test-omp.so | grep omp_
|  U omp_get_num_threads
| 
| Hth
| -k
| 
| > Hi Michelle,
| 
| > On 21 May 2024 at 13:46, Nixon, Michelle Pistner wrote:
| > | Hi all,
| > |
| > | I'm running into build issues for my package (fido: https://
| nam10.safelinks.protection.outlook.com/?url=
| https%3A%2F%2Fgithub.com%2Fjsilve24%2Ffido=
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500263726%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| =C4BI6%2FUgTjd%2FCRPzBF8NuUkPvlQkIxb2r1%2BzIqKNPpE%3D=0) on the
| r-devel-linux-x86_64-debian-clang system on CRAN (full check log here: 
https://
| nam10.safelinks.protection.outlook.com/?url=
| 
https%3A%2F%2Fwin-builder.r-project.org%2Fincoming_pretest%2Ffido_1.1.0_20240515_211644%2FDebian%2F00install.out
| =
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500272862%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| =TKpQrkImemwz%2FuK7yYqCLEECU6WCCvgYwcrAId%2Bpmb0%3D=0). fido
| relies on several of the Rcpp packages, and I think the error is due to how
| OpenMP is set up in our package. The error in question states:
| > |
| > | "Error: package or namespace load failed for �fido� in dyn.load(file,
| DLLpath = DLLpath, ...):
| > |  unable to load shared object 
'/home/hornik/tmp/R.check/r-devel-clang/Work/
| build/Packages/00LOCK-fido/00new/fido/libs/fido.so':
| > |   /home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/
| 00new/fido/libs/fido.so: undefined symbol: omp_get_thread_num"
| > |
| > | I've had a hard time recreating the error, as I can successfully get the
| package to build on other systems (GitHub action results here: https://
| nam10.safelinks.protection.outlook.com/?url=
| https%3A%2F%2Fgithub.com%2Fjsilve24%2Ffido%2Factions=
| 
05%7C02%7Cmap5672%40psu.edu%7C7993944f52ae4a017d3d08dc7a34d1d6%7C7cf48d453ddb4389a9c1c115526eb52e%7C0%7C0%7C638519614500275695%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C0%7C%7C%7C
| =zXCul23gWAevfEhCsfyJg8KewU8fjSuy1qZgZemNy7M%3D=0) including a
| system using the same version of R/clang as the failing CRAN check. Looking at
| the logs between the two, the major difference is the lack of -fopenmp in the
| compiling function on the CRAN version (which is there on the r-hub check
| version with the same specifications):
| > |
| > | (From the CRAN version) clang++-18 -std=gnu++17 -shared 
-L/home/hornik/tmp/
| R-d-clang-18/lib -Wl,-O1 -o fido.so ConjugateLinearModel.o
| MaltipooCollapsed_LGH.o MaltipooCollapsed_Optim.o MatrixAlgebra.o
| PibbleCollapsed_LGH.o PibbleCollapsed_Optim.o PibbleCollapsed_Uncollapse.o
| PibbleCollapsed_Uncollapse_sigmaKnown.o RcppExports.o SpecialFunctions.o
| test_LaplaceApproximation.o test_MultDirichletBoot.o test_utils.o -L/h

Re: [R-pkg-devel] handling documentation build tools

2024-05-21 Thread Dirk Eddelbuettel


As lyx is not listed in 'Writing R Extensions', the one (authorative) manual
describing how to build packages for R, I would not assume it to be present
on every CRAN machine building packages. Also note that several user recently
had to ask here how to deal with less common fonts for style files for
(pdf)latex.

So I would recommend 'localising' the pdf creation to your own machine, and
to ship the resulting pdf. You can have pre-made pdfs as core of a vignette,
I trick I quite like to make package building simpler and more robust.  See
https://www.r-bloggers.com/2019/01/add-a-static-pdf-vignette-to-an-r-package/
for details.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Compile issues on r-devel-linux-x86_64-debian-clang with OpenMP

2024-05-21 Thread Dirk Eddelbuettel

Hi Michelle,

On 21 May 2024 at 13:46, Nixon, Michelle Pistner wrote:
| Hi all,
| 
| I'm running into build issues for my package (fido: 
https://github.com/jsilve24/fido) on the r-devel-linux-x86_64-debian-clang 
system on CRAN (full check log here: 
https://win-builder.r-project.org/incoming_pretest/fido_1.1.0_20240515_211644/Debian/00install.out).
 fido relies on several of the Rcpp packages, and I think the error is due to 
how OpenMP is set up in our package. The error in question states:
| 
| "Error: package or namespace load failed for �fido� in dyn.load(file, DLLpath 
= DLLpath, ...):
|  unable to load shared object 
'/home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/00new/fido/libs/fido.so':
|   
/home/hornik/tmp/R.check/r-devel-clang/Work/build/Packages/00LOCK-fido/00new/fido/libs/fido.so:
 undefined symbol: omp_get_thread_num"
| 
| I've had a hard time recreating the error, as I can successfully get the 
package to build on other systems (GitHub action results here: 
https://github.com/jsilve24/fido/actions) including a system using the same 
version of R/clang as the failing CRAN check. Looking at the logs between the 
two, the major difference is the lack of -fopenmp in the compiling function on 
the CRAN version (which is there on the r-hub check version with the same 
specifications):
| 
| (From the CRAN version) clang++-18 -std=gnu++17 -shared 
-L/home/hornik/tmp/R-d-clang-18/lib -Wl,-O1 -o fido.so ConjugateLinearModel.o 
MaltipooCollapsed_LGH.o MaltipooCollapsed_Optim.o MatrixAlgebra.o 
PibbleCollapsed_LGH.o PibbleCollapsed_Optim.o PibbleCollapsed_Uncollapse.o 
PibbleCollapsed_Uncollapse_sigmaKnown.o RcppExports.o SpecialFunctions.o 
test_LaplaceApproximation.o test_MultDirichletBoot.o test_utils.o 
-L/home/hornik/tmp/R-d-clang-18/lib -lR
| 
| My initial thought was an issue in the configure scripts (which we borrowed 
heavily from RcppArmadillo but made slight changes to (which is the most likely 
cause if there is issue here)) or that there is some mismatch somewhere as to 
whether or not OpenMP is available, but there isn't an obvious bug to me.
| 
| Any guidance on how to debug would be greatly appreciated!

I seem to recall that that machine is 'known-bad' for OpenMP due to the
reliance on clang-18 which cannot (?) build with it.  Might be best to
contact Kurt Hornik (CC'ed) and/or CRAN.

Best, Dirk

 
| Thanks,
| Michelle
| 
| Michelle Nixon, PhD
| 
| Assistant Research Professor
| College of Information Sciences and Technology
| The Pennsylvania State University
| 
|   [[alternative HTML version deleted]]
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN packages dependency on bioconductor packages

2024-05-16 Thread Dirk Eddelbuettel


On 16 May 2024 at 05:34, Duncan Murdoch wrote:
| I forget now, but presumably the thinking at the time was that Suggested 
| packages would always be available for building and checking vignettes.

Yes. I argued for years (cf https://dirk.eddelbuettel.com/blog/2017/03/22/
from seven (!!) years ago) and CRAN is slowly moving away from that implicit
'always there' guarantee to prefering explicit enumerations -- and now even
tests via the NoSuggests flavour.

As Uwe stated in this thread, having the vignette dependencies both in
Suggests as well as in the VignetteHeader should do. And it is the Right
Thing (TM) to do.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Overcoming CRAN's 5mb vendoring requirement

2024-05-09 Thread Dirk Eddelbuettel


Software Heritage (see [1] for their website and [2] for a brief intro I gave
at useR! 2019 in Toulouse) covers GitHub and CRAN [3]. It is by now 'in
collaboration with UNESCO', supported by a long and posh list of sponsors [4]
and about as good as it gets to 'ensure longevity of artifacts'.

It is of course not meant for downloads during frequent builds.

But given the 'quasi-institutional nature' and sponsorship, we could think of
using GitHub as an 'active cache'. But CRAN is CRAN and as it now stands
GitHub is not trusted.  ¯\_(ツ)_/¯

Dirk


[1] https://www.softwareheritage.org/
[2] https://dirk.eddelbuettel.com/papers/useR2019_swh_cran_talk.pdf
[3] https://www.softwareheritage.org/faq/ question 2.1
[4] https://www.softwareheritage.org/support/sponsors/
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Fast Matrix Serialization in R?

2024-05-08 Thread Dirk Eddelbuettel


On 9 May 2024 at 03:20, Sameh Abdulah wrote:
| I need to serialize and save a 20K x 20K matrix as a binary file.

Hm that is an incomplete specification: _what_ do you want to do with it?
Read it back in R?  Share it with other languages (like Python) ? I.e. what
really is your use case?  Also, you only seem to use readBin / writeBin. Why
not readRDS / saveRDS which at least give you compression?

If it is to read/write from / to R look into the qs package. It is good. The
README.md at its repo has benchmarks: https://github.com/traversc/qs If you
want to index into the stored data look into fst. Else also look at databases

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Overcoming CRAN's 5mb vendoring requirement

2024-05-08 Thread Dirk Eddelbuettel


On 8 May 2024 at 11:02, Josiah Parry wrote:
| CRAN has rejected this package with:
| 
| *   Size of tarball: 18099770 bytes*
| 
| *Please reudce to less than 5 MB for a CRAN package.*

Are you by chance confusing a NOTE (issued, but can be overruled) with a
WARNING (more severe, likely a must-be-addressed) or ERROR?

There are lots and lots of packages larger than 5mb -- see eg

   https://cran.r-project.org/src/contrib/?C=S;O=D

which has a top-5 of

   rcdklibs   19mb
   fastrmodels15mb
   prqlr  15mb
   RFlocalfdr 14mb
   acss.data  14mb

and at least one of those is also Rust-using and hence a possible template.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Patches for CVE-2024-27322

2024-04-30 Thread Dirk Eddelbuettel


On 30 April 2024 at 11:59, peter dalgaard wrote:
| svn diff -c 86235 ~/r-devel/R

Which is also available as
  
https://github.com/r-devel/r-svn/commit/f7c46500f455eb4edfc3656c3fa20af61b16abb7

Dirk

| (or 86238 for the port to the release branch) should be easily backported.
| 
| (CC Luke in case there is more to it)
| 
| - pd
| 
| > On 30 Apr 2024, at 11:28 , Iñaki Ucar  wrote:
| > 
| > Dear R-core,
| > 
| > I just received notification of CVE-2024-27322 [1] in RedHat's Bugzilla. We
| > updated R to v4.4.0 in Fedora rawhide, F40, EPEL9 and EPEL8, so no problem
| > there. However, F38 and F39 will stay at v4.3.3, and I was wondering if
| > there's a specific patch available, or if you could point me to the commits
| > that fixed the issue, so that we can cherry-pick them for F38 and F39.
| > Thanks.
| > 
| > [1] https://nvd.nist.gov/vuln/detail/CVE-2024-27322
| > 
| > Best,
| > -- 
| > Iñaki Úcar
| > 
| > [[alternative HTML version deleted]]
| > 
| > __
| > R-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-devel
| 
| -- 
| Peter Dalgaard, Professor,
| Center for Statistics, Copenhagen Business School
| Solbjerg Plads 3, 2000 Frederiksberg, Denmark
| Phone: (+45)38153501
| Office: A 4.23
| Email: pd@cbs.dk  Priv: pda...@gmail.com
| 
| __
| R-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Problem with loading package "devtools" from CRAN.

2024-04-29 Thread Dirk Eddelbuettel


On 30 April 2024 at 01:21, Rolf Turner wrote:
| On Mon, 29 Apr 2024 06:30:20 -0500
| Dirk Eddelbuettel  wrote:
| 
| 
| 
| > These days, I strongly recommend r2u [1].  As you already use R via
| > CRAN through apt, r2u adds one more repository after which _all_ R
| > packages are handled via the same apt operations that you already
| > trust to get you R from CRAN (as well as anything else on your
| > machine).  This covers all 20+ thousand CRAN packages along with 400
| > key BioC packages. Handling your packages with your system package
| > managed guarantees all dependencies are resolved reliably and
| > quickly. It makes installing, upgrading, managing CRAN package
| > easier, faster and more reliable.
| 
| 
| 
| > [1] https://eddelbuettel.github.io/r2u
| 
| 
| 
| Sounds promising, but I cannot follow what "r2u" is actually
| all about.  What *is* r2u?  And how do I go about using it?  Do I
| invoke it (or invoke something) from within R?  Or do I invoke
| something from the OS?  E.g. something like
| 
| sudo apt-get install 
| 
| ???

You could peruse the documentation at

  https://eddelbuettel.github.io/r2u

and / or the blogposts I have especially below

  https://dirk.eddelbuettel.com/blog/code/r4/

(and you may have to read 'in reverse order').

| I have downloaded the file add_cranapt_jammy.sh and executed
| 
|sudo sh add_cranapt_jammy.sh
| 
| which seemed to run OK.  What now?

Briefly, when you setup r2u you set up new a new apt repo AND a new way to
access them from R (using the lovely `bspm` package).  So in R saying
`install.packages("devtools")` will seamlessly fetch r-cran-devtools and
about 100 other files it depends upon (if you start from an 'empty' system as
I did in a container last eve before replying to you). That works in mere
seconds. You can then say `library(devtools)` as if you compiled locally.

Naturally, using binaries both way faster and easier when it works (as this
generally does). See the blog posts, see the demos, see the r2u site, try in
(risklessly !!) in a container or at gitpod or in continuous integration or
in codespaces or ...
 
The docs try to get to that. Maybe start small and aim `install.packages()`
at a package you know you do not have see what what happens?

Follow-ups may be more appropriate for r-sig-debian, and/or an issue ticket
at the r2u github repo depending on nature of the follow-up.

Good luck,  Dirk


-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problem with loading package "devtools" from CRAN.

2024-04-29 Thread Dirk Eddelbuettel


Rolf,

This question might have been more appropriate for r-sig-debian than here.
But as Simon noted, the lack of detail makes is difficult to say anything to
aid. It likely was an issue local to your setup and use.

These days, I strongly recommend r2u [1].  As you already use R via CRAN
through apt, r2u adds one more repository after which _all_ R packages are
handled via the same apt operations that you already trust to get you R from
CRAN (as well as anything else on your machine).  This covers all 20+
thousand CRAN packages along with 400 key BioC packages. Handling your
packages with your system package managed guarantees all dependencies are
resolved reliably and quickly. It makes installing, upgrading, managing CRAN
package easier, faster and more reliable.

To double-check, I just spot-checked 'devtools' on an r2u container (on top
of Ubuntu 22.04) and of course devtools install and runs fine (as a binary).
So maybe give r2u a go. "Sixteen million packages served" in two years ...

Cheers, Dirk

[1] https://eddelbuettel.github.io/r2u

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Question regarding .make_numeric_version with non-character input

2024-04-25 Thread Dirk Eddelbuettel


Hi Kurt,

On 25 April 2024 at 08:07, Kurt Hornik wrote:
| > Hervé Pagès writes:
| 
| > Hi Kurt,
| > Is it intended that numeric_version() returns an error by default on
| > non-character input in R 4.4.0? 
| 
| Dear Herve, yes, that's the intention.
| 
| > It seems that I can turn this into a warning by setting
| > _R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_=false but I don't
| > seem to be able to find any of this mentioned in the NEWS file.
| 
| That's what I added for smoothing the transition: it will be removed
| from the trunk shortly.

I would actually be nice to have a more robust variant for non-CRAN
versions. For example I just had to do a local hack to be able to use what
the QuantLib 'rc' 1.34-rc reported (when I then used to R facilities to
condition code and tests on whether I was dealing with code before or after
an API transition).  So as a wishlist: could you envision an extension to
package_version() casting that, say, removes all [a-zA-Z]+ first (if opted
into) ?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] read.csv

2024-04-16 Thread Dirk Eddelbuettel


As an aside, the odd format does not seem to bother data.table::fread() which
also happens to be my personally preferred workhorse for these tasks:

> fname <- "/tmp/r/filename.csv"
> read.csv(fname)
   Gene SNP prot log10p
1 YWHAE 13:62129097_C_T 1433   7.35
2 YWHAE 4:72617557_T_TA 1433   7.73
> data.table::fread(fname)
 Gene SNP   prot log10p

1:  YWHAE 13:62129097_C_T  1433E   7.35
2:  YWHAE 4:72617557_T_TA  1433E   7.73
> readr::read_csv(fname)
Rows: 2 Columns: 4
── Column specification 
──
Delimiter: ","
chr (2): Gene, SNP
dbl (2): prot, log10p

ℹ Use `spec()` to retrieve the full column specification for this data.
ℹ Specify the column types or set `show_col_types = FALSE` to quiet this 
message.
# A tibble: 2 × 4
  Gene  SNP  prot log10p

1 YWHAE 13:62129097_C_T  1433   7.35
2 YWHAE 4:72617557_T_TA  1433   7.73
> 

That's on Linux, everything current but dev version of data.table.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] read.csv

2024-04-16 Thread Dirk Eddelbuettel


On 16 April 2024 at 10:46, jing hua zhao wrote:
| Dear R-developers,
| 
| I came to a somewhat unexpected behaviour of read.csv() which is trivial but 
worthwhile to note -- my data involves a protein named "1433E" but to save 
space I drop the quote so it becomes,
| 
| Gene,SNP,prot,log10p
| YWHAE,13:62129097_C_T,1433E,7.35
| YWHAE,4:72617557_T_TA,1433E,7.73
| 
| Both read.cv() and readr::read_csv() consider prot(ein) name as (possibly 
confused by scientific notation) numeric 1433 which only alerts me when I tried 
to combine data,
| 
| all_data <- data.frame()
| for (protein in proteins[1:7])
| {
|cat(protein,":\n")
|f <- paste0(protein,".csv")
|if(file.exists(f))
|{
|  p <- read.csv(f)
|  print(p)
|  if(nrow(p)>0) all_data  <- bind_rows(all_data,p)
|}
| }
| 
| proteins[1:7]
| [1] "1433B" "1433E" "1433F" "1433G" "1433S" "1433T" "1433Z"
| 
| dplyr::bind_rows() failed to work due to incompatible types nevertheless 
rbind() went ahead without warnings.

You may need to reconsider aiding read.csv() (and alternate reading
functions) by supplying column-type info instead of relying on educated
heuristic guesses which appear to fail here due to the nature of your data.

Other storage formats can store type info. That is generally safer and may be
an option too.

I think this was more of an email for r-help than r-devel.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] RSS Feed of NEWS needs a hand

2024-04-02 Thread Dirk Eddelbuettel


On 2 April 2024 at 09:41, Duncan Murdoch wrote:
| On 02/04/2024 8:50 a.m., Dirk Eddelbuettel wrote:
| > On 2 April 2024 at 07:37, Dirk Eddelbuettel wrote:
| > blosxom, simple as it is, takes (IIRC) filesystem ctime as the posting
| > timestamp so would be best if you had a backup with the old timestamps.
| > 
| 
| Looks like those dates are gone -- the switch from svn to git involved 
| some copying, and I didn't preserve timestamps.

You can recreate them. Nobody cares too much about the hour or minute with a
day as there (always ? generally ?) was only one post per day.  But preserving
the overall sort order would be nice as would not spamming the recent posts
with old ones.

| I'll see about regenerating the more recent ones.  I don't think there's 
| much historical interest in the pre-4.0 versions, so maybe I'll just 
| nuke those.

I suspect you will have to do it programmatically too. You could even take
the old timestamps of the svn and/or git commits and then touch the ctime (or
maybe it was mtime, I forget but 'touch --time=  file' works). "Been
there done that" for part of my 20+ year old blog infrastructure too.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] RSS Feed of NEWS needs a hand

2024-04-02 Thread Dirk Eddelbuettel


On 2 April 2024 at 07:37, Dirk Eddelbuettel wrote:
| 
| On 2 April 2024 at 08:21, Duncan Murdoch wrote:
| | I have just added R-4-4-branch to the feeds.  I think I've also fixed 
| | the \I issue, so today's news includes a long list of old changes.
| 
| These feeds can fussy: looks like you triggered many updates. Feedly
| currently greets me with 569 new posts (!!) in that channel.

Now 745 -- and the bigger issue seems to be that the 'posted at' timestamp is
wrong and 'current' so all the old posts are now seen as 'fresh'. Hence the
flood ... of unsorted post.

blosxom, simple as it is, takes (IIRC) filesystem ctime as the posting
timestamp so would be best if you had a backup with the old timestamps.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] RSS Feed of NEWS needs a hand

2024-04-02 Thread Dirk Eddelbuettel


On 2 April 2024 at 08:21, Duncan Murdoch wrote:
| I have just added R-4-4-branch to the feeds.  I think I've also fixed 
| the \I issue, so today's news includes a long list of old changes.

These feeds can fussy: looks like you triggered many updates. Feedly
currently greets me with 569 new posts (!!) in that channel.

Easy enough to mark as all read -- first off thanks for updating the service!

Dirk, a loyal reader since day one

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-04-02 Thread Dirk Eddelbuettel


On 1 April 2024 at 17:44, Uwe Ligges wrote:
| Untested:
| 
| install.packages() calls available.packages() to find out which packages 
| are available - and passes a "filters" argument if supplied.
| That can be a user defined filter. It should be possible to write a user 
| defined filter which prefers the packages in your local repo.

Intriguing.  Presumably that would work for update.packages() too?

(We actually have a use case at work, and as one way out I created another
side-repo to place a package with an incremented version number so it would
'win' on hightest version; this is due to some non-trivial issues with the
underlying dependencies.)

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-03-31 Thread Dirk Eddelbuettel


On 31 March 2024 at 11:43, Martin Morgan wrote:
| So all repositories are consulted and then the result filtered to contain just
| the most recent version of each. Does it matter then what order the
| repositories are visited?

Right. I fall for that too often, as I did here.  The order matters for
.libPaths() where the first match is use, for package install the highest
number (from any entry in getOption(repos)) wins.

Thanks for catching my thinko.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Order of repo access from options("repos")

2024-03-31 Thread Dirk Eddelbuettel


Greg,

There are AFAICT two issues here: how R unrolls the named vector that is the
'repos' element in the list 'options', and how your computer resolves DNS for
localhost vs 172.17.0.1.  I would try something like

   options(repos = c(CRAN = "http://localhost:3001/proxy;,
 C = "http://localhost:3002;,
 B = "http://localhost:3003/proxy;,
 A = "http://localhost:3004;))

or the equivalent with 172.17.0.1. When I do that here I get errors from
first to last as we expect:

   > options(repos = c(CRAN = "http://localhost:3001/proxy;,
 C = "http://localhost:3002;,
 B = "http://localhost:3003/proxy;,
 A = "http://localhost:3004;))
   > available.packages()
   Warning: unable to access index for repository 
http://localhost:3001/proxy/src/contrib:
 cannot open URL 'http://localhost:3001/proxy/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3002/src/contrib:
 cannot open URL 'http://localhost:3002/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3003/proxy/src/contrib:
 cannot open URL 'http://localhost:3003/proxy/src/contrib/PACKAGES'
   Warning: unable to access index for repository 
http://localhost:3004/src/contrib:
 cannot open URL 'http://localhost:3004/src/contrib/PACKAGES'
Package Version Priority Depends Imports LinkingTo Suggests Enhances 
License License_is_FOSS License_restricts_use OS_type Archs MD5sum 
NeedsCompilation File Repository
   > 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Question regarding .make_numeric_version with non-character input

2024-03-29 Thread Dirk Eddelbuettel


On 29 March 2024 at 17:56, Andrea Gilardi via R-devel wrote:
| Dear all,
| 
| I have a question regarding the R-devel version of .make_numeric_version() 
function. As far as I can understand, the current code 
(https://github.com/wch/r-source/blob/66b91578dfc85140968f07dd4e72d8cb8a54f4c6/src/library/base/R/version.R#L50-L56)
 runs the following steps in case of non-character input:
| 
| 1. It creates a message named msg using gettextf.
| 2. Such object is then passed to stop(msg) or warning(msg) according to the 
following condition
| 
| tolower(Sys.getenv("_R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_") != 
"false")
| 
| However, I don't understand the previous code since the output of 
Sys.getenv("_R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_") != "false" is 
just a boolean value and tolower() will just return "true" or "false". Maybe 
the intended code is 
tolower(Sys.getenv("_R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_")) != 
"false" ? Or am I missing something? 

Yes, agreed -- good catch.  In full, the code is (removing leading
whitespace, and putting it back onto single lines)

  msg <- gettextf("invalid non-character version specification 'x' (type: %s)", 
typeof(x))
  if(tolower(Sys.getenv("_R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_") != 
"false"))
  stop(msg, domain = NA)
  else
  warning(msg, domain = NA, immediate. = TRUE)  

where msg is constant (but reflecting language settings via standard i18n)
and as you not the parentheses appear wrong.  What was intended is likely

  msg <- gettextf("invalid non-character version specification 'x' (type: %s)", 
typeof(x))
  if(tolower(Sys.getenv("_R_CHECK_STOP_ON_INVALID_NUMERIC_VERSION_INPUTS_")) != 
"false")
  stop(msg, domain = NA)
  else
  warning(msg, domain = NA, immediate. = TRUE)  

If you use bugzilla before and have a handle, maybe file a bug report with
this as patch at https://bugs.r-project.org/

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] using portable simd instructions

2024-03-27 Thread Dirk Eddelbuettel


On 27 March 2024 at 08:48, jesse koops wrote:
| Thank you, I was not aware of the easy way to search CRAN. I looked at
| rcppsimdjson of course, but couldn't figure it out since it is done in
| the simdjson library if interpret it correclty, not within the R
| ecosystem and I didn't know how that would change things. Writing R
| extensions assumes a lot of  prior knowledge so I will have to work my
| way up to there first.

I think I have (at least) one other package doing something like this _in the
library layer too_ as suggested by Tomas, namely crc32c as used by digest.
You could study how crc32c [0] does this for x86_64 and arm64 to get hardware
optimization. (This may be more specific cpu hardware optimization but at
least the library and cmake files are small.)

I decided as a teenager that assembler wasn't for me and haven't looked back,
but I happily take advantage of it when bundled well. So strong second for
the recommendation by Tomas to rely on this being done in an external and
tested library.

(Another interesting one there is highway [1]. Just packaging that would
likely be an excellent contribution.)

Dirk

[0] repo: https://github.com/google/crc32c
[1] repo: https://github.com/google/highway
docs: https://google.github.io/highway/en/master/


| 
| Op di 26 mrt 2024 om 15:41 schreef Dirk Eddelbuettel :
| >
| >
| > On 26 March 2024 at 10:53, jesse koops wrote:
| > | How can I make this portable and CRAN-acceptable?
| >
| > But writing (or borrowing ?) some hardware detection via either configure /
| > autoconf or cmake. This is no different than other tasks decided at 
install-time.
| >
| > Start with 'Writing R Extensions', as always, and work your way up from
| > there. And if memory serves there are already a few other packages with SIMD
| > at CRAN so you can also try to take advantage of the search for a 'token'
| > (here: 'SIMD') at the (unofficial) CRAN mirror at GitHub:
| >
| >https://github.com/search?q=org%3Acran%20SIMD=code
| >
| > Hth, Dirk
| >
| > --
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] paths capability FALSE on devel?

2024-03-27 Thread Dirk Eddelbuettel


On 27 March 2024 at 11:03, Prof Brian Ripley via R-devel wrote:
| On 27/03/2024 10:28, Alexandre Courtiol wrote:
| > Hi all,
| > 
| > I don't know if it is a local issue on my hands or not, but after
| > installing R-devel the output of grDevices::dev.capabilities()$paths is
| > FALSE, while it is TRUE for R 4.3.3.
| > Relatedly, I have issues with plotting paths on devel.
| > 
| > At this stage, I simply would like to know if others running R devel and R
| > 4.3.3 can replicate this behaviour and if there are obvious reasons why the
| > observed change would be expected.
| 
| The help says
| 
|   Query the capabilities of the current graphics device.
| 
| You haven't told us what that was.  See the posting guide for the "at a 
| minimum" information you also did not provide 

Yes, with that I see

> x11()
> grDevices::dev.capabilities()$paths
[1] TRUE
>
> getRversion()
[1] ‘4.5.0’
>
> R.version
   _ 
platform   x86_64-pc-linux-gnu   
arch   x86_64
os linux-gnu 
system x86_64, linux-gnu 
status Under development (unstable)  
major  4 
minor  5.0   
year   2024  
month  03
day27
svn rev86214 
language   R 
version.string R Under development (unstable) (2024-03-27 r86214)
nickname   Unsuffered Consequences   
> 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Check results on r-devel-windows claiming error but tests seem to pass?

2024-03-26 Thread Dirk Eddelbuettel


On 26 March 2024 at 09:37, Dirk Eddelbuettel wrote:
| 
| Avi,
| 
| That was a hickup and is now taken care of. When discussing this (off-line)
| with Jeroen we (rightly) suggested that keeping an eye on

Typo, as usual, "he (rightly) suggested".  My bad.

D.

| 
|https://contributor.r-project.org/svn-dashboard/
| 
| is one possibility to keep track while we have no status alert system from
| CRAN.  I too was quite confused because a new upload showed errors, and
| win-builder for r-devel just swallowed any uploads.
| 
| Cheers, Dirk
| 
| -- 
| dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] using portable simd instructions

2024-03-26 Thread Dirk Eddelbuettel


On 26 March 2024 at 10:53, jesse koops wrote:
| How can I make this portable and CRAN-acceptable?

But writing (or borrowing ?) some hardware detection via either configure /
autoconf or cmake. This is no different than other tasks decided at 
install-time.

Start with 'Writing R Extensions', as always, and work your way up from
there. And if memory serves there are already a few other packages with SIMD
at CRAN so you can also try to take advantage of the search for a 'token'
(here: 'SIMD') at the (unofficial) CRAN mirror at GitHub:

   https://github.com/search?q=org%3Acran%20SIMD=code

Hth, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Check results on r-devel-windows claiming error but tests seem to pass?

2024-03-26 Thread Dirk Eddelbuettel


Avi,

That was a hickup and is now taken care of. When discussing this (off-line)
with Jeroen we (rightly) suggested that keeping an eye on

   https://contributor.r-project.org/svn-dashboard/

is one possibility to keep track while we have no status alert system from
CRAN.  I too was quite confused because a new upload showed errors, and
win-builder for r-devel just swallowed any uploads.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] How to store large data to be used in an R package?

2024-03-26 Thread Dirk Eddelbuettel


On 25 March 2024 at 11:12, Jairo Hidalgo Migueles wrote:
| I'm reaching out to seek some guidance regarding the storage of relatively
| large data, ranging from 10-40 MB, intended for use within an R package.
| Specifically, this data consists of regression and random forest models
| crucial for making predictions within our R package.
| 
| Initially, I attempted to save these models as internal data within the
| package. While this approach maintains functionality, it has led to a
| package size exceeding 20 MB. I'm concerned that this would complicate
| submitting the package to CRAN in the future.
| 
| I would greatly appreciate any suggestions or insights you may have on
| alternative methods or best practices for efficiently storing and accessing
| this data within our R package.

Brooke and I wrote a paper on one way of addressing it via a 'data' package
accessibly via an Additional_repositories: entry supported by a drat repo.

See https://journal.r-project.org/archive/2017/RJ-2017-026/index.html for the
paper which contains a nice slow walkthrough of all the details.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Request for assistance: error in installing on Debian (undefined symbol: omp_get_num_procs) and note in checking the HTML versions (no command 'tidy' found, package 'V8' unavailable

2024-03-21 Thread Dirk Eddelbuettel


Salut Annaig,

On 21 March 2024 at 09:26, Annaig De-Walsche wrote:
| Dear R-package-devel Community,
| 
| I hope this email finds you well. I am reaching out to seek assistance 
regarding package development in R.
| 
| Specifically, I am currently developing an R package for querying composite 
hypotheses using Rccp. 

My preferred typo. The package is actually called Rcpp (pp as in plus-plus).
 
| Skipping checking HTML validation: no command 'tidy' found
| Skipping checking math rendering: package 'V8' unavailable
| 
| I have searched through the available documentation and resources, but I 
still need help understanding the error and note messages. Hence, I am turning 
to this community, hoping that some of you have encountered similar issues.
| 
| Thank you very much for considering my request. I would be grateful if anyone 
could provide me with some help.
| 
| Best regards,
| Annaïg De Walsche
| Quantitative Genetics and Evolution unit of INRAE
| Gif-sur-Yvette, France
| 

Could you share with us which actual Docker container you started?

| Installing package into ‘/home/docker/R’
| (as ‘lib’ is unspecified)
| 'getOption("repos")' replaces Bioconductor standard repositories, see
| 'help("repositories", package = "BiocManager")' for details.
| Replacement repositories:
| CRAN: https://cloud.r-project.org
| * installing *source* package ‘qch’ ...
| ** using staged installation
| ** libs
| using C++ compiler: ‘g++ (Debian 13.2.0-7) 13.2.0’
| using C++11
| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-I"/usr/local/lib/R/include" -DNDEBUG  -I'/home/docker/R/Rcpp/include' 
-I'/home/docker/R/RcppArmadillo/include' -I/usr/local/include-fpic  -g -O2 
-Wall -pedantic -mtune=native  -c RcppExports.cpp -o RcppExports.o
| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-I"/usr/local/lib/R/include" -DNDEBUG  -I'/home/docker/R/Rcpp/include' 
-I'/home/docker/R/RcppArmadillo/include' -I/usr/local/include-fpic  -g -O2 
-Wall -pedantic -mtune=native  -c updatePrior_rcpp.cpp -o updatePrior_rcpp.o
| updatePrior_rcpp.cpp:55: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|55 |#pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:65: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|65 |  #pragma omp for
|   |
| updatePrior_rcpp.cpp:92: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|92 |  #pragma omp critical
|   |
| updatePrior_rcpp.cpp:178: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   178 |   #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:190: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   190 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:289: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   289 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:301: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   301 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:341: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|   341 | #pragma omp critical
|   |
| updatePrior_rcpp.cpp:409: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   409 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:423: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   423 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:527: warning: ignoring ‘#pragma omp parallel’ 
[-Wunknown-pragmas]
|   527 | #pragma omp parallel num_threads(threads_nb)
|   |
| updatePrior_rcpp.cpp:539: warning: ignoring ‘#pragma omp for’ 
[-Wunknown-pragmas]
|   539 | #pragma omp for
|   |
| updatePrior_rcpp.cpp:580: warning: ignoring ‘#pragma omp critical’ 
[-Wunknown-pragmas]
|   580 | #pragma omp critical
|   |

You seem to be using a number of OpenMP directives. That is good and
performant. But OpenMP cannot be assumed as given; some OSs more or less skip
it alltogether, some platforms or compilers may not have it. I ran into the
same issue earlier trying to test something with clang on Linux, it would not
find the OpenMP library gcc happily finds. I moved on in that (local) use case.

In short you probably want to condition your use.

| g++ -fsanitize=undefined,bounds-strict -fno-omit-frame-pointer -std=gnu++11 
-shared -L/usr/local/lib/R/lib -L/usr/local/lib -o qch.so RcppExports.o 
updatePrior_rcpp.o -L/usr/local/lib/R/lib -lRlapack -L/usr/local/lib/R/lib 
-lRblas -lgfortran -lm -lubsan -lquadmath -L/usr/local/lib/R/lib -lR
| installing to /home/docker/R/00LOCK-qch/00new/qch/libs
| ** R
| ** data
| *** moving datasets to lazyload DB
| ** byte-compile and prepare package for lazy loading
| 'getOption("repos")' replaces Bioconductor standard repositories, see
| 'help("repositories", package = "BiocManager")' for details.
| Replacement repositories:
| CRAN: https://cloud.r-project.org
| Note: wrong 

Re: [R-pkg-devel] new maintainer for CRAN package XML

2024-03-19 Thread Dirk Eddelbuettel


Dear Uwe,

Did CRAN ever reach a decision here with a suitable volunteer (or group of
volunteers) ?  The state of XML came up again recently on mastodon, and it
might be helpful to share an update if there is one.

Thanks, as always, for all you and the rest of the team do for CRAN.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] RSS Feed of NEWS needs a hand

2024-03-15 Thread Dirk Eddelbuettel


Years ago Duncan set up a nightly job to feed RSS based off changes to NEWS,
borrowing some setup parts from CRANberries as for example the RSS 'compiler'.

That job is currently showing the new \I{...} curly protection in an
unfavourable light. Copying from the RSS reader I had pointed at this since
the start [1], for today I see (indented by four spaces)

CHANGES IN R-devel INSTALLATION on WINDOWS

The makefiles and installer scripts for Windows have been tailored to
\IRtools44, an update of the \IRtools43 toolchain. It is based on GCC 13
and newer versions of \IMinGW-W64, \Ibinutils and libraries (targeting
64-bit Intel CPUs). R-devel can no longer be built using \IRtools43
without changes. 

\IRtools44 has experimental suport for 64-bit ARM (aarch64) CPUs via LLVM
17 toolchain using lld, clang/flang-new and libc++. 

Can some kind soul put a filter over it to remove the \I ?

Thanks,  Dirk

[1] Feedly. Unless we set this up so early that I once used Google
Reader. It's been a while...

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 15:12, Duncan Murdoch wrote:
| On 05/03/2024 2:26 p.m., Dirk Eddelbuettel wrote:
| > The default behaviour is to build after every commit to the main branch.  
But
| > there are options. On the repo I mentioned we use
| > 
| >  "branch": "*release",
| 
| Where do you put that?  I don't see r2u on R-universe, so I guess you're 
| talking about a different repo; which one?

In the (optional) control repo that can drive your 'r-universe', and the file
has to be named 'packages.json'. For you the repo would

https://github.com/dmurdoch/dmurdoch.r-universe.dev

(and the naming rule was tightened by Jeroen recently -- we used to call
these just 'universe', now it has to match your runiverse)

The file packages.json would then have a block

  {
"package": "rgl",
"maintainer": "Duncan Murdoch "
"url": "https://github.com/dmurdoch/rgl;,
"available": true,
"branch": "*release"
  }

The reference I mentioned is our package 'tiledbsoma' (joint work of TileDB
and CZI, in https://github.com/single-cell-data/TileDB-SOMA) and described here:

https://github.com/TileDB-Inc/tiledb-inc.r-universe.dev/blob/master/packages.json
 

(and you can ignore the '"subdir": "apis/r"' which is a facet local to that 
repo).

Note that 'my' packages.json in my eddelbuettel.r-universe.dev ie

https://github.com/eddelbuettel/eddelbuettel.r-universe.dev/blob/master/packages.json

also describe but without the '"branch": "*release"' and that builds with every 
merge to
the main branch by my choice; that build is mine and 'inofficial' giving us two.

| > It is under your control. You could document how to install via `remotes`
| > from that branch.  As so often, it's about trading one thing off for 
another.
| 
| I do that, but my documentation falls off the bottom of the screen, and 
| the automatic docs generated by R-universe are at the top.

I always get lost in the r-universe docs too. Some, as Jeroen kindly reminded
me the other day, are here:  https://github.com/r-universe-org

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 13:28, Duncan Murdoch wrote:
| What I'm seeing is that the tags are ignored, and it is distributing the 
| HEAD of the main branch.  I don't think most users should be using that 
| version:  in my packages it won't have had full reverse dependency 
| checks, I only do that before CRAN releases.  And occasionally it hasn't 
| even passed R CMD check, though that's not my normal workflow.  On the 
| other hand, I like that it's available and easy to install, it just 
| shouldn't be the default install.

The default behaviour is to build after every commit to the main branch.  But
there are options. On the repo I mentioned we use

"branch": "*release",

and now builds occur on tagged releases only. The above is AFAIUI a meta
declaration understood by `remotes`, it was an option suggested by a
colleague.  Naming actual branches also works.
 
| I suppose I could do all development on a "devel" branch, and only merge 
| it into the main branch after I wanted to make a release, but then the 
| R-universe instructions would be no good for getting the devel code.

It is under your control. You could document how to install via `remotes`
from that branch.  As so often, it's about trading one thing off for another.

| I don't know anything about dpkg, but having some options available to 
| package authors would be a good thing.

Yes but you know {install,available}.packages and have some understanding of
how R identifies and installs packages. I merely illustrated a different use
pattern of giving "weights" to repos. If "we all" want different behaviour,
someone has to site down and write it. Discussing some possible specs and
desired behavior may help. 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 11:56, Duncan Murdoch wrote:
| I have mixed feelings about r-universe.  On the one hand, it is really 
| nicely put together, and it offers the service described above.  On the 
| other, it's probably a bad idea to follow its advice and use 
| install.packages() with `repos` as shown:  that will install development 
| versions of packages, not releases.

Yup. It's a point I raised right at the start as I really do believe in
curated releases but clearly a lot of people prefer the simplicity of
'tagging a release' at GitHub and then getting a build.

r-universe is indeed good at what it does and reliable. There are limited
choices in 'driving' what you can do with it.  We rely quite heavily on it in
a large project for work.  As each 'repo' can appear only once in a universe,
we resorted to having the 'offical' build follow GitHub 'releases', as well
as (optional, additional) builds against a the main branch from another
universe.  This example is for a non-CRAN package.

With CRAN packages, r-universe can be useful too. For some of my packages, I
now show multiple 'badges' at the README: for the released CRAN version as
well as for the current 'rc' in the main branch sporting a differentiating
final digit.  RcppArmadillo had a pre-releases available to test that way for
a few weeks til the new release this week.  So in effect, this gives you what
`drat` allows yet also automagically adds builds. It's quite useful when you
are careful about it.
 
| Do you know if it's possible for a package to suggest the CRAN version 
| first, with an option like the above only offered as a pre-release option?

In the language of Debian and its dpkg and tools, one solution to that would
be 'repository pinning' to declare a 'value' on a repository.  There, the
default is 500, and e.g. for r2u I set this to 700 as you usually want its
versions.

We do not have this for R, but it could be added (eventually) as a new value
in PACKAGES, or as a new supplementary attribute.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggesting an archived package in the DESCRIPTION file

2024-03-05 Thread Dirk Eddelbuettel


On 5 March 2024 at 06:25, Duncan Murdoch wrote:
| You could make a compatible version of `survivalmodels` available on a 
| non-CRAN website, and refer to that website in the 
| Additional_repositories field of DESCRIPTION.

Every r-universe sub-site fits that requirement. For this package Google's
first hit was https://raphaels1.r-universe.dev/survivalmodels and it carries
the same line on install.packages() that Jeroen adds to every page:

 install.packages('survivalmodels', repos = 
c('https://raphaels1.r-universe.dev',
  'https://cloud.r-project.org'))

So doing all three of 
- adding a line 'Additional_repositories: https://raphaels1.r-universe.dev'
- adding a 'Suggests: survivalmodels;
- ensuring conditional use only as Suggests != Depends
should do.

| It would be best if you fixed whatever issue caused survivalmodels to be 
| archived when you do this.
| 
| Looking here: 
| 
https://cran-archive.r-project.org/web/checks/2024/2024-03-02_check_results_survivalmodels.html
| that appears very easy to do.  The source is here: 
| https://github.com/RaphaelS1/survivalmodels/ .

The other may even take a PR fixing this going forward.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unable to access log operator in C

2024-02-28 Thread Dirk Eddelbuettel


On 28 February 2024 at 19:05, Avraham Adler wrote:
| I am hoping the solution to this question is simple, but I have not
| been able to find one. I am building a routine in C to be called from
| R. I am including Rmath.h. However, when I have a call to "log", I get
| the error "called object 'log' is not a function or a function
| pointer. When I "trick" it by calling log1p(x - 1), which I *know* is
| exported from Rmath.h, it works.
| 
| More completely, my includes are:
| #include 
| #include 
| #include 
| #include 
| #include  // for NULL
| #include 
| 
| The object being logged is a double, passed into C as an SEXP, call it
| "a", which for now will always be a singleton. I initialize a pointer
| double *pa = REAL(a). I eventually call log(pa[0]), which does not
| compile and throws the error listed above. Switching the call to
| log1p(pa[0] - 1.0) works and returns the proper answer.
| 
| Even including math.h explicitly does not help, which makes sense as
| it is included by Rmath.h.

Can you show the actual line?  Worst case rename your source file to end in
.cpp, include  and call std::log.

  > Rcpp::cppFunction("double mylog(double x) { return std::log(x); }")
  > mylog(exp(42))
  [1] 42
  > 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available: ‘arrow’

2024-02-25 Thread Dirk Eddelbuettel


On 26 February 2024 at 09:19, Simon Urbanek wrote:
| [requiring increased is] best way [..] and certainly the only good practice.

No, not really. Another viewpoint, which is implemented in another project I
contribute to, is where a version + build_revision tuple exists if, and only
if, the underlying upload was accepted. Until then upload iterations are fine.

Hence s/only good practive/one possible way/.

Anyway: `arrow` is long back at CRAN (yay!) so this thread is done anyway.

Dirk
 
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available: ‘arrow’

2024-02-23 Thread Dirk Eddelbuettel


On 23 February 2024 at 15:53, Leo Mada wrote:
| Dear Dirk & R-Members,
| 
| It seems that the version number is not incremented:
| # Archived
| arrow_14.0.2.1.tar.gz   2024-02-08 11:57  3.9M
| # Pending
| arrow_14.0.2.1.tar.gz   2024-02-08 18:24  3.9M
| 
| Maybe this is the reason why it got stuck in "pending".

No it is not.

The hint to increase version numbers on re-submission is a weaker 'should' or
'might', not a strong 'must'.

I have uploaded a few packages to CRAN over the last two decades, and like
others have made mistakes requiring iterations. I have not once increased a
version number.  If/when CRAN sees an error in its (automated, largely)
processing, the package is moved and the space is cleared allowing a fresh
upload. (Of course you cannot upload under the same filename twice _before_
the initial processing. By default uploads do not overwrite.)  Arhive/ is
distinct from pending.  POSIX semantics on times also help: your example
clearly shows that the one in archived is older by about 6 1/2 hours. 

That said, in case there are multiple rounds of email and discussion having
distinct numbers may ease identification of the particular package and
discussion thread. But it still makes sense to have this be a suggestion, not
a requirement.
 
Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package required but not available:‘arrow’

2024-02-22 Thread Dirk Eddelbuettel


On 22 February 2024 at 04:01, Duncan Murdoch wrote:
| For you to deal with this, you should make arrow into a suggested 
| package,

For what it is worth, that is exactly what package tiledb does.

Yet the Suggests: still lead to a NOTE requiring a human to override which
did not happen until I gently nudged after the 'five work days' had lapsed.

So full agreement that 'in theory' a Suggests: should help and is the weaker
and simpler dependency.  However 'in practice' it can still lead to being
held up up when the weak-dependency package does not build.

[ As for Dénes's point, most if not all the internals in package tiledb are
  actually on nanoarrow but we offer one code path returning an Arrow Table
  object and that requires 'arrow' the package for the instantiation.

  So it really all boils down to 'Lightweight is the right weight' as we say
  over at www.tinyverse.org.  But given that the public API offers an Arrow
  accessor, it is a little late to pull back from it.  And Arrow is a powerful
  and useful tool. Building it, however, can have its issues... ]

Anyway, while poking around the issue when waiting, I was also told by Arrow
developers that the issue (AFAICT a missing header) is fixed, and looking at
CRAN's incoming reveals the package has been sitting there since Feb 8 (see
https://cran.r-project.org/incoming/pending/).  So would be good to hear from
CRAN what if anything is happening here.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Compiling libR as a standalone C library for java+jni (-fPIC)

2024-02-20 Thread Dirk Eddelbuettel


Salut Pierre,

On 20 February 2024 at 10:33, Pierre Lindenbaum wrote:
| (cross-posted on SO: https://stackoverflow.com/questions/78022766)
| 
| Hi all,
| 
| I'm trying to compile R as a static library with the -fPIC flag so I can use 
it within java+JNI (is it only possible ?), but I cannot find the right flags 
in '.configure' to compile R this way.
| 
| I tested various flags but I cannot find the correct syntax.
| 
| for now, my latest attempt was
| 
| ```
| rm -rvf  "TMP/R-4.3.2" TMP/tmp.tar.gz
| mkdir -p TMP/R-4.3.2/lib/
| wget -O TMP/tmp.tar.gz 
"https://pbil.univ-lyon1.fr/CRAN/src/base/R-4/R-4.3.2.tar.gz;
| cd TMP && tar xfz tmp.tar.gz && rm tmp.tar.gz &&  cd R-4.3.2 && \
|      CPICFLAGS=fpic FPICFLAGS=fpic CXXPICFLAGS=fpic SHLIB_LDFLAGS=shared  
SHLIB_CXXLDFLAGS=shared  ./configure --enable-R-static-lib 
--prefix=/path/to/TMP --with-x=no --disable-BLAS-shlib && make
| 
| ```

Looks like you consistenly dropped the '-' from '-fPIC'.

FWIW the Debian (and hence Ubuntu and other derivatives) binaries contain a
libR you can embed. And littler and RInside have done so for maybe 15 years.

Cannot help with JNI but note that the history of the headless (and generally
excellent) Rserve (and its clients) started on Java. Might be worth a try.

Good luck, Dirk

| witch gives the following error during configure:
| 
| 
| ```
| configure: WARNING: you cannot build info or HTML versions of the R manuals
| configure: WARNING: you cannot build PDF versions of the R manuals
| configure: WARNING: you cannot build PDF versions of vignettes and help pages
| make[1]: Entering directory 'R-4.3.2'
| configure.ac:278: error: possibly undefined macro: AM_CONDITIONAL
|    If this token and others are legitimate, please use m4_pattern_allow.
|    See the Autoconf documentation.
| configure.ac:870: error: possibly undefined macro: AC_DISABLE_STATIC
| configure.ac:2226: error: possibly undefined macro: AM_LANGINFO_CODESET
| configure.ac:2876: error: possibly undefined macro: AM_NLS
| configure.ac:2880: error: possibly undefined macro: AM_GNU_GETTEXT_VERSION
| configure.ac:2881: error: possibly undefined macro: AM_GNU_GETTEXT
| make[1]: *** [Makefile:49: configure] Error 1
| 
| ```
| removing the XXXFLAGS=YYY and --prefix (?) allows R to be compiled but It's 
not loaded into java.
| 
| ```
| gcc  -ITMP -I${JAVA_HOME}/include/ -I${JAVA_HOME}/include/linux \
|      -LTMP/R-4.3.2/lib `TMP/R-4.3.2/bin/R CMD config --cppflags` -shared 
-fPIC -o TMP/libRSession.so  -g RSession.c TMP/R-4.3.2/lib/libR.a
| /usr/bin/ld: TMP/R-4.3.2/lib/libR.a(objects.o): warning: relocation against 
`R_dot_Method' in read-only section `.text'
| /usr/bin/ld: TMP/R-4.3.2/lib/libR.a(altrep.o): relocation R_X86_64_PC32 
against symbol `R_NilValue' can not be used when making a shared object; 
recompile with -fPIC
| /usr/bin/ld: final link failed: bad value
| ```
| 
| Any idea ? Thanks
| 
| Pierre
| 
| __
| R-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Tcl socket server (tcltk) does not work any more on R 4.3.2

2024-02-20 Thread Dirk Eddelbuettel


On 20 February 2024 at 12:27, webmail.gandi.net wrote:
| Dear list,
| 
| It seems that something changed between R 4.2.3 and R 4.3 (tested with 4.3.2) 
that broke the Tcl socket server. Here is a reproducible example:
| 
| - R process #1 (Tcl socket server):
| 
| library(tcltk)
| cmd <- r"(
|  proc accept {chan addr port} { ;# Make a proc to accept connections
|puts "$addr:$port says [gets $chan]" ;# Receive a string
|puts $chan goodbye   ;# Send a string
|close $chan  ;# Close the socket (automatically 
flushes)
| }   ;#
| socket -server accept 12345 ;# Create a server socket)"
| .Tcl(cmd)
| 
| - R process #2 (socket client):
| 
| con <- socketConnection(host = "localhost", port = 12345, blocking = FALSE)
| writeLines("Hello, world!", con) # Should print something in R #1 stdout
| readLines(con) # Should receive "goodbye"
| close(con)
| 
| When R process #1 is R 4.2.3, it works as expected (whatever version of R 
#2). When R process #1 is R 4.3.2, nothing is sent or received through the 
socket apparently, but no error is issued and process #2 seems to be able to 
connect to the socket.
| 
| I am stuck with this. Thanks in advance for help.

>From a quick check this issue seems to persist in the (current) R-devel
2024-02-20 r85951 too.

Dirk

| Regards,
| 
| Philippe
| 
| > .Tcl("puts [info patchlevel]")
| 8.6.13
|   
| 
| > sessionInfo()
| R version 4.3.2 (2023-10-31)
| Platform: aarch64-apple-darwin20 (64-bit)
| Running under: macOS Sonoma 14.2.1
| 
| Matrix products: default
| BLAS:   
/System/Library/Frameworks/Accelerate.framework/Versions/A/Frameworks/vecLib.framework/Versions/A/libBLAS.dylib
 
| LAPACK: 
/Library/Frameworks/R.framework/Versions/4.3-arm64/Resources/lib/libRlapack.dylib;
  LAPACK version 3.11.0
| 
| locale:
| [1] en_US.UTF-8/en_US.UTF-8/en_US.UTF-8/C/en_US.UTF-8/en_US.UTF-8
| 
| time zone: Europe/Brussels
| tzcode source: internal
| 
| attached base packages:
| [1] tcltk stats graphics  grDevices utils datasets  methods   
base 
| 
| loaded via a namespace (and not attached):
| [1] compiler_4.3.2 tools_4.3.2glue_1.7.0  
| __
| R-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] CRAN uses an old version of clang

2024-02-09 Thread Dirk Eddelbuettel


On 9 February 2024 at 08:59, Marcin Jurek wrote:
| I recently submitted an update to my package. It previous version relied on
| Boost for Bessel and gamma functions but a colleague pointed out to me that
| they are included in the standard library beginning with the C++17
| standard.

There is an often overlooked bit of 'fine print': _compiler support_ for a
C++ standard is not the same as the _compiler shipping a complete library_
for that same standard. This can be frustrating. See the release notes for
gcc/g++ and clang/clang++, IIRC they usually have a separate entry for C++
library support.

In this case, can probably rely on LinkingTo: BH which has been helping with
Boost headers for over a decade.

Writing R Extensions is also generally careful in reminding us that such
language standard support is always dependent on the compiler at hand. So
package authors ought to check, just like R does via its extensive configure
script when it builds.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] failing CRAN checks due to problems with dependencies

2024-02-08 Thread Dirk Eddelbuettel


On 8 February 2024 at 13:28, Marcin Jurek wrote:
| Ok, this makes sense! I saw that Rcpp was failing the checks too but I
| wasn't sure if I should resubmit or wait. Thanks!

For completeness, it was not caused by Rcpp but rather by a mix on new clang
and gcc versions which somehow got into each other's way on that machine; and
this has by now been fixed.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] r-oldrel-linux- not in CRAN checks?

2024-02-07 Thread Dirk Eddelbuettel


On 7 February 2024 at 09:15, Vincent van Hees wrote:
| Thanks Ivan, In that case I will conclude that it is time to upgrade my
| Ubuntu 18 machine. I just wasn't sure whether there is still a need for
| keeping my own package Ubuntu 18 compatible, but if dependencies like Rfast
| do not do it and if it is even not in the CRAN checks anymore then there is
| also limited value in me making the effort.

I think that is a good conclusion.  A few more observations:

- for #r2u I package / build all of CRAN for Ubuntu (both 20.04 and 22.04),
  there are a handful of CRAN packages I cannot build on Ubuntu 20.04 (!!!)
  because they use C++20 which the default compiler on 20.04 does not support

- in my dayjob (also behind one large CRAN package I maintain) we had to move
  all CI jobs from 20.04 to 22.04 for the same reason. I think this will me
  more, not less, common.

- your premise in your initial email was not quite supported by either
  "Writing R Extensions" nor the "CRAN Repository Policy": Neither stipulates
  a minimum 'old' environment.

FWIW I am a fairly happy camper with 22.04 for deployment, 23.10 for my use
and will likely move to 22.04 fairly soon this summer after it is released.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Advice debugging M1Mac check errors

2024-02-04 Thread Dirk Eddelbuettel


On 4 February 2024 at 20:41, Holger Hoefling wrote:
| I wanted to ask if people have good advice on how to debug M1Mac package
| check errors when you don´t have a Mac? Is a cloud machine the best option
| or is there something else?

a) Use the 'mac builder' CRAN offers:
   https://mac.r-project.org/macbuilder/submit.html 

b) Use the newly added M1 runners at GitHub Actions,
   
https://github.blog/changelog/2024-01-30-github-actions-introducing-the-new-m1-macos-runner-available-to-open-source/

Option a) is pretty good as the machine is set up for CRAN and builds
fast. Option b) gives you more control should you need it.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Bioconductor reverse dependency checks for a CRAN package

2024-01-30 Thread Dirk Eddelbuettel


Ivan,

On 30 January 2024 at 18:56, Ivan Krylov via R-package-devel wrote:
| Hello R-package-devel,
| 
| What would you recommend in order to run reverse dependency checks for
| a package with 182 direct strong dependencies from CRAN and 66 from
| Bioconductor (plus 3 more from annotations and experiments)?
| 
| Without extra environment variables, R CMD check requires the Suggested
| packages to be available, which means installing...
| 
| revdepdep <- package_dependencies(revdep, which = 'most')
| revdeprest <- package_dependencies(
|  unique(unlist(revdepdep)),
|  which = 'strong', recursive = TRUE
| )
| length(setdiff(
|  unlist(c(revdepdep, revdeprest)),
|  unlist(standard_package_names())
| ))
| 
| ...up to 1316 packages. 7 of these suggested packages aren't on CRAN or
| Bioconductor (because they've been archived or have always lived on
| GitHub), but even if I filter those out, it's not easy. Some of the
| Bioconductor dependencies are large; I now have multiple gigabytes of
| genome fragments and mass spectra, but also a 500-megabyte arrow.so in
| my library. As long as a data package declares a dependency on your
| package, it still has to be installed and checked, right?
| 
| Manually installing the SystemRequirements is no fun at all, so I've
| tried the rocker/r2u container. It got me most of the way there, but
| there were a few remaining packages with newer versions on CRAN. For

If that happens, please file an issue ticket at the r2u site.  CRAN should be
current as I update business daily whenever p3m does and hence will be as
current as approaches using it (and encode the genuine system dependencies).

BioConductor in r2u is both more manual (and I try to update "every few
days") and course not complete so if you miss a package _from BioConductor_
again please just file an issue ticket.

| these, I had to install the system packages manually in order to build
| them from source.

For what it is worth, my own go-to for many years has been a VM in which I
install 'all packages needed' for the rev.dep to be checked. Doing it with
on-demands 'lambda function (one per package tested)' based on r2u would be a
nice alternative but I don't have the aws credits to try it...

| Someone told me to try the rocker/r-base container together with pak.
| It was more proactive at telling me about dependency conflicts and
| would have got me most of the way there too, except it somehow got me a
| 'stringi' binary without the corresponding libicu*.so*, which stopped
| the installation process. Again, nothing that a bit of manual work
| wouldn't fix, but I don't feel comfortable setting this up on a CI
| system. (Not on every commit, of course - that would be extremely
| wasteful - but it would be nice if it was possible to run these checks
| before release on a different computer and spot more problems this way.)
| 
| I can't help but notice that neither install.packages() nor pak() is
| the recommended way to install Bioconductor packages. Could that
| introduce additional problems with checking the reverse dependencies?

As Martin already told you, BioConductor has always had their own
installation wrapper because they are a 'little different' with the bi-annual
release cycle.
 
| Then there's the check_packages_in_dir() function itself. Its behaviour
| about the reverse dependencies is not very helpful: they are removed
| altogether or at least moved away. Something may be wrong with my CRAN
| mirror, because some of the downloaded reverse dependencies come out
| with a size of zero and subsequently fail the check very quickly.
| 
| I am thinking of keeping a separate persistent library with all the
| 1316 dependencies required to check the reverse dependencies and a

As stated above, that is what works for me. I used to use a chroot directory
on a 'big server', now I use a small somewhat underpowered VM.

| persistent directory with the reverse dependencies themselves. Instead
| of using the reverse=... argument, I'm thinking of using the following
| scheme:
| 
| 1. Use package_dependencies() to determine the list of packages to test.
| 2. Use download.packages() to download the latest version of everything
| if it doesn't already exist. Retry if got zero-sized or otherwise
| damaged tarballs. Remove old versions of packages if a newer version
| exists.
| 3. Run check_packages_in_dir() on the whole directory with the
| downloaded reverse dependencies.
| 
| For this to work, I need a way to run step (3) twice, ensuring that one
| of the runs is performed with the CRAN version of the package in the
| library and the other one is performed with the to-be-released version
| of the package in the library. Has anyone already come up with an
| automated way to do that?
| 
| No wonder nobody wants to maintain the XML package.

Well a few of us maintain packages with quite a tail and cope. Rcpp has 2700,
RcppArmadillo have over 100, BH a few hundred. These aren't 'light'. I wrote
myself the `prrd` package (on 

Re: [R-pkg-devel] lost braces note on CRAN pretest related to \itemize

2024-01-23 Thread Dirk Eddelbuettel


On 23 January 2024 at 19:39, Patrick Giraudoux wrote:
| Has anyone an idea about what is going wrong ?

\item has no braces following it.  From a package I submitted today and for
which I still have NEWS.Rd in the editor (indented here):

  \section{Changes in version 0.0.22 (2024-01-23)}{
\itemize{
  \item Replace empty examples macros to satisfy CRAN request.
}
  }

Hth,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] New Package Removal because Shared Library Too Large from Debugging Symbols

2024-01-20 Thread Dirk Eddelbuettel


Johann,

On 20 January 2024 at 14:38, Johann Gaebler wrote:
| Hi everyone,
| 
| I received the following message regarding  `rar` 
, a package that I put up on CRAN two 
days ago:
| 
| > Dear maintainer,
| > 
| > Please see the problems shown on
| > .
| > 
| > Please correct before 2024-02-02 to safely retain your package on CRAN.
| 
| The issue is that the compiled libraries are too large. The Mac CRAN checks 
turned up the following note:

No you read that wrong. That is NOT the issue. That is a mere 'Note'.

Your issue is the bright red link labeled 'LTO' on that pagem and going to

  https://www.stats.ox.ac.uk/pub/bdr/LTO/rar.out
  
where it details an error on that platform / compilation choice:

  g++-13 -std=gnu++17 -shared -L/usr/local/gcc13/lib64 -L/usr/local/lib64 -o 
rar.so cpp11.o distpt.o iter.o max.o min.o regdata.o sens.o test-distpt.o 
test-iter.o test-max.o test-min.o test-regdata.o test-runner.o test-sens.o
  cpp11.cpp:18:13: warning: 'run_testthat_tests' violates the C++ One 
Definition Rule [-Wodr]
 18 | extern SEXP run_testthat_tests(void *);
| ^
  /data/gannet/ripley/R/test-dev/testthat/include/testthat/testthat.h:172:17: 
note: 'run_testthat_tests' was previously declared here
172 | extern "C" SEXP run_testthat_tests(SEXP use_xml_sxp) {
| ^
  make[2]: Leaving directory '/data/gannet/ripley/R/packages/tests-LTO/rar/src'
  installing to 
/data/gannet/ripley/R/packages/tests-LTO/Libs/rar-lib/00LOCK-rar/00new/rar/libs

This 'violates the C++ One Definition Rule' is something that started with
g++-13, if memory serves. Without looking at the code, I think you did
something that lead to a symbol being included multiple times, and it should
not be.

Cheers, Dirk

 
| > installed size is  8.9Mb
| > sub-directories of 1Mb or more:
| >  libs   8.7Mb
| 
| I have not been able to reproduce the issue either locally or on any machine 
I have ready access to. I have built it on some of the Rhub and R-Project build 
systems, and the same issue (with very different `libs` sizes) came up on some 
of them:
| 
| • (RHub) Ubuntu Linux 20.04.1 LTS, R-release, GCC: 18.2Mb,
| • (RHub) Fedora Linux, R-devel, clang, gfortran: 6.8Mb,
| • (R-Project) r-release-macosx-arm64: 8.5Mb.
| 
| Based on trying to read up about this, it seems that this is a pretty common 
problem 
 
for compiled packages because of debugging symbols getting inserted into the 
shared library file. Using the fix from that blog post where you modify the 
Makevars to strip debugging symbols from the shared library seems to solve the 
issue on those build systems, so I feel reasonably confident that this is 
what’s going on.
| 
| Apparently many, many existing packages on CRAN have the same issue. However, 
I’m very new to R package development, so I’m not exactly sure what to do. I 
have two questions:
| 
| 1. Is there anything I need to “fix” here, or should I just make contact with 
the CRAN folks and bring the fact that this is being caused by debugging 
symbols to their attention?
| 2. Regardless of whether or not I have to fix this issue for CRAN, is there a 
way to strip out the debugging symbols that comports with CRAN policies? The 
method suggested in the blog post above (adding a phony target in `Makevars` 
that strips the shared library) seems not to be CRAN-compliant, but I could be 
mistaken about that. (In particular, I had to modify it locally to get it to 
run, so I’m not sure what the platform-independent version of it looks like.)
| 
| Thanks in advance for the help!
| 
| Sincerely,
| Johann D. Gaebler
|   [[alternative HTML version deleted]]
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] current docker image for ASAN

2024-01-16 Thread Dirk Eddelbuettel


On 16 January 2024 at 15:54, Steven Scott wrote:
| Greetings everyone, though I expect this message is mainly for Dirk.
| 
| CRAN checks of my bsts/Boom package generate an ASAN error that the CRAN
| maintainers have asked me to look into.  I recall doing this before (this
| error has been there for several years now) via a docker image that Dirk
| had set up.  I have two questions.
| 
| 1) Which docker image should I use?  I imagine it has been updated since
| the last time I tried.
| 2) Is the image built with an asan-appropriate libc++?  I'm asking because
| the last time I tried tracking down this error, ASAN identified that there
| was an error, but pointed to an irrelevant section of code.  Brian thinks
| libc++ is the culprit.

Thanks -- maybe see prior messages. The container is on a scheduled weekly
rebuild, but for reasons that long escape me I also switched at some point to
relying on the 'sumo' container Winston builds by collating several such
containers and have used this myself for several debugging trips:

   https://github.com/wch/r-debug

You want RDsan and/or RDcsan therein.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] checking CRAN incoming feasibility

2024-01-16 Thread Dirk Eddelbuettel


On 17 January 2024 at 09:42, Simon Urbanek wrote:
| that check always hangs for me (I don't think it likes NZ ;)), so I just use
| 
| _R_CHECK_CRAN_INCOMING_REMOTE_=0 R CMD check --as-cran ...

You can also set it in Renviron files consulted just for checks:

  $ grep INCOMING_= ~/.R/check.Renviron*
  /home/edd/.R/check.Renviron:_R_CHECK_CRAN_INCOMING_=FALSE
  /home/edd/.R/check.Renviron-Rdevel:_R_CHECK_CRAN_INCOMING_=TRUE
  $ 

Best, Dirk

| 
| Cheers,
| Simon
| 
| 
| > On Jan 16, 2024, at 6:49 PM, Rolf Turner  wrote:
| > 
| > 
| > On Tue, 16 Jan 2024 16:24:59 +1100
| > Hugh Parsonage  wrote:
| > 
| >>> Surely the software just has to check
| >> that there is web connection to a CRAN mirror.
| >> 
| >> Nope! The full code is in tools:::.check_package_CRAN_incoming  (the
| >> body of which filled up my entire console), but to name a few checks
| >> it has to do: check that the name of the package is not the same as
| >> any other, including archived packages (which means that it has to
| >> download the package metadata), make sure the licence is ok, see if
| >> the version number is ok. 10 minutes is quite a lot though. I suspect
| >> the initial connection may have been faulty.
| > 
| > Well, it may not have been 10 minutes, but it was at least 5.  The
| > problem is persistent/repeatable.  I don't believe that there is any
| > faulty connection.
| > 
| > Thanks for the insight.
| > 
| > cheers,
| > 
| > Rolf Turner
| > 
| > -- 
| > Honorary Research Fellow
| > Department of Statistics
| > University of Auckland
| > Stats. Dep't. (secretaries) phone:
| > +64-9-373-7599 ext. 89622
| > Home phone: +64-9-480-4619
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| >
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] test failure: oldrel

2024-01-16 Thread Dirk Eddelbuettel


On 16 January 2024 at 10:28, Josiah Parry wrote:
| Oddly making the change has made CI happy. 
| 
https://github.com/R-ArcGIS/arcgisutils/actions/runs/7543315551/job/20534063601
| 
| It may be that the issue was OS related but I'm unsure since only oldrel for
| windows and macos check results are published https://cran.r-project.org/web/
| checks/check_results_arcgisutils.html

Seb solved the puzzle (in direct email to me). It has to do with the fact
that _the container_ defaults to UTC.  If I add '-e TZ=America/Chicago' to
the invocation we do indeed see a difference between r-release and r-oldrel
(and I also brought the version string display inside R):

edd@rob:~$ for v in 4.3.2 4.2.2; do docker run --rm -ti -e TZ=America/Chicago 
r-base:${v} Rscript -e 'cat(format(getRversion()), 
format(as.POSIXct(Sys.Date(), tz = "UTC")), Sys.getenv("TZ"), "\n")'; done
4.3.2 2024-01-16 America/Chicago 
4.2.2 2024-01-15 18:00:00 America/Chicago 
edd@rob:~$ 

Thanks to Seb for the cluebat wave.

Dirk

| 
| 
| On Tue, Jan 16, 2024 at 9:59 AM Dirk Eddelbuettel  wrote:
| 
| 
| Doesn't seem to be the case as it moderately easy to check (especially 
when
| you happen to have local images of r-base around anyway):
| 
| edd@rob:~$ for v in 4.3.2 4.2.2 4.1.3 4.0.5 3.6.3 3.5.3 3.4.4 3.3.3; do
| echo -n "R ${v}: "; docker run --rm -ti r-base:${v} Rscript -e 'as.POSIXct
| (Sys.Date(), tz = "UTC")'; done
| R 4.3.2: [1] "2024-01-16 UTC"
| R 4.2.2: [1] "2024-01-16 UTC"
| R 4.1.3: [1] "2024-01-16 UTC"
| R 4.0.5: [1] "2024-01-16 UTC"
| R 3.6.3: [1] "2024-01-16 UTC"
| R 3.5.3: [1] "2024-01-16 UTC"
| R 3.4.4: [1] "2024-01-16 UTC"
| R 3.3.3: [1] "2024-01-16 UTC"
| edd@rob:~$
| 
| Dirk
| 
| --
| dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] test failure: oldrel

2024-01-16 Thread Dirk Eddelbuettel


Doesn't seem to be the case as it moderately easy to check (especially when
you happen to have local images of r-base around anyway):

edd@rob:~$ for v in 4.3.2 4.2.2 4.1.3 4.0.5 3.6.3 3.5.3 3.4.4 3.3.3; do echo -n 
"R ${v}: "; docker run --rm -ti r-base:${v} Rscript -e 'as.POSIXct(Sys.Date(), 
tz = "UTC")'; done
R 4.3.2: [1] "2024-01-16 UTC"
R 4.2.2: [1] "2024-01-16 UTC"
R 4.1.3: [1] "2024-01-16 UTC"
R 4.0.5: [1] "2024-01-16 UTC"
R 3.6.3: [1] "2024-01-16 UTC"
R 3.5.3: [1] "2024-01-16 UTC"
R 3.4.4: [1] "2024-01-16 UTC"
R 3.3.3: [1] "2024-01-16 UTC"
edd@rob:~$ 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suggests with non-CRAN packages

2024-01-10 Thread Dirk Eddelbuettel


On 10 January 2024 at 16:25, Uwe Ligges wrote:
| 
| 
| On 10.01.2024 15:35, Josiah Parry wrote:
| > Thanks, all. As it goes, the package submission failed. The package that 
| > is suggested is available at https://r.esri.com/bin/ 
| >  and as such provided `https://r.esri.com 
| > ` as the url in `Additional_repositories`.
| 
| There is no
| 
| https://r.esri.com/src
| 
| hence it is obviously not a standard repository.

And how to set one up is described very patiently over ten pages in

   Hosting Data Packages via drat: A Case Study with Hurricane Exposure Data

at

   https://journal.r-project.org/archive/2017/RJ-2017-026/index.html

which does

   Abstract Data-only packages offer a way to provide extended functionality
   for other R users. However, such packages can be large enough to exceed
   the package size limit (5 megabytes) for the Comprehen sive R Archive
   Network (CRAN). As an alternative, large data packages can be posted to
   additional repostiories beyond CRAN itself in a way that allows smaller
   code packages on CRAN to access and use the data. The drat package
   facilitates creation and use of such alternative repositories and makes it
   particularly simple to host them via GitHub. CRAN packages can draw on
   packages posted to drat repositories through the use of the
   ‘Additonal_repositories’ field in the DESCRIPTION file. This paper
   describes how R users can create a suite of coordinated packages, in which
   larger data packages are hosted in an alternative repository created with
   drat, while a smaller code package that interacts with this data is
   created that can be submitted to CRAN.

for the use case of a 'too large for CRAN' suggested data package

| > The request was to remove the additional repositories and provide 
| > instructions for package installation in the Description field. This 
| > package, arcgisbinding, is used in one line of the entire package 
| > 
https://github.com/R-ArcGIS/arcgisutils/blob/64093dc1a42fa28010cd45bb6ae8b8c57835cb40/R/arc-auth.R#L123
 

 to extract an authorization token. It is provided for compatibility with a 
semi-closed-source R package. The installation instructions for which 
arelengthy 
(https://r.esri.com/r-bridge-site/arcgisbinding/installing-arcgisbinding.html 
) 
and /only /available as a windows binary. Providing an explicit call out for 
installation in the "Description" field of the DESCRIPTION feels like it is 
co-opting the Description to describe the installation process for a function 
that I anticipate /very few /people to use.
| 
| So you can either remove the need for that package or say something like 
| " and if an authorization token is to be extracted on Windows, the 
| 'arcgisbinding' package is needed that can be installed as explained at 
| ."

Additional_repositories is great, and you have 134 examples at CRAN:

> D <- data.table(tools::CRAN_package_db())
> D[is.na(Additional_repositories)==FALSE, .(Package, Additional_repositories)]
  Package
Additional_repositories
  
  
  1:archiDART  
https://archidart.github.io/drat/
  2:   aroma.core 
https://henrikbengtsson.r-universe.dev,\nhttps://r-forge.r-project.org
  3: asteRisk   
https://rafael-ayala.github.io/drat/
  4:BayesfMRI 
https://inla.r-inla-download.org/R/testing
  5:bigDM  
https://inla.r-inla-download.org/R/stable
 ---

130:TreatmentPatterns   
https://ohdsi.github.io/drat
131: TreeDist  
https://ms609.github.io/packages/
132: triplesmatch
https://errickson.net/rrelaxiv/
133: USA.state.boundaries 
https://iembry.gitlab.io/drat/
134:  voi 
https://inla.r-inla-download.org/R/stable/
>

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] eval(parse()) within mutate() returning same value for all rows

2023-12-29 Thread Dirk Eddelbuettel


On 29 December 2023 at 22:31, Mateo Obregón wrote:
| Thanks Gabor, I like your solution that splits the args into separate 
columns, 
| in turn making the sprintf() call more interpretable .

Well you may also like `tstrsplit()`, a gem inside data.table:

> suppressMessages(library(data.table))
>
> D <- data.table(words="%s plus %s equals %s", args=c("1,1,2", "2,2,4", 
> "3,3,6"))
> D
  words   args
  
1: %s plus %s equals %s  1,1,2
2: %s plus %s equals %s  2,2,4
3: %s plus %s equals %s  3,3,6
>
> D[, c('a','b','c') := tstrsplit(args, ",")]
> D
  words   args  a  b  c
 
1: %s plus %s equals %s  1,1,2  1  1  2
2: %s plus %s equals %s  2,2,4  2  2  4
3: %s plus %s equals %s  3,3,6  3  3  6
>
> D[, res := sprintf(words, a, b, c)]
> D
  words   args  a  b  c   res
 
1: %s plus %s equals %s  1,1,2  1  1  2 1 plus 1 equals 2
2: %s plus %s equals %s  2,2,4  2  2  4 2 plus 2 equals 4
3: %s plus %s equals %s  3,3,6  3  3  6 3 plus 3 equals 6
> 

so all we do here is a one-liner in data.table if you're so inclined:


> D <- data.table(words="%s plus %s equals %s", args=c("1,1,2", "2,2,4", 
> "3,3,6"))
> D[, c('a','b','c') := tstrsplit(args, ",")][, res := sprintf(words, a, b, 
> c)][, .(res)]
 res
  
1: 1 plus 1 equals 2
2: 2 plus 2 equals 4
3: 3 plus 3 equals 6
> 

data.table is very powerful and expressive. It is much worth getting into
which I really only did ten or so years into using R.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] eval(parse()) within mutate() returning same value for all rows

2023-12-29 Thread Dirk Eddelbuettel


On 29 December 2023 at 14:13, Mateo Obregón wrote:
| Hi all-
| 
| Looking through stackoverflow for R string combining examples, I found the 
| following from 3 years ago:
| 
| 

| 
| The top answer suggests to use eval(parse(sprintf())). I tried the suggestion 

Well:

   > fortunes::fortune(106)

   If the answer is parse() you should usually rethink the question.
  -- Thomas Lumley
 R-help (February 2005)
   
   > 

| and it did not return the expected combines strings. I thought that this 
might 
| be an issue with some leftover values being reused, so I explicitly eval() 
| with a new.env():
| 
| > library(dplyr)
| > df <- tibble(words=c("%s plus %s equals %s"), 
| args=c("1,1,2","2,2,4","3,3,6"))
| > df |> mutate(combined = eval(parse(text=sprintf("sprintf('%s', %s)", words, 
| args)), envir=new.env()))
| 
| # A tibble: 3 × 3
|   wordsargs  combined 
|
| 1 %s plus %s equals %s 1,1,2 3 plus 3 equals 6
| 2 %s plus %s equals %s 2,2,4 3 plus 3 equals 6
| 3 %s plus %s equals %s 3,3,6 3 plus 3 equals 6
| 
| The `combined`  is not what I was expecting, as the same last eval() is 
| returned for all three rows.
|  
| Am I missing something? What has changed in the past three years?

Nothing if you use the first answer which relies only on base R and still
works: 

   > words <- c('%s + %s equal %s', '%s + %s equal %s')
   > arguments <- c('1,1,2', '2,2,4')
   > df <- data.frame(words, arguments)
   > df
words arguments
   1 %s + %s equal %s 1,1,2
   2 %s + %s equal %s 2,2,4
   > df$combined <- apply(df, 1, function(x) do.call(sprintf, 
c(as.list(strsplit(x[2], ',')[[1]]), fmt = x[[1]])))
   > df
words arguments  combined
   1 %s + %s equal %s 1,1,2 1 + 1 equal 2
   2 %s + %s equal %s 2,2,4 2 + 2 equal 4
   > 

I am not the best person to answer what may have changed in `dplyr` in those
three years -- and neither is this list which is primarily concerned with
developing R itself.  

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Dirk Eddelbuettel


The point of my email was that

   if [ `uname -s` = 'Darwin' ]; then ...

allows for a clean branch between the (new here) macOS behaviour and (old,
prior) behavior removing all concerns about 'portability' (per the Subject:).

You missed 100% of that.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] portability question

2023-12-20 Thread Dirk Eddelbuettel


On 20 December 2023 at 11:10, Steven Scott wrote:
| The Boom package builds a library against which other packages link.  The
| library is built using the Makevars mechanism using the line
| 
| ${AR} rc $@ $^
| 
| A user has asked me to change 'rc' to 'rcs' so that 'ranlib' will be run on
| the archive.  This is apparently needed for certain flavors of macs.  I'm
| hoping someone on this list can comment on the portability of that change
| and whether it would negatively affect other platforms.  Thank you.

Just branch for macOS.  Here is a line I 'borrowed' years ago from data.table
and still use for packages needed to call install_name_tool on macOS.  You
could have a simple 'true' branch of the test use 'rcs' and the 'false'
branch do what you have always done.  Without any portability concerns.

>From https://github.com/Rdatatable/data.table/blob/master/src/Makevars.in#L14
and indented here for clarity

if [ "$(OS)" != "Windows_NT" ] && [ `uname -s` = 'Darwin' ]; then \
   install_name_tool -id data_table$(SHLIB_EXT) data_table$(SHLIB_EXT); 
\
fi

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN submission struggle

2023-12-16 Thread Dirk Eddelbuettel


Christiaan,

You say "errors" but you don't say which.

You say you have a package, but don't provide a source reference.

This makes it awfully hard to say or do anything. In case you are on github
or gitlab or ... it would simply be easiest to share a reference to the
repository.  Emailing 10mb blobs to every list subscriber is not ideal.

Dirk

PS Fortunes has that covered too

   > fortunes::fortune("mind read")

   There are actual error messages, and until you show them, we can not help as 
the mind
   reading machine is currently off for repairs.
      -- Dirk Eddelbuettel (after reports about errors with R CMD check)
 R-help (July 2010)

   > 
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Wrong mailing list: Could the 100 byte path length limit be lifted?

2023-12-13 Thread Dirk Eddelbuettel


On 13 December 2023 at 16:02, Tomas Kalibera wrote:
| 
| On 12/13/23 15:59, Dirk Eddelbuettel wrote:
| > On 13 December 2023 at 15:32, Tomas Kalibera wrote:
| > | Please don't forget about what has been correctly mentioned on this
| > | thread already: there is essentially a 260 character limit on Windows
| > | (see
| > | 
https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html
| > | for more). Even if the relative path length limit for a CRAN package was
| > | no longer regarded important for tar compatibility, it would still make
| > | sense for compatibility with Windows. It may still be a good service to
| > | your users if you keep renaming the files to fit into that limit.
| >
| > So can lift the limit from 100 char to 260 char ?
| 
| The 260 char limit is for the full path. A package would be extracted in 
| some directory, possibly also with a rather long name.

Call a cutoff number. 

Any move from '100' to '100 + N' for any nonzero N is a win. Pick one, and
then commit the change.  N = 50 would be a great start as arbitrary as it is.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Wrong mailing list: Could the 100 byte path length limit be lifted?

2023-12-13 Thread Dirk Eddelbuettel


On 13 December 2023 at 15:32, Tomas Kalibera wrote:
| Please don't forget about what has been correctly mentioned on this 
| thread already: there is essentially a 260 character limit on Windows 
| (see 
| https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html 
| for more). Even if the relative path length limit for a CRAN package was 
| no longer regarded important for tar compatibility, it would still make 
| sense for compatibility with Windows. It may still be a good service to 
| your users if you keep renaming the files to fit into that limit.

So can lift the limit from 100 char to 260 char ?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


PS One aspect I didn't mention clearly (my bad) that this does not affect all
or even most packages: in most cases the src/Makevars should indeed be as
simple as possible. But in _some_ cases we need to cooperate with external
libraries and in some of these cases the switch has been seen to be necessary.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


On 10 December 2023 at 17:07, Simon Urbanek wrote:
| As discussed here before packages should *never* set -mmacosx-version-min
| or similar flags by hand.

a) That is in conflict with what was said in the past; we have used an
explicit min version of 10.14 for the C++17 we were using then (and we now
need a bit more so 11.0 is welcome).

b) That is in conflict with how I read the R manual I quoted: R Admin,
Section 'C.3.10 Building binary packages'. Recall that our package uses a
binary artifact (by permission) so we have to play along and this minimum
version used by both matters.

c) That also appears to be in conflict with empirics. A quick search [1] at
the searchable CRAN mirror finds around a dozen packages doing just that:
setting a minimum version.

Anyway -- I 'eventually' got the info I need. 

Best regards, Dirk


[1] https://github.com/search?q=org%3Acran%20mmacosx-version-min=code

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Status of -mmacosx-version-min

2023-12-09 Thread Dirk Eddelbuettel


Last month, I had asked about the setting '-mmacosx-version-min' here.  The
setting can be used to specify what macOS version one builds for. It is,
oddly enough, not mentioned in Writing R Extension but for both r-release and
r-devel the R Administration manual states

   • Current CRAN macOS distributions are targeted at Big Sur so it is
 wise to ensure that the compilers generate code that will run on
 Big Sur or later.  With the recommended compilers we can use
  CC="clang -mmacosx-version-min=11.0"
  CXX="clang++ -mmacosx-version-min=11.0"
  FC="/opt//gfortran/bin/gfortran -mmacosx-version-min=11.0"
 or set the environment variable
  export MACOSX_DEPLOYMENT_TARGET=11.0

which is clear enough. (There is also an example in the R Internals manual
still showing the old (and deprecated ?) value of 10.13.)  It is also stated
at the top of mac.r-project.org.  But it is still in a somewhat confusing
contradiction to the matrix of tests machines, described e.g. at

   https://cran.r-project.org/web/checks/check_flavors.html

which still has r-oldrel-macos-x86_64 with 10.13.

I found this confusing, and pressed the CRAN macOS maintainer to clarify but
apparently did so in an insuffciently convincing manner. (There was a word
about it being emailed to r-sig-mac which is a list I am not on as I don't
have a macOS machine.) So in case anybody else wonders, my hope is that the
above is of help. At my day job, we will now switch to 11.0 to take advantage
of some more recent C++ features.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] problems with Maintainers in DESCRIPTION file

2023-12-07 Thread Dirk Eddelbuettel


On 7 December 2023 at 20:58, María Olga Viedma Sillero wrote:
| I receive the same note after fixing it, removing it, and checking Authors@R. 
I think the rejection is a false positive.
| 
| Flavor: r-devel-linux-x86_64-debian-gcc, r-devel-windows-x86_64
| Check: CRAN incoming feasibility, Result: NOTE
|   Maintainer: 'Olga Viedma mailto:olga.vie...@uclm.es>>'
  ^^

Compare that with the other 20500 CRAN packages (you can look at all of them
conveniently via https://github.com/cran/) -- your format differs. Instead of

  Authors@R: c(person("Olga", "Viedma", email = 
"olga.vie...@uclm.es", role = c("aut", "cph", 
"cre")),
   person("Carlos Alberto", "Silva", email = 
"c.si...@ufl.edu", role = c("aut", "cph")),
   person("Jose Manuel", "Moreno", email = 
"josem.mor...@uclm.es", role = c("aut", "cph")))

write

  Authors@R: c(person("Olga", "Viedma", email = "olga.vie...@uclm.es", role = 
c("aut", "cph", "cre")),
  person("Carlos Alberto", "Silva", email = "c.si...@ufl.edu", role 
= c("aut", "cph")),
  person("Jose Manuel", "Moreno", email = "josem.mor...@uclm.es", 
role = c("aut", "cph")))

ie remove the  part.

Hth, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] macos x86 oldrel backups?

2023-12-05 Thread Dirk Eddelbuettel


Hi Simon,

On 5 December 2023 at 23:17, Simon Urbanek wrote:
| The high-sierra build packages are currently not built due to hardware 
issues. The macOS version is so long out of support by Apple (over 6 years) 
that it is hard to maintain it. Only big-sur builds are supported at this 
point. Although it is possible that we may be able to restore the old builds, 
it is not guaranteed. (BTW the right mailing list for this is R-SIG-Mac).

Interesting.  And I see at mac.r-project.org the statement

   [...] As of R 4.3.0 we maintain two binary builds:

   - big-sur-x86_64 build supports legacy Intel Macs from macOS 11 (Big Sur) 
and higher
   - big-sur-arm64 build supports arm64 Macs (M1+) from macOS 11 (Big Sur) and 
higher

   We are no longer building binaries for macOS versions before 11 (as they
   are no longer supported by Apple).

so it is official that r-oldrel on macOS is no more for now? So we do not
have to worry about compilation standards lower than 11.0?  Do I have that
correct? 

Dirk
not on r-sig-mac, appreciative of any hints here as that is what we created the 
list for
 
| Cheers,
| Simon
| 
| 
| 
| > On 5/12/2023, at 09:52, Jonathan Keane  wrote:
| > 
| > Thank you to the CRAN maintainers for maintenance and keeping the all
| > of the CRAN infrastructure running.
| > 
| > I'm seeing a long delay in builds on CRAN for r-oldrel-macos-x86_64.
| > I'm currently interested in Arrow [1], but I'm seeing many other
| > packages with similar missing r-oldrel-macos-x86_64 builds (possibly
| > all, I sampled a few packages from [2], but didn't do an exhaustive
| > search) for an extended period.
| > 
| > It appears that this started between 2023-10-21 and 2023-10-22. It
| > looks like AMR [3] has a successful build but xlcutter does not [4]
| > and all the packages I've checked after 2023-10-22 don't have an
| > updated build for r-oldrel-macos-x86_64
| > 
| > Sorry if this is scheduled maintenance, I tried to find an
| > announcement here and on r-project.org but haven't yet found anything
| > indicating this.
| > 
| > [1] - https://cran.r-project.org/web/checks/check_results_arrow.html
| > [2] - 
https://cran.r-project.org/web/packages/available_packages_by_date.html
| > [3] - https://cran.r-project.org/web/packages/AMR/index.html
| > [4] - https://cran.r-project.org/web/packages/xlcutter/index.html
| > 
| > -Jon
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] URL woes at CRAN: Anaconda edition

2023-11-30 Thread Dirk Eddelbuettel


Thank you all -- expecially Aron and Ivan for the deep dive on the underlying
aspect of the hosting of that web property.  And of course to Uwe for
approving the package manually.

For my taste, life is too short for all this. So users be damned, and I have
now removed the badge.  At the end of the day it is better for Uwe et al (and
of course our side) to have these things autoprocess. If this kind of stuff
is in the way, I will just remove it as maintainer.

Thanks again to all.

Cheers, Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] URL woes at CRAN: Anaconda edition

2023-11-30 Thread Dirk Eddelbuettel


I added a badge to point to Conda builds for the work repo:

  
[![Anaconda](https://anaconda.org/conda-forge/r-tiledb/badges/version.svg)](https://anaconda.org/conda-forge/r-tiledb)


And as it goes with all good intentions I immediately got punished on the
next upload:

  Found the following (possibly) invalid URLs:
URL: https://anaconda.org/conda-forge/r-tiledb
  From: README.md
  Status: 400
  Message: Bad Request

And *of course* the same URL resolves fine for me in the browser without any
apparent redirect.  Bit of a web newb here but is there anything I can do
short of deleteing the badge?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-17 Thread Dirk Eddelbuettel


Simon,

One more thing: An alert reader pointed out to me that macOS-oldrel has

 r-oldrel-macos-x86_64  r-oldrelmacOS   x86_64  macOS 10.13.6 (17G11023)

in the table at https://cran.r-project.org/web/checks/check_flavors.html. So
this seems to mesh with what the R-on-macOS FAQ says, and switching to 11.0
would appear to at least loose r-oldrel-macos-x86_64 until April, no?

Best,  Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Simon,

On 17 November 2023 at 10:43, Simon Urbanek wrote:
| > On 17/11/2023, at 10:28 AM, Dirk Eddelbuettel  wrote:
| > On 17 November 2023 at 09:35, Simon Urbanek wrote:
| > | can you clarify where the flags come from? The current CRAN builds 
(big-sur-x86_64 and big-sur-arm64) use
| > | 
| > | export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
| > | export MACOSX_DEPLOYMENT_TARGET=11.0
| > | 
| > | so the lowest target is 11.0 and it is no longer forced it in the flags 
(so that users can more easily choose their desired targets).
| > 
| > Beautiful, solves our issue.  Was that announced at some point? If so, 
where?
| > 
| 
| I don't see what is there to announce as the packages should be simply using 
flags passed from R and that process did not change.
| 
| That said, the binary target for CRAN has been announced on this list as part 
of the big-sur build announcement:
| https://stat.ethz.ch/pipermail/r-sig-mac/2023-April/014731.html

I don't own or (directly) use macOS hardware so I am not on that list. 
 
| > For reference the R-on-macOS FAQ I consulted still talks about 10.13 at
| > 
https://cran.r-project.org/bin/macosx/RMacOSX-FAQ.html#Installation-of-source-packages
| > 
| >  CC = clang -mmacosx-version-min=10.13
| >  CXX = clang++ -mmacosx-version-min=10.13 -std=gnu++14
| >  FC = gfortran -mmacosx-version-min=10.13
| >  OBJC = clang -mmacosx-version-min=10.13
| >  OBJCXX = clang++ -mmacosx-version-min=10.13
| > 
| > so someone may want to refresh this. It is what I consulted as relevant 
info.
| > 
| 
| It says "Look at file /Library/Frameworks/R.framework/Resources/etc/Makeconf" 
so it is just an example that will vary by build. For example big-sur-arm64 
will give you
| 
| $ grep -E '^(CC|CXX|FC|OBJC|OBJCXX) ' 
/Library/Frameworks/R.framework/Resources/etc/Makeconf
| CC = clang -arch arm64
| CXX = clang++ -arch arm64 -std=gnu++14
| FC = /opt/R/arm64/bin/gfortran -mtune=native
| OBJC = clang -arch arm64
| OBJCXX = clang++ -arch arm64
| 
| Again, this is just an example, no one should be entering such flags by hand 
- that's why they are in Makeconf so packages can use them without worrying 
about the values (see R-exts 1.2: 
https://cran.r-project.org/doc/manuals/R-exts.html#Configure-and-cleanup for 
details).

I recommend you spend a moment with for example the (rather handy) search
capability of GitHub to search through the 'cran' organisation mirroring the
repo.  The package I maintain is far from being the only one setting the flag.

https://github.com/search?q=org%3Acran+mmacosx-version-min=code

Best, Dirk

| Cheers,
| Simon
| 
| 
| 
| > Thanks, Dirk
| > 
| > | 
| > | Cheers,
| > | Simon
| > | 
| > | 
| > | 
| > | > On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
| > | > 
| > | > 
| > | > Hi Simon,
| > | > 
| > | > We use C++20 'inside' our library and C++17 in the API. Part of our 
C++17 use
| > | > is now expanding to std::filesystem whose availability is dependent on 
the
| > | > implementation. 
| > | > 
| > | > The compiler tells us (in a compilation using 
-mmacosx-version-min=10.14)
| > | > that the features we want are only available with 10.15.
| > | > 
| > | > Would we be allowed to use this value of '10.15' on CRAN?
| > | > 
| > | > Thanks as always,  Dirk
| > | > 
| > | > 
| > | > [1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
| > | > 
| > | > -- 
| > | > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > | > 
| > | > __
| > | > R-package-devel@r-project.org mailing list
| > | > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > | > 
| > | 
| > 
| > -- 
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Simon,

On 17 November 2023 at 09:35, Simon Urbanek wrote:
| can you clarify where the flags come from? The current CRAN builds 
(big-sur-x86_64 and big-sur-arm64) use
| 
| export SDKROOT=/Library/Developer/CommandLineTools/SDKs/MacOSX11.sdk
| export MACOSX_DEPLOYMENT_TARGET=11.0
| 
| so the lowest target is 11.0 and it is no longer forced it in the flags (so 
that users can more easily choose their desired targets).

Beautiful, solves our issue.  Was that announced at some point? If so, where?

For reference the R-on-macOS FAQ I consulted still talks about 10.13 at
https://cran.r-project.org/bin/macosx/RMacOSX-FAQ.html#Installation-of-source-packages

  CC = clang -mmacosx-version-min=10.13
  CXX = clang++ -mmacosx-version-min=10.13 -std=gnu++14
  FC = gfortran -mmacosx-version-min=10.13
  OBJC = clang -mmacosx-version-min=10.13
  OBJCXX = clang++ -mmacosx-version-min=10.13

so someone may want to refresh this. It is what I consulted as relevant info.

Thanks, Dirk

| 
| Cheers,
| Simon
| 
| 
| 
| > On 17/11/2023, at 2:57 AM, Dirk Eddelbuettel  wrote:
| > 
| > 
| > Hi Simon,
| > 
| > We use C++20 'inside' our library and C++17 in the API. Part of our C++17 
use
| > is now expanding to std::filesystem whose availability is dependent on the
| > implementation. 
| > 
| > The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
| > that the features we want are only available with 10.15.
| > 
| > Would we be allowed to use this value of '10.15' on CRAN?
| > 
| > Thanks as always,  Dirk
| > 
| > 
| > [1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185
| > 
| > -- 
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Can -mmacosx-version-min be raised to 10.15 ?

2023-11-16 Thread Dirk Eddelbuettel


Hi Simon,

We use C++20 'inside' our library and C++17 in the API. Part of our C++17 use
is now expanding to std::filesystem whose availability is dependent on the
implementation. 

The compiler tells us (in a compilation using -mmacosx-version-min=10.14)
that the features we want are only available with 10.15.

Would we be allowed to use this value of '10.15' on CRAN?

Thanks as always,  Dirk


[1] 
https://github.com/TileDB-Inc/TileDB/actions/runs/6882271269/job/18720444943?pr=4518#step:7:185

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Package submission fail

2023-11-13 Thread Dirk Eddelbuettel


On 13 November 2023 at 16:46, Ivan Krylov wrote:
| Hello Christiaan and welcome to R-package-devel!

Seconded but PLEASE do not send large attachments to the list and all its
subscriber. Point us to your code repository, you will like get a kind
response from a volunteer or two peeking at it.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Segmentation fault early in compilation of revision 85514

2023-11-12 Thread Dirk Eddelbuettel


Avi,

Might be toolchain-dependent, might be options-dependent--it built fine here.
Easier for you to vary option two so maybe try that?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [R] Why Rprofile.site is not built with manual installation of R devel in linux?

2023-11-10 Thread Dirk Eddelbuettel


On 10 November 2023 at 14:19, Martin Maechler wrote:
| >> 2.  In the installed R in /where/you/want/R/to/go, there is no even 
etc folder, there are only the folders bin, lib and share.

That would appear to be an error in the locally installed R.

What the package does has been discussed before. Per the Linux Filesystem
Standard (or Filesystem Hierarchy or whatever it is called), Linux distro
such as Debian and Ubuntu use _both_ /usr/lib _and_ /usr/share so the
directories are split. Use the `R.home()` function to find them:

$ Rscript -e 'sapply(c("bin", "doc", "etc"), R.home)'
   bindocetc 
  "/usr/lib/R/bin" "/usr/share/R/doc"   "/usr/lib/R/etc" 
$ 

That is from the packaged R via r-base-core.

The behaviour can easily be validated by invoking, say, a Docker container
based on these same (Debian) packages. One of the containers I look after in
the Rocker Project has also been aliased to just 'r-base' for years by Docker
itself so just prefix the above by 'docker run --rm -ti r-base' to run in a
clean container:

$ docker run --rm -ti r-base Rscript -e 'sapply(c("bin", "doc", "etc"), 
R.home)'
   bindocetc 
  "/usr/lib/R/bin" "/usr/share/R/doc"   "/usr/lib/R/etc" 
$ 

Being able to run Docker for such tests is a good superpower and on most
systems these days only one package installation away.

For a locally built one (my r-devel build here) it works the same but points
of course to a different directory:

$ RDscript -e 'sapply(c("bin", "doc", "etc"), R.home)'
   bindoc
"/usr/local/lib/R-devel/lib/R/bin" "/usr/local/lib/R-devel/lib/R/doc"
   etc 
"/usr/local/lib/R-devel/lib/R/etc"
$ 

As suggested, question for 'R on Debian, Ubuntu, ...' should probably go to
r-sig-debian which is now CCed.  As usual, you must be subscribed to post or
else you will get mistaken for a spammer and ignored.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] Fortune candidate Re: Issue with R Package on CRAN - OpenMP and clang17

2023-10-31 Thread Dirk Eddelbuettel


On 31 October 2023 at 19:58, Ivan Krylov wrote:
| [...] The computers that helped launch the first
| people into space had 2 kWords of memory, but nowadays you need more
| than 256 MBytes of RAM to launch a bird into a pig and 10 GBytes of
| storage in order to compile a compiler. This is what progress looks
| like.

Fortune!!

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-30 Thread Dirk Eddelbuettel


I have some better news.  While we established that 'in theory' setting the
environment variable OMP_NUM_THREADS would help (and I maintain that it is a
great PITA that CRAN does not do so as a general fix for this issue) it does
*not help* once R is started.  OpenMP only considers the variable once at
startup and does not re-read it.  So we cannot set from R once R has started.

But OpenMP offers a setter (and a getter) for the thread count value.

And using it addresses the issue.  I created a demo package [1] which, when
running on a system with both OpenMP and 'enough cores' (any modern machine
will do) exhibits the warning from R CMD check --as-cran with timing enabled
(i.e. env vars set).  When an additional environment variable 'SHOWME' is set
to 'yes', it successfully throttles via the exposed OpenMP setter.  In our
example, Armadillo uses it to calibrate its thread use, a lower setting is
followed, and the warning is gone.

I will add more convenient wrappers to RcppArmadillo itself. These are
currently in a branch [2] and their use is illustrated in the help page and
example of fastLm demo function [3].  I plan to make a new RcppArmadillo
release with this change in the coming days, the setter and re-setter will
work for any OpenMP threading changes. So if you use RcppArmadillo, this
should help. (And of course there always was RhpcBLASctl doing this too.)

Dirk

[1] https://github.com/eddelbuettel/rcpparmadilloopenmpex
[2] https://github.com/RcppCore/RcppArmadillo/tree/feature/thread_throttle\
[3] 
https://github.com/RcppCore/RcppArmadillo/blob/a8db424bd6aaeda2ceb897142d3c366d9c6591c7/man/fastLm.Rd#L72-L98
[4] https://cran.r-project.org/package=RhpcBLASctl

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Wayland Display Support in R Plot

2023-10-30 Thread Dirk Eddelbuettel


On 30 October 2023 at 13:17, Willem Ligtenberg via R-devel wrote:
| I just tried it on Ubuntu 23.10. It seems to just work.
| See screenshot here: https://nextcloud.wligtenberg.nl/s/jnbDT4ZiHw2JQ8H
| I should be using wayland, and as far as I know I haven't done anything 
| special to make this work. But there might be some compatibility layer 
| that is active by default.

Same here under trusted 22.04 LTS on that older / not quite healthy laptop.
All it took was to set WaylandEnable=true in /etc/gdm3/custom.conf and a
systemctl call to restart gdm3.

R plots fine in the x11() default device via plot(cumsum(rnorm(100)), type="l").

Like Roger, I had issues with some third-party apps (obs to record lectures
comes to mind, maybe also zoom ?) so "we'll see".

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Wayland Display Support in R Plot

2023-10-29 Thread Dirk Eddelbuettel


On 30 October 2023 at 09:20, Simon Urbanek wrote:
| > On 30/10/2023, at 8:38 AM, Dirk Eddelbuettel  wrote:
| > On 30 October 2023 at 07:54, Paul Murrell wrote:
| > | I am unaware of any Wayland display support.
| > | 
| > | One useful way forward would be an R package that provides such a device 
| > | (along the lines of 'Cairo', 'tikzDevice', et al)
| > 
| > As I understand it, it is a protocol, and not a device.
| > 
| 
| Well, X11 is a protocol, not a device, either.

Point taken.

| > I think I needed to fall back to X11 for a particular applications (likely
| > OBS) so my session tells me (under Settings -> About -> Windowing System) I
| > am still running X11. I'll check again once I upgrade from Ubuntu 23.04 to
| > Ubuntu 23.10

Booted an older laptop using 22.04, selected 'not X11' in the gdm dialog but
the same Gnome Menu still says Windowing System: X11.  So I am no longer sure
how I would convince myself if I am under Wayland there or not.  The answers
in 
https://unix.stackexchange.com/questions/202891/how-to-know-whether-wayland-or-x11-is-being-used
suggest I still run X11 too.  So I got nuttin' here.

In any event, I read OP as asking 'do we need a new device' and I still think
that the answer to that still is 'likely not' as the X11 compatibility layer
should cover this.  

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Wayland Display Support in R Plot

2023-10-29 Thread Dirk Eddelbuettel


On 30 October 2023 at 07:54, Paul Murrell wrote:
| I am unaware of any Wayland display support.
| 
| One useful way forward would be an R package that provides such a device 
| (along the lines of 'Cairo', 'tikzDevice', et al)

As I understand it, it is a protocol, and not a device.

Several Linux distributions have long defaulted to it, so we already should
have thousands of users. While 'not X11' it provides a compatibility layer
and should be seamless.

I think I needed to fall back to X11 for a particular applications (likely
OBS) so my session tells me (under Settings -> About -> Windowing System) I
am still running X11. I'll check again once I upgrade from Ubuntu 23.04 to
Ubuntu 23.10

See https://en.wikipedia.org/wiki/Wayland_(protocol) for more.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-27 Thread Dirk Eddelbuettel


Hi Jouni,

On 27 October 2023 at 13:02, Helske, Jouni wrote:
| Actually, the OMP_NUM_THREADS worked for vignettes and testthat tests, but
| didn't help with the examples. However, I just wrapped the problematic example

Now I am confused.

What is your understanding of why it helps in one place and not the other?

| with \donttest as for some reason this issue only happened with a single
| seemingly simple example (hence the "weird" in the earlier NEWS due to
| frustration, I changed this to the CRAN version).
| 
| Thanks for reminding me about the resetting the number of cores, will fix that
| to the next version.

I have an idea for a RcppArmadillo-based helper function. We can save the
initial values of the environment variable in .onLoad and cache it. A simple
helper function pair can then dial the environment variable down and reset it
to the cached value.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-27 Thread Dirk Eddelbuettel


Jouni,

My CRANberriesFeed reports a new bssm package at CRAN, congratulations for
sorting this out. [1,2] The OMP_NUM_THREADS setting is indeed all it takes,
and it _does_ seem to be read even from a running session: i.e. you can set
this inside an R session and the OpenMP code considers it in the same
process. Good!

As some of us mentioned, your usage pattern of setting
'Sys.setenv("OMP_NUM_THREADS" = 2)' everywhere _leaves_ that value so you
permanently ham-string the behaviour of a session which runs an example or
test of your package: the same session will never get back to 'all cores' by
itself so adding a resetter to the initial value (maybe via on.exit()) may be
a good idea for the next package revision if you have any energy left for
this question :)

Again, congrats for sorting it out, and sorry for the trouble. I long argued
CRAN should set the behaviour-defining environment variable, that
OMP_NUM_THREADS, for the tests and examples it wants to run under reduced
load.  Alas, that's not where we ended up.

Cheers,  Dirk

[1] http://dirk.eddelbuettel.com/cranberries/2023/10/27#bssm_2.0.2

[2] Your NEWS file calls this 'fix weird CRAN issues with parallelisation on
Debian.'. There is nothing 'weird' here (it behaves as designed, computers do
that to us), and it is not just on Debian but on any system where the build
has a) access to OpenMP so uses it and b) measures real time to elapsed time
with a cap of 2 as CRAN does.

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] API client package failing due to API authentication

2023-10-26 Thread Dirk Eddelbuettel


On 26 October 2023 at 11:14, Cole Johanson wrote:
| My package https://github.com/cole-johanson/smartsheetr requires an
| environment variable, the API access token, to run most of the functions.
| The steps for setting this are documented in the README, but my package is
| being auto-rejected by CRAN for failing the examples.
| 
| I have wrapped the examples with the roxygen2 tag *\dontrun*, but it is
| still attempting to run the examples.
| 
| Should I report this as a false positive, or is there a step I am missing?

You should not attempt to run the examples when they could fail e.g. when no
API key is present as it the case for CRAN.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-25 Thread Dirk Eddelbuettel


On 24 October 2023 at 08:15, Dirk Eddelbuettel wrote:
| 
| On 24 October 2023 at 15:55, Ivan Krylov wrote:
| | В Tue, 24 Oct 2023 10:37:48 +
| | "Helske, Jouni"  пишет:
| | 
| | > Examples with CPU time > 2.5 times elapsed time
| | >   user system elapsed ratio
| | > exchange 1.196   0.04   0.159 7.774
| | 
| | I've downloaded the archived copy of the package from the CRAN FTP
| | server, installed it and tried:
| | 
| | library(bssm)
| | Sys.setenv("OMP_THREAD_LIMIT" = 2)
| | data("exchange")
| | model <- svm(
| |  exchange, rho = uniform(0.97,-0.999,0.999),
| |  sd_ar = halfnormal(0.175, 2), mu = normal(-0.87, 0, 2)
| | )
| | system.time(particle_smoother(model, particles = 500))
| | #user  system elapsed
| | #   0.515   0.000   0.073
| | 
| | I set a breakpoint on clone() [*] and got quite a few calls creating
| | OpenMP threads with the following call stack:
| | 
| | #0  clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:52
| | <...>
| | #4  0x77314e0a in GOMP_parallel () from
| | /usr/lib/x86_64-linux-gnu/libgomp.so.1
| |  <-- RcppArmadillo code below
| | #5 0x738f5f00 in
| | arma::eglue_core::apply,
| | arma::eOp, arma::eop_exp>,
| | arma::eop_scalar_times>, arma::eOp,
| | arma::eop_scalar_div_post>, arma::eop_square> > (outP=..., x=...) at
| | .../library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:69
| | #6 0x73a31246 in
| | arma::Mat::operator=,
| | arma::eop_exp>, arma::eop_scalar_times>,
| | arma::eOp, arma::eop_scalar_div_post>,
| | arma::eop_square>, arma::eglue_div> (X=..., this=0x7fff36f0) at
| | .../library/RcppArmadillo/include/armadillo_bits/Proxy.hpp:226
| | #7
| | 
arma::Col::operator=,
| | arma::eop_exp>, arma::eop_scalar_times>,
| | arma::eOp, arma::eop_scalar_div_post>,
| | arma::eop_square>, arma::eglue_div> > ( X=..., this=0x7fff36f0) at
| | .../library/RcppArmadillo/include/armadillo_bits/Col_meat.hpp:535
| |  <-- bssm code below
| | #8  ssm_ung::laplace_iter (this=0x7fff15e0, signal=...) at
| | model_ssm_ung.cpp:310
| | #9  0x73a36e9e in ssm_ung::approximate (this=0x7fff15e0) at
| | .../library/RcppArmadillo/include/armadillo_bits/arrayops_meat.hpp:27
| | #10 0x73a3b3d3 in ssm_ung::psi_filter
| | (this=this@entry=0x7fff15e0, nsim=nsim@entry=500, alpha=...,
| | weights=..., indices=...) at model_ssm_ung.cpp:517
| | #11 0x73948cd7 in psi_smoother (model_=..., nsim=nsim@entry=500,
| | seed=seed@entry=1092825895, model_type=model_type@entry=3) at
| | R_psi.cpp:131
| | 
| | What does arma::eglue_core do?
| | 
| | (gdb) list
| | /* reformatted a bit */
| | library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:64
| |  int n_threads = (std::min)(
| |   int(arma_config::mp_threads),
| |   int((std::max)(int(1), int(omp_get_max_threads(
| |  );
| | (gdb) p arma_config::mp_threads
| | $3 = 8
| | (gdb) p (int)omp_get_max_threads()
| | $4 = 16
| | (gdb) p (char*)getenv("OMP_THREAD_LIMIT")
| | $7 = 0x56576b91 "2"
| | (gdb) p /x (int)omp_get_thread_limit()
| | $9 = 0x7fff
| | 
| | Sorry for misinforming you about the OMP_THREAD_LIMIT environment
| | variable: the OpenMP specification requires the program to ignore
| | modifications to the environment variables after the program has
| | started [**], so it only works if R is started with OMP_THREAD_LIMIT
| | set. Additionally, the OpenMP thread limit is not supposed to be
| | adjusted at runtime at all [***].
| | 
| | Unfortunately for our situation, Armadillo is very insistent in setting
| | its own number of threads from arma_config::mp_threads (which is
| | constexpr 8 unless you set preprocessor directives while compiling it)
| | and omp_get_max_threads (which is the upper bound on the number of
| | threads that cannot be adjusted at runtime).
| | 
| | What I'm about to suggest is a terrible hack, but since Armadillo seems
| | to lack the option to set the number of threads at runtime, there might
| | be no other option.
| | 
| | Before you #include an Armadillo header, every time:
| | 
| | 1. #include  so that the OpenMP functions are declared and the
| | #include guard is set
| | 
| | 2. Define a static inline function get_number_of_threads returning the
| | desired number of threads as an int (e.g. referencing an extern int
| | number_of_threads stored elsewhere)
| | 
| | 3. #define omp_get_max_threads get_number_of_threads
| | 
| | Now if you provide an API for the R code to get and set this number, it
| | should be possible to control the number of threads used by OpenMP code
| | in Armadillo. Basically, a data.table::setDTthreads() for the copy of
| | Armadillo inlined inside your package.
| | 
| | If you then compile your package with a large #define
| | ARMA_OPENMP_THREADS, it will both be able to use more than 8 threads
| | *and* limit itself when needed.
| | 
| | An alternative course of action is com

Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-24 Thread Dirk Eddelbuettel


On 24 October 2023 at 15:55, Ivan Krylov wrote:
| В Tue, 24 Oct 2023 10:37:48 +
| "Helske, Jouni"  пишет:
| 
| > Examples with CPU time > 2.5 times elapsed time
| >   user system elapsed ratio
| > exchange 1.196   0.04   0.159 7.774
| 
| I've downloaded the archived copy of the package from the CRAN FTP
| server, installed it and tried:
| 
| library(bssm)
| Sys.setenv("OMP_THREAD_LIMIT" = 2)
| data("exchange")
| model <- svm(
|  exchange, rho = uniform(0.97,-0.999,0.999),
|  sd_ar = halfnormal(0.175, 2), mu = normal(-0.87, 0, 2)
| )
| system.time(particle_smoother(model, particles = 500))
| #user  system elapsed
| #   0.515   0.000   0.073
| 
| I set a breakpoint on clone() [*] and got quite a few calls creating
| OpenMP threads with the following call stack:
| 
| #0  clone () at ../sysdeps/unix/sysv/linux/x86_64/clone.S:52
| <...>
| #4  0x77314e0a in GOMP_parallel () from
| /usr/lib/x86_64-linux-gnu/libgomp.so.1
|  <-- RcppArmadillo code below
| #5 0x738f5f00 in
| arma::eglue_core::apply,
| arma::eOp, arma::eop_exp>,
| arma::eop_scalar_times>, arma::eOp,
| arma::eop_scalar_div_post>, arma::eop_square> > (outP=..., x=...) at
| .../library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:69
| #6 0x73a31246 in
| arma::Mat::operator=,
| arma::eop_exp>, arma::eop_scalar_times>,
| arma::eOp, arma::eop_scalar_div_post>,
| arma::eop_square>, arma::eglue_div> (X=..., this=0x7fff36f0) at
| .../library/RcppArmadillo/include/armadillo_bits/Proxy.hpp:226
| #7
| 
arma::Col::operator=,
| arma::eop_exp>, arma::eop_scalar_times>,
| arma::eOp, arma::eop_scalar_div_post>,
| arma::eop_square>, arma::eglue_div> > ( X=..., this=0x7fff36f0) at
| .../library/RcppArmadillo/include/armadillo_bits/Col_meat.hpp:535
|  <-- bssm code below
| #8  ssm_ung::laplace_iter (this=0x7fff15e0, signal=...) at
| model_ssm_ung.cpp:310
| #9  0x73a36e9e in ssm_ung::approximate (this=0x7fff15e0) at
| .../library/RcppArmadillo/include/armadillo_bits/arrayops_meat.hpp:27
| #10 0x73a3b3d3 in ssm_ung::psi_filter
| (this=this@entry=0x7fff15e0, nsim=nsim@entry=500, alpha=...,
| weights=..., indices=...) at model_ssm_ung.cpp:517
| #11 0x73948cd7 in psi_smoother (model_=..., nsim=nsim@entry=500,
| seed=seed@entry=1092825895, model_type=model_type@entry=3) at
| R_psi.cpp:131
| 
| What does arma::eglue_core do?
| 
| (gdb) list
| /* reformatted a bit */
| library/RcppArmadillo/include/armadillo_bits/mp_misc.hpp:64
|  int n_threads = (std::min)(
|   int(arma_config::mp_threads),
|   int((std::max)(int(1), int(omp_get_max_threads(
|  );
| (gdb) p arma_config::mp_threads
| $3 = 8
| (gdb) p (int)omp_get_max_threads()
| $4 = 16
| (gdb) p (char*)getenv("OMP_THREAD_LIMIT")
| $7 = 0x56576b91 "2"
| (gdb) p /x (int)omp_get_thread_limit()
| $9 = 0x7fff
| 
| Sorry for misinforming you about the OMP_THREAD_LIMIT environment
| variable: the OpenMP specification requires the program to ignore
| modifications to the environment variables after the program has
| started [**], so it only works if R is started with OMP_THREAD_LIMIT
| set. Additionally, the OpenMP thread limit is not supposed to be
| adjusted at runtime at all [***].
| 
| Unfortunately for our situation, Armadillo is very insistent in setting
| its own number of threads from arma_config::mp_threads (which is
| constexpr 8 unless you set preprocessor directives while compiling it)
| and omp_get_max_threads (which is the upper bound on the number of
| threads that cannot be adjusted at runtime).
| 
| What I'm about to suggest is a terrible hack, but since Armadillo seems
| to lack the option to set the number of threads at runtime, there might
| be no other option.
| 
| Before you #include an Armadillo header, every time:
| 
| 1. #include  so that the OpenMP functions are declared and the
| #include guard is set
| 
| 2. Define a static inline function get_number_of_threads returning the
| desired number of threads as an int (e.g. referencing an extern int
| number_of_threads stored elsewhere)
| 
| 3. #define omp_get_max_threads get_number_of_threads
| 
| Now if you provide an API for the R code to get and set this number, it
| should be possible to control the number of threads used by OpenMP code
| in Armadillo. Basically, a data.table::setDTthreads() for the copy of
| Armadillo inlined inside your package.
| 
| If you then compile your package with a large #define
| ARMA_OPENMP_THREADS, it will both be able to use more than 8 threads
| *and* limit itself when needed.
| 
| An alternative course of action is compiling your package with #define
| ARMA_OPENMP_THREADS 2 and giving up on more OpenMP threads inside calls
| to Armadillo.

We should work on adding such a run-time setter of the number of cores to
RcppArmadillo so that examples can dial down to 2 cores.  I have been doing
just that in package tiledb (via a setting internal to the TileDB Core
library) for 'ages' now and RcppArmadillo could and should offer the 

Re: [R-pkg-devel] Too many cores used in examples (not caused by data.table)

2023-10-20 Thread Dirk Eddelbuettel


On 19 October 2023 at 05:57, Helske, Jouni wrote:
| I am having difficulties in getting the latest version of the bssm 
(https://github.com/helske/bssm) package to CRAN, as the pretest issues a NOTE 
that the package uses too many cores in some of the examples ("Examples with 
CPU time > 2.5 times elapsed time"). I've seen plenty of discussion about this 
issue in relation to the data.table package, but bssm does not use it. Also, 
while bssm uses OpenMP in some functions, these are not called in the example 
in question (?exchange), and by default the number of threads in the 
parallelisable functions is set to 1.
| 
| But I just realised that bssm uses Armadillo via RcppArmadillo, which uses 
OpenMP by default for some elementwise operations. So, I wonder if that could 
be the culprit? However, I would think that in such case there would be many 
other packages with RcppArmadillo encountering the same CRAN issues. Has anyone 
experienced this with their packages which use RcppArmadillo but not 
data.table, or can say whether my guess is correct? I haven't been able to 
reproduce the issue myself on r-hub or my own linux, so I can't really test 
whether setting #define ARMA_DONT_USE_OPENMP helps.

You have some options to control OpenMP.

There is an environment variable (OMP_THREAD_LIMIT), and there is an CRAN
add-on package (RhpcBLASctl) which, if memory serves, also sets this. Looking
at the Armadillo documentation we see another variable (ARMA_OPENMP_THREADS).

I really think CRAN made a mistake here pushing this down on all package
maintainers.  It is too much work, some will get frustrated, some will get it
wrong and I fear in aggregate we end up with less performant software (as
some will 'cave in' and hard-wire single threaded computes). 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


On 18 October 2023 at 08:51, Simon Urbanek wrote:
| John,
| 
| the short answer is it won't work (it defeats the purpose of vignettes).

Not exactly. Everything is under our (i.e. package author) control, and when
we want to replace 'computed' values with cached values we can.

All this is somewhat of a charade. "Of course" we want vignettes to run
tests. But then we don't want to fall over random missing .sty files or fonts
(macOS machines have been less forgiving than others), not to mention compile
time.

So for simplicity I often pre-make pdf vignettes that get included in other
latex code as source. Works great, never fails, CRAN never complained --
which is somewhat contrary to your statement.

It is effectively the same with tests. We all want maximum test surfaces. But
when tests fail, or when they run too long, or [insert many other reasons
here] so many packages run tests conditionally.  Such is life.

Dirk

 
| However, this sounds like a purely hypothetical question - CRAN policies 
allow long-running vignettes if they declared.
| 
| Cheers,
| Simon
| 
| 
| > On 18/10/2023, at 3:02 AM, John Fox  wrote:
| > 
| > Hello Dirk,
| > 
| > Thank you (and Kevin and John) for addressing my questions.
| > 
| > No one directly answered my first question, however, which was whether the 
approach that I suggested would work. I guess that the implication is that it 
won't, but it would be nice to confirm that before I try something else, 
specifically using R.rsp.
| > 
| > Best,
| > John
| > 
| > On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
| >> Caution: External email.
| >> On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| >> | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
| >> | package.
| >> For completeness, that approach also works directly with Sweave. Described 
in
| >> a blog post by Mark van der Loo in 2019, and used in a number of packages
| >> including a few of mine.
| >> That said, I also used the approach described by John Harrold and cached
| >> results myself.
| >> Dirk
| >> --
| >> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| >> __
| >> R-package-devel@r-project.org mailing list
| >> https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| > 
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


John,

On 17 October 2023 at 10:02, John Fox wrote:
| Hello Dirk,
| 
| Thank you (and Kevin and John) for addressing my questions.
| 
| No one directly answered my first question, however, which was whether 
| the approach that I suggested would work. I guess that the implication 
| is that it won't, but it would be nice to confirm that before I try 
| something else, specifically using R.rsp.

I am a little remote here, both mentally and physically. What I might do here
in the case of your long-running vignette, and have done in about half a
dozen packages where I wanted 'certainty' and no surprises, is to render the
pdf vignette I want as I want them locally, ship them in the package as an
included file (sometimes from a subdirectory) and have a five-or-so line
Sweave .Rnw file include it. That works without hassles. Here is the Rnw I
use for package anytime

-
\documentclass{article}
\usepackage{pdfpages}
%\VignetteIndexEntry{Introduction to anytime}
%\VignetteKeywords{anytime, date, datetime, conversion}
%\VignettePackage{anytime}
%\VignetteEncoding{UTF-8}

\begin{document}
\includepdf[pages=-, fitpaper=true]{anytime-intro.pdf}
\end{document}
-

That is five lines of LaTeX code slurping in the pdf (per the blog post by
Mark). As I understand it R.rsp does something similar at the marginal cost
of an added dependency.

Now, as mentioned, you can also 'conditionally' conpute in a vignette and
choose if and when to use a data cache. I think that we show most of that in
the package described in the RJournal piece by Brooke and myself on drat for
data repositories. (We may be skipping the compute when the data is not
accessible. Loading a precomputed set is similar. I may be doing that in the
much older never quite finished gcbd package and its vignette.

Hope this helps, maybe more once I am back home.

Cheers, Dirk
 
| Best,
|   John
| 
| On 2023-10-17 4:02 a.m., Dirk Eddelbuettel wrote:
| > Caution: External email.
| > 
| > 
| > On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| > | Produce a PDF file yourself, then use the "as.is" feature of the R.rsp
| > | package.
| > 
| > For completeness, that approach also works directly with Sweave. Described 
in
| > a blog post by Mark van der Loo in 2019, and used in a number of packages
| > including a few of mine.
| > 
| > That said, I also used the approach described by John Harrold and cached
| > results myself.
| > 
| > Dirk
| > 
| > --
| > dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
| > 
| > __
| > R-package-devel@r-project.org mailing list
| > https://stat.ethz.ch/mailman/listinfo/r-package-devel
| 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing long-running vignette code in CRAN submission

2023-10-17 Thread Dirk Eddelbuettel


On 16 October 2023 at 10:42, Kevin R Coombes wrote:
| Produce a PDF file yourself, then use the "as.is" feature of the R.rsp 
| package.

For completeness, that approach also works directly with Sweave. Described in
a blog post by Mark van der Loo in 2019, and used in a number of packages
including a few of mine.

That said, I also used the approach described by John Harrold and cached
results myself.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Cadence of macOS builds at CRAN

2023-09-14 Thread Dirk Eddelbuettel


Simon,

A new package of mine [1] appeared on CRAN on Sep 5. Respecting the one week 
gap,
I made a small update on Sep 12.

Today is Sep 14. There are still no builds for
  macOS r-release (arm64)
  macOS r-oldrel (arm64)
  macOS r-release (x86_64)
but we do have two oldrel releases. Weirder still we have results for
macOS r-release (x86_64) even when the binary is not listed.

There is nothing tricky in the package or it dependencies.  Could you provide
an update of what should and can be expected in the macOS provision? Is this
a matter of intra-CRAN syncing between your builder(s) and the Vienna site?

Thanks as always,  Dirk

[1] https://cran.r-project.org/package=RcppInt64

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-31 Thread Dirk Eddelbuettel


On 31 August 2023 at 07:32, SHIMA Tatsuya wrote:
| I submitted prqlr 0.5.1 yesterday, which is almost identical to prqlr 
| 0.5.0, and prqlr is now available again on CRAN.
| Thanks to the CRAN reviewers for their quick reaction.

And it is gone again (per CRANberries). Never a dull moment with CRAN. 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 18:48, Jeff Newmiller wrote:
| You have a really bizarre way of twisting what others are saying, Dirk. I 
have seen no-one here saying 'limit R to 2 threads' except for you, as a way to 
paint opposing views to be absurd.

That's too cute.

Nobody needs to repeat it, and some of us know that "it is law"
as the "CRAN Repository Policy" (which each package uploads
promises to adhere to) says
 
   If running a package uses multiple threads/cores it must never
   use more than two simultaneously: the check farm is a shared
   resource and will typically be running many checks
   simultaneously.

You may find reading the document informative. The source reference
(mirrored for convenience at GH) of that line is

https://github.com/r-devel/r-dev-web/blob/master/CRAN/Policy/CRAN_policies.texi#L244-L246

and the rendered page is at 
https://cran.r-project.org/web/packages/policies.html

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 26 August 2023 at 12:05, Simon Urbanek wrote:
| In reality it's more people running R on their laptops vs the rest of the 
world.

My point was that we also have 'single user on really Yuge workstation'. 

Plus we all know that those users are often not sysadmins, and do not have
our levels of accumulated systems knowledge.

So we should give _more_ power by default, not less.

| [...] they will always be saying blatantly false things like "R is not for 
large data"

By limiting R (and/or packages) to two threads we will only get more of
these.  Our collective call.

This whole thread is pretty sad, actually.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 18:45, Duncan Murdoch wrote:
| The real problem is that there are two stubborn groups opposing each 
| other:  the data.table developers and the CRAN maintainers.  The former 
| think users should by default dedicate their whole machine to 
| data.table.  The latter think users should opt in to do that.

No, it feels more like it is CRAN versus the rest of the world.

Take but one example, and as I may have mentioned elsewhere, my day job
consists in providing software so that (to take one recent example)
bioinformatics specialist can slice huge amounts of genomics data.  When that
happens on a dedicated (expensive) hardware with dozens of cores, it would be
wasteful to have an unconditional default of two threads. It would be the end
of R among serious people, no more, no less. Can you imagine how the internet
headlines would go: "R defaults to two threads". 

And it is not just data.table as even in the long thread over in its repo we
have people chiming in using OpenMP in their code (as data.table does but
which needs a different setter than the data.table thread count).

It is the CRAN servers which (rightly !!) want to impose constraints for when
packages are tested.  Nobody objects to that.

But some of us wonder if settings these defaults for all R user, all the
time, unconditional is really the right thing to do.  Anyway, Uwe told me he
will take it to an internal discussion, so let's hope sanity prevails.

Dirk
-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trouble with long-running tests on CRAN debian server

2023-08-25 Thread Dirk Eddelbuettel


On 25 August 2023 at 15:37, Uwe Ligges wrote:
| 
| 
| On 23.08.2023 16:00, Scott Ritchie wrote:
| > Hi Uwe,
| > 
| > I agree and have also been burnt myself by programs occupying the 
| > maximum number of cores available.
| > 
| > My understanding is that in the absence of explicit parallelisation, use 
| > of data.table in a package should not lead to this type of behaviour?
| 
| Yes, that would be my hope, too.

No everybody involved with data.table thinks using 50% is already a
compromise giving up performance, see eg Jan's comment from yesterday (and
everything leading up to it):

   https://github.com/Rdatatable/data.table/issues/5658#issuecomment-1691831704

*You* have a local constraint (that is perfectly reasonable) as *you* run
multiple package tests. So *you* should set a low value for OMP_THREAD_LIMIT.

Many users spend top dollars to have access to high-powered machines for
high-powered analyses. They do want all cores.

There simply cannot be one setting that addresses all situations. Please set
a low limit as your local deployment requires it.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Re-building vignettes had CPU time 9.2 times elapsed time

2023-08-25 Thread Dirk Eddelbuettel


On 24 August 2023 at 07:42, Fred Viole wrote:
| Hi, I am receiving a NOTE upon submission regarding the re-building of
| vignettes for CPU time for the Debian check.
| 
| I am unable to find any documented instances or solutions to this issue.
| The vignettes currently build in 1m 54.3s locally and in 56s on the Win
| check.
| 
| 
https://win-builder.r-project.org/incoming_pretest/NNS_10.1_20230824_132459/Debian/00check.log

Please see, inter alia, the long running thread

   "Trouble with long-running tests on CRAN debian server"

started earlier this week (!!) on this list covering exactly this issue.

We can only hope CRAN comes to understand our point that _it_ should set a
clearly-identifable variable (the OpenMP thread count would do) so that
package data.table can this for its several hundred users.

As things currently stand, CRAN expects several hundred packages (such as
your, guessing there this comes from data.table which I do not know for sure
but you do import it) to make the change which is pretty close to the text
book definition of madness.

Also see https://github.com/Rdatatable/data.table/issues/5658 with by now 24
comments.  It is on the same issue.

Uwe, Kurt: Please please please set OMP_THREAD_LIMIT to 2 on the Windows and
Debian machines doing this test.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Setting valgrind options when running R CMD check --use-valgrind

2023-08-23 Thread Dirk Eddelbuettel


On 23 August 2023 at 16:49, Duncan Murdoch wrote:
| On 23/08/2023 2:54 p.m., Dirk Eddelbuettel wrote:
| > 
| > When I invoke valgrind via
| > R -d valgrind -e '...'
| > the options in the file ~/.valgrindrc are being picked up. Good.
| > 
| > When I invok valgrind via
| > R CMD check --use-valgrind ...
| > the options in the file ~/.valgrindrc are NOT being picked up. Bad.
| > 
| > And valgrind complains.  How can I add the needed options?  Adding
| > --debugger-args=""
| > does not work.  Is there another trick?
| 
| I don't know the answer to your question, but here's something to try. 
| There's a way to run an "R CMD check" equivalent from a regular session, 
| so presumably it could be done from "R -d valgrind -e":
| 
|  tools:::.check_packages(c("pkg", "--option1", "--option2"))
| 
| A likely problem is that many of the check tests are run in separate 
| processes; I don't know if the valgrind setting would be inherited or not.

Thanks for the reminder, I also re-realized by re-reading WRE that setting
VALGRIND_OPTS="" works.  And with that I am no longer fully sure I can
claim that ~/.valgrindrc was ignored.  I may have misread an error.

Thanks for the prompt help, it is appreciated.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Setting valgrind options when running R CMD check --use-valgrind

2023-08-23 Thread Dirk Eddelbuettel


When I invoke valgrind via
   R -d valgrind -e '...'
the options in the file ~/.valgrindrc are being picked up. Good.

When I invok valgrind via
   R CMD check --use-valgrind ...
the options in the file ~/.valgrindrc are NOT being picked up. Bad.

And valgrind complains.  How can I add the needed options?  Adding
   --debugger-args=""
does not work.  Is there another trick?

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trouble with long-running tests on CRAN debian server

2023-08-21 Thread Dirk Eddelbuettel


On 21 August 2023 at 16:05, Ivan Krylov wrote:
| Dirk is probably right that it's a good idea to have OMP_THREAD_LIMIT=2
| set on the CRAN check machine. Either that, or place the responsibility
| on data.table for setting the right number of threads by default. But
| that's a policy question: should a CRAN package start no more than two
| threads/child processes even if it doesn't know it's running in an
| environment where the CPU time / elapsed time limit is two?

Methinks that given this language in the CRAN Repository Policy

  If running a package uses multiple threads/cores it must never use more
  than two simultaneously: the check farm is a shared resource and will
  typically be running many checks simultaneously.

it would indeed be nice if this variable, and/or equivalent ones, were set.

As I mentioned before, I had long added a similar throttle (not for
data.table) in a package I look after (for work, even). So a similar
throttler with optionality is below. I'll add this to my `dang` package
collecting various functions.

A usage example follows. It does nothing by default, ensuring 'full power'
but reflects the minimum of two possible options, or an explicit count:

> dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '12'.
> Sys.setenv("OMP_THREAD_LIMIT"=3); dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '3'.
> options(Ncpus=2); dang::limitDataTableCores(verbose=TRUE)
Limiting data.table to '2'.
> dang::limitDataTableCores(1, verbose=TRUE)
Limiting data.table to '1'.
>

That makes it, in my eyes, preferable to any unconditional 'always pick 1 
thread'.

Dirk


##' Set threads for data.table respecting possible local settings
##'
##' This function set the number of threads \pkg{data.table} will use
##' while reflecting two possible machine-specific settings from the
##' environment variable \sQuote{OMP_THREAD_LIMIT} as well as the R
##' option \sQuote{Ncpus} (uses e.g. for parallel builds).
##' @title Set data.table threads respecting default settingss
##' @param ncores A numeric or character variable with the desired
##' count of threads to use
##' @param verbose A logical value with a default of \sQuote{FALSE} to
##' operate more verbosely
##' @return The return value of the \pkg{data.table} function
##' \code{setDTthreads} which is called as a side-effect.
##' @author Dirk Eddelbuettel
##' @export
limitDataTableCores <- function(ncores, verbose = FALSE) {
if (missing(ncores)) {
## start with a simple fallback: 'Ncpus' (if set) or else 2
ncores <- getOption("Ncpus", 2L)
## also consider OMP_THREAD_LIMIT (cf Writing R Extensions), gets NA if 
envvar unset
ompcores <- as.integer(Sys.getenv("OMP_THREAD_LIMIT"))
## and then keep the smaller
ncores <- min(na.omit(c(ncores, ompcores)))
}
stopifnot("Package 'data.table' must be installed." = 
requireNamespace("data.table", quietly=TRUE))
stopifnot("Argument 'ncores' must be numeric or character" = 
is.numeric(ncores) || is.character(ncores))
if (verbose) message("Limiting data.table to '", ncores, "'.")
data.table::setDTthreads(ncores)
}

| 
| -- 
| Best regards,
| Ivan
| 
| __
| R-package-devel@r-project.org mailing list
| https://stat.ethz.ch/mailman/listinfo/r-package-devel

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trouble with long-running tests on CRAN debian server

2023-08-21 Thread Dirk Eddelbuettel


On 21 August 2023 at 15:16, Ivan Krylov wrote:
| On Mon, 21 Aug 2023 12:02:55 +0100
| Scott Ritchie  wrote:
| 
| > remotes::install_github("sritchie73/ukbnmr")
| > library(ukbnmr)
| > system.time({ remove_technical_variation(test_data) })
| 
| data.tables, you say? Can you show us the NOTE message you're getting?
| It could be that your example takes too much CPU time (as opposed to
| "real", "wallclock" time) due to running too many threads started by
| data.table.

Yep, and that is a new test AFAIK.
 
| It's not obvious why data.table would start too many threads (it's
| supposed to honour the limits that CRAN expresses in environment
| variables), but at least it should be easy to check and discount.

It grabs all it can get which is what you want for performance (I am on a
six-core machine here):

  $ R -q
  > library(data.table)
  data.table 1.14.8 using 6 threads (see ?getDTthreads).  Latest news: 
r-datatable.com
  > 

and it honors variables if set

  $ OMP_THREAD_LIMIT=2 R -q
  > library(data.table)
  data.table 1.14.8 using 2 threads (see ?getDTthreads).  Latest news: 
r-datatable.com
  > 

so I presume that variable is NOT set by CRAN.  It might help if it were.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] possible solution to package-documentation-alias problem

2023-08-20 Thread Dirk Eddelbuettel


On 20 August 2023 at 09:22, Duncan Murdoch wrote:
| That seems like a bug that should be reported to the Roxygen authors.

Seb did so in June:

   https://github.com/r-lib/roxygen2/issues/1491

There has not been a response (but given the CRAN email to many maintainers
it has now been referenced mulitple times).

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Suppressing compiler warnings?

2023-08-14 Thread Dirk Eddelbuettel


On 14 August 2023 at 11:51, Mark Padgham wrote:
| An update of a package of mine got immediately kicked off CRAN because
| an externally-bundled file which had not been changed for years, and
| which included "pragma clang system_header" to suppress warnings. The
| concern is fair enough. Removing that nevertheless now generates a heap
| of warnings which CRAN will have to wade through / grep to assess
| whether any are of concern. These can be easily suppressed by replacing
| cheap "system_header" with targeted suppression of just two classes of
| warnings for that bundled header file only. But I guess this is also
| forbidden? Should I now just leave those warnings, and rely on CRAN's
| grep policies to assess them? Is there any public list of what such grep
| policies might be? (And no, I can't adapt the code to remove the
| warnings, because most are "zero-as-null-pointer-constant", but the "0"
| is actually better than nullptr the way the code is constructed. The
| rest are - currently - unimportant deprecation warnings, all of one
| specific class.)

It is in the CRAN Repostory Policy:

   - Packages should not attempt to disable compiler diagnostics, nor to
 remove other diagnostic information such as symbols in shared objects. 

Per 'svn blame' it goes back to commit 4049 from Apr 2019, but a commit from
Nov 2017 has 'memtion (sic!) not disabling compiler diagnostics'.

FWIW I have had to do it for BH and RcppEigen for some time leading to both
of them 'spamming' package users with a lot of noise. I don't particularly
like that, but I also do not have too many choices here.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] gdb availability on r-devel-linux-x86_64-fedora-gcc

2023-08-12 Thread Dirk Eddelbuettel


On 12 August 2023 at 18:12, Uwe Ligges wrote:
| On 12.08.2023 15:10, Jamie Lentin wrote:
| > The system call in question is done by the TMB package[2], and not ours 
| > to tinker with:
| > 
| >    cmd <- paste("R --vanilla < ",file," -d gdb --debugger-args=\"-x",
| >     gdbscript,"\"")
| >    txt <- system(cmd,intern=TRUE,ignore.stdout=FALSE,ignore.stderr=TRUE)
| > 
| > My only vaguely reasonable guess is that gdb isn't available on the host 
| > in question (certainly R will be!). How likely is this? Is it worth 
| > trying to resubmit with the call wrapped with an "if (gdb is on the path)"?
| 
| 
| I guess it is really not available as that system got an update.
| Note that you package does not declare any SystemRequirements. Please do 
| so and mention gdb.
| 
| Wrapping it in "if (gdb is on the path)" seems a good solution.

Seconded esp as some systems may have lldb instead of gdb, or neither.
Adding a simple `if (nzchar(Sys.which("gdb")))` should get you there.

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


  1   2   3   4   5   6   7   8   9   10   >