Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Simon Urbanek
I think it would be quite useful to have some community repository of code 
snippets dealing with such situations. R-exts gives advice and pieces of code 
which are useful, but they are not complete solutions and situations like 
Dirk's example are not that uncommon. (E.g., I recall some of the spatial 
packages copy/pasting code from each other for quote some time - which works, 
but is error prone if changes need to be made).

If one has to rely on a 3rd party library and one wants to fall back to source 
compilation when it is not available, it is a quite complex task, because one 
has to match the library's build system to R's and the package build rules as 
well. There are many ways where this can go wrong - Dirk mentioned some of them 
- and ideally not every package developer in that situation should be going 
through the pain of learning all the details the hard way.

Of course there are other packages as an example, but for someone not familiar 
with the details it's hard to see which ones do it right, and which ones don't 
- we don't always catch all the bad cases on CRAN.

I don't have a specific proposal, but if there was a GitHub repo or wiki or 
something to try to distill the useful bits from existing packages, I'd be 
happy to review it and give advice based on my experience from that macOS 
binary maintenance if that's useful.

Cheers,
Simon


> On May 12, 2023, at 8:36 AM, Dirk Eddelbuettel  wrote:
> 
> 
> Hi Reed,
> 
> On 11 May 2023 at 11:15, Reed A. Cartwright wrote:
> | I'm curious why you chose to call cmake from make instead of from configure.
> | I've always seen cmake as part of the configure step of package building.
> 
> Great question! Couple of small answers: i) This started as a 'proof of
> concept' that aimed to be small so getting by without requiring `configure`
> seemed worth a try, ii) I had seen another src/Makevars invoking compilation
> of a static library in a similar (albeit non-cmake) way and iii) as we now
> know about section 1.2.6 (or soon 1.2.9) 'Using cmake' has it that way too.
> 
> Otherwise I quite like having `configure` and I frequently use it -- made
> from 'genuine' configire.in via `autoconf`, or as scripts in shell or other
> languages.
> 
> Cheers, Dirk
> 
> PS My repaired package is now on CRAN. I managed to bungle the static library
> build (by not telling `cmake` to use position independent code), bungled
> macOS but not telling myself where `cmake` could live, and in fixing that
> bungled Windows by forgetting to add `src/Makevars.win` fallback. Yay me.
> 
> -- 
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Bioc-devel] BiocManager::install

2023-05-11 Thread Kasper Daniel Hansen
It seems totally sensible to be able to use BiocManager to install either
bioc-release or bioc-devel at any time, provided you're running R-devel.
First, by definition, R-devel is always >= the R used for release / devel
and Second, it is reasonable to assume users of R-devel to know what they
are doing.

I am unsure if you're arguing for anything else.

Best,
Kasper

On Thu, May 11, 2023 at 10:25 AM Wolfgang Huber 
wrote:

> Hi Kasper
>
> My use case is simple: anyone who works with R-devel and wants to use a
> package on Bioconductor from April to October.
> Many of the 2230 packages in our repository are useful outside of the
> BiocGenerics, IRanges, SummarizedExperiment core world.
> E.g., to name a few, BiocParallel, illuminaio, rhdf5, EBImage, ggtree,
> edgeR, limma, qvalue, sparseMatrixStats, … and I do not think “we” should
> recommend people who want to use these which version of R they must use.
> Btw these examples are all “highly downloaded”.
>
> I fully understand the wish to make people use coherent versions of
> packages and R for situations where lots of interdependent packages,
> classes, methods etc. are imported.
> But sometimes, people just need one or two packages, and then R’s built-in
> dependency management works just fine and the current BiocManager approach
> is needlessly intrusive.
>
> It’s as bad as having made me wonder whether to recommend authors of
> packages that do not directly build upon BiocGenerics, IRanges etc. to
> submit them to CRAN, to increase potential user base (b/c installation from
> Bioconductor can be such a pain). And that’s really not the place I want to
> be.
>
> Thanks and best wishes
> Wolfgang
>
>
>
>
>
> > Il giorno 10.05.2023, alle ore 17:12, Kasper Daniel Hansen <
> kasperdanielhan...@gmail.com> ha scritto:
> >
> > Could we get a list of use cases from Wolfgang? I am confused about what
> > the issue is. Is the issue that it is painful to work with R-devel in the
> > "off" 6-months? If so, I agree that it should be easier (even if we don't
> > recommend it). But I am having a hard time parsing the email.
> >
> > I can recognize Martin M's wish: a way to run Bioc-release on R-devel;
> that
> > seems sensible to me.
> >
> > Best,
> > Kasper
> >
> > On Tue, May 9, 2023 at 3:46 AM Martin Maechler <
> maech...@stat.math.ethz.ch>
> > wrote:
> >
> >>> Wolfgang Huber
> >>>on Sun, 7 May 2023 14:29:37 +0200 writes:
> >>
> >>> Hi Martin As you correctly point out, Bioconductor package
> >>> developers are probably not those with the most relevant
> >>> use cases. I think there are use cases for everyone
> >>> else—anyone who decides to write code on R-devel, for
> >>> whatever reason, and just wants to use a Bioconductor
> >>> package between mid-April to mid-October (they could
> >>> develop for CRAN, or just be a user and write scripts and
> >>> packages for a private project). There are many useful
> >>> packages on Bioconductor that are of general interest,
> >>> even for people whose work does not center around
> >>> Bioconductor or biology (say, ggtree, rhdf5,
> >>> sparseMatrixStats, EBImage, …)
> >>
> >>> I added these ponderings also to
> >>> https://github.com/Bioconductor/pkgrevdocs/issues/108
> >>
> >>> Thanks and best wishes Wolfgang
> >>
> >> As the older ones among you know, I've been a BioC developer
> >> only many years ago ('hexbin' e.g.), but as an R package
> >> maintainer and co-maintainer and R Core team member,
> >> I really like to chime in here, declaring that it *has* been
> >> quite painful for me over the years to test CRAN packages which
> >> depend on BioC packages - with R-devel -- which is my primary R
> >> version for testing, notably also for testing potential changes in R
> >> across many packages, etc.
> >> Notably during this half of the year where there is no
> >> "official" way how to correctly install current Bioconductor packages
> >> (in their own package library, as I always do) under R-devel.
> >>
> >> If I'd be able to sum up the time lost over this issue for the last say
> 10
> >> years, it would add to a full working day at least. ...
> >>
> >> (and I have added a comment also in the above issue #108)
> >>
> >>
> >>> (PS in my particular case yesterday, it was just that my
> >>> R-devel is better maintained (built from source etc) and
> >>> has in its library some (non-BioC) packages with complex
> >>> systems dependencies that I need for a workflow I am
> >>> working on, packages that currently elude me on my binary
> >>> installation of R4.3. And then in addition I just wanted
> >>> to *use* a package from Bioconductor and didn’t like how
> >>> clumsy that experience was.)
> >>
> >> My other experience is that I always have to help people in my
> >> group to install our pcalg CRAN package because it depends
> >> e.g. on Bioc packages 'graph' and 'Rgraphviz' .. and on their
> >> laptops they somehow don't have the correct  getOption("repos")
> >> or there are other reasons why 

Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Dirk Eddelbuettel


Hi Reed,

On 11 May 2023 at 11:15, Reed A. Cartwright wrote:
| I'm curious why you chose to call cmake from make instead of from configure.
| I've always seen cmake as part of the configure step of package building.

Great question! Couple of small answers: i) This started as a 'proof of
concept' that aimed to be small so getting by without requiring `configure`
seemed worth a try, ii) I had seen another src/Makevars invoking compilation
of a static library in a similar (albeit non-cmake) way and iii) as we now
know about section 1.2.6 (or soon 1.2.9) 'Using cmake' has it that way too.

Otherwise I quite like having `configure` and I frequently use it -- made
from 'genuine' configire.in via `autoconf`, or as scripts in shell or other
languages.

Cheers, Dirk

PS My repaired package is now on CRAN. I managed to bungle the static library
build (by not telling `cmake` to use position independent code), bungled
macOS but not telling myself where `cmake` could live, and in fixing that
bungled Windows by forgetting to add `src/Makevars.win` fallback. Yay me.

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Inconsistent functionality of c++ code in MatchIt

2023-05-11 Thread Noah Greifer
I want to thank Bill and everyone that reached out to me individually. It
looks like Bill's solution is the right one, as he was able to replicate
and fix the problem. I am still a bit confused on why this would occur on
some OSs and not others (probably due to different compilers), but I think
the solution is just writing explicit, robust code that should always work.
Thank you!

Noah

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Inconsistent functionality of c++ code in MatchIt

2023-05-11 Thread Bill Dunlap
I see the problem when I compile the C++ code on Ubuntu 20.04 and the
latest R-devel with
   C++ compiler: ‘g++ (Ubuntu 9.3.0-17ubuntu1~20.04) 9.3.0
If I change all the unadorned 'abs' calls in src/nn_matchC_vec.cpp with the
prefix 'std::' the problem goes away.

-Bill


On Thu, May 11, 2023 at 11:12 AM Noah Greifer 
wrote:

> Hello,
>
> I'm the mainter of the package *MatchIt*, which uses *Rcpp* to implement
> nearest neighbor matching. One way to customize nearest neighbor
> matching is to add a caliper, which is the largest distance two units can
> be from each other before they are not allowed to be matched. I've had some
> users complain recently that the caliper is not working for them, i.e.,
> even after specifying a caliper, units are being matched who are
> farther apart then the caliper width. I have been unable to replicate this
> problem on my Mac; the caliper always works as intended.
>
> One user noted that even when using the same package version, the
> performance varied across two machines: one obeyed the caliper and one
> didn't. I thought this might be related to the version of R installed, as
> for some users updating R fixed the issue, but for others it didn't. I'm
> kind of at a loss.
>
> I have a suspicion that the problem is related to the function abs() in the
> C++ functions find_right() and find_left() that I wrote to perform the
> matching, which are in nn_match_vec.cpp
> .
> I have had problems with abs() before (seemingly related to a namespace
> conflict between std and Rcpp). In this case, abs() is used in the
> following way:
>
> if (abs(distance[ii] - distance[k]) > caliper_dist) {
> //if closest is outside caliper, break; none can be found
> break;
>  }
>
> Here, distance is a NumericVector, and ii and k are ints. My expectation is
> that this would dispatch to std::abs() with a double as its input. It's
> possible something is going wrong there. I'm wondering if this has to do
> with recent changes to R's C++ engine or compilers.
>
> If you want to run code to test whether the caliper is working correctly on
> your machine, you can run the following code:
>
> install.packages("MatchIt")
> data("lalonde", package = "MatchIt")
> m <- MatchIt::matchit(treat ~ age + educ + race + re74,
>   data = lalonde, caliper = .01)
> summary(m)$nn
>
> If the caliper is working correctly, you should see a small matrix that has
> the row
>
> Matched88  88
>
> If not, you would see the row
>
> Matched   185 185
>
> The GitHub issue of people complaining about this is here
>  along with their
> explanations about versions of *MatchIt* and R. The package code is also
> there.
>
> Any thoughts or insights about this would really help! Thank you so much!
>
> Noah
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Reed A. Cartwright
Dirk,

I'm curious why you chose to call cmake from make instead of from
configure. I've always seen cmake as part of the configure step of package
building.

Thanks,
Reed


On Thu, May 11, 2023, 04:17 Dirk Eddelbuettel  wrote:

>
> On 11 May 2023 at 09:02, Martin Maechler wrote:
> | I've been told in private that the above "be happy if"
> | may *not* be a good idea,
> | or rather even close to impossible as   cmake  seems to not fit
> | well, at all, with the quite sophisticated
> | autoconf -> configure -> make
> | setup we have with building R + recommended packages
> | cross-platform compatibly as well as possible.
>
> Yes, my bad -- it _is_ indeed in Writing R Extensions (albeit with an
> actual error, see below) so I now use what it recommended (in the form
> pointed out by Reed though) so for me it is
>
> sh -> make -> cmake
>
> but could be autoconf too.
>
> I presume the macOS systems needs it for cross-compilation or other dances.
> Still annoying as hell.  A simple default may be better but I do not know
> any
> of the dragons running around inside macOS.
>
> Dirk
>
>
> Appendix: Actual Error in Writing R Extensions.
>
> Section 1.2.6 ends on these lines (r-release, and r-devel) I am quoting in
> full
>
> One way to work around this is for the package’s ‘configure’ script to
> include
>  if test -z "$CMAKE"; then CMAKE="`which cmake`"; fi
>  if test -z "$CMAKE"; then
> CMAKE=/Applications/CMake.app/Contents/bin/cmake; fi
>  if test -f "$CMAKE"; then echo "no 'cmake' command found"; exit
> 1; fi
> and for the second approach to substitute ‘CMAKE’ into ‘src//Makevars’.
>
> The final 'test -f' has to be negated. Demo:
>
> edd@rob:/tmp$ cat foo.sh
> ##!/bin/sh
> if test -z "$CMAKE"; then CMAKE="`which cmake`"; fi
> if test -z "$CMAKE"; then
> CMAKE=/Applications/CMake.app/Contents/bin/cmake; fi
> if test -f "$CMAKE"; then echo "no 'cmake' command found"; exit 1; fi
> echo "** using $CMAKE"
> edd@rob:/tmp$ ./foo.sh
> no 'cmake' command found
> edd@rob:/tmp$ which cmake
> /usr/bin/cmake
> edd@rob:/tmp$
>
> If I change foo.sh to use '! test -f' all is well
>
> edd@rob:/tmp$ be foo.sh# be is an alias for emacsclient in tty
> mode
> edd@rob:/tmp$ ./foo.sh
> ** using /usr/bin/cmake
> edd@rob:/tmp$
>
> --
> dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org
>
> __
> R-package-devel@r-project.org mailing list
>
> https://urldefense.com/v3/__https://stat.ethz.ch/mailman/listinfo/r-package-devel__;!!IKRxdwAv5BmarQ!cx5EOr8qi2w5dP-kdo90tRXU4I7VjWDeD19I3UQxXe1hCHor7wP-FBUn5gzg8PFkdlerFTrGD0uL$
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Inconsistent functionality of c++ code in MatchIt

2023-05-11 Thread Noah Greifer
Hello,

I'm the mainter of the package *MatchIt*, which uses *Rcpp* to implement
nearest neighbor matching. One way to customize nearest neighbor
matching is to add a caliper, which is the largest distance two units can
be from each other before they are not allowed to be matched. I've had some
users complain recently that the caliper is not working for them, i.e.,
even after specifying a caliper, units are being matched who are
farther apart then the caliper width. I have been unable to replicate this
problem on my Mac; the caliper always works as intended.

One user noted that even when using the same package version, the
performance varied across two machines: one obeyed the caliper and one
didn't. I thought this might be related to the version of R installed, as
for some users updating R fixed the issue, but for others it didn't. I'm
kind of at a loss.

I have a suspicion that the problem is related to the function abs() in the
C++ functions find_right() and find_left() that I wrote to perform the
matching, which are in nn_match_vec.cpp
.
I have had problems with abs() before (seemingly related to a namespace
conflict between std and Rcpp). In this case, abs() is used in the
following way:

if (abs(distance[ii] - distance[k]) > caliper_dist) {
//if closest is outside caliper, break; none can be found
break;
 }

Here, distance is a NumericVector, and ii and k are ints. My expectation is
that this would dispatch to std::abs() with a double as its input. It's
possible something is going wrong there. I'm wondering if this has to do
with recent changes to R's C++ engine or compilers.

If you want to run code to test whether the caliper is working correctly on
your machine, you can run the following code:

install.packages("MatchIt")
data("lalonde", package = "MatchIt")
m <- MatchIt::matchit(treat ~ age + educ + race + re74,
  data = lalonde, caliper = .01)
summary(m)$nn

If the caliper is working correctly, you should see a small matrix that has
the row

Matched88  88

If not, you would see the row

Matched   185 185

The GitHub issue of people complaining about this is here
 along with their
explanations about versions of *MatchIt* and R. The package code is also
there.

Any thoughts or insights about this would really help! Thank you so much!

Noah

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] range() for Date and POSIXct could respect `finite = TRUE`

2023-05-11 Thread Bill Dunlap
>  What do others think?

I can imagine a class, "TemperatureKelvins", that wraps a double but would
have a range of 0 to Inf or one called "GymnasticsScore" with a range of 0
to 10.  For those sorts of things it would be nice to have a generic that
gave the possible min and max for the class instead of one that just said
they were -Inf and Inf or not.

-Bill

On Thu, May 11, 2023 at 1:49 AM Martin Maechler 
wrote:

> > Davis Vaughan
> > on Tue, 9 May 2023 09:49:41 -0400 writes:
>
> > It seems like the main problem is that `is.numeric(x)`
> > isn't fully indicative of whether or not `is.finite(x)`
> > makes sense for `x` (i.e.  Date isn't numeric but does
> > allow infinite dates).
>
> > So I could also imagine a new `allows.infinite()` S3
> > generic that would return a single TRUE/FALSE for whether
> > or not the type allows infinite values, this would also be
> > indicative of whether or not `is.finite()` and
> > `is.infinite()` make sense on that type. I imagine it
> > being used like:
>
> > ```
> >   allows.infinite <- function(x) {
> > UseMethod("allows.infinite")
> >   }
> >   allows.infinite.default <- function(x) {
> > is.numeric(x) # For backwards compatibility, maybe? Not sure.
> >   }
> >   allows.infinite.Date <- function(x) {
> > TRUE
> >   }
> >   allows.infinite.POSIXct <- function(x) {
> > TRUE
> >   }
> >
> >   range.default <- function (..., na.rm = FALSE, finite = FALSE) {
> > x <- c(..., recursive = TRUE)
> > if (allows.infinite(x)) { # changed from `is.numeric()`
> >   if (finite)
> > x <- x[is.finite(x)]
> >   else if (na.rm)
> > x <- x[!is.na(x)]
> >   c(min(x), max(x))
> > }
> > else {
> >   if (finite)
> > na.rm <- TRUE
> >   c(min(x, na.rm = na.rm), max(x, na.rm = na.rm))
> > }
> >   }
> >   ```
>
> > It could allow other R developers to also use the pattern of:
>
> > ```
> > if (allows.infinite(x)) {
> ># conditionally do stuff with is.infinite(x)
> > }
> > ```
>
> > and that seems like it could be rather nice.
>
> > It would avoid the need for `range.Date()` and `range.POSIXct()`
> methods too.
>
> > -Davis
>
> That *is* an interesting alternative perspective ...
> sent just about before I was going to commit my proposal (incl
> new help page entries, regr.tests ..).
>
> So we would introduce a new generic  allows.infinite() {or
> better name?,  allowsInf, ..} with the defined semantic that
>
> allows.infinite(x) for a vector 'x' gives a logical "scalar",
> TRUE iff it is known that  is.finite(x) "makes sense" and
> returns a logical vector of length length(x) .. which is TRUE
> where x[i] is not NA/NaN/+Inf/-Inf .. *and*
> is.infinite := Negate(is.finite){or vice versa if you prefer}.
>
> I agree that this may be useful somewhat more generally than
> just for  range() methods.
>
> What do others think?
>
> Martin
>
>
> > On Thu, May 4, 2023 at 5:29 AM Martin Maechler
> >  wrote:
> [..]
>
> >> > Davis Vaughan
> >> > on Mon, 1 May 2023 08:46:33 -0400 writes:
> >>
> >> > Martin,
> >> > Yes, I missed that those have `Summary.*` methods, thanks!
> >>
> >> > Tweaking those to respect `finite = TRUE` sounds great. It seems
> like
> >> > it might be a little tricky since the Summary methods call
> >> > `NextMethod()`, and `range.default()` uses `is.numeric()` to
> determine
> >> > whether or not to apply `finite`. Because `is.numeric.Date()` is
> >> > defined, that always returns `FALSE` for Dates (and POSIXt).
> Because
> >> > of that, it may still be easier to just write a specific
> >> > `range.Date()` method, but I'm not sure.
> >>
> >> > -Davis
> >>
> >> I've looked more closely now, and indeed,
> >> range() is the only function in the  Summary  group
> >> where (only) the default method has a 'finite' argument.
> >> which strikes me as somewhat asymmetric / inconsequential, as
> >> after all,  range(.) := c(min(.), max(.)) ,
> >> but  min() and max() do not obey an finite=TRUE setting, note
> >>
> >> > min(c(-Inf,3:5), finite=TRUE)
> >> Error: attempt to use zero-length variable name
> >>
> >> where the error message also is not particularly friendly
> >> and of course has nothing to with 'finite' :
> >>
> >> > max(1:4, foo="bar")
> >> Error: attempt to use zero-length variable name
> >> >
> >>
> >> ... but that is diverting;  coming back to the topic:  Given
> >> that 'finite' only applies to range() {and there is just a
> convenience},
> >> I do agree that from my own work & support to make `Date` and
> >> `POSIX(c)t` behave more number-like, it would be "nice" to have
> >> range() obey a `finite=TRUE` also for these.
> >>
> >> OTOH, there are quite a few other 'number-like' thingies for
> >> which I would then like to 

Re: [Rd] R-4.2.3 build from source on Windows (w Rtools42) - lto1.exe error

2023-05-11 Thread W. D.
Thanks to everybody for the reples.

tl;dr: it was a config problem with my environment being polluted by
other apps (also via $USERPROFILE/.bashrc).
R-4-2-branch builds with LTO "march=native" (Rtools42, also managed to
build with "mtune=native", which afaik is inherently included when
doing march, but still).

Details: I will try to aggregate answers to comments/hints here and
document a sequentially processed train of action here including
results as some form of documentation for current status and hickups
from my setup described in this email chain.

@Avraham, et al
Since I wasn't sure if and how maybe lib updates I had done before
starting to build had played a role I downloaded a fresh copy of the
Rtools42 installer from here
[https://cran.r-project.org/bin/windows/Rtools/rtools42/files/]
current version via `cat /x86_64-w6cat
/x86_64-w64-mingw32.static.posix/.version` gave "5355"
Did some more "tests":
`cd R-4-2-branch/src/gnuwin32`
Ran build with"mtune=native" gave same error
Ran `make clean`
Then using no EOPTS options in MkRules.local with version "5355" -> same error
Ran `make clean`
---
Then ran an update via `pacman -Syuu`
Which updates the following:
Packages (55) brotli-1.0.9-8  bsdtar-3.6.2-3
ca-certificates-20230311-1  coreutils-8.32-5  curl-8.0.1-1
  dash-0.5.12-1  db-5.3.28-4  diffutils-3.9-1  file-5.44-5
 gawk-5.2.1-2  gcc-libs-11.3.0-3
  gnupg-2.2.41-1  grep-1~3.0-6  heimdal-libs-7.8.0-3
info-7.0.3-1  less-633-1  libcrypt-2.1-4
  libcurl-8.0.1-1  libdb-5.3.28-4  libedit-20221030_3.1-1
libexpat-2.5.0-1  libffi-3.4.4-1
  libgcrypt-1.10.2-1  libgnutls-3.8.0-1
libgpg-error-1.47-1  libidn2-2.3.4-2  libksba-1.6.3-1
  liblzma-5.4.3-1  libnghttp2-1.52.0-1  libopenssl-3.1.0-2
 libpcre-8.45-3  libpsl-0.21.2-1
  libreadline-8.2.001-3  libsqlite-3.41.2-3
libssh2-1.10.0-3  libunistring-1.1-2  libxml2-2.10.4-1
  libzstd-1.5.5-1  make-4.4.1-1  mpfr-4.2.0.p4-1
msys2-keyring-1~20230316-1  ncurses-6.4-1
  openssl-3.1.0-2  patch-2.7.6-2  perl-5.36.0-1
pinentry-1.2.1-1  rebase-4.5.0-4  rsync-3.2.7-2
  sed-4.9-1  tcl-8.6.12-3  texinfo-7.0.3-1
texinfo-tex-7.0.3-1  xz-5.4.3-1  zlib-1.2.13-1  zstd-1.5.5-1

But what confuses me a bit is that after that ` cat /x86_64-w6cat
/x86_64-w64-mingw32.static.posix/.version` still only gives me "5355"?
As will be shown later (below) this was not a breaking issue.

Important change to my environment for the next attempt!
I decided to check my ENV Variables (one more time) after thoroughly
reading Tomas comment "maybe there is some config problem on the
system" as well as Prof. Ripley's "first build without LTO to
isolate the issue" and noticed that the old Rtools40/.../bin folder
was also in my WINdows %PATH% ENV Variable.
So I started a cleanup initiative from there.
Also noticed that the .bashrc file in my %USERPROFILE% folder
cluttered up my PATH especially there were `/mingw-64/bin` and
`.../Library/bin` and similar entries from a miniconda3 installation
amongst others ghosting around in there!
Despite gcc --version or make commands not returning anything other
than `command not found` I cleaned that .bashrc file up quite a bit.
After cleanup I restarted MSYS2 Bash and ran  `make distcelan` in the
/src/gnuwin32 folder one more time then added miktex,
/x86_64-w64-mingw32.static.posix/bin, tar, etc to $PATH or environment
again.
`echo $PATH` now (ater the extension of PATH var as dscibed in
https://cran.r-project.org/bin/windows/base/howto-R-4.2.html)
finally looked like this
```
/c/Users/gwd/AppData/Local/Programs/MiKTeX/miktex/bin/x64:/x86_64-w64-mingw32.static.posix/bin:/usr/local/bin:/usr/bin:/bin:/opt/bin:/c/Windows/System32:/c/Windows:/c/Windows/System32/Wbem:/c/Windows/System32/WindowsPowerShell/v1.0/:/c/progra~1/git/cmd:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl
```
with all other Library, bin or similar (mainly miniconda3) folders gone now.
and
`which make gcc pdflatex tar` looks like this now (not much different
to before, but ...)
```
make is /usr/bin/make
make is /bin/make
gcc is /x86_64-w64-mingw32.static.posix/bin/gcc
pdflatex is /c/Users/gwd/AppData/Local/Programs/MiKTeX/miktex/bin/x64/pdflatex
tar is /usr/bin/tar
tar is /bin/tar
tar is /c/Windows/System32/tar
```
seems to look "better"!

Then I started to build again ... which took me some time - that's why
there's this delay in my answer here. ..
First w/o any EOPTS -> succeeded!
Then `make distclean`
And then with march=native` [as well as mtune] -> succeeded!

I will give the R-devel version with Rtools43 a shot as well (next)
week - if you don't hear back from me -> you can assume that (also)
worked correctly with LTO.

So ... as is so often the case: "the problem was sitting in front of the PC".
Thanks for the assistance and informative hints and sorry for
bothering you all basically b/c of my (.bashrc) setup conundrum!

Greetings,
Walter


On Thu, 11 May 2023 at 09:06, Tomas 

Re: [Bioc-devel] BiocManager::install

2023-05-11 Thread Wolfgang Huber
Hi Kasper

My use case is simple: anyone who works with R-devel and wants to use a package 
on Bioconductor from April to October.
Many of the 2230 packages in our repository are useful outside of the 
BiocGenerics, IRanges, SummarizedExperiment core world. 
E.g., to name a few, BiocParallel, illuminaio, rhdf5, EBImage, ggtree, edgeR, 
limma, qvalue, sparseMatrixStats, … and I do not think “we” should recommend 
people who want to use these which version of R they must use. Btw these 
examples are all “highly downloaded”.

I fully understand the wish to make people use coherent versions of packages 
and R for situations where lots of interdependent packages, classes, methods 
etc. are imported. 
But sometimes, people just need one or two packages, and then R’s built-in 
dependency management works just fine and the current BiocManager approach is 
needlessly intrusive.

It’s as bad as having made me wonder whether to recommend authors of packages 
that do not directly build upon BiocGenerics, IRanges etc. to submit them to 
CRAN, to increase potential user base (b/c installation from Bioconductor can 
be such a pain). And that’s really not the place I want to be.

Thanks and best wishes
Wolfgang





> Il giorno 10.05.2023, alle ore 17:12, Kasper Daniel Hansen 
>  ha scritto:
> 
> Could we get a list of use cases from Wolfgang? I am confused about what
> the issue is. Is the issue that it is painful to work with R-devel in the
> "off" 6-months? If so, I agree that it should be easier (even if we don't
> recommend it). But I am having a hard time parsing the email.
> 
> I can recognize Martin M's wish: a way to run Bioc-release on R-devel; that
> seems sensible to me.
> 
> Best,
> Kasper
> 
> On Tue, May 9, 2023 at 3:46 AM Martin Maechler 
> wrote:
> 
>>> Wolfgang Huber
>>>on Sun, 7 May 2023 14:29:37 +0200 writes:
>> 
>>> Hi Martin As you correctly point out, Bioconductor package
>>> developers are probably not those with the most relevant
>>> use cases. I think there are use cases for everyone
>>> else—anyone who decides to write code on R-devel, for
>>> whatever reason, and just wants to use a Bioconductor
>>> package between mid-April to mid-October (they could
>>> develop for CRAN, or just be a user and write scripts and
>>> packages for a private project). There are many useful
>>> packages on Bioconductor that are of general interest,
>>> even for people whose work does not center around
>>> Bioconductor or biology (say, ggtree, rhdf5,
>>> sparseMatrixStats, EBImage, …)
>> 
>>> I added these ponderings also to
>>> https://github.com/Bioconductor/pkgrevdocs/issues/108
>> 
>>> Thanks and best wishes Wolfgang
>> 
>> As the older ones among you know, I've been a BioC developer
>> only many years ago ('hexbin' e.g.), but as an R package
>> maintainer and co-maintainer and R Core team member,
>> I really like to chime in here, declaring that it *has* been
>> quite painful for me over the years to test CRAN packages which
>> depend on BioC packages - with R-devel -- which is my primary R
>> version for testing, notably also for testing potential changes in R
>> across many packages, etc.
>> Notably during this half of the year where there is no
>> "official" way how to correctly install current Bioconductor packages
>> (in their own package library, as I always do) under R-devel.
>> 
>> If I'd be able to sum up the time lost over this issue for the last say 10
>> years, it would add to a full working day at least. ...
>> 
>> (and I have added a comment also in the above issue #108)
>> 
>> 
>>> (PS in my particular case yesterday, it was just that my
>>> R-devel is better maintained (built from source etc) and
>>> has in its library some (non-BioC) packages with complex
>>> systems dependencies that I need for a workflow I am
>>> working on, packages that currently elude me on my binary
>>> installation of R4.3. And then in addition I just wanted
>>> to *use* a package from Bioconductor and didn’t like how
>>> clumsy that experience was.)
>> 
>> My other experience is that I always have to help people in my
>> group to install our pcalg CRAN package because it depends
>> e.g. on Bioc packages 'graph' and 'Rgraphviz' .. and on their
>> laptops they somehow don't have the correct  getOption("repos")
>> or there are other reasons why install.packages('pcalg')
>> does not find its Bioc dependencies.
>> On our Linux desktop+server environment, I do setup
>>options(repos = )
>> such that everything works .. but alas, also *not* when in
>> R-devel but when you develop a package for CRAN / or only just
>> follow the more wide recommendation to also check your package
>> with current R-devel, then non-expert package developers need a
>> lot of stamina if their package depends (directly or
>> recursively) on a Bioc package
>> which is really unfortunate and tends to put the Bioconductor
>> project in a shady light it really has not deserved at all!
>> 
>> Martin
>> 
>> --
>> Martin Maechler
>> 

Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Dirk Eddelbuettel


On 11 May 2023 at 09:02, Martin Maechler wrote:
| I've been told in private that the above "be happy if"
| may *not* be a good idea,
| or rather even close to impossible as   cmake  seems to not fit
| well, at all, with the quite sophisticated
| autoconf -> configure -> make
| setup we have with building R + recommended packages
| cross-platform compatibly as well as possible.

Yes, my bad -- it _is_ indeed in Writing R Extensions (albeit with an
actual error, see below) so I now use what it recommended (in the form
pointed out by Reed though) so for me it is

sh -> make -> cmake

but could be autoconf too.

I presume the macOS systems needs it for cross-compilation or other dances.
Still annoying as hell.  A simple default may be better but I do not know any
of the dragons running around inside macOS.

Dirk


Appendix: Actual Error in Writing R Extensions.

Section 1.2.6 ends on these lines (r-release, and r-devel) I am quoting in full

One way to work around this is for the package’s ‘configure’ script to
include
 if test -z "$CMAKE"; then CMAKE="`which cmake`"; fi
 if test -z "$CMAKE"; then 
CMAKE=/Applications/CMake.app/Contents/bin/cmake; fi
 if test -f "$CMAKE"; then echo "no 'cmake' command found"; exit 1; fi
and for the second approach to substitute ‘CMAKE’ into ‘src//Makevars’.

The final 'test -f' has to be negated. Demo:

edd@rob:/tmp$ cat foo.sh 
##!/bin/sh
if test -z "$CMAKE"; then CMAKE="`which cmake`"; fi
if test -z "$CMAKE"; then CMAKE=/Applications/CMake.app/Contents/bin/cmake; 
fi
if test -f "$CMAKE"; then echo "no 'cmake' command found"; exit 1; fi
echo "** using $CMAKE"
edd@rob:/tmp$ ./foo.sh 
no 'cmake' command found
edd@rob:/tmp$ which cmake
/usr/bin/cmake
edd@rob:/tmp$ 

If I change foo.sh to use '! test -f' all is well

edd@rob:/tmp$ be foo.sh# be is an alias for emacsclient in tty mode
edd@rob:/tmp$ ./foo.sh 
** using /usr/bin/cmake
edd@rob:/tmp$ 

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unneeded S3 method registration

2023-05-11 Thread Duncan Murdoch
The problem is that there's no way to declare that an internal function 
should or shouldn't be treated as an S3 method, other than by declaring 
it as one in NAMESPACE and exporting it.  If you read the thread 
"Unfortunate function name generic.something​" that started last week, 
you'll see the opposing problem to yours:  a local function named 
levels.no that isn't intended to be an S3 method, but will be treated as 
one in some circumstances.


You do export emm_basis and recover_data, so those generics are 
available to users.  And you do declare some of the methods, e.g. in 
emmeans 1.8.5, users can see methods for some classes:


> methods("emm_basis")
[1] emm_basis.aovlist* emm_basis.lm*  emm_basis.lme* 
emm_basis.merMod*  emm_basis.mlm*


As the message below says, you don't declare emm_basis.Gam to be a 
method, which means that a user with a Gam object who calls emm_basis 
will get the default method instead, whereas when code in your package 
calls it, they'll get the Gam method.


You say users should never call emm_basis directly, but package 
developers should provide methods for it.  At a minimum that's going to 
make debugging those packages much more confusing.  And if they have a 
class which inherits from Gam and want to call the inherited method, 
they won't get it.


So I think in this case the NOTE is something you should fix.

Duncan Murdoch

On 10/05/2023 10:49 p.m., Lenth, Russell V wrote:

Dear R package developers

My emmeans package failed preliminary checks when I submitted an update today, 
apparently due to a recent change in requirements on method registration. The 
message I got was:

* checking S3 generic/method consistency ... NOTE
Apparent methods for exported generics not registered:
   emm_basis.Gam emm_basis.MCMCglmm emm_basis.averaging
   emm_basis.betareg emm_basis.brmsfit emm_basis.carbayes emm_basis.clm
   emm_basis.clmm emm_basis.coxme emm_basis.coxph emm_basis.default
   emm_basis.gam emm_basis.gamlss emm_basis.gamm emm_basis.gee
   emm_basis.geeglm emm_basis.geese emm_basis.gls emm_basis.gnls
   emm_basis.hurdle emm_basis.lqm emm_basis.lqmm emm_basis.mblogit
   emm_basis.mcmc emm_basis.mcmc.list emm_basis.mira emm_basis.mmer
   emm_basis.multinom emm_basis.nlme emm_basis.nls emm_basis.polr
   emm_basis.qdrg emm_basis.rms emm_basis.rq emm_basis.rqs
   emm_basis.stanreg emm_basis.survreg emm_basis.svyolr
   emm_basis.zeroinfl recover_data.MCMCglmm recover_data.averaging
   recover_data.betareg recover_data.brmsfit recover_data.carbayes
   recover_data.clm recover_data.clmm recover_data.coxme
   recover_data.coxph recover_data.default recover_data.gam
   recover_data.gamlss recover_data.gamm recover_data.gee
   recover_data.geeglm recover_data.geese recover_data.gls
   recover_data.gnls recover_data.hurdle recover_data.lqm
   recover_data.lqmm recover_data.manova recover_data.mblogit
   recover_data.mcmc recover_data.mcmc.list recover_data.mira
   recover_data.mmer recover_data.multinom recover_data.nlme
   recover_data.nls recover_data.polr recover_data.qdrg recover_data.rms
   recover_data.rq recover_data.rqs recover_data.stanreg
   recover_data.survreg recover_data.svyglm recover_data.svyolr
   recover_data.zeroinfl
See section 'Registering S3 methods' in the 'Writing R Extensions'
manual.

I guess my question is "why does this matter?" There are many, many functions 
mentioned here, but they are all methods for emm_basis and recover_data. Both generics 
are in the emmeans namespace, as are all these functions.

The section on registering S3 methods explains:


The standard method for S3-style UseMethod dispatching might fail to locate 
methods defined in a package that is imported but not attached to the search 
path. To ensure that these methods are available the packages defining the 
methods should ensure that the generics are imported and register the methods 
using S3method directives...


But clearly all those methods flagged in the messages will be found in the same 
namespace as the generics -- emm_basis and recover_data -- so not being able to 
find them is not an issue. Moreover, emm_basis() and recover_data() are not 
meant to be called directly by a user, or even by code in another package. They 
are only meant to be called within the function emmeans::ref_grid(), and the 
existence of those generics and methods is simply a mechanism for being able to 
support a lot of different model classes.

Obviously, I could add a whole lot of S3method() directives to the NAMESPACE 
file, but it just seems wasteful to export all those methods when they are 
never needed outside the emmeans namespace.

Am I missing something?

Thanks

Russ Lenth



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list

Re: [Rd] range() for Date and POSIXct could respect `finite = TRUE`

2023-05-11 Thread Martin Maechler
> Davis Vaughan 
> on Tue, 9 May 2023 09:49:41 -0400 writes:

> It seems like the main problem is that `is.numeric(x)`
> isn't fully indicative of whether or not `is.finite(x)`
> makes sense for `x` (i.e.  Date isn't numeric but does
> allow infinite dates).

> So I could also imagine a new `allows.infinite()` S3
> generic that would return a single TRUE/FALSE for whether
> or not the type allows infinite values, this would also be
> indicative of whether or not `is.finite()` and
> `is.infinite()` make sense on that type. I imagine it
> being used like:

> ```
>   allows.infinite <- function(x) {
> UseMethod("allows.infinite")
>   }
>   allows.infinite.default <- function(x) {
> is.numeric(x) # For backwards compatibility, maybe? Not sure.
>   }
>   allows.infinite.Date <- function(x) {
> TRUE
>   }
>   allows.infinite.POSIXct <- function(x) {
> TRUE
>   }
>
>   range.default <- function (..., na.rm = FALSE, finite = FALSE) {
> x <- c(..., recursive = TRUE)
> if (allows.infinite(x)) { # changed from `is.numeric()`
>   if (finite)
> x <- x[is.finite(x)]
>   else if (na.rm)
> x <- x[!is.na(x)]
>   c(min(x), max(x))
> }
> else {
>   if (finite)
> na.rm <- TRUE
>   c(min(x, na.rm = na.rm), max(x, na.rm = na.rm))
> }
>   }
>   ```

> It could allow other R developers to also use the pattern of:

> ```
> if (allows.infinite(x)) {
># conditionally do stuff with is.infinite(x)
> }
> ```

> and that seems like it could be rather nice.

> It would avoid the need for `range.Date()` and `range.POSIXct()` methods 
too.

> -Davis

That *is* an interesting alternative perspective ...
sent just about before I was going to commit my proposal (incl
new help page entries, regr.tests ..).

So we would introduce a new generic  allows.infinite() {or
better name?,  allowsInf, ..} with the defined semantic that

allows.infinite(x) for a vector 'x' gives a logical "scalar",
TRUE iff it is known that  is.finite(x) "makes sense" and
returns a logical vector of length length(x) .. which is TRUE
where x[i] is not NA/NaN/+Inf/-Inf .. *and*
is.infinite := Negate(is.finite){or vice versa if you prefer}.

I agree that this may be useful somewhat more generally than
just for  range() methods.

What do others think?

Martin


> On Thu, May 4, 2023 at 5:29 AM Martin Maechler
>  wrote:
[..]

>> > Davis Vaughan
>> > on Mon, 1 May 2023 08:46:33 -0400 writes:
>> 
>> > Martin,
>> > Yes, I missed that those have `Summary.*` methods, thanks!
>> 
>> > Tweaking those to respect `finite = TRUE` sounds great. It seems like
>> > it might be a little tricky since the Summary methods call
>> > `NextMethod()`, and `range.default()` uses `is.numeric()` to determine
>> > whether or not to apply `finite`. Because `is.numeric.Date()` is
>> > defined, that always returns `FALSE` for Dates (and POSIXt). Because
>> > of that, it may still be easier to just write a specific
>> > `range.Date()` method, but I'm not sure.
>> 
>> > -Davis
>> 
>> I've looked more closely now, and indeed,
>> range() is the only function in the  Summary  group
>> where (only) the default method has a 'finite' argument.
>> which strikes me as somewhat asymmetric / inconsequential, as
>> after all,  range(.) := c(min(.), max(.)) ,
>> but  min() and max() do not obey an finite=TRUE setting, note
>> 
>> > min(c(-Inf,3:5), finite=TRUE)
>> Error: attempt to use zero-length variable name
>> 
>> where the error message also is not particularly friendly
>> and of course has nothing to with 'finite' :
>> 
>> > max(1:4, foo="bar")
>> Error: attempt to use zero-length variable name
>> >
>> 
>> ... but that is diverting;  coming back to the topic:  Given
>> that 'finite' only applies to range() {and there is just a convenience},
>> I do agree that from my own work & support to make `Date` and
>> `POSIX(c)t` behave more number-like, it would be "nice" to have
>> range() obey a `finite=TRUE` also for these.
>> 
>> OTOH, there are quite a few other 'number-like' thingies for
>> which I would then like to have  range(*, finite=TRUE) work,
>> e.g.,  "mpfr" (package {Rmpfr}) or "bigz" {gmp} numbers, numeric
>> sparse matrices, ...
>> 
>> To keep such methods all internally consistent with
>> range.default(), I could envision something like this
>> 
>> 
>> .rangeNum <- function(..., na.rm = FALSE, finite = FALSE, isNumeric)
>> {
>> x <- c(..., recursive = TRUE)
>> if(isNumeric(x)) {
>> if(finite) x <- x[is.finite(x)]
>> else if(na.rm) x <- x[!is.na(x)]
>> c(min(x), max(x))
>> } else {
>> if(finite) na.rm <- TRUE
>> c(min(x, na.rm=na.rm), max(x, na.rm=na.rm))
>> }
>> }
>> 

Re: [Rd] R-4.2.3 build from source on Windows (w Rtools42) - lto1.exe error

2023-05-11 Thread Tomas Kalibera



On 5/11/23 03:07, Gmail wrote:

Windows 11 PRO Version  10.0.22621 Build 22621
Processor: Intel(R) Core(TM) i7-1065G7 "Icelake-client"
```
​svn info   
Path: .
Working Copy Root Path: /d/R_DEV/R-4/R42/R-4-2-branch
URL: https://svn.r-project.org/R/branches/R-4-2-branch
Relative URL: ^/branches/R-4-2-branch
Repository Root: https://svn.r-project.org/R
Repository UUID: 00db46b3-68df-0310-9c12-caf00c1e9a41
Revision: 84417
Node Kind: directory
Schedule: normal
Last Changed Author: kalibera
Last Changed Rev: 84249
Last Changed Date: 2023-04-13 07:12:24 + (Thu, 13 Apr 2023)
```

Only adaptation done in MkRules.local was adding: `EOPTS = -march=native`  - 
that's why I included the cpu-type info above;
running make all recommended​ fails at/with:
```
gcc -shared -s -static-libgcc -o utils.dll tmp.def init.o io.o size.o sock.o 
stubs.o utils.o hashtab.o windows/dataentry.o windows/dialogs.o 
windows/registry.o windows/util.o windows/widgets.o 
../../../gnuwin32/dllversion.o -lRgraphapp -lversion 
-L/x86_64-w64-mingw32.static.posix/lib/x64 -llzma 
-LC:/rtools42/x86_64-w64-mingw32.static.posix/lib/x64 
-LC:/rtools42/x86_64-w64-mingw32.static.posix/lib -L../../../../bin/x64 -lR

lto1.exe: fatal error: bytecode stream in file 'windows/dataentry.o' generated 
with LTO version 9.3 instead of the expected 9.4
compilation terminated.

lto-wrapper.exe: fatal error: 
C:\rtools42\x86_64-w64-mingw32.static.posix\bin\gcc.exe returned 1 exit status
compilation terminated.
C:\rtools42\x86_64-w64-mingw32.static.posix\bin/ld.exe: error: lto-wrapper 
failed
collect2.exe: error: ld returned 1 exit status
cp: cannot stat 'utils.dll': No such file or directory
make[4]: *** [Makefile.win:36: shlib] Error 1
make[3]: *** [../../../share/make/basepkg.mk:145: mksrc-win2] Error 1
make[2]: *** [Makefile.win:24: all] Error 2
make[1]: *** [Makefile.win:34: R] Error 1
make: *** [Makefile:18: all] Error 2
```
Any hints/ideas on how to fix this? I guess I could
gcc -c -flto ... windows/dataentry.c -o windows/dataentry.o
with the exact path of that fiile ...


All of the object files are generated on your system during the build. 
If dataentry.o is generated using an older version than other object 
files, maybe there is some configuration problem on the system. It might 
be worth checking the compilers and linkers from Rtools42 are used, and 
then running "make distclean", and then trying the build again.


I never tried building R with LTO on Windows myself, I don't know if 
that works even on a system set up according to the documentation for R 
4.2 (https://cran.r-project.org/bin/windows/base/howto-R-4.2.html).



and it hopefully will fix that but I guess it would make sense to add a 
Revision to update that LTO version mismatch there, and I don't know yet if 
this is the only one?


Perhaps it is better to use Rtools43 and R-devel if you can, so that if 
you find some problem in either, it can still be fixed.



Greetings,
W


Best,
Tomas



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Please install cmake on macOS builders

2023-05-11 Thread Martin Maechler
> Martin Maechler 
> on Wed, 10 May 2023 21:31:29 +0200 writes:

  > Dirk Eddelbuettel 
  > on Wed, 10 May 2023 07:01:37 -0500 writes:

>> Simon,

>> Explicitly declaring

>> SystemRequirements: cmake

>> appears to be insufficient to get a build on the
>> (otherwise lovely to have) 'macOS builder', and leads to
>> failure on (at least) 'r-oldrel-macos-x86_64'.

>> Would it be possible to actually have cmake installed?

>> These daus cmake is for better or worse becoming a
>> standard, and I rely on it for one (new) package to
>> correctly configure a library. It would be nice to be
>> able to rely on it on macOS too.

> Somewhat 'ditto' from here {about wanting 'cmake' to
> become +/- standard tool for R packages use, *not* at all
> related to macOS} :

> The SuiteSparse C library on parts of which our Matrix
> package builds extensively has also switched their setup
> to use cmake instead of make ... and this was actually one
> reason we have not yet updated to the latest versions of
> SuiteSparse for the Matrix package.

> As Matrix is formally recommended, I would even be happy
> if 'cmake' became a +/- required OS tool for R ...

I've been told in private that the above "be happy if"
may *not* be a good idea,
or rather even close to impossible as   cmake  seems to not fit
well, at all, with the quite sophisticated
autoconf -> configure -> make
setup we have with building R + recommended packages
cross-platform compatibly as well as possible.

Martin


--
Martin Maechler ETH Zurich and R Core team

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] R-4.2.3 build from source on Windows (w Rtools42) - lto1.exe error

2023-05-11 Thread Prof Brian Ripley

Initial comments

- R 4.2.3 is not current

- -flto does not seem to be the default in src/gnuwin32/MkRules.

- LTO versions in GCC are tied to the compiler version, and in recent 
GCC are the same as the compiler version.  The recommended toolchain for 
R 4.2.x is Rtools42 which according to NEWS is based on GCC 10 not 9.


- LTO mismatches in my experience are most often seen in incremental 
builds, so first do an ab initio build.  (Not so long ago they were not 
detected during linking but gave segfaults.)


Assuming it is important to use LTO, I would first build without LTO to 
isolate the issue.  And be aware of the following in the NEWS for R 4.3.0:


• The Rcomplex definition (in header R_ext/Complex.h) has been
  extended to prevent possible mis-compilation when interfacing
  with Fortran (PR#18430).

AFAIR that "possible mis-compilation" is most likely using LTO.


On 11/05/2023 02:07, Gmail wrote:

Windows 11 PRO Version  10.0.22621 Build 22621
Processor: Intel(R) Core(TM) i7-1065G7 "Icelake-client"
```
​svn info   
Path: .
Working Copy Root Path: /d/R_DEV/R-4/R42/R-4-2-branch
URL: https://svn.r-project.org/R/branches/R-4-2-branch
Relative URL: ^/branches/R-4-2-branch
Repository Root: https://svn.r-project.org/R
Repository UUID: 00db46b3-68df-0310-9c12-caf00c1e9a41
Revision: 84417
Node Kind: directory
Schedule: normal
Last Changed Author: kalibera
Last Changed Rev: 84249
Last Changed Date: 2023-04-13 07:12:24 + (Thu, 13 Apr 2023)
```

Only adaptation done in MkRules.local was adding: `EOPTS = -march=native`  - 
that's why I included the cpu-type info above;
running make all recommended​ fails at/with:
```
gcc -shared -s -static-libgcc -o utils.dll tmp.def init.o io.o size.o sock.o 
stubs.o utils.o hashtab.o windows/dataentry.o windows/dialogs.o 
windows/registry.o windows/util.o windows/widgets.o 
../../../gnuwin32/dllversion.o -lRgraphapp -lversion 
-L/x86_64-w64-mingw32.static.posix/lib/x64 -llzma 
-LC:/rtools42/x86_64-w64-mingw32.static.posix/lib/x64 
-LC:/rtools42/x86_64-w64-mingw32.static.posix/lib -L../../../../bin/x64 -lR

lto1.exe: fatal error: bytecode stream in file 'windows/dataentry.o' generated 
with LTO version 9.3 instead of the expected 9.4
compilation terminated.

lto-wrapper.exe: fatal error: 
C:\rtools42\x86_64-w64-mingw32.static.posix\bin\gcc.exe returned 1 exit status
compilation terminated.
C:\rtools42\x86_64-w64-mingw32.static.posix\bin/ld.exe: error: lto-wrapper 
failed
collect2.exe: error: ld returned 1 exit status
cp: cannot stat 'utils.dll': No such file or directory
make[4]: *** [Makefile.win:36: shlib] Error 1
make[3]: *** [../../../share/make/basepkg.mk:145: mksrc-win2] Error 1
make[2]: *** [Makefile.win:24: all] Error 2
make[1]: *** [Makefile.win:34: R] Error 1
make: *** [Makefile:18: all] Error 2
```
Any hints/ideas on how to fix this? I guess I could
gcc -c -flto ... windows/dataentry.c -o windows/dataentry.o
with the exact path of that fiile ...
and it hopefully will fix that but I guess it would make sense to add a 
Revision to update that LTO version mismatch there, and I don't know yet if 
this is the only one?

Greetings,
W

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


--
Brian D. Ripley,  rip...@stats.ox.ac.uk
Emeritus Professor of Applied Statistics, University of Oxford

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel