Re: [Rd] Advice debugging M1Mac check errors

2024-02-06 Thread J C Nash

M1mac numerical issues should be rare, but when they do pop up they can be 
disconcerting.

The following little script reveals what happens with no extended precision. A 
few months
ago I built this into a "package" and used 
https://mac.r-project.org/macbuilder/submit.html
to run it, getting the indicated result of 0 for (sum(vv1) - 1.0e0), with 
non-zero on my
Ryzen 7 laptop.

JN

# FPExtendedTest.R   J C Nash
loopsum <- function(vec){
   n <- length(vec)
   vsum<-0.0
   for (i in 1:n) { vsum <- vsum + vec[i]}
   vsum
}
small<-.Machine$double.eps/4 # 1/4 of the machine precision
vsmall <- rep(small, 1e4) # a long vector of small numbers
vv1 <- c(1.0, vsmall) # 1 at the front of this vector
vv2 <- c(vsmall, 1.0) # 1 at the end
(sum(vv1) - 1.0e0) # Should be > 0 for extended precision, 0 otherwise
(sum(vv2) - 1.0e0) # Should be > 0
(loopsum(vv1) - 1.0e0) # should be zero
(loopsum(vv2) - 1.0e0) # should be greater than zero



On 2024-02-06 08:06, Prof Brian Ripley via R-devel wrote:



We were left to guess, but I doubt this has to do with the lack of 'extended precision' nor long doubles longer than 
doubles on arm64 macOS.  And issues with that are rather rare (much rarer than numerical issues for non-reference x86_64 
BLAS/LAPACKs).  Of the 20,300 CRAN packages just 18 have M1mac-specific errors, none obviously from numerical 
inaccuracy.  A quick look back suggests we get about 20 a year with M1mac numerical issues, about half of which were 
mirrored on the x86_64 'noLD' checks.




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Advice debugging M1Mac check errors

2024-02-04 Thread J C Nash

Simon's comments add another viewpoint to mine. My own knowledge of the
impact of "disable-long-double" does not include an understanding of
exactly what effect this has. One needs to spend a lot of time and effort
with excruciating details. Fortunately, we can usually get away with
64 bit FP arithmetic for almost all applications. I suspect applications
that need really long precision are likely best handled with special
hardware.

JN


On 2024-02-04 16:46, Simon Urbanek wrote:




On Feb 5, 2024, at 12:26 PM, Duncan Murdoch  wrote:

Hi John.

I don't think the 80 bit format was part of IEEE 754; I think it was an Intel 
invention for the 8087 chip (which I believe preceded that standard), and 
didn't make it into the standard.

The standard does talk about 64 bit and 128 bit floating point formats, but not 
80 bit.



Yes, the 80 bit was Intel-specific (motivated by internal operations, not as external 
format), but as it used to be most popular architecture, people didn't quite realize that 
tests relying on Intel results will be Intel-specific (PowerPC Macs had 128-bit floating 
point, but they were not popular enough to cause trouble in the same way). The IEEE 
standard allows "extended precision" formats, but doesn't prescribe their 
format or precision - and they are optional. Arm64 CPUs only support 64-bit double 
precision in hardware (true both on macOS and Windows), so only what is in the basic 
standard. There are 128-bit floating point solutions in software, but, obviously, they 
are a lot slower (several orders of magnitude). Apple has been asking for priorities in 
the scientific community and 128-bit floating number support was not something high on 
people's priority list. It is far from trivial, because there is a long list of 
operations (all variations of the math functions) so I wouldn't expect this to change 
anytime soon - in fact once Microsoft's glacial move is done we'll be likely seeing only 
64-bit everywhere.

That said even if you don't have a arm64 CPU, you can build R with 
--disable-long-double to get closer to the arm64 results if that is your worry.

Cheers,
Simon




On 04/02/2024 4:47 p.m., J C Nash wrote:

Slightly tangential: I had some woes with some vignettes in my
optimx and nlsr packages (actually in examples comparing to OTHER
packages) because the M? processors don't have 80 bit registers of
the old IEEE 754 arithmetic, so some existing "tolerances" are too
small when looking to see if is small enough to "converge", and one
gets "did not converge" type errors. There are workarounds,
but the discussion is beyond this post. However, worth awareness that
the code may be mostly correct except for appropriate tests of
smallness for these processors.
JN
On 2024-02-04 11:51, Dirk Eddelbuettel wrote:


On 4 February 2024 at 20:41, Holger Hoefling wrote:
| I wanted to ask if people have good advice on how to debug M1Mac package
| check errors when you don´t have a Mac? Is a cloud machine the best option
| or is there something else?

a) Use the 'mac builder' CRAN offers:
 https://mac.r-project.org/macbuilder/submit.html

b) Use the newly added M1 runners at GitHub Actions,
 
https://github.blog/changelog/2024-01-30-github-actions-introducing-the-new-m1-macos-runner-available-to-open-source/

Option a) is pretty good as the machine is set up for CRAN and builds
fast. Option b) gives you more control should you need it.

Dirk


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Advice debugging M1Mac check errors

2024-02-04 Thread J C Nash

80 bit registers (I don't have my original docs with me here in Victoria)
seem to have been part of the 1985 standard to which I was one of the 31 named
contributors. See

https://stackoverflow.com/questions/612507/what-are-the-applications-benefits-of-an-80-bit-extended-precision-data-type

or the Wikipedia item on IEEE 754.

It appears to have been omitted from 2008 and 2020 versions, but is still (I 
believe) part of
many processors. It's an internal precision for handling multiplications and 
accumulation,
and not one of the storage modes.

Most of the time this makes very little difference in results for R, since it 
is only some
operations where the extended precision gets activated. If we store quantities, 
we get the
regular precision. Thus very few situations using the M? chips give 
differences, but when
they do, it is a nuisance.

There is plenty of scope for debating the pros and cons of extended precision 
internally.
Not having it likely contributes to speed / bang for the buck of the M? chips. 
But we do
now have occasional differences in outcomes which will lead to confusion and 
extra work.

JN




On 2024-02-04 15:26, Duncan Murdoch wrote:

Hi John.

I don't think the 80 bit format was part of IEEE 754; I think it was an Intel invention for the 8087 chip (which I 
believe preceded that standard), and didn't make it into the standard.


The standard does talk about 64 bit and 128 bit floating point formats, but not 
80 bit.

Duncan Murdoch

On 04/02/2024 4:47 p.m., J C Nash wrote:

Slightly tangential: I had some woes with some vignettes in my
optimx and nlsr packages (actually in examples comparing to OTHER
packages) because the M? processors don't have 80 bit registers of
the old IEEE 754 arithmetic, so some existing "tolerances" are too
small when looking to see if is small enough to "converge", and one
gets "did not converge" type errors. There are workarounds,
but the discussion is beyond this post. However, worth awareness that
the code may be mostly correct except for appropriate tests of
smallness for these processors.

JN




On 2024-02-04 11:51, Dirk Eddelbuettel wrote:


On 4 February 2024 at 20:41, Holger Hoefling wrote:
| I wanted to ask if people have good advice on how to debug M1Mac package
| check errors when you don´t have a Mac? Is a cloud machine the best option
| or is there something else?

a) Use the 'mac builder' CRAN offers:
 https://mac.r-project.org/macbuilder/submit.html

b) Use the newly added M1 runners at GitHub Actions,
 
https://github.blog/changelog/2024-01-30-github-actions-introducing-the-new-m1-macos-runner-available-to-open-source/


Option a) is pretty good as the machine is set up for CRAN and builds
fast. Option b) gives you more control should you need it.

Dirk



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Advice debugging M1Mac check errors

2024-02-04 Thread J C Nash

Slightly tangential: I had some woes with some vignettes in my
optimx and nlsr packages (actually in examples comparing to OTHER
packages) because the M? processors don't have 80 bit registers of
the old IEEE 754 arithmetic, so some existing "tolerances" are too
small when looking to see if is small enough to "converge", and one
gets "did not converge" type errors. There are workarounds,
but the discussion is beyond this post. However, worth awareness that
the code may be mostly correct except for appropriate tests of
smallness for these processors.

JN




On 2024-02-04 11:51, Dirk Eddelbuettel wrote:


On 4 February 2024 at 20:41, Holger Hoefling wrote:
| I wanted to ask if people have good advice on how to debug M1Mac package
| check errors when you don´t have a Mac? Is a cloud machine the best option
| or is there something else?

a) Use the 'mac builder' CRAN offers:
https://mac.r-project.org/macbuilder/submit.html

b) Use the newly added M1 runners at GitHub Actions,

https://github.blog/changelog/2024-01-30-github-actions-introducing-the-new-m1-macos-runner-available-to-open-source/

Option a) is pretty good as the machine is set up for CRAN and builds
fast. Option b) gives you more control should you need it.

Dirk



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] ADA Compliance

2024-01-15 Thread J C Nash

Slightly tangential, but about two decades ago I was researching
how multimedia databases might be reasonably structured. To have a
concrete test case, I built a database of English Country (Playford)
dances, which I called Playford's Progeny. (Ben B. will be aware of
this, too.) This proved rather popular, but around 2010 the busybody
brigade at uOttawa sent me a demand to prove that the website satisfied
(name your jurisdiction, I think mine was Ontario provincial something)
accessibility requirements.

I figured my time to do this was worth $2-3K and simply went out and
bought service for about $100. It's now hosted on ottawaenglishdance.org.
Interestingly the main contributor to my site at the time was blind.
Go figure.

The point I'm getting at is that it may make people feel good to
legislate about accessibility, but my guess is the old adage of
catching more flies with honey than vinegar is illustrated here to a
horrifying degree. I'm afraid I've no practical advice on how to
satisfy the "rules".

Best of luck getting things available for as many folk as possible,
no matter their particular disabilities. It's something I support,
just not a lot of rules.

JN


On 2024-01-15 07:10, peter dalgaard wrote:

Yes,

Jonathon Godfrey, who wrote the r-devel/2022-December mail (and is himself 
blind), would be my standard go-to guy in matters relating to visual 
impairment, screen readers and all that.

Peter D.


On 13 Jan 2024, at 00:14 , Ben Bolker  wrote:

I would be very surprised if anyone had written up a VPAT 
 for R.

  It won't help you with the bureaucratic requirements, but R is in fact very 
accessible to visually impaired users: e.g. see

https://community.rstudio.com/t/accessibility-of-r-rstudio-compared-to-excel-for-student-that-is-legally-blind/103849/3

 From https://github.com/ajrgodfrey/BrailleR


R is perhaps the most blind-friendly statistical software option because all 
scripts can be written in plain text, using the text editor a user prefers, and 
all output can be saved in a wide range of file formats. The advent of R 
markdown and other reproducible research techniques can offer the blind user a 
degree of efficiency that is not offered in many other statistical software 
options. In addition, the processed Rmd files are usually HTML which are the 
best supported files in terms of screen reader development.


  (And there is continued attention to making sure R stays accessible in this 
way: https://stat.ethz.ch/pipermail/r-devel/2022-December/082180.html; 
https://stat.ethz.ch/pipermail/r-devel/2023-February/082313.html)

  R is also easy to use without a mouse, which should improve accessibility for 
users with neuromuscular conditions.

   cheers
Ben Bolker




On 2024-01-12 2:50 p.m., Hunter, Zayne via R-devel wrote:

Hello,
I am working with Ball State University to obtain a license of R. As part of 
our requirements for obtaining new software, we must review the VPAT for ADA 
compliance. Can you provide this information for me?
Thanks,
Zayne Hunter
Technology Advisor & Vendor Relations Manager
Ball State University
zayne.hun...@bsu.edu
(765)285-7853
[[alternative HTML version deleted]]
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel




__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Rmarkdown fails if (quote) r (space) is used

2023-11-03 Thread J C Nash

Yes. An initial space does the trick. Thanks. J

On 2023-11-03 11:48, Serguei Sokol wrote:

Le 03/11/2023 à 15:54, J C Nash a écrit :

I've spent a couple of hours with an Rmarkdown document where I
was describing some spherical coordinates made up of a radius r and
some angles. I wanted to fix the radius at 1.

In my Rmarkdown text I wrote

    Thus we have `r = 1` ...

To avoid a confusion between inline code and fixed font typesetting, could it be

    Thus we have ` r = 1` ...

(with a space after an opening quote)?

Best,
Serguei.



This caused failure to render with "unexpected =". I was using Rstudio
at first and didn't see the error msg.

If I use "radius R" and `R = 1`, things are fine, or `r=1` with no space,
but the particular "(quote) r (space)" seems to trigger code block processing.

Perhaps this note can save others some wasted time.

I had thought (obviously incorrectly) that one needed ```{r something}
to start the code chunk.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel




__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Rmarkdown fails if (quote) r (space) is used

2023-11-03 Thread J C Nash

I've spent a couple of hours with an Rmarkdown document where I
was describing some spherical coordinates made up of a radius r and
some angles. I wanted to fix the radius at 1.

In my Rmarkdown text I wrote

   Thus we have `r = 1` ...

This caused failure to render with "unexpected =". I was using Rstudio
at first and didn't see the error msg.

If I use "radius R" and `R = 1`, things are fine, or `r=1` with no space,
but the particular "(quote) r (space)" seems to trigger code block processing.

Perhaps this note can save others some wasted time.

I had thought (obviously incorrectly) that one needed ```{r something}
to start the code chunk.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Issue of itemize in man file

2023-10-22 Thread J C Nash

Thanks. That seems to be the issue. Also vincent's suggestion of checkRd.

JN

On 2023-10-22 10:52, Ivan Krylov wrote:

On Sun, 22 Oct 2023 10:43:08 -0400
J C Nash  wrote:


\itemize{
  \item{fnchk OK;}{ \code{excode} = 0;
 \code{infeasible} = FALSE}


The \item command inside \itemize{} lists doesn't take arguments.
Did you mean \describe{} instead of \itemize{}?



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Issue of itemize in man file

2023-10-22 Thread J C Nash

I'm doing a major update of the optimx package and things were going relatively
smoothly until this weekend when files that have passed win-builder gave NOTEs
on r-devel for several manual (.Rd) files.
The NOTE is of the form


* checking Rd files ... NOTE
checkRd: (-1) fnchk.Rd:40-41: Lost braces in \itemize; \value handles \item{}{} 
directly
checkRd: (-1) fnchk.Rd:43: Lost braces in \itemize; \value handles \item{}{} 
directly
checkRd: (-1) fnchk.Rd:45: Lost braces in \itemize; \value handles \item{}{} 
directly



The source of this looks like

  \item{msg}{A text string giving information about the result of the function 
check: Messages and
the corresponding values of \code{excode} are:
  \itemize{
\item{fnchk OK;}{ \code{excode} = 0;
   \code{infeasible} = FALSE}
\item{Function returns INADMISSIBLE;}
{ \code{excode} = -1; \code{infeasible} = TRUE}
 ...
}

I've not seen this before, nor does a search give any hits.

Does anyone have any ideas? Or is this a glitch in r-devel as things are tried 
out.?

I don't get the NOTE on win-builder R-release nor on local R CMD check. Note 
that the
\itemize is to give a second-level list i.e., for expanding output of one of the
\value objects returned, namely the return codes.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] URL syntax causes R CMD build failure - a fix

2023-09-03 Thread J C Nash

Thanks Uwe. I think you may have the reason, esp. if the url is output as LaTex
formatted text to the intermediate files.

> Where is it in your package and what is the R CMD check output?

The issue was a failure to build the nlsr-devdoc.Rmd vignette. Unfortunately, 
the
messages were as below (indented). The package passed in May and earlier, and
failure occurred when pandoc updated recently on some platforms, so there is 
some
change in the toolchain that triggered this. While pandoc is a wonderful tool, 
its
message output can be very unhelpful. I've had difficulties outside of R 
converting
LaTex to epub for some of my historical novels. In those cases I've also seen 
apparent
success with "bits left out" which does upset the readers when the story has 
gaps.
I've never resolved the "why?", but have managed to work around, sometimes by 
simply
adding a vertical space (i.e., line ending), or otherwise rearranging text. It 
would
be nice to know the reason, but as with the present issue, the reward is not 
worth
the effort.

I've also seen awkwardness with currency symbols, though that may be my own 
lack of
detailed knowledge with LaTex. I need multiple currencies in some stories, and 
end
up editing with Sigil. I anticipate that some R users with vignettes that have
several currencies might want to check output.

Whether or not the url syntax that caused the present trouble is valid or not, 
the
percentage signs are likely worth avoiding if possible.

Thanks,

JN

On 2023-09-03 10:29, Uwe Ligges wrote:

John can you point us to an example?
Where is it in your package and what is the R CMD check output?

Guess: Within an Rd file you have to escape the %  characters otherwise they 
start a comment.

Best,
Uwe Ligges



On 03.09.2023 00:30, Spencer Graves wrote:
I've encountered similar issues. However, it has been long enough ago that I don't remember enough details to say more 
without trying to update my CRAN packages to see what messages I get and maybe researching my notes from previous 
problems of this nature. Spencer Graves



On 9/2/23 4:23 PM, Greg Hunt wrote:

The percent encoded characters appear to be valid in that URL, suggesting
that rejecting them is an error. That kind of error could occur when the
software processing them converts them back to a non-unicode character set.

On Sun, 3 Sep 2023 at 4:34 am, J C Nash  wrote:


I'm posting this in case it helps some other developers getting build
failure.

Recently package nlsr that I maintain got a message that it failed to
build on
some platforms. The exact source of the problem is still to be illuminated,
but seems to be in knitr::render and/or pandoc or an unfortunate
interaction.
An update to pandoc triggered a failure to process a vignette that had been
happily processed for several years. The error messages are unhelpful, at
least
to me,

 Error at "nlsr-devdoc.knit.md" (line 5419, column 1):
 unexpected end of input
 Error: pandoc document conversion failed with error 64
 Execution halted

Unfortunately, adding "keep_md: TRUE" (you need upper case TRUE to save it
when
there is no error of this type), did not save the intermediate file in this
case. However, searching for "pandoc error 64" presented one web page
where the author
used brute force search of his document by removing / replacing sections
to find
the line(s) that caused trouble. This is a little tedious, but effective.
In my
case, the offending line turned out to be a copied and pasted URL

https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

The coded characters can be replaced by a hyphen, to give,

https://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm

and this, when pasted in Mozilla Firefox at least, will go to the
appropriate
wikipedia page.

I'd be interested in hearing from others who have had similar
difficulties. I
suspect this is relatively rare, and causing some sort of infelicity in the
output of knitr::render that then trips up some versions of pandoc, that
may,
for instance, be now applying stricter rules to URL syntax.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] URL syntax causes R CMD build failure - a fix

2023-09-02 Thread J C Nash

I'm posting this in case it helps some other developers getting build failure.

Recently package nlsr that I maintain got a message that it failed to build on
some platforms. The exact source of the problem is still to be illuminated,
but seems to be in knitr::render and/or pandoc or an unfortunate interaction.
An update to pandoc triggered a failure to process a vignette that had been
happily processed for several years. The error messages are unhelpful, at least
to me,

   Error at "nlsr-devdoc.knit.md" (line 5419, column 1):
   unexpected end of input
   Error: pandoc document conversion failed with error 64
   Execution halted

Unfortunately, adding "keep_md: TRUE" (you need upper case TRUE to save it when
there is no error of this type), did not save the intermediate file in this
case. However, searching for "pandoc error 64" presented one web page where the 
author
used brute force search of his document by removing / replacing sections to find
the line(s) that caused trouble. This is a little tedious, but effective. In my
case, the offending line turned out to be a copied and pasted URL

https://en.wikipedia.org/wiki/Levenberg%E2%80%93Marquardt_algorithm

The coded characters can be replaced by a hyphen, to give,

https://en.wikipedia.org/wiki/Levenberg-Marquardt_algorithm

and this, when pasted in Mozilla Firefox at least, will go to the appropriate
wikipedia page.

I'd be interested in hearing from others who have had similar difficulties. I
suspect this is relatively rare, and causing some sort of infelicity in the
output of knitr::render that then trips up some versions of pandoc, that may,
for instance, be now applying stricter rules to URL syntax.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] CRAN test complaints about package that passes most platforms

2023-08-11 Thread J C Nash

My nlsr package was revised mid-February. After CRAN approved it, I got a
message that it was "failing" M1Mac tests. The issue turned out to be ANOTHER
package that was being used in an example in a vignette. Because M1 does not
provide the IEEE 754 80 bit registers, a method in package minqa did not
"converge", that is, it did not pass a termination test. Relaxing a tolerance
got a "pass" on the test service for M1 Mac then available. This issue can
be found by searching the web, though it probably deserves some clarity in
R documentation somewhere. The presentation of such problems can, of course,
take many forms.

There was a minor revision to nlsr in May to rationalize the names of some 
functions
to produce summary information about solutions. This seemed to give no issues 
until
now.

Two days ago, however, I received a msg that the (unchanged!) package is 
failing tests
on M1 and on Fedora clang r-devel tests in building some vignettes. The messages
are about pandoc and a missing file "framed.sty". All other tests showing on
CRAN are OK. When I try with R-hub I seem to get even more complaints than
the messages from CRAN, but about the same issues, and about vignette
building.

2 queries:

- Is anyone else getting similar messages? If so, it may be useful to share
notes to try to get this resolved. It seems within reason that the issue is
some unfortunate detail in Fedora and M1 that interacts with particular
syntax in the vignette, or that the setup of those machines is inadequate.
Comparing notes may reveal what is causing complaints and help to fix either
in the .Rmd vignettes or in the pandoc structure.

- Is there an M1Mac test platform to which packages can be submitted? Brian
Ripley did have one, but trying the link I used before seems not to present
a submission dialog.

I'd like to be helpful, but have a suspicion that a humble package developer
is being used as a stooge to find and fix software glitches outside of R. 
However,
if it's a matter of an unfortunate mismatch of document and processor, I'll be
happy to help document and fix it.

It would be a pity if vignettes cause enough trouble that developers simply 
don't
include them.

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] feature request: optim() iteration of functions that return multiple values

2023-08-08 Thread J C Nash

But why time methods that the author (me!) has been telling the community for
years have updates? Especially as optimx::optimr() uses same syntax as optim()
and gives access to a number of solvers, both production and didactic. This set
of solvers is being improved or added to regularly, with a major renewal almost
complete (for the adventurous, code on https://github.com/nashjc/optimx).

Note also that the default Nelder-Mead is good for exploring function surface 
and
is quite robust at getting quickly into the region of a minimum, but can be 
quite
poor in "finishing" the process. Tools have different strengths and weaknesses.
optim() was more or less state of the art a couple of decades ago, but there are
other choices now.

JN

On 2023-08-08 05:14, Sami Tuomivaara wrote:

Thank you all very much for the suggestions, after testing, each of them would 
be a viable solution in certain contexts.  Code for benchmarking:

# preliminaries
install.packages("microbenchmark")
library(microbenchmark)


data <- new.env()
data$ans2 <- 0
data$ans3 <- 0
data$i <- 0
data$fun.value <- numeric(1000)

# define functions

rosenbrock_env <- function(x, data)
{
   x1 <- x[1]
   x2 <- x[2]
   ans <- 100 * (x2 - x1 * x1)^2 + (1 - x1)^2
   ans2 <- ans^2
   ans3 <- sqrt(abs(ans))
   data$i <- data$i + 1
   data$fun.value[data$i] <- ans
   ans
}


rosenbrock_env2 <- function(x, data)
{
   x1 <- x[1]
   x2 <- x[2]
   ans <- 100 * (x2 - x1 * x1)^2 + (1 - x1)^2
   ans2 <- ans^2
   ans3 <- sqrt(abs(ans))
   data$ans2 <- ans2
   data$ans3 <- ans3
   ans
}

rosenbrock_attr <- function(x)
{
   x1 <- x[1]
   x2 <- x[2]
   ans <- 100 * (x2 - x1 * x1)^2 + (1 - x1)^2
   ans2 <- ans^2
   ans3 <- sqrt(abs(ans))
   attr(ans, "ans2") <- ans2
   attr(ans, "ans3") <- ans3
   ans
}


rosenbrock_extra <- function(x, extraInfo = FALSE)
{
   x1 <- x[1]
   x2 <- x[2]
   ans <- 100 * (x2 - x1 * x1)^2 + (1 - x1)^2
   ans2 <- ans^2
   ans3 <- sqrt(abs(ans))
   if (extraInfo) list(ans = ans, ans2 = ans2, ans3 = ans3)
   else ans
}


rosenbrock_all <- function(x)
{
   x1 <- x[1]
   x2 <- x[2]
   ans <- 100 * (x2 - x1 * x1)^2 + (1 - x1)^2
   ans2 <- ans^2
   ans3 <- sqrt(abs(ans))
   list(ans = ans, ans2 = ans2, ans3 = ans3)
}

returnFirst <- function(fun) function(...) do.call(fun,list(...))[[1]]
rosenbrock_all2 <- returnFirst(rosenbrock_all)


# benchmark all functions
set.seed <- 100

microbenchmark(env = optim(c(-1,2), rosenbrock_env, data = data),
env2 = optim(c(-1,2), rosenbrock_env2, data = data),
attr = optim(c(-1,2), rosenbrock_attr),
extra = optim(c(-1,2), rosenbrock_extra, extraInfo = FALSE),
all2 = optim(c(-1,2), rosenbrock_all2),
times = 100)


# correct parameters and return values?
env <- optim(c(-1,2), rosenbrock_env, data = data)
env2 <- optim(c(-1,2), rosenbrock_env2, data = data)
attr <- optim(c(-1,2), rosenbrock_attr)
extra <- optim(c(-1,2), rosenbrock_extra, extraInfo = FALSE)
all2 <- optim(c(-1,2), rosenbrock_all2)

# correct return values with optimized parameters?
env. <- rosenbrock_env(env$par, data)
env2. <- rosenbrock_env(env2$par, data)
attr. <- rosenbrock_attr(attr$par)
extra. <- rosenbrock_extra(extra$par, extraInfo = FALSE)
all2. <- rosenbrock_all2(all2$par)

# functions that return more than one value
all. <- rosenbrock_all(all2$par)
extra2. <- rosenbrock_extra(extra$par, extraInfo = TRUE)

# environment values correct?
data$ans2
data$ans3
data$i
data$fun.value


microbenchmarking results:

Unit: microseconds
   expr minlq  meanmedian uq   max neval
env 644.102 3919.6010 9598.3971 7950.0005 15582.8515 42210.900   100
   env2 337.001  351.5510  479.2900  391.7505   460.3520  6900.800   100
   attr 350.201  367.3010  502.0319  409.7510   483.6505  6772.800   100
  extra 276.800  287.2010  402.4231  302.6510   371.5015  6457.201   100
   all2 630.801  646.9015  785.9880  678.0010   808.9510  6411.102   100

rosenbrock_env and _env2 functions differ in that _env accesses vectors in the 
defined environment by indexing, whereas _env2 doesn't (hope I interpreted this 
right?).  This appears to be expensive operation, but allows saving values 
during the steps of the optim iteration, rather than just at convergence.  
Overall, _extra has consistently lowest median execution time!

My earlier workaround was to write two separate functions, one of which returns 
extra values; all suggested approaches simplify that approach considerably.  I 
am also now more educated about attributes and environments that I did not know 
how to utilize before and that proved to be very useful concepts.  Again, thank 
you everyone for your input!


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list

Re: [R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash

The important information is in the body of the man page for news(),
i.e., found by
   ?utils::news

and this explains why putting an "o" in front of a line clears the
NOTE. Once I realized that CRAN is running this, I could see the
"why". Thanks.

JN

On 2023-07-26 10:25, Duncan Murdoch wrote:


NEWS has been used for a long time by the utils::news() function, which in turn 
is used by the HTML help system.

Duncan Murdoch



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash

Thanks for the info, though it seems odd that CRAN wants to
parse a plain text file that is purely for information, since it
should have no impact on the current package or any other. I suppose
there might be character set issues to check. The motive for parsing
it eludes me.

Does anyone know if there are plans to use NEWS for some purpose in
the future i.e., to actually track changes beyond package maintainer's
comments?

Cheers, and thanks again.

JN


On 2023-07-26 10:03, Ivan Krylov wrote:

В Wed, 26 Jul 2023 09:37:38 -0400
J C Nash  пишет:


I'd like to avoid NOTEs if possible, and since I'm using a plain-text
NEWS, don't believe this should trigger one.


Plain-text NEWS files are parsed according to the rules specified in
help(news), which is admittedly laconic in its description. If you run
tools:::.news_reader_default('https://cran.r-project.org/web/packages/optimx/NEWS')
(or news(package = 'optimx')), you can see that R's news() already
misunderstands some of the contents of your NEWS file.

A relatively recent change (r82543, July 2022) set
_R_CHECK_NEWS_IN_PLAIN_TEXT_=TRUE for R CMD check --as-cran and started
verifying that R's plain text "news reader" function could actually
parse plain-text NEWS files without warnings or errors.

I think that if you rename NEWS to ChangeLog, R will leave the file
alone, but CRAN will offer it to users as plain text.



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Changes to checks on NEWS?

2023-07-26 Thread J C Nash



In work on an upgrade to my optimx package, I added to my (plain text) NEWS
file.

The lines

VERSION 2023-06-25

  o This is a MAJOR revision and overhaul of the optimx package and its 
components.
  o Fixed CITATION file based on R CMD check --as-cran complaints
regarding requirement for person() and bibentry() changes.

pass R CMD check --as-cran

but

VERSION 2023-06-25

This is a MAJOR revision and overhaul of the optimx package and its 
components.
  o Fixed CITATION file based on R CMD check --as-cran complaints
regarding requirement for person() and bibentry() changes.

give a NOTE that news cannot process the chunk/lines in NEWS.

R CMD checkpasses. (i.e., CRAN checks are tripping the NOTE).

I don't see anything about this in Writing R Extensions at moment.

Does anyone have information on what may have changed. I'd like to avoid NOTEs 
if possible,
and since I'm using a plain-text NEWS, don't believe this should trigger one.

The version that passes was the result of some almost random tries to see what 
would
trigger a note.

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Concerns with SVD

2023-07-16 Thread J C Nash

Better check your definitions of SVD -- there are several forms, but all I
am aware of (and I wrote a couple of the codes in the early 1970s for the SVD)
have positive singular values.

JN


On 2023-07-16 02:01, Durga Prasad G me14d059 wrote:

Respected Development Team,

This is Durga Prasad reaching out to you regarding an issue/concern related
to Singular Value Decomposition SVD in R software package. I am attaching a
detailed attachment with this letter which depicts real issues with SVD in
R.

To reach the concern the expressions for the exponential of a matrix using
SVD and
projection tensors are obtained from series expansion. However, numerical
inconsistency is observed between the exponential of matrix obtained using
the function(svd()) used in R software.

However, it is observed that most of the researchers fraternity is engaged
in utilising R software for their research purposes and to the extent of my
understanding such an error in SVD in R software might raise the concern
about authenticity of the simulation results produced and published by
researchers across the globe.

Further, I am very sure that the R software development team is well versed
with the happening and they have any specific and resilient reasons for
doing so. I would request you kindly, to guide me through the concern.

Thank you very much.


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] files katex.js and katex-config.js not found in R CMD check --as-cran

2023-05-09 Thread J C Nash

In updating my nlsr package, I ran R CMD check --as-cran and got an error
that /usr/lib/R/doc/html/katex/katex.js was not found.

I installed the (large!) r-cran-katex. No joy.

katex.js was in /usr/share/R/doc/html/katex/  so I created a symlink. Then
I got katex-config.js not found (but in 1 directory up).
So

sudo ln -s /usr/share/R/doc/html/katex-config.js katex-config.js

Then I get the check to run OK. So I'm now up and running, but others might not
be so persistent.

Is there a glitch in my system? Or is this a bug in the latest R CMD check 
--as-cran?
(Or at least a configuration issue?)

Best,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Query: Could documentation include modernized references?

2023-03-31 Thread J C Nash

Thanks Martin.

Following Duncan's advice as well as some textual input, I have put a proposed 
Rd file for
optim on a fork of the R code at
https://github.com/nashjc/r/blob/master/src/library/stats/man/optim.Rd

This has the diffs given below from the github master. The suggested changes
primarily point to a Task View, which I believe is a sensible approach.

I'll admit to being rather clumsy with git, and will be happy to receive advice
on how to proceed if more work needed on my part.

Cheers,

John Nash




--- optim.Rd2022-03-24 19:02:04.0 -0400
+++ optim.Rd.20230324.txt   2023-03-29 09:23:28.373457291 -0400
@@ -15,6 +15,9 @@
   General-purpose optimization based on Nelder--Mead, quasi-Newton and
   conjugate-gradient algorithms. It includes an option for
   box-constrained optimization and simulated annealing.
+  These methods are quite old and better ones are known for many
+  problems.  See the Optimization and Mathematical Programming task
+  view (Schwendinger and Borchers, 2023) for a survey.
 }
 \usage{
 optim(par, fn, gr = NULL, \dots,
@@ -67,6 +70,8 @@
   Beale--Sorenson updates).  Conjugate gradient methods will generally
   be more fragile than the BFGS method, but as they do not store a
   matrix they may be successful in much larger optimization problems.
+  The \code{"CG"} method has known improvements that are discussed in
+  Schwendinger and Borchers (2023)."

   Method \code{"L-BFGS-B"} is that of Byrd \emph{et. al.} (1995) which
   allows \emph{box constraints}, that is each variable can be given a lower
@@ -230,8 +235,10 @@
 \source{
   The code for methods \code{"Nelder-Mead"}, \code{"BFGS"} and
   \code{"CG"} was based originally on Pascal code in Nash (1990) that was
-  translated by \code{p2c} and then hand-optimized.  Dr Nash has agreed
-  that the code can be made freely available.
+  translated by \code{p2c} and then hand-optimized.  Dr Nash has agreed
+  that the code can be made freely available, but recommends that the more
+  reliable \code{optimx::Rcgmin()} function should be used instead of
+  method \code{"CG"}.

   The code for method \code{"L-BFGS-B"} is based on Fortran code by Zhu,
   Byrd, Lu-Chen and Nocedal obtained from Netlib (file
@@ -269,6 +276,10 @@
   Nocedal, J. and Wright, S. J. (1999).
   \emph{Numerical Optimization}.
   Springer.
+   
+  Florian Schwendinger, Hans W. Borchers (2023). \emph{CRAN Task View:
+  Optimization and Mathematical Programming.} Version 2023-02-16.
+  URL https://CRAN.R-project.org/view=Optimization.
 }

 \seealso{




On 2023-03-31 09:31, Martin Maechler wrote:


Thanks a lot, Duncan, for this (as usual from you) very precise
and helpful information / explanations.

I am "happy"/willing to get involved a bit here, as I do want to
spend some time re-reading about current state of (some, notably
optim-related) optimizers.

(But I will be mostly offline for the next 60 hours or so.)


Martin

--
Martin Maechler
ETH Zurich  and  R Core team


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Query: Could documentation include modernized references?

2023-03-26 Thread J C Nash

A tangential email discussion with Simon U. has highlighted a long-standing
matter that some tools in the base R distribution are outdated, but that
so many examples and other tools may use them that they cannot be deprecated.

The examples that I am most familiar with concern optimization and nonlinear
least squares, but other workers will surely be able to suggest cases elsewhere.
I was the source (in Pascal) of Nelder-Mead, BFGS and CG algorithms in optim().
BFGS is still mostly competitive, and Nelder-Mead is useful for initial 
exploration
of an optimization problem, but CG was never very good, right from the mid-1970s
well before it was interfaced to R. By contrast Rcgmin works rather well
considering how similar it is in nature to CG. Yet I continue to see use and
even recommendations of these tools in inappropriate circumstances.

Given that it would break too many other packages and examples to drop the
existing tools, should we at least add short notes in the man (.Rd) pages?
I'm thinking of something like

   optim() has methods that are dated. Users are urged to consider suggestions
   from ...

and point to references and/or an appropriate Task View, which could, of course,
be in the references.

I have no idea what steps are needed to make such edits to the man pages. Would
R-core need to be directly involved, or could one or two trusted R developers
be given privileges to seek advice on and implement such modest documentation
additions?  FWIW, I'm willing to participate in such an effort, which I believe
would help users to use appropriate and up-to-date tools.

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] Discovering M1mac cowpads

2023-03-24 Thread J C Nash

Recently I updated my package nlsr and it passed all the usual checks and was
uploaded to CRAN. A few days later I got a message that I should "fix" my
package as it had failed in "M1max" tests.

The "error" was actually a failure in a DIFFERENT package that was used as
an example in a vignette. I fixed it in my vignette with try(). However, I
am interested in just where the M1 causes trouble.

As far as I can determine so far, for numerical computations, differences will
show up only when a package is able to take advantage of extended precision
registers in the IEEE arithmetic. I think this means that in pure R, it won't
be seen. Packages that call C or Fortran could do so. However, I've not yet
got a good handle on this.

Does anyone have some small, reproducible examples? (For me, reproducing so
far means making a small package and submitting to macbuilder, as I don't
have an M1 Mac.)

Cheers,

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Potential bug in fitted.nls

2023-01-26 Thread J C Nash

nls() actually uses different modeling formulas depending on the 'algorithm', 
and
there is, in my view as a long time nonlinear modeling person, an unfortunate
structural issue that likely cannot be resolved simply. This is because for 
nonlinear
modeling programs we really should be using explicit model statements
e.g., a linear model should be y ~ a * x + b where x is the (independent)
variable and a and b are parameters. But we put in y ~ x as per lm().
Partially linear approaches and indexed parameter models add complexity and
inconsistency. We're pushing structures beyond their design.

This is one of the topics in a paper currently undergoing
final edit on "Improving nls()" that came out of a Google Summer of Code 
project.
The desired improvements in nls() were mostly frustrated by entanglements in 
the code,
but they have led to a lot of tweaks to package nlsr. Perhaps someone more 
facile
with the intricacies of R internals can succeed. For the paper and nlsr,
all the bits should get sent to CRAN and elsewhere in the next month or so,
(co-author is newly started on a Ph D) but if anyone is anxious to try, they 
can email
me. The nlsr code has been stable for several months, but some documentation 
still
being considered.

Sorting out how to deal with the model expression for nls() and related tools
is a worthwhile goal, but not one that can be settled here. It could make a good
review project for a senior undergrad or master's level, and I'd be happy to
join the discussion.

Cheers, JN


On 2023-01-26 12:55, Bill Dunlap wrote:

Doesn't nls() expect that the lengths of vectors on both sides of the
formula match (if both are supplied)?  Perhaps it should check for that.

-Bill

On Thu, Jan 26, 2023 at 12:17 AM Dave Armstrong  wrote:


Dear Colleagues,

I recently answered [this question]() on StackOverflow that identified
what seems to be unusual behaviour with `stats:::nls.fitted()`. In
particular, a null model returns a single fitted value rather than a
vector of the same fitted value of `length(y)`.  The documentation
doesn’t make it seem like this is the intended behaviour, so I’m not
sure if it’s a bug, a “Wishlist” item or something that is working
as intended even though it seems unusual to me.  I looked through the
bug reporting page on the R project website and it suggested contacting
the R-devel list in cases where the behaviour is not obviously a bug to
see whether others find the behaviour equally unusual and I should
submit a Wishlist item through Bugzilla.

Below is a reprex that shows how the fitted values of a model with just
a single parameter is length 1, but if I multiply that constant by a
vector of ones, then the fitted values are of `length(y)`.  Is this
something that should be reported?

``` r
dat <-
data.frame(y=c(80,251,304,482,401,141,242,221,304,243,544,669,638),
ones = rep(1, 13))
mNull1 <- nls(y ~ a, data=dat, start=c(a=mean(dat$y)))
fitted(mNull1)
#> [1] 347.6923
#> attr(,"label")
#> [1] "Fitted values"

mNull2 <- nls(y ~ a*ones, data=dat, start=c(a=mean(dat$y)))
fitted(mNull2)
#>  [1] 347.6923 347.6923 347.6923 347.6923 347.6923 347.6923 347.6923
347.6923
#>  [9] 347.6923 347.6923 347.6923 347.6923 347.6923
#> attr(,"label")
#> [1] "Fitted values"
```

Created on 2023-01-25 by the [reprex
package](https://reprex.tidyverse.org) (v2.0.1)
 [[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel



[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Potential bug in fitted.nls

2023-01-26 Thread J C Nash

FWIW, nlsr::nlxb() gives same answers.

JN

On 2023-01-25 09:59, Dave Armstrong wrote:

Dear Colleagues,

I recently answered [this question]() on StackOverflow that identified
what seems to be unusual behaviour with `stats:::nls.fitted()`. In
particular, a null model returns a single fitted value rather than a
vector of the same fitted value of `length(y)`.  The documentation
doesn’t make it seem like this is the intended behaviour, so I’m not
sure if it’s a bug, a “Wishlist” item or something that is working
as intended even though it seems unusual to me.  I looked through the
bug reporting page on the R project website and it suggested contacting
the R-devel list in cases where the behaviour is not obviously a bug to
see whether others find the behaviour equally unusual and I should
submit a Wishlist item through Bugzilla.

Below is a reprex that shows how the fitted values of a model with just
a single parameter is length 1, but if I multiply that constant by a
vector of ones, then the fitted values are of `length(y)`.  Is this
something that should be reported?

``` r
dat <-
data.frame(y=c(80,251,304,482,401,141,242,221,304,243,544,669,638),
ones = rep(1, 13))
mNull1 <- nls(y ~ a, data=dat, start=c(a=mean(dat$y)))
fitted(mNull1)
#> [1] 347.6923
#> attr(,"label")
#> [1] "Fitted values"

mNull2 <- nls(y ~ a*ones, data=dat, start=c(a=mean(dat$y)))
fitted(mNull2)
#>  [1] 347.6923 347.6923 347.6923 347.6923 347.6923 347.6923 347.6923
347.6923
#>  [9] 347.6923 347.6923 347.6923 347.6923 347.6923
#> attr(,"label")
#> [1] "Fitted values"
```

Created on 2023-01-25 by the [reprex
package](https://reprex.tidyverse.org) (v2.0.1)
[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Bug in optim for specific orders of magnitude

2022-12-23 Thread J C Nash

Extreme scaling quite often ruins optimization calculations. If you think 
available methods
are capable of doing this, there's a bridge I can sell you in NYC.

I've been trying for some years to develop a good check on scaling so I can 
tell users
who provide functions like this to send (lots of) money and I'll give them the 
best answer
there is (generally no answer at all). Or, more seriously, to inform them that 
they should
not expect results unless they scale. Richard Varga once said some decades ago 
that any
problem was trivially solvable in the right scale, and he was mostly right. 
Scaling is
important.

To see the range of answers from a number of methods, the script below is 
helpful. I had
to remove lbfgsb3c from the mix as it stopped mid-calculation in unrecoverable 
way. Note
that I use my development version of optimx, so some methods might not be 
included in
CRAN offering. Just remove the methods from the ameth and bmeth lists if 
necessary.

Cheers, John Nash

# CErickson221223.R
# optim(c(0,0), function(x) {x[1]*1e-317}, lower=c(-1,-1), upper=c(1,1),
#  method='L-BFGS-B')

tfun <- function(x, xpnt=317){
  if ((length(x)) != 2) {stop("Must have length 2")}
  scl <- 10^(-xpnt)
  val <- x[1]*scl # note that x[2] unused. May be an issue!
  val
}
gtfun <- function(x, xpnt=317){ # gradient
  scl <- 10^(-xpnt)
  gg <- c(scl, 0.0)
  gg
}


xx <- c(0,0)
lo <- c(-1,-1)
up <- c(1,1)
print(tfun(xx))
library(optimx)
ameth <- c("BFGS", "CG", "Nelder-Mead", "L-BFGS-B", "nlm", "nlminb",
 "Rcgmin", "Rtnmin", "Rvmmin", "spg", "ucminf", "newuoa", "bobyqa",
 "nmkb", "hjkb",  "hjn", "lbfgs", "subplex", "ncg", "nvm", "mla",
 "slsqp", "anms")

bmeth <- c("L-BFGS-B", "nlminb", "Rcgmin", "Rtnmin", "nvm",
"bobyqa", "nmkb", "hjkb", "hjn", "ncg", "slsqp")

tstu <- opm(x<-c(0,0), fn=tfun, gr=gtfun, method=ameth, control=list(trace=0))
summary(tstu, order=value)

tstb <- opm(x<-c(0,0), fn=tfun, gr=gtfun, method=bmeth, lower=lo, upper=up,
control=list(trace=0))
summary(tstb, order=value)


On 2022-12-23 13:41, Rui Barradas wrote:

Às 17:30 de 23/12/2022, Collin Erickson escreveu:

Hello,

I've come across what seems to be a bug in optim that has become a nuisance
for me.

To recreate the bug, run:

optim(c(0,0), function(x) {x[1]*1e-317}, lower=c(-1,-1), upper=c(1,1),
method='L-BFGS-B')

The error message says:

Error in optim(c(0, 0), function(x) { :
   non-finite value supplied by optim

What makes this particularly treacherous is that this error only occurs for
specific powers. By running the following code you will find that the error
only occurs when the power is between -309 and -320; above and below that
work fine.

p <- 1:1000
giveserror <- rep(NA, length(p))
for (i in seq_along(p)) {
   tryout <- try({
 optim(c(0,0), function(x) {x[1]*10^-p[i]}, lower=c(-1,-1),
upper=c(1,1), method='L-BFGS-B')
   })
   giveserror[i] <- inherits(tryout, "try-error")
}
p[giveserror]

Obviously my function is much more complex than this and usually doesn't
fail, but this reprex demonstrates that this is a problem. To avoid the
error I may multiply by a factor or take the log, but it seems like a
legitimate bug that should be fixed.

I tried to look inside of optim to track down the error, but the error lies
within the external C code:

.External2(C_optim, par, fn1, gr1, method, con, lower,
 upper)

For reference, I am running R 4.2.2, but was also able to recreate this bug
on another device running R 4.1.2 and another running 4.0.3.

Thanks,
Collin Erickson

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Hello,

See if this R-Help thread [1] earlier this year is relevant.
In particular, the post by R Core team member Luke Tierney [2], that answers 
much better than what I could.

The very small numbers in your question seem to have hit a limit and this limit 
is not R related.


[1] https://stat.ethz.ch/pipermail/r-help/2022-February/473840.html
[2] https://stat.ethz.ch/pipermail/r-help/2022-February/473844.html


Hope this helps,

Rui Barradas

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Note on submission Found the following files/directories: 'NUL'

2022-11-25 Thread J C Nash

FWIW: optimx::optimx is outdated and only there for legacy use.
Better to use the optimx::optimr() function for single solvers.

JN


On 2022-11-25 05:10, Ivan Krylov wrote:

В Fri, 25 Nov 2022 09:59:10 +
"ROTOLO, Federico /FR"  пишет:


When submitting my package parfm, I get the following note
Flavor: r-devel-linux-x86_64-debian-gcc
Check: for non-standard things in the check directory, Result: NOTE
   Found the following files/directories:
 'NUL'
so that my submission is rejected.

I cannot find any file or directory called NUL in my package.
Do you have any suggestion?


The file gets created during the check when you call sink('NUL'):
https://github.com/cran/parfm/blob/8c3f45291514aedde67cecf0b090ddd3487f3ada/R/parfm.R#L260-L299

It mostly works on Windows, where "nul" with any extension in any
directory is the null file, but it creates a file named 'NUL' on other
operating systems. It also breaks the non-default sink, if any was set
up by the user.

Does optimx::optimx produce output that can't be turned off otherwise?
(Does it help to set control$trace = 0?) Have you tried
suppressMessages() or capture.output() with nullfile()?



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN no longer checking for solaris?

2021-12-06 Thread J C Nash

I'd second Uwe's point. I was one of 31 signatories to first IEEE 754 (I didn't 
participate in
the two more recent releases, as I already tore my hair out with the details of 
low level
bit manipulations). Before the standard, porting code was truly a nightmare. We 
did it
because we had to and were too ignorant to realize what a fools task it was. 
Keep code
portable please.

JN

On 2021-12-06 6:07 a.m., Uwe Ligges wrote:



On 06.12.2021 03:09, Avraham Adler wrote:

Would this mean we could start using little endian bit strings, as I think
only the Solaris platform was big endian (or was it the other way around)?


It depends on the hardware, not the OS.
CRAN checked on Intel CPUs, which are little endian while formerly Solaris was 
typically used on Sparc which is big endian.

In any case, please try to write cross platform code further on. ARM and x86-64 may agree, but we do not know what comes 
next. And old ideas may be revived more quickly than expected:
Not too many people expected 20 years ago that the future of scientific computing in 2021 would still/again happen on 
platforms without support for long doubles / extended precision.


Best,
Uwe Ligges




Avi

On Sun, Dec 5, 2021 at 8:56 PM Dirk Eddelbuettel  wrote:



On 5 December 2021 at 17:23, Travers Ching wrote:
| I see that there doesn't exist a Solaris flavor on any CRAN check page.
| However, I'm certain that Solaris was being checked up until very
recently.
|
| Is this just temporary?
|
| Is there any information for the future of Solaris on CRAN?

No "official" word yet on this list, r-devel or elsewhere, or via commits
to
the CRAN Policy (which a cron job of mine monitors).

But Henrik was eagle-eyed and spotted a number of changes to the svn (or
git
mirror thereof) writing Solaris out of the official documentation:

   https://twitter.com/henrikbengtsson/status/1466877096471379970

So yes it seems like an era is coming to a close.

Dirk

--
https://dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R feature suggestion: Duplicated function arguments check

2021-11-08 Thread J C Nash

I think this is similar in nature (though not detail) to an issue raised
on StackOverflow where the OP used "x" in dot args and it clashed with the
"x" in a numDeriv call in my optimx package. I've got a very early fix (I
think), though moderators on StackOverflow were unpleasant enough to
delete my request for the OP to contact me so I could get more
information to make improvements. Sigh. Developers need conversations
with users to improve their code.

Re: argument duplication -- In my view, the first goal should be to inform
the user of the clash. Doing anything further without providing information
is likely a very bad idea, though discussion of possibilities of action after
notification is certainly worthwhile.

Best, JN


On 2021-11-08 11:53 a.m., Duncan Murdoch wrote:

On 08/11/2021 11:48 a.m., Avi Gross via R-package-devel wrote:

Vincent,

But is the second being ignored the right result?

In many programming situations, subsequent assignments replace earlier ones.
And consider the way R allows something like this:

func(a=2, b=3, a=4, c=a*b)

Is it clear how to initialize the default for c as it depends on one value
of "a" or the other?


That c=a*b only works with non-standard tidyverse evaluation.  It causes other problems, e.g. the inability to pass ... 
properly (see https://github.com/tidyverse/glue/issues/231 for an example).


Duncan Murdoch



Of course, you could just make multiple settings an error rather than
choosing an arbitrary fix.

R lists are more like a BAG data structure than a SET.

-Original Message-
From: R-package-devel  On Behalf Of
Vincent van Hees
Sent: Monday, November 8, 2021 11:25 AM
To: Duncan Murdoch 
Cc: r-package-devel@r-project.org
Subject: Re: [R-pkg-devel] R feature suggestion: Duplicated function
arguments check

Thanks Duncan, I have tried to make a minimalistic example:

myfun = function(...) {
   input = list(...)
   mysum = function(A = c(), B= c()) {
 return(A+B)
   }
   if ("A" %in% names(input) & "B" %in% names(input)) {
 print(mysum(A = input$A, B = input$B))
   }
}

# test:

myfun(A = 1, B = 2, B = 4)

[1] 3

# So, the second B is ignored.



On Mon, 8 Nov 2021 at 17:03, Duncan Murdoch 
wrote:


On 08/11/2021 10:29 a.m., Vincent van Hees wrote:

Not sure if this is the best place to post this message, as it is
more

of a

suggestion than a question.

When an R function accepts more than a handful of arguments there is
the risk that users accidentally provide arguments twice, e.g
myfun(A=1, B=2, C=4, D=5, A=7), and if those two values are not the
same it can have frustrating side-effects. To catch this I am
planning to add a check for duplicated arguments, as shown below, in
one of my own functions. I am

now

wondering whether this would be a useful feature for R itself to
operate

in

the background when running any R function that has more than a
certain number of input arguments.

Cheers, Vincent

myfun = function(...) {
    #check input arguments for duplicate assignments
    input = list(...)
    if (length(input) > 0) {
  argNames = names(input)
  dupArgNames = duplicated(argNames)
  if (any(dupArgNames)) {
    for (dupi in unique(argNames[dupArgNames])) {
  dupArgValues = input[which(argNames %in% dupi)]
  if (all(dupArgValues == dupArgValues[[1]])) { # double

arguments,

but no confusion about what value should be
    warning(paste0("\nArgument ", dupi, " has been provided
more

than

once in the same call, which is ambiguous. Please fix."))
  } else { # double arguments, and confusion about what value

should

be,
    stop(paste0("\nArgument ", dupi, " has been provided more
than once in the same call, which is ambiguous. Please fix."))
  }
    }
  }
    }
    # rest of code...
}



Could you give an example where this is needed?  If a named argument
is duplicated, R will catch that and give an error message:

    > f(a=1, b=2, a=3)
    Error in f(a = 1, b = 2, a = 3) :
  formal argument "a" matched by multiple actual arguments

So this can only happen when it is an argument in the ... list that is
duplicated.  But usually those are passed to some other function, so
something like

    g <- function(...) f(...)

would also catch the duplication in g(a=1, b=2, a=3):

    > g(a=1, b=2, a=3)
    Error in f(...) :
  formal argument "a" matched by multiple actual arguments

The only case where I can see this getting by is where you are never
using those arguments to match any formal argument, e.g.

    list(a=1, b=2, a=3)

Maybe this should have been made illegal when R was created, but I
think it's too late to outlaw now:  I'm sure there are lots of people
making use of this.

Or am I missing something?

Duncan Murdoch



[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Seeking opinions on possible change to nls() code

2021-08-20 Thread J C Nash
Thanks Martin. I'd missed the intention of that option, but re-reading
it now it is obvious.

FWIW, this problem is quite nasty, and so far I've found no method
that reveals the underlying dangers well. And one of the issues with
nonlinear models is that they reveal how slippery the concept of
inference can be when applied to parameters in such models.

JN


On 2021-08-20 11:35 a.m., Martin Maechler wrote:
>>>>>> J C Nash 
>>>>>> on Fri, 20 Aug 2021 11:06:25 -0400 writes:
> 
> > In our work on a Google Summer of Code project
> > "Improvements to nls()", the code has proved sufficiently
> > entangled that we have found (so far!)  few
> > straightforward changes that would not break legacy
> > behaviour. One issue that might be fixable is that nls()
> > returns no result if it encounters some computational
> > blockage AFTER it has already found a much better "fit"
> > i.e. set of parameters with smaller sum of squares.  Here
> > is a version of the Tetra example:
> 
> time=c( 1,  2,  3,  4,  6 , 8, 10, 12, 16)
> conc = c( 0.7, 1.2, 1.4, 1.4, 1.1, 0.8, 0.6, 0.5, 0.3)
> NLSdata <- data.frame(time,conc)
> NLSstart <-c(lrc1=-2,lrc2=0.25,A1=150,A2=50) # a starting vector (named!)
> NLSformula <-conc ~ A1*exp(-exp(lrc1)*time)+A2*exp(-exp(lrc2)*time)
> tryit <- try(nls(NLSformula, data=NLSdata, start=NLSstart, trace=TRUE))
> print(tryit)
> 
> > If you run this, tryit does not give information that the
> > sum of squares has been reduced from > 6 to < 2, as
> > the trace shows.
> 
> > Should we propose that this be changed so the returned
> > object gives the best fit so far, albeit with some form of
> > message or return code to indicate that this is not
> > necessarily a conventional solution? Our concern is that
> > some examples might need to be adjusted slightly, or we
> > might simply add the "try-error" class to the output
> > information in such cases.
> 
> > Comments are welcome, as this is as much an infrastructure
> > matter as a computational one.
> 
> Hmm...  many years ago, we had introduced the  'warnOnly=TRUE'
> option to nls()  i.e., nls.control()  exactly for such cases,
> where people would still like to see the solution:
> 
> So,
> 
> --
>> try2 <- nls(NLSformula, data=NLSdata, start=NLSstart, trace=TRUE, 
>   control = nls.control(warnOnly=TRUE))
> 61215.76(3.56e+03): par = (-2 0.25 150 50)
> 2.175672(2.23e+01): par = (-1.9991 0.3171134 2.618224 -1.366768)
> 1.621050(7.14e+00): par = (-1.960475 -2.620293 2.575261 -0.5559918)
> Warning message:
> In nls(NLSformula, data = NLSdata, start = NLSstart, trace = TRUE,  :
>   singular gradient
> 
>> try2
> Nonlinear regression model
>   model: conc ~ A1 * exp(-exp(lrc1) * time) + A2 * exp(-exp(lrc2) * time)
>data: NLSdata
>lrc1lrc2  A1  A2 
>  -22.89   96.43  156.70 -156.68 
>  residual sum-of-squares: 218483
> 
> Number of iterations till stop: 2 
> Achieved convergence tolerance: 7.138
> Reason stopped: singular gradient
> 
>> coef(try2)
>   lrc1   lrc2 A1 A2 
>  -22.88540   96.42686  156.69547 -156.68461 
> 
> 
>> summary(try2)
> Error in chol2inv(object$m$Rmat()) : 
>   element (3, 3) is zero, so the inverse cannot be computed
>>
> --
> 
> and similar for  vcov(), of course, where the above error
> originates.
> 
> { I think  GSoC (andr other)  students should start by studying and
>   exploring relevant help pages before drawing conclusions
>   ..
>   but yes, I've been born in the last millennium ...
> }
> 
> ;-)
> 
> Have a nice weekend!
> Martin
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Seeking opinions on possible change to nls() code

2021-08-20 Thread J C Nash
In our work on a Google Summer of Code project "Improvements to nls()",
the code has proved sufficiently entangled that we have found (so far!)
few straightforward changes that would not break legacy behaviour. One
issue that might be fixable is that nls() returns no result if it
encounters some computational blockage AFTER it has already found a
much better "fit" i.e. set of parameters with smaller sum of squares.
Here is a version of the Tetra example:

time=c( 1,  2,  3,  4,  6 , 8, 10, 12, 16)
conc = c( 0.7, 1.2, 1.4, 1.4, 1.1, 0.8, 0.6, 0.5, 0.3)
NLSdata <- data.frame(time,conc)
NLSstart <-c(lrc1=-2,lrc2=0.25,A1=150,A2=50) # a starting vector (named!)
NLSformula <-conc ~ A1*exp(-exp(lrc1)*time)+A2*exp(-exp(lrc2)*time)
tryit <- try(nls(NLSformula, data=NLSdata, start=NLSstart, trace=TRUE))
print(tryit)

If you run this, tryit does not give information that the sum of squares
has been reduced from > 6 to < 2, as the trace shows.

Should we propose that this be changed so the returned object gives the
best fit so far, albeit with some form of message or return code to indicate
that this is not necessarily a conventional solution? Our concern is that
some examples might need to be adjusted slightly, or we might simply add
the "try-error" class to the output information in such cases.

Comments are welcome, as this is as much an infrastructure matter as a
computational one.

Best,

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] subset argument in nls() and possibly other functions

2021-07-13 Thread J C Nash
In mentoring and participating in a Google Summer of Code project "Improvements 
to nls()",
I've not found examples of use of the "subset" argument in the call to nls(). 
Moreover,
in searching through the source code for the various functions related to 
nls(), I can't
seem to find where subset is used, but a simple example, included below, 
indicates it works.
Three approaches all seem to give the same results.

Can someone point to documentation or code so we can make sure we get our 
revised programs
to work properly? The aim is to make them more maintainable and provide 
maintainer documentation,
along with some improved functionality. We seem, for example, to already be 
able to offer
analytic derivatives where they are feasible, and should be able to add 
Marquardt-Levenberg
stabilization as an option.

Note that this "subset" does not seem to be the "subset()" function of R.

John Nash

# CroucherSubset.R -- https://walkingrandomly.com/?p=5254

xdata = c(-2,-1.64,-1.33,-0.7,0,0.45,1.2,1.64,2.32,2.9)
ydata = 
c(0.699369,0.700462,0.695354,1.03905,1.97389,2.41143,1.91091,0.919576,-0.730975,-1.42001)
Cform <- ydata ~ p1*cos(p2*xdata) + p2*sin(p1*xdata)
Cstart<-list(p1=1,p2=0.2)
Cdata<-data.frame(xdata, ydata)
Csubset<-1:8 # just first 8 points

# Original problem - no subset
fit0 = nls(ydata ~ p1*cos(p2*xdata) + p2*sin(p1*xdata), data=Cdata, 
start=list(p1=1,p2=.2))
summary(fit0)

# via subset argument
fit1 = nls(ydata ~ p1*cos(p2*xdata) + p2*sin(p1*xdata), data=Cdata, 
start=list(p1=1,p2=.2), subset=Csubset)
summary(fit1)

# via explicit subsetting
Csdata <- Cdata[Csubset, ]
Csdata
fit2 = nls(ydata ~ p1*cos(p2*xdata) + p2*sin(p1*xdata), data=Csdata, 
start=list(p1=1,p2=.2))
summary(fit2)

# via weights -- seems to give correct observation count if zeros not recognized
wts <- c(rep(1,8), rep(0,2))
fit3 = nls(ydata ~ p1*cos(p2*xdata) + p2*sin(p1*xdata), data=Cdata, 
weights=wts, start=list(p1=1,p2=.2))
summary(fit3)

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED - additional note

2021-06-13 Thread J C Nash
Thanks Uwe.

My misunderstanding. I thought the reference was to one of the
"src/" directories in the base R tree. However, I found examples in
some packages for the location.

Windows is alien territory for me, unfortunately.

JN


On 2021-06-12 7:27 p.m., Uwe Ligges wrote:
> 
> 
> On 12.06.2021 16:39, J C Nash wrote:
>> Two minor notes:
>>
>> 1) The Writing R Extensions manual, as far as I can determine, does not 
>> inform package
>> developers that Makevars.win needs to be in the src/ subdirectory. I 
>> followed the example
>> of some other packages to choose where to put it.
> 
> I just searched for Makevars.win in Writing R Extensions and the first 
> occurence is:
> "There are platform-specific file names on Windows: src/Makevars.win"
> 
> So tells you both it should be in src and how to capitlize.
> 
> Best,
> Uwe
> 
> 
>>
>> 2) Also, while I managed to get my package to install with "makevars.win", I 
>> got a
>> WARNING on running a CHECK until I replaced it with "Makevars.win", i.e., 
>> Camel-case
>> name.
>>
>> Do these observations merit edits in the manual?
>>
>> JN
>>
>>
>> On 2021-06-11 11:16 a.m., J C Nash wrote:
>>> After some flailing around, discovered a posting
>>>
>>> https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code
>>>
>>> which showed a makevars.win file containing
>>>
>>> PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)
>>>
>>> I had tried several similar such makevars.win files, but trying PKG_LIBS+= 
>>> and
>>> no spaces. There is mention of the libraries in Writing R Extensions, but 
>>> given
>>> the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be 
>>> there
>>> in the documentation. I've separately noted that Linux sessionInfo() shows
>>> BLAS and LAPACK but Windows does not.
>>>
>>> Cheers, JN
>>>
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED - additional note

2021-06-12 Thread J C Nash
Two minor notes:

1) The Writing R Extensions manual, as far as I can determine, does not inform 
package
developers that Makevars.win needs to be in the src/ subdirectory. I followed 
the example
of some other packages to choose where to put it.

2) Also, while I managed to get my package to install with "makevars.win", I 
got a
WARNING on running a CHECK until I replaced it with "Makevars.win", i.e., 
Camel-case
name.

Do these observations merit edits in the manual?

JN


On 2021-06-11 11:16 a.m., J C Nash wrote:
> After some flailing around, discovered a posting
> 
> https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code
> 
> which showed a makevars.win file containing
> 
> PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)
> 
> I had tried several similar such makevars.win files, but trying PKG_LIBS+= and
> no spaces. There is mention of the libraries in Writing R Extensions, but 
> given
> the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be there
> in the documentation. I've separately noted that Linux sessionInfo() shows
> BLAS and LAPACK but Windows does not.
> 
> Cheers, JN
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package SOLVED

2021-06-11 Thread J C Nash
After some flailing around, discovered a posting

https://stackoverflow.com/questions/42118561/error-in-r-cmd-shlib-compiling-c-code

which showed a makevars.win file containing

PKG_LIBS = $(LAPACK_LIBS) $(BLAS_LIBS) $(FLIBS)

I had tried several similar such makevars.win files, but trying PKG_LIBS+= and
no spaces. There is mention of the libraries in Writing R Extensions, but given
the heavy use of LAPACK, BLAS and FLIBS, perhaps this example should be there
in the documentation. I've separately noted that Linux sessionInfo() shows
BLAS and LAPACK but Windows does not.

Cheers, JN

On 2021-06-10 9:37 a.m., Dirk Eddelbuettel wrote:
> 
> On 10 June 2021 at 09:22, J C Nash wrote:
> | Thanks to help from Duncan Murdoch, we have extracted the nls() 
> functionality to a package nlspkg and are building
> | an nlsalt package. We can then run nlspkg::AFunction() and 
> nlsalt::AFunction() in a single script to compare.
> | This works great in Linux, with the packages building and installing under 
> the command line or in Rstudio.
> | But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a 
> number of errors of the type
> | 
> | *** arch - i386
> | 
> C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
> | loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'
> | 
> | The reference is to a BLAS function, so I am fairly certain there is some 
> failed pointer, possibly a
> | makevars.win entry, that we need. So far my searches and (possibly silly) 
> attempts to provide links
> | have failed.
> | 
> | Can anyone provide suggestions?
> 
> Guess: On Linux you use a complete (external) BLAS, on Windows you use the
> (subset) BLAS provided by R which may not have the desired function rending
> your approach less portable.  See what sessionInfo() has to say on both.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Windows load error installing package

2021-06-10 Thread J C Nash
Thanks Dirk:

It looks like R_Windows isn't setting any BLAS or LAPACK. Here is the output 
from sessionInfo() on
my Win10 (VirtualBox VM) and Linux Mint 20.1 systems. However, I've not got any 
idea how to fix
this.

JN


>> sessionInfo()
> R version 4.1.0 (2021-05-18)
> Platform: x86_64-w64-mingw32/x64 (64-bit)
> Running under: Windows 10 x64 (build 19042)
> 
> Matrix products: default
> 
> locale:
> [1] LC_COLLATE=English_United States.1252
> [2] LC_CTYPE=English_United States.1252
> [3] LC_MONETARY=English_United States.1252
> [4] LC_NUMERIC=C
> [5] LC_TIME=English_United States.1252
> 
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base
> 
> loaded via a namespace (and not attached):
> [1] compiler_4.1.0
>>
> 
> 
>> sessionInfo()
> R version 4.1.0 (2021-05-18)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Linux Mint 20.1
> 
> Matrix products: default
> BLAS:   /usr/lib/x86_64-linux-gnu/openblas-pthread/libblas.so.3
> LAPACK: /usr/lib/x86_64-linux-gnu/openblas-pthread/liblapack.so.3
> 
> locale:
>  [1] LC_CTYPE=en_CA.UTF-8   LC_NUMERIC=C  
>  [3] LC_TIME=en_CA.UTF-8LC_COLLATE=en_CA.UTF-8
>  [5] LC_MONETARY=en_CA.UTF-8LC_MESSAGES=en_CA.UTF-8   
>  [7] LC_PAPER=en_CA.UTF-8   LC_NAME=C 
>  [9] LC_ADDRESS=C   LC_TELEPHONE=C
> [11] LC_MEASUREMENT=en_CA.UTF-8 LC_IDENTIFICATION=C   
> 
> attached base packages:
> [1] stats graphics  grDevices utils datasets  methods   base 
> 
> loaded via a namespace (and not attached):
> [1] compiler_4.1.0
>> 


On 2021-06-10 9:37 a.m., Dirk Eddelbuettel wrote:
> 
> On 10 June 2021 at 09:22, J C Nash wrote:
> | Thanks to help from Duncan Murdoch, we have extracted the nls() 
> functionality to a package nlspkg and are building
> | an nlsalt package. We can then run nlspkg::AFunction() and 
> nlsalt::AFunction() in a single script to compare.
> | This works great in Linux, with the packages building and installing under 
> the command line or in Rstudio.
> | But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a 
> number of errors of the type
> | 
> | *** arch - i386
> | 
> C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
> | loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'
> | 
> | The reference is to a BLAS function, so I am fairly certain there is some 
> failed pointer, possibly a
> | makevars.win entry, that we need. So far my searches and (possibly silly) 
> attempts to provide links
> | have failed.
> | 
> | Can anyone provide suggestions?
> 
> Guess: On Linux you use a complete (external) BLAS, on Windows you use the
> (subset) BLAS provided by R which may not have the desired function rending
> your approach less portable.  See what sessionInfo() has to say on both.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Windows load error installing package

2021-06-10 Thread J C Nash
Hi,

I'm mentoring Arkajyoti Bhattacharjee for the Google Summer of Code project 
"Improvements to nls()".

Thanks to help from Duncan Murdoch, we have extracted the nls() functionality 
to a package nlspkg and are building
an nlsalt package. We can then run nlspkg::AFunction() and nlsalt::AFunction() 
in a single script to compare.
This works great in Linux, with the packages building and installing under the 
command line or in Rstudio.
But in Windows CMD the "R CMD build" works, but "R CMD INSTALL" gives a number 
of errors of the type

*** arch - i386
C:/RBuildTools/4.0/mingw32/bin/../lib/gcc/i686-w64-mingw32/8.3.0/../../../../i686-w64-mingw32/bin/ld.exe:
loessf.o:loessf.f:(.text+0x650): undefined reference to `idamax_'

The reference is to a BLAS function, so I am fairly certain there is some 
failed pointer, possibly a
makevars.win entry, that we need. So far my searches and (possibly silly) 
attempts to provide links
have failed.

Can anyone provide suggestions?

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Question about preventing CRAN package archival

2021-06-02 Thread J C Nash
As noted by John Harrold and my previous posting, these are not monster codes.
I'd check what I needed and simply work out enough R to make my package work.
Most of these matrix functions are pretty much old-fashioned math translated
into R. I can't see that R will engage lawyers if the OP translates the variable
names to the ones he is using and more or less mimics the bits of code needed.

Cheers, JN


On 2021-06-02 3:15 p.m., John Harrold wrote:
> To add another option. In the past when this has happened to me I've found
> other packages that provide similar functionality.
> 
> I'm assuming that is.square just checks the number of columns == number of
> rows? And the others can probably be implemented pretty easily.
> 
> On Wed, Jun 2, 2021 at 10:41 AM Ben Staton  wrote:
> 
>> My package uses the MIT license, so would that not meet the compatibility
>> requirements?
>>
>> I will attempt to reach out to the package author - thanks for your help!
>>
>> On Wed, Jun 2, 2021 at 10:31 AM Ben Bolker  wrote:
>>
>>> That all sounds exactly right.
>>>GPL >= 2 allows you to use the material without asking permission as
>>> long as your package is compatibly licensed (e.g. also GPL).
>>>Under normal circumstances it would be polite to ask permission, but
>>> if the reason for doing this is that the maintainer is unreachable in
>>> the first place ...
>>>
>>>   If you want to try a little harder, it seems quite possible that you
>>> can reach the matrixcalc maintainer at the (personal) e-mail address
>>> shown in this page:
>>>
>>>
>> https://www.facebook.com/photo/?fbid=10208324530363130=ecnf.1000413042
>>>
>>>(Possibly an identity confusion, but I rate that as unlikely based on
>>> other facebook snooping)
>>>
>>>I don't think a short, polite e-mail request would be out of bounds,
>>> they can always ignore it or tell you to go away.
>>>
>>>cheers
>>> Ben Bolker
>>>
>>> On 6/2/21 1:15 PM, Ben Staton wrote:
>>>> Hello,
>>>>
>>>> Thank you for your detailed list of solutions.
>>>>
>>>> I was initially tempted to go with option 1 (move matrixcalc to
>> suggests
>>>> and check for its existence before using functions that rely on it),
>> but
>>> as
>>>> mentioned, this is not a long term fix.
>>>>
>>>> I unfortunately can't take on the responsibilities of option 2
>> (becoming
>>>> the package maintainer) -- there is much that this package does that I
>> do
>>>> not understand, and do not wish to feign authority!
>>>>
>>>> I plan to take option 3 (copy the needed functions into my package).
>>> There
>>>> are only three functions I need from matrixcalc, and all three are
>> fairly
>>>> simple (is.square.matrix
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.square.matrix.R>,
>>>> is.symmetric.matrix
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.symmetric.matrix.R>, and
>>>> is.positive.definite
>>>> <https://rdrr.io/cran/matrixcalc/src/R/is.positive.definite.R>) and
>>> there
>>>> is only one function in postpack that needs them. I plan to define them
>>>> within the postpack function. matrixcalc is licensed under GPL >= 2 and
>>>> based on my scan of the license text, this is allowed. Is that correct?
>>>>
>>>> Regarding option 4 (contacting the matrixcalc maintainer), the original
>>>> email from CRAN mentioned that they have attempted to contact the
>> package
>>>> author with no response.
>>>>
>>>> Thank you!
>>>>
>>>> On Wed, Jun 2, 2021 at 9:52 AM J C Nash  wrote:
>>>>
>>>>> I just downloaded the source matrixcalc package to see what it
>>> contained.
>>>>> The functions
>>>>> I looked at seem fairly straightforward and the OP could likely
>> develop
>>>>> equivalent features
>>>>> in his own code, possibly avoiding a function call. Avoiding the
>>> function
>>>>> call means NAMESPACE etc. are not involved, so fewer places for
>> getting
>>>>> into
>>>>> trouble, assuming the inline code works properly.
>>>>>
>>>>> JN
>>>>>
>>>>>
>>>>> On 2021-06-02 12:37 p.m., Duncan Murdoch wrote:
>>>>>> On 02/06/

Re: [R-pkg-devel] Question about preventing CRAN package archival

2021-06-02 Thread J C Nash
I just downloaded the source matrixcalc package to see what it contained. The 
functions
I looked at seem fairly straightforward and the OP could likely develop 
equivalent features
in his own code, possibly avoiding a function call. Avoiding the function
call means NAMESPACE etc. are not involved, so fewer places for getting into
trouble, assuming the inline code works properly.

JN


On 2021-06-02 12:37 p.m., Duncan Murdoch wrote:
> On 02/06/2021 12:13 p.m., Ben Staton wrote:
>> Hello,
>>
>> I received an email notice from CRAN indicating that my R package
>> ('postpack') will be archived soon if I do not take any action and I want
>> to avoid that outcome. The issue is not caused by my package, but instead a
>> package that my package depends on:
>>
>> "... package 'matrixcalc' is now scheduled for archival on 2021-06-09,
>> and archiving this will necessitate also archiving its strong reverse
>> dependencies."
>>
>> Evidently, xyz has been returning errors on new R builds prompting CRAN to
>> list it as a package to be archived. My package, 'postpack' has
>> 'matrixcalc' listed in the Imports field, which I assume is why I received
>> this email.
>>
>> I want to keep 'postpack' active and don't want it to be archived. I still
>> need package 'matrixcalc' for my package, but not for most functions. Could
>> I simply move package 'matrixcalc' to the Suggests list and submit the new
>> version to CRAN to remove the "Strong Reverse Dependency" issue that
>> triggered this email to avoid CRAN from archiving my package?
> 
> That's part of one solution, but not the best solution.
> 
> If you move it to Suggests, you should make sure that your package checks for 
> it before every use, and falls back to
> some other calculation if it is not present.  Be aware that once it is 
> archived, almost none of your users will have it
> available, so this is kind of like dropping the functions that it supports.
> 
> Another solution which would be great for the community might be for you to 
> offer to take over as maintainer of
> matrixcalc.  Then you'd fix whatever problems it has, and you wouldn't need 
> to worry about it.  I haven't looked at the
> issues so I don't know if this is feasible.
> 
> A third choice would be for you to copy the functions you need from 
> matrixcalc into your own package so you can drop the
> dependency.  This is generally legal under the licenses that CRAN accepts, 
> but you should check anyway.
> 
> A fourth choice would be for you to contact the matrixcalc maintainer, and 
> help them to fix the issues so that
> matrixcalc doesn't get archived.  They may or may not be willing to work with 
> you.
> 
> I'd say my third choice is the best choice in the short term, and 2nd or 4th 
> would be good long term solutions.
> 
> Duncan Murdoch
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] GSoC project "Improvement to nls()"

2021-05-26 Thread J C Nash
This message is to let R developers know that the project in the Subject is now
a Google Summer of Code project.

Our aim in this project is to find simplifications and corrections to the nls()
code, which has become heavily patched. Moreover, it has some deficiencies in 
that
there is no Marquardt stabilization and it is likely the jacobian (called 
gradient
in R) computations are less than ideal. On the other hand, it has a lot of 
features
and capabilities.

A correction I proposed to avoid the "small residual" issue (when models are 
nearly
perfectly fittable) is now in R-devel. Using a new nls.control parameter one 
can avoid
failure, but the default value of 0 leaves legacy behaviour. We hope to be able 
to use
similar approaches so existing nls() example output is unaltered.

It is likely we will only partially meet our goals:
- to document and possibly simplify the existing code
- to correct some minor issues in documentation or function
- to find a graceful way to incorporate a Marquardt stabilization into the
  Gauss-Newton iteration
- to document, evaluate, and possibly improve the jacobian computation

all within the context that any changes impose minimal nuisance for R workers.

Some of these efforts overlap the nlsr, minpack.lm and likely other packages,
and may suggest improvements there also.

There is a gitlab repository established at 
https://gitlab.com/nashjc/improvenls.
We welcome interest and participation off list except for bugs in the current R
function(s).

John Nash
University of Ottawa

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Problems with too much testing

2021-04-16 Thread J C Nash
I'm generally in accord with Duncan on this. There are inevitably situations 
where general
rules don't apply. Our challenge is to find practical ways to keep the overall 
workload of
all participants in the process to a minimum.

JN

On 2021-04-16 10:18 a.m., Duncan Murdoch wrote:
> On 16/04/2021 9:49 a.m., J C Nash wrote:
>> Another approach is to change the responsibility.
>>
>> My feeling is that tests in the TESTING package should be modifiable by the 
>> maintainer of
>> the TESTED package, with both packages suspended if the two maintainers 
>> cannot agree. We
>> need to be able to move forward when legacy behaviour is outdated or just 
>> plain wrong. Or,
>> in the case that I find affects me, when improvements in iterative schemes 
>> change iterates
>> slightly. My guess is that Duncan's example is a case in point.
>>
>> I doubt this will ever occur, as it doesn't seem to be the R way. However, I 
>> do know that
>> improvements in methods are not going to CRAN in some cases.
> 
> In the cases I've been involved with the authors of the testing package have 
> accepted suggested changes when I've made
> them:  I think that's also part of "the R way".  However, this takes time for 
> both of us:  I need to understand what
> they are intending to test before I can suggest a change to it, and they need 
> to understand my change before they can
> decide if it is acceptable, or whether further changes would also be 
> necessary.
> 
> Github helps a lot with this:  if the testing package is there, I can quickly 
> reproduce the issue, produce a fix, and
> send it to the author, who can tweak it if I've set things up properly.
> 
> For the kinds of changes you're making, I suspect relaxing a tolerance would 
> often be enough, though if you switch
> algorithms and record that in your results, the testing package may need to 
> replace reference values.  I think I'd be
> uncomfortable doing that.
> 
> Duncan Murdoch
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Problems with too much testing

2021-04-16 Thread J C Nash
Another approach is to change the responsibility.

My feeling is that tests in the TESTING package should be modifiable by the 
maintainer of
the TESTED package, with both packages suspended if the two maintainers cannot 
agree. We
need to be able to move forward when legacy behaviour is outdated or just plain 
wrong. Or,
in the case that I find affects me, when improvements in iterative schemes 
change iterates
slightly. My guess is that Duncan's example is a case in point.

I doubt this will ever occur, as it doesn't seem to be the R way. However, I do 
know that
improvements in methods are not going to CRAN in some cases.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] Google Summer of Code for R projects / mentors needed

2021-01-16 Thread J C Nash
One of the mechanisms by which R has been extended and improved has been through
the efforts of students and mentors in the Google Summer of Code initiatives. 
This
year Toby Hocking (along with others) has continued to lead this effort.

This year, Google has changed the format somewhat so that the projects are 
shorter.
There will likely be more of them.

For success, we have learned that projects need at least 2 mentors -- illness,
life events, world events and holidays can get in the way of monitoring student
work and handling the minor but critical short reports to Google to ensure money
gets to deserving students (and does not get sent otherwise!).

Please consider mentoring and/or proposing a project. See
https://github.com/rstats-gsoc/gsoc2021/wiki/

As an example, my own proposal concerns improving the behaviour and features
of the nls() function. 
https://github.com/rstats-gsoc/gsoc2021/wiki/Improvements-to-nls()

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] URL checks

2021-01-12 Thread J C Nash


Sorry, Martin, but I've NOT commented on this matter, unless someone has been 
impersonating me.
Someone else?

JN


On 2021-01-11 4:51 a.m., Martin Maechler wrote:
>> Viechtbauer, Wolfgang (SP) 
>> on Fri, 8 Jan 2021 13:50:14 + writes:
> 
> > Instead of a separate file to store such a list, would it be an idea to 
> add versions of the \href{}{} and \url{} markup commands that are skipped by 
> the URL checks?
> > Best,
> > Wolfgang
> 
> I think John Nash and you misunderstood -- or then I
> misunderstood -- the original proposal:
> 
> I've been understanding that there should be a  "central repository" of URL
> exceptions that is maintained by volunteers.
> 
> And rather *not* that package authors should get ways to skip
> URL checking..
> 
> Martin
> 
> 
> >> -Original Message-
> >> From: R-devel [mailto:r-devel-boun...@r-project.org] On Behalf Of 
> Spencer
> >> Graves
> >> Sent: Friday, 08 January, 2021 13:04
> >> To: r-devel@r-project.org
> >> Subject: Re: [Rd] URL checks
> >> 
> >> I also would be pleased to be allowed to provide "a list of known
> >> false-positive/exceptions" to the URL tests.  I've been challenged
> >> multiple times regarding URLs that worked fine when I checked them.  We
> >> should not be required to do a partial lobotomy to pass R CMD check ;-)
> >> 
> >> Spencer Graves
> >> 
> >> On 2021-01-07 09:53, Hugo Gruson wrote:
> >>> 
> >>> I encountered the same issue today with 
> https://astrostatistics.psu.edu/.
> >>> 
> >>> This is a trust chain issue, as explained here:
> >>> https://whatsmychaincert.com/?astrostatistics.psu.edu.
> >>> 
> >>> I've worked for a couple of years on a project to increase HTTPS
> >>> adoption on the web and we noticed that this type of error is very
> >>> common, and that website maintainers are often unresponsive to 
> requests
> >>> to fix this issue.
> >>> 
> >>> Therefore, I totally agree with Kirill that a list of known
> >>> false-positive/exceptions would be a great addition to save time to 
> both
> >>> the CRAN team and package developers.
> >>> 
> >>> Hugo
> >>> 
> >>> On 07/01/2021 15:45, Kirill Müller via R-devel wrote:
>  One other failure mode: SSL certificates trusted by browsers that are
>  not installed on the check machine, e.g. the "GEANT Vereniging"
>  certificate from https://relational.fit.cvut.cz/ .
>  
>  K
>  
>  On 07.01.21 12:14, Kirill Müller via R-devel wrote:
> > Hi
> > 
> > The URL checks in R CMD check test all links in the README and
> > vignettes for broken or redirected links. In many cases this 
> improves
> > documentation, I see problems with this approach which I have
> > detailed below.
> > 
> > I'm writing to this mailing list because I think the change needs to
> > happen in R's check routines. I propose to introduce an "allow-list"
> > for URLs, to reduce the burden on both CRAN and package maintainers.
> > 
> > Comments are greatly appreciated.
> > 
> > Best regards
> > 
> > Kirill
> > 
> > # Problems with the detection of broken/redirected URLs
> > 
> > ## 301 should often be 307, how to change?
> > 
> > Many web sites use a 301 redirection code that probably should be a
> > 307. For example, https://www.oracle.com and https://www.oracle.com/
> > both redirect to https://www.oracle.com/index.html with a 301. I
> > suspect the company still wants oracle.com to be recognized as the
> > primary entry point of their web presence (to reserve the right to
> > move the redirection to a different location later), I haven't
> > checked with their PR department though. If that's true, the 
> redirect
> > probably should be a 307, which should be fixed by their IT
> > department which I haven't contacted yet either.
> > 
> > $ curl -i https://www.oracle.com
> > HTTP/2 301
> > server: AkamaiGHost
> > content-length: 0
> > location: https://www.oracle.com/index.html
> > ...
> > 
> > ## User agent detection
> > 
> > twitter.com responds with a 400 error for requests without a user
> > agent string hinting at an accepted browser.
> > 
> > $ curl -i https://twitter.com/
> > HTTP/2 400
> > ...
> > ...Please switch to a supported browser..
> > 
> > $ curl -s -i https://twitter.com/ -A "Mozilla/5.0 (X11; Ubuntu; 
> Linux
> > x86_64; rv:84.0) Gecko/20100101 Firefox/84.0" | head -n 1
> > HTTP/2 200
> > 
> > # Impact
> > 
> > While the latter problem *could* be fixed 

Re: [Rd] URL checks

2021-01-09 Thread J C Nash
Is this a topic for Google Summer of Code? See
https://github.com/rstats-gsoc/gsoc2021/wiki


On 2021-01-09 12:34 p.m., Dirk Eddelbuettel wrote:
> 
> The idea of 'white lists' to prevent known (and 'tolerated') issues, note,
> warnings, ... from needlessly reappearing is very powerful and general, and
> can go much further than just URL checks.
> 
> I suggested several times in the past that we can look at the format Debian
> uses in its 'lintian' package checker and its override files -- which are
> used across thousands of packages there.  But that went nowhere so I stopped.
> 
> This issue needs a champion or two to implement a prototype as well as a
> potential R Core / CRAN sponsor to adopt it.  But in all those years no smoke
> has come out of any chimneys so ...  ¯\_(ツ)_/¯ is all we get.
> 
> Dirk
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] New URL redirect checks

2020-09-23 Thread J C Nash
Does this issue fit in the more general one of centralized vs
partitioned checks? I've suggested before that the CRAN team
seems (and I'll be honest and admit I don't have a good knowledge
of how they work) to favour an all-in-one checking, whereas it
might be helpful to developers and also widen the "CRAN checking"
team to partition checks. Partitioned checks would allow the
particular problems that are raised to be dealt with in a more
focussed action. URLs seem an obvious candidate, since
link checking is used outside of R packages.

I'm sure there are others besides myself who would contribute
to such activities. After all, the partitioned checks could be
contributed packages themselves.

JN



On 2020-09-22 9:50 p.m., Yihui Xie wrote:
> Me too. I have changed some valid URLs in \url{} to \verb{} just to
> avoid these check NOTEs. I do appreciate the check for the validity of
> URLs in packages, especially those dead links (404), but discouraging
> URLs with status code other than 200 (such as 301) feels like
> overdoing the job. After I "hide" links from R CMD check with \verb{}
> , it will be hard to know if these links are still valid in the
> future.
> 
> Regards,
> Yihui
> 
> On Tue, Sep 22, 2020 at 1:17 PM Kevin Wright  wrote:
>>
>> Isn't the whole concept of DOI basically link-shortening/redirecting?
>>
>> For example, this link
>> https://doi.org/10.2134/agronj2016.07.0395
>> redirects to
>> https://acsess.onlinelibrary.wiley.com/doi/abs/10.2134/agronj2016.07.0395
>>
>> As a side note, I got so fed up with CRAN check complaints about (perfectly 
>> valid) re-directs that I refuse to use the \url{} tag anymore.
>>
>> Kevin
>>
>>
>> On Thu, Sep 17, 2020 at 8:32 AM Yihui Xie  wrote:
>>>
>>> I don't have an opinion on the URL shorteners, but how about the
>>> original question? Redirection can be extremely useful in general.
>>> Shortening URLs is only one of its possible applications. FWIW, CRAN
>>> uses (303) redirect itself, e.g.,
>>> https://cran.r-project.org/package=MASS is redirected to
>>> https://cran.r-project.org/web/packages/MASS/index.html Should these
>>> "canonical" CRAN links be disallowed in packages, too? Just as another
>>> example, https://cran.r-project.org/bin/windows/base/release.html is
>>> redirected to the latest Windows installer of R (through the 
>>> tag).
>>>
>>> If the intent of the new URL redirect check is to disallow using URL
>>> shorteners like bit.ly or nyti.ms, that may be fair, but it it is to
>>> disallow using any URLs that are redirected, I think this CRAN policy
>>> may be worth a reconsideration.
>>>
>>> Regards,
>>> Yihui
>>> --
>>> https://yihui.org
>>>
>>>
>>> On Thu, Sep 17, 2020 at 3:26 AM Gábor Csárdi  wrote:

 Right, I am sorry, I did not realize the security aspect here. I guess
 I unconsciously treated CRAN package authors as a trusted source.

 Thanks for the correction and clarification, and to CRAN for
 implementing these checks. :)

 G.

 On Wed, Sep 16, 2020 at 10:50 PM Duncan Murdoch
  wrote:
>
> On 16/09/2020 4:51 p.m., Simon Urbanek wrote:
>> I can't comment for CRAN, but generally, shorteners are considered 
>> security risk so regardless of the 301 handling I think flagging those 
>> is a good idea. Also I think it is particularly bad to use them in 
>> manuals because it hides the target so the user has no idea what hey 
>> will get.
>
> I agree, and we do have \href{}{} in Rd files and similar in other
> formats for giving text of a link different than the URL if the URL is
> inconveniently long.  There's still a bit of a security issue though:
> the built in help browser (at least in MacOS) doesn't show the full URL
> when you hover over the link, as most browsers do.  So one could have
>
> \href{https://disney.org}{https://horrible.web.site}
>
> Duncan Murdoch
>
>
>>
>> Cheers,
>> Simon
>>
>>
>>> On Sep 17, 2020, at 5:35 AM, Gábor Csárdi  
>>> wrote:
>>>
>>> Dear all,
>>>
>>> the new CRAN URL checks flag HTTP 301 redirects. While I understand
>>> the intent, I think this is unfortunate, because several URL shortener
>>> services use 301 redirects, and often a shorter URL is actually better
>>> in a manual page than a longer one that can be several lines long in
>>> the console and also potentially truncated in the PDF manual.
>>>
>>> Some example shorteners that are flagged:
>>>
 db <- tools:::url_db(c("https://nyti.ms;, "https://t.co/mtXLLfYOYE;), 
 "README")
 tools:::check_url_db(db)
>>> URL: https://nyti.ms (moved to https://www.nytimes.com/)
>>> From: README
>>> Status: 200
>>> Message: OK
>>>
>>> URL: https://t.co/mtXLLfYOYE (moved to
>>> https://www.bbc.co.uk/news/blogs-trending-47975564)
>>> From: README
>>> Status: 200
>>> Message: OK
>>>
>>> 

Re: [R-pkg-devel] install.packages() seems not to select the latest suitable version

2020-07-28 Thread J C Nash
Possibly the "old" site-library is not getting over-written. I had to
manually delete.

See https://www.mail-archive.com/r-help@r-project.org/msg259132.html

JN

On 2020-07-28 7:21 a.m., Dirk Eddelbuettel wrote:
> 
> Hi Adelchi,
> 
> On 28 July 2020 at 11:46, Adelchi Azzalini wrote:
> | When I updated package mnormt to version 2.0.0 in June (now at 2.0.1), 
> | at the stage of --as-cran checking, there was a compilation error,  
> | which was overcome by setting the 
> | 
> | Depends:R (≥ 4.0.0)
> | 
> | With this option, all worked fine.
> | 
> | However, shortly afterwards, complaints started coming, 
> | either from users or from maintainers of packages making use of mnormt,
> | because this high version dependence causes troubles to some people,
> | such as those using Debian installations, currently at a much lower 
> | R version.
> 
> You can point those users to a) the r-sig-debian list and b) the Debian
> directory at CRAN as we have always had "backports" of the current R to older
> Debian releases---thanks to the work by Johannes Ranke "backporting" whatever
> my current Debian packages of R are.
> 
> Moreover, you can also point them at `apt install r-cran-mnormt` -- I have
> maintained your package within Debian since 2007 (!!) and continue to do so
> giving Debian (and Ubuntu) users the choice between a distro binary and
> installation from CRAN source. 
>  
> | At the time I select that dependence value, I relied on the fact that
> | install.packages() selected the most recent suitable version of a package,
> | given the existing R installation. I expected that people without
> | R 4.0.0 would have the older version of mnormt, 1.5-7, installed.
> | As my memory goes (and the memory of other people too), this was 
> | the working in the past, but apparently not any more. 
> 
> I don't think that is quite correct. The CRAN repo is always set up for the
> currently released version, and may allow constraints such 'R (>= 4.0.0)'
> imposing the current (major) release.
> 
> There is no recent change in this behavior.
> 
> | For instance, this is a passage from a specific user:
> |  
> | "install.packages() used tp just install the most recent available 
> | for your current version of R.  In the past it might have done just that, 
> | but that's clearly not the case currently."
> 
> Yes and no. I don't think this correctly stated. `install.packages()` always
> picks the most recent version, but this may also require running _the
> current_ R release.  I disagree about "not the case currently" -- no change
> as stated above.
> 
> | Can anyone clarify the reason of this (apparent? real?) change?
> | ...and possibly indicate a way who people with lower R version (and perhaps
> | limited R expertise) can install the older version of mnormt, 1.5-7, 
> | without much hassle?
> 
> "Versioned" installs were never supported by `install.packages()`.
> 
> But one could always download an older version to a local file, and point
> install.packages() at that file (and setting 'repos=NULL'), or use `R CMD
> INSTALL` directly. No change there either.
> 
> Dirk
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Anyone Know How To Setup Wine for Windows Testing?

2020-07-15 Thread J C Nash
Are you sure you want to try to run R etc. under Wine?

- If you have Windows running, either directly or in a VM, you can run R there.
- If you have Windows and want to run R under some other OS, then set up a VM
e.g., Linux Mint, for that. I sometimes test R for Windows in a VirtualBox VM
for Win10, but generally run in Linux Mint. I've also run R in some Linux VMs
to test for specific dependencies in some distros.

I rather doubt R will run very well in Linux under Wine. My experience with Wine
is that a few apps (e.g. Irfanview) run well, but many give lots of trouble.

JN


On 2020-07-15 1:17 p.m., Steve Bronder wrote:
> Does anyone know of a setup guide for getting R and Rtools 4.0 up and
> running on Wine with the Windows Server 2008 R2 VM? Do other maintainers
> with more knowhow think that would be useful for debugging purposes?
> 
> I've been trying to test out some flto gcc things for windows by setting up
> a local wine VM on my ubuntu box. Wine has an option for Windows Server
> 2008 R2 (which I believe is the windows session CRAN uses?) If anyone has
> done this before and knows of a guide somewhere that would be very helpful!
> 
> Regards,
> 
> Steve Bronder
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Help useRs to use R's own Time/Date objects more efficiently

2020-04-04 Thread J C Nash
As with many areas of R usage, my view is that the concern is one
of making it easier to find appropriate information quickly. The
difficulty is that different users have different needs. So if
one wants to know (most of) what is available, the Time Series
Task View is helpful. If one is a novice, it may now be rather
daunting, while I've found, as a long time user of different software,
that I have to dig to find what I need.

In optimization I have tried -- and have had several false starts --
to unify several packages. That could be helpful for time and date
functions.

Another possibility could be to put the "see" and "see also" information
at the TOP of the documentation rather than lower down, and also to
refer to Task Views and possibly other -- eventually R-project --
documentation objects. I happen to favour wiki-like approaches, but
there has not been much movement towards that yet. We R people are
quite strong individualists, but perhaps more team minded thinking
would help. Some of us are getting beyond our best-before date.

However, I support Martin's intent, and hope there will be attempts
in these directions.

Best,

John Nash


On 2020-04-04 5:49 a.m., Martin Maechler wrote:
> This is mostly a RFC  [but *not* about the many extra packages, please..]:
> 
> Noticing to my chagrin  how my students work in a project,
> googling for R code and cut'n'pasting stuff together, accumulating
> this and that package on the way  all just for simple daily time series
> (though with partly missing parts),
> using chron, zoo, lubridate, ...  all for things that are very
> easy in base R *IF* you read help pages and start thinking on
> your own (...), I've noted once more that the above "if" is a
> very strong one, and seems to happen rarely nowadays by typical R users...
> (yes, I stop whining for now).
> 
> In this case, I propose to slightly improve the situation ...
> by adding a few more lines to one help page [[how could that
> help in the age where "google"+"cut'n'paste" has replaced thinking ? .. ]] :
> 
> On R's own ?Dates  help page (and also on ?DateTimeClasses )
> we have pointers, notably
> 
> See Also:
> 
>  ...
>  ...
>  
>  'weekdays' for convenience extraction functions.
> 
> So people must find that and follow the pointer
> (instead of installing one of the dozen helper packages).
> 
> Then on that page, one sees  weekdays(), months() .. julian()
> in the usage ... which don't seem directly helpful for a person
> who needs more.  If that person is diligent and patient (as good useRs are 
> ;-),
> she finds
> 
>Note:
> 
>   Other components such as the day of the month or the year are very
>   easy to compute: just use 'as.POSIXlt' and extract the relevant
>   component.  Alternatively (especially if the components are
>   desired as character strings), use 'strftime'.
> 
> 
> But then, nowadays, the POSIXlt class is not so transparent to the
> non-expert anymore (as it behaves very much like POSIXct, and
> not like a list for good reasons) .. and so 97%  of R users will
> not find this "very easy".
> 
> For this reason, I propose to at add the following to the
> 'Examples:' section of the help file ...
> and I hope that also readers of  R-devel  who have not been
> aware of how to do this nicely,  will now remember (or remember
> where to look?).
> 
> I at least will tell my students in the future to use these or
> write versions of these simple utility functions.
> 
> 
> 
> 
> ## Show how easily you get month, day, year, day (of {month, week, yr}), ... :
> ## (remember to count from 0 (!): mon = 0..11, wday = 0..6,  etc !!)
> 
> ##' Transform (Time-)Date vector  to  convenient data frame :
> dt2df <- function(dt, dName = deparse(substitute(dt)), stringsAsFactors = 
> FALSE) {
> DF <- as.data.frame(unclass(as.POSIXlt( dt )), 
> stringsAsFactors=stringsAsFactors)
> `names<-`(cbind(dt, DF, deparse.level=0L), c(dName, names(DF)))
> }
> dt2df(.leap.seconds)# date+time
> dt2df(Sys.Date() + 0:9) # date
> 
> ##' Even simpler:  Date -> Matrix:
> d2mat <- function(x) simplify2array(unclass(as.POSIXlt(x)))
> d2mat(seq(as.Date("2000-02-02"), by=1, length.out=30)) # has R 1.0.0's 
> release date
> 
> 
> 
> In the distant past / one of the last times I touched on people
> using (base) R's  Date / Time-Date  objects, I had started
> thinking if we should not provide some simple utilities to "base R"
> (not in the 'base' pkg, but rather 'utils') for "extracting" from
> {POSIX(ct), Date} objects ... and we may have discussed that
> within R Core 20 years ago,  and had always thought that this
> shouldn't be hard for useRs themselves to see how to do...
> 
> But then I see that "everybody" uses extension packages instead,
> even in the many situations where there's no gain doing so, 
> but rather increases the 

Re: [Rd] unstable corner of parameter space for qbeta?

2020-03-26 Thread J C Nash
Despite the need to focus on pbeta, I'm still willing to put in some effort.
But I find it really helps to have 2-3 others involved, since the questions back
and forth keep matters moving forward. Volunteers?

Thanks to Martin for detailed comments.

JN


On 2020-03-26 10:34 a.m., Martin Maechler wrote:
>>>>>> J C Nash 
>>>>>> on Thu, 26 Mar 2020 09:29:53 -0400 writes:
> 
> > Given that a number of us are housebound, it might be a good time to 
> try to
> > improve the approximation. It's not an area where I have much 
> expertise, but in
> > looking at the qbeta.c code I see a lot of root-finding, where I do 
> have some
> > background. However, I'm very reluctant to work alone on this, and will 
> ask
> > interested others to email off-list. If there are others, I'll report 
> back.
> 
> Hi John.
> Yes, qbeta() {in its "main branches"}  does zero finding, but
> zero finding of   pbeta(...) - p*   and I tried to explain in my
> last e-mail that the real problem is that already pbeta() is not
> accurate enough in some unstable corners ...
> The order fixing should typically be
> 1) fix pbeta()
> 2) look at qbeta() which now may not even need a fix because its
>problems may have been entirely a consequence of pbeta()'s inaccuracies.
>And if there are cases where the qbeta() problems are not
>only pbeta's "fault", it is still true that the fixes that
>would still be needed crucially depend on the detailed
>working of the function whose zero(s) are sought, i.e.,  pbeta()
> 
> > Ben: Do you have an idea of parameter region where approximation is 
> poor?
> > I think that it would be smart to focus on that to start with.
> 
> 
> 
> Rmpfr  matrix-/vector - products:
> 
> > Martin: On a separate precision matter, did you get my query early in 
> year about double
> > length accumulation of inner products of vectors in Rmpfr? R-help more 
> or
> > less implied that Rmpfr does NOT use extra length. I've been using David
> > Smith's FM Fortran where the DOT_PRODUCT does use double length, but it
> > would be nice to have that in R. My attempts to find "easy" workarounds 
> have
> > not been successful, but I'll admit that other things took precedence.
> 
> Well, the current development version of 'Rmpfr' on R-forge now
> contains facilities to enlarge the precision of the computations
> by a factor 'fPrec' with default 'fPrec = 1';
> notably, instead of  x %*% y   (where the `%*%` cannot have more
> than two arguments) does have a counterpart  matmult(x,y, )
> which allows more arguments, namely 'fPrec', or directly 'precBits';
> and of course there are  crossprod() and tcrossprod() one should
> use when applicable and they also got the  'fPrec' and
> 'precBits' arguments.
> 
> {The %*% etc precision increase still does not work optimally
>  efficiency wise, as it simply increases the precision of all
>  computations by just increasing the precision of x and y (the inputs)}.
> 
> The whole  Matrix and Matrix-vector arithmetic is still
> comparibly slow in Rmpfr .. mostly because I valued human time
> (mine!) much higher than computer time in its implementation.
> That's one reason I would never want to double the precision
> everywhere as it decreases speed even more, and often times
> unnecessarily: doubling the accuracy is basically "worst-case
> scenario" precaution
> 
> Martin
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] unstable corner of parameter space for qbeta?

2020-03-26 Thread J C Nash
Given that a number of us are housebound, it might be a good time to try to
improve the approximation. It's not an area where I have much expertise, but in
looking at the qbeta.c code I see a lot of root-finding, where I do have some
background. However, I'm very reluctant to work alone on this, and will ask
interested others to email off-list. If there are others, I'll report back.

Ben: Do you have an idea of parameter region where approximation is poor?
I think that it would be smart to focus on that to start with.

Martin: On a separate precision matter, did you get my query early in year 
about double
length accumulation of inner products of vectors in Rmpfr? R-help more or
less implied that Rmpfr does NOT use extra length. I've been using David
Smith's FM Fortran where the DOT_PRODUCT does use double length, but it
would be nice to have that in R. My attempts to find "easy" workarounds have
not been successful, but I'll admit that other things took precedence.

Best,

John Nash



On 2020-03-26 4:02 a.m., Martin Maechler wrote:
>> Ben Bolker 
>> on Wed, 25 Mar 2020 21:09:16 -0400 writes:
> 
> > I've discovered an infelicity (I guess) in qbeta(): it's not a bug,
> > since there's a clear warning about lack of convergence of the numerical
> > algorithm ("full precision may not have been achieved").  I can work
> > around this, but I'm curious why it happens and whether there's a better
> > workaround -- it doesn't seem to be in a particularly extreme corner of
> > parameter space. It happens, e.g., for  these parameters:
> 
> > phi <- 1.1
> > i <- 0.01
> > t <- 0.001
> > shape1 = i/phi  ##  0.009090909
> > shape2 = (1-i)/phi  ## 0.9
> > qbeta(t,shape1,shape2)  ##  5.562685e-309
> > ##  brute-force uniroot() version, see below
> > Qbeta0(t,shape1,shape2)  ## 0.9262824
> 
> > The qbeta code is pretty scary to read: the warning "full precision
> > may not have been achieved" is triggered here:
> 
> > 
> https://github.com/wch/r-source/blob/f8d4d7d48051860cc695b99db9be9cf439aee743/src/nmath/qbeta.c#L530
> 
> > Any thoughts?
> 
> Well,  qbeta() is mostly based on inverting pbeta()  and pbeta()
> has *several* "dangerous" corners in its parameter spaces
> {in some cases, it makes sense to look at the 4 different cases
>  log.p = TRUE/FALSE  //  lower.tail = TRUE/FALSE  separately ..}
> 
> pbeta() itself is based on the most complex numerical code in
> all of base R, i.e., src/nmath/toms708.c  and that algorithm
> (TOMS 708) had been sophisticated already when it was published,
> and it has been improved and tweaked several times since being
> part of R, notably for the log.p=TRUE case which had not been in
> the focus of the publication and its algorithm.
> [[ NB: part of this you can read when reading  help(pbeta)  to the end ! ]]
> 
> I've spent many "man weeks", or even "man months" on pbeta() and
> qbeta(), already and have dreamed to get a good student do a
> master's thesis about the problem and potential solutions I've
> looked into in the mean time.
> 
> My current gut feeling is that in some cases, new approximations
> are necessary (i.e. tweaking of current approximations is not
> going to help sufficiently).
> 
> Also not (in the R sources)  tests/p-qbeta-strict-tst.R
> a whole file of "regression tests" about  pbeta() and qbeta()
> {where part of the true values have been computed with my CRAN
> package Rmpfr (for high precision computation) with the
> Rmpfr::pbetaI() function which gives arbitrarily precise pbeta()
> values but only when  (a,b) are integers -- that's the "I" in pbetaI().
> 
> Yes, it's intriguing ... and I'll look into your special
> findings a bit later today.
> 
> 
>   > Should I report this on the bug list?
> 
> Yes, please.  Not all problem of pbeta() / qbeta() are part yet,
> of R's bugzilla data base,  and maybe this will help to draw
> more good applied mathematicians look into it.
> 
> 
> 
> Martin Maechler
> ETH Zurich and R Core team
> (I'd call myself the "dpq-hacker" within R core -- related to
>  my CRAN package 'DPQ')
> 
> 
> > A more general illustration:
> > http://www.math.mcmaster.ca/bolker/misc/qbeta.png
> 
> > ===
> > fun <- function(phi,i=0.01,t=0.001, f=qbeta) {
> > f(t,shape1=i/phi,shape2=(1-i)/phi, lower.tail=FALSE)
> > }
> > ## brute-force beta quantile function
> > Qbeta0 <- function(t,shape1,shape2,lower.tail=FALSE) {
> > fn <- function(x) {pbeta(x,shape1,shape2,lower.tail=lower.tail)-t}
> > uniroot(fn,interval=c(0,1))$root
> > }
> > Qbeta <- Vectorize(Qbeta0,c("t","shape1","shape2"))
> > curve(fun,from=1,to=4)
> > curve(fun(x,f=Qbeta),add=TRUE,col=2)
> 
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> 
> __
> R-devel@r-project.org mailing list
> 

[R-pkg-devel] General considerations about vignettes

2019-08-30 Thread J C Nash
I'm seeking some general advice about including vignettes in my packages,
which are largely for nonlinear estimation and function minimization 
(optimization).
This means that my packages offer alternatives to many other tools, and the user
then has the chore of deciding which is appropriate. Bad choices can be very
costly in inappropriate results or computational efficiencies. Hence, I include
vignettes to offer comparisons and examples of use.

Unfortunately, as in a case this week, changes in the comparison packages break
my package(s), and I get an email from CRAN telling me to fix it before some
date not far in the future. This means a) work for me, possibly at an 
inopportune
time; b) risk of loss of capability, in the present case in the nlsr package 
which
offers some unique capabilities, and c) extra work for CRAN for what is, 
arguably,
updating of peripheral documentation. Updating optimization packages on CRAN 
can be,
I have discovered, a very time-consuming task. Package optimx took over 3 months
to get updated.

It should be noted in the present situation that just before I got the msg from
CRAN I got a msg from the maintainer of the package that has changed and breaks
the vignette with some suggestions on a fix. The issue is that his package has
changed function syntax -- a situation all of us know is fraught with troubles,
since improvements may cause breakage.

I am NOT saying that my vignettes should not be updated. However, I'm wondering
if I should set up a repository for my vignettes on Github/Gitlab or similar, 
and
simply link to them. This would separate the updating of vignettes from the 
central
packages. Their updating could be less strictly tied to CRAN activities, and 
could
also be a task undertaken by others who are not listed as maintainer.

I'd welcome some (hopefully constructive) comments. Would CRAN maintainers feel
this to be helpful, or does it lower the value of official R packages? Do
other maintainers experience the same requests, or do they just not include
vignettes (and many do not)?


John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
I've put my tiny package back up, but it likely is doing same
thing as Bill Dunlap's. I took it off as a cleanup when I thought
the issue resolved.

http://web.ncf.ca/fh448/jfiles/fchk_2019-6.5.tar.gz

Seems I've disturbed the ant's nest.

JN



On 2019-06-07 1:53 p.m., William Dunlap wrote:
> I've attached a package, ppp_0.1.tar.gz, which probably will not get through 
> to R-help, that illustrates this.
> It contains one function which, by default, triggers a condition-length>1 
> issue:
>    f <- function(x = 1:3)
>    {
>        if (x > 1) {
>            x <- -x
>        }
>        stop("this function always gives an error")
>    }
> and the help file example is
>    try(f())
> 
> Then 
>    env _R_CHECK_LENGTH_1_CONDITION_=abort,verbose R-3.6.0 CMD check --as-cran 
> ppp_0.1.tar.gz
> results in
> * checking examples ... ERROR
> Running examples in ‘ppp-Ex.R’ failed
> The error most likely occurred in:
> 
>> base::assign(".ptime", proc.time(), pos = "CheckExEnv")
>> ### Name: f
>> ### Title: Cause an error
>> ### Aliases: f
>> ### Keywords: error
>>
>> ### ** Examples
>>
>> try(f())
>  --- FAILURE REPORT --
>  --- failure: the condition has length > 1 ---
>  --- srcref ---
> :
>  --- package (from environment) ---
> ppp
>  --- call from context ---
> f()
>  --- call from argument ---
> if (x > 1) {
>     x <- -x
> }
>  --- R stacktrace ---
> where 1: f()
> where 2: doTryCatch(return(expr), name, parentenv, handler)
> where 3: tryCatchOne(expr, names, parentenv, handlers[[1L]])
> where 4: tryCatchList(expr, classes, parentenv, handlers)
> where 5: tryCatch(expr, error = function(e) {
>     call <- conditionCall(e)
>     if (!is.null(call)) {
>         if (identical(call[[1L]], quote(doTryCatch)))
>             call <- sys.call(-4L)
>         dcall <- deparse(call)[1L]
>         prefix <- paste("Error in", dcall, ": ")
>         LONG <- 75L
>         sm <- strsplit(conditionMessage(e), "\n")[[1L]]
>         w <- 14L + nchar(dcall, type = "w") + nchar(sm[1L], type = "w")
>         if (is.na <http://is.na>(w))
>             w <- 14L + nchar(dcall, type = "b") + nchar(sm[1L],
>                 type = "b")
>         if (w > LONG)
>             prefix <- paste0(prefix, "\n  ")
>     }
>     else prefix <- "Error : "
>     msg <- paste0(prefix, conditionMessage(e), "\n")
>     .Internal(seterrmessage(msg[1L]))
>     if (!silent && isTRUE(getOption("show.error.messages"))) {
>         cat(msg, file = outFile)
>         .Internal(printDeferredWarnings())
>     }
>     invisible(structure(msg, class = "try-error", condition = e))
> })
> where 6: try(f())
> 
>  --- value of length: 3 type: logical ---
> [1] FALSE  TRUE  TRUE
>  --- function from context ---
> function (x = 1:3)
> {
>     if (x > 1) {
>         x <- -x
>     }
>     stop("this function always gives an error")
> }
> 
> 
>  --- function search by body ---
> Function f in namespace ppp has this body.
>  --- END OF FAILURE REPORT --
> Fatal error: the condition has length > 1
> * checking PDF version of manual ... OK
> * DONE
> 
> Status: 1 ERROR, 1 NOTE
> See
>   ‘/tmp/bill/ppp.Rcheck/00check.log’
> for details.
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com <http://tibco.com>
> 
> 
> On Fri, Jun 7, 2019 at 10:21 AM Duncan Murdoch  <mailto:murdoch.dun...@gmail.com>> wrote:
> 
> On 07/06/2019 12:32 p.m., William Dunlap wrote:
> > The length-condition-not-equal-to-one checks will cause R to shutdown
> > even if the code in a tryCatch().
> 
> That's strange.  I'm unable to reproduce it with my tries, and John's
> package is no longer online.  Do you have an example I could look at?
> 
> Duncan Murdoch
> 
> >
> > Bill Dunlap
> > TIBCO Software
> > wdunlap tibco.com <http://tibco.com> <http://tibco.com>
> >
> >
> > On Fri, Jun 7, 2019 at 7:47 AM Duncan Murdoch  <mailto:murdoch.dun...@gmail.com>
> > <mailto:murdoch.dun...@gmail.com <mailto:murdoch.dun...@gmail.com>>> 
> wrote:
> >
> >     On 07/06/2019 9:46 a.m., J C Nash wrote:
> >      > Should try() not stop those checks from forcing an error?
> >
> >     try(stop("msg"))  will print the error message, but won't stop
> >     exec

Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Serguei picked up the glitch and Jeff sorted out the || vs | once any()
was used.

The test that caused the issue was not the one I was looking for,
but another case. However, I'd overlooked the possibility that
there could be different lengths, so || complained (as it should,
but didn't in regular R CMD check).

As an aside, the --as-cran check complains as follows:

The The Title field should be in title case. Current version is:
‘A test of R CMD check --as-cran’
In title case that is:
‘A Test of R CMD Check --as-Cran’

Thanks to all.

JN


On 2019-06-07 10:05 a.m., Jeff Newmiller wrote:
>>> any(is.infinite(fval)) || any(is.na(fval))
>>
>> a little typo here: it should be '|', not '||', right ?
> 
> Since `any` collapses the vectors to length 1 either will work, but I would 
> prefer `||`.
> 
> On June 7, 2019 6:51:29 AM PDT, Serguei Sokol  wrote:
>> On 07/06/2019 15:31, Sebastian Meyer wrote:
>>> The failure stated in the R CMD check failure report is:
>>>
>>>>   --- failure: length > 1 in coercion to logical ---
>>>
>>> This comes from --as-cran performing useful extra checks via setting
>> the
>>> environment variable _R_CHECK_LENGTH_1_LOGIC2_, which means:
>>>
>>>> check if either argument of the binary operators && and || has
>> length greater than one.
>>>
>>> (see
>> https://cran.r-project.org/doc/manuals/r-release/R-ints.html#Tools)
>>>
>>> The failure report also states the source of the failure:
>>>
>>>>   --- call from context ---
>>>> fchk(x, benbad, trace = 3, y)
>>>>   --- call from argument ---
>>>> is.infinite(fval) || is.na(fval)
>>>
>>> The problem is that both is.infinite(fval) and is.na(fval) return
>>> vectors of length 10 in your test case:
>>>
>>>>   --- value of length: 10 type: logical ---
>>>>   [1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
>>>
>>> The || operator works on length 1 Booleans. Since fval can be of
>> length
>>> greater than 1 at that point, the proper condition seems to be:
>>>
>>> any(is.infinite(fval)) || any(is.na(fval))
>> a little typo here: it should be '|', not '||', right ?
>>
>> Best,
>> Serguei.
>>
>>> Am 07.06.19 um 14:53 schrieb J C Nash:
>>>> Sorry reply not quicker. For some reason I'm not getting anything in
>> the thread I started!
>>>> I found the responses in the archives. Perhaps cc: nas...@uottawa.ca
>> please.
>>>>
>>>> I have prepared a tiny (2.8K) package at
>>>> http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz
>>>>
>>>> R CMD check --> OK
>>>>
>>>> R CMD check --as-cran --> 1 ERROR, 1 NOTE
>>>>
>>>> The error is in an example:
>>>>
>>>>> benbad<-function(x, y){
>>>>> # y may be provided with different structures
>>>>> f<-(x-y)^2
>>>>> } # very simple, but ...
>>>>>
>>>>> y<-1:10
>>>>> x<-c(1)
>>>>> cat("test benbad() with y=1:10, x=c(1)\n")
>>>>> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
>>>>> print(tryfc01)
>>>>> print(fc01)
>>>>
>>>> There's quite a lot of output, but it doesn't make much sense to me,
>> as
>>>> it refers to code that I didn't write.
>>>>
>>>> The function fchk is attempting to check if functions provided for
>>>> optimization do not violate some conditions e.g., character rather
>> than
>>>> numeric etc.
>>>>
>>>> JN
>>>>
>>>>
>>>> On 2019-06-07 8:44 a.m., J C Nash wrote:
>>>>> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
>>>>> Fri Jun 7 11:44:37 CEST 2019
>>>>>
>>>>>  Previous message (by thread): [R-pkg-devel] try() in R CMD
>> check --as-cran
>>>>>  Next message (by thread): [R-pkg-devel] using package data in
>> package code
>>>>>  Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
>>>>>
>>>>> Right, what problem are you talking about? Can you tell us which
>> check
>>>>> it is and what it actually complained about.
>>>>> There is no check that looks at the sizes of x and y in
>> exypressions
>>>>> such as
>>>>> (x - y)^2.
>>>>> as far as I kn

Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Should try() not stop those checks from forcing an error?

I recognize that this is the failure -- it is indeed the check I'm trying to
catch -- but I don't want tests of such checks to fail my package.

JN

On 2019-06-07 9:31 a.m., Sebastian Meyer wrote:
> The failure stated in the R CMD check failure report is:
> 
>>  --- failure: length > 1 in coercion to logical ---
> 
> This comes from --as-cran performing useful extra checks via setting the
> environment variable _R_CHECK_LENGTH_1_LOGIC2_, which means:
> 
>> check if either argument of the binary operators && and || has length 
>> greater than one. 
> 
> (see https://cran.r-project.org/doc/manuals/r-release/R-ints.html#Tools)
> 
> The failure report also states the source of the failure:
> 
>>  --- call from context --- 
>> fchk(x, benbad, trace = 3, y)
>>  --- call from argument --- 
>> is.infinite(fval) || is.na(fval)
> 
> The problem is that both is.infinite(fval) and is.na(fval) return
> vectors of length 10 in your test case:
> 
>>  --- value of length: 10 type: logical ---
>>  [1] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
> 
> The || operator works on length 1 Booleans. Since fval can be of length
> greater than 1 at that point, the proper condition seems to be:
> 
> any(is.infinite(fval)) || any(is.na(fval))
> 
> Best regards,
> 
>   Sebastian
> 
> 
> Am 07.06.19 um 14:53 schrieb J C Nash:
>> Sorry reply not quicker. For some reason I'm not getting anything in the 
>> thread I started!
>> I found the responses in the archives. Perhaps cc: nas...@uottawa.ca please.
>>
>> I have prepared a tiny (2.8K) package at
>> http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz
>>
>> R CMD check --> OK
>>
>> R CMD check --as-cran --> 1 ERROR, 1 NOTE
>>
>> The error is in an example:
>>
>>> benbad<-function(x, y){
>>># y may be provided with different structures
>>>f<-(x-y)^2
>>> } # very simple, but ...
>>>
>>> y<-1:10
>>> x<-c(1)
>>> cat("test benbad() with y=1:10, x=c(1)\n")
>>> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
>>> print(tryfc01)
>>> print(fc01)
>>
>> There's quite a lot of output, but it doesn't make much sense to me, as
>> it refers to code that I didn't write.
>>
>> The function fchk is attempting to check if functions provided for
>> optimization do not violate some conditions e.g., character rather than
>> numeric etc.
>>
>> JN
>>
>>
>> On 2019-06-07 8:44 a.m., J C Nash wrote:
>>> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
>>> Fri Jun 7 11:44:37 CEST 2019
>>>
>>> Previous message (by thread): [R-pkg-devel] try() in R CMD check 
>>> --as-cran
>>> Next message (by thread): [R-pkg-devel] using package data in package 
>>> code
>>> Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
>>>
>>> Right, what problem are you talking about? Can you tell us which check
>>> it is and what it actually complained about.
>>> There is no check that looks at the sizes of x and y in exypressions
>>> such as
>>> (x - y)^2.
>>> as far as I know.
>>>
>>> Best,
>>> Uwe
>>>
>>> On 07.06.2019 10:33, Berry Boessenkool wrote:
>>>>
>>>> Not entirely sure if this is what you're looking for:
>>>> https://github.com/wch/r-source/blob/trunk/src/library/tools/R/check.R
>>>> It does contain --as-cran a few times and there's the change-history:
>>>> https://github.com/wch/r-source/commits/trunk/src/library/tools/R/check.R
>>>>
>>>> Regards,
>>>> Berry
>>>>
>>>>
>>>> 
>>>> From: R-package-devel  on 
>>>> behalf of J C Nash 
>>>> Sent: Thursday, June 6, 2019 15:03
>>>> To: List r-package-devel
>>>> Subject: [R-pkg-devel] try() in R CMD check --as-cran
>>>>
>>>> After making a small fix to my optimx package, I ran my usual R CMD check 
>>>> --as-cran.
>>>>
>>>> To my surprise, I got two ERRORs unrelated to the change. The errors 
>>>> popped up in
>>>> a routine designed to check the call to the user objective function. In 
>>>> particular,
>>>> one check is that the size of vectors is the same in expressions like (x - 
>>>> y)^2.
>>>> This works fine with 

Re: [R-pkg-devel] try() in R CMD check --as-cran

2019-06-07 Thread J C Nash
Sorry reply not quicker. For some reason I'm not getting anything in the thread 
I started!
I found the responses in the archives. Perhaps cc: nas...@uottawa.ca please.

I have prepared a tiny (2.8K) package at
http://web.ncf.ca/nashjc/jfiles/fchk_2019-6.5.tar.gz

R CMD check --> OK

R CMD check --as-cran --> 1 ERROR, 1 NOTE

The error is in an example:

> benbad<-function(x, y){
># y may be provided with different structures
>f<-(x-y)^2
> } # very simple, but ...
> 
> y<-1:10
> x<-c(1)
> cat("test benbad() with y=1:10, x=c(1)\n")
> tryfc01 <- try(fc01<-fchk(x, benbad, trace=3, y))
> print(tryfc01)
> print(fc01)

There's quite a lot of output, but it doesn't make much sense to me, as
it refers to code that I didn't write.

The function fchk is attempting to check if functions provided for
optimization do not violate some conditions e.g., character rather than
numeric etc.

JN


On 2019-06-07 8:44 a.m., J C Nash wrote:
> Uwe Ligges ||gge@ @end|ng |rom @t@t|@t|k@tu-dortmund@de
> Fri Jun 7 11:44:37 CEST 2019
> 
> Previous message (by thread): [R-pkg-devel] try() in R CMD check --as-cran
> Next message (by thread): [R-pkg-devel] using package data in package code
> Messages sorted by: [ date ] [ thread ] [ subject ] [ author ]
> 
> Right, what problem are you talking about? Can you tell us which check
> it is and what it actually complained about.
> There is no check that looks at the sizes of x and y in exypressions
> such as
> (x - y)^2.
> as far as I know.
> 
> Best,
> Uwe
> 
> On 07.06.2019 10:33, Berry Boessenkool wrote:
>>
>> Not entirely sure if this is what you're looking for:
>> https://github.com/wch/r-source/blob/trunk/src/library/tools/R/check.R
>> It does contain --as-cran a few times and there's the change-history:
>> https://github.com/wch/r-source/commits/trunk/src/library/tools/R/check.R
>>
>> Regards,
>> Berry
>>
>>
>> 
>> From: R-package-devel  on 
>> behalf of J C Nash 
>> Sent: Thursday, June 6, 2019 15:03
>> To: List r-package-devel
>> Subject: [R-pkg-devel] try() in R CMD check --as-cran
>>
>> After making a small fix to my optimx package, I ran my usual R CMD check 
>> --as-cran.
>>
>> To my surprise, I got two ERRORs unrelated to the change. The errors popped 
>> up in
>> a routine designed to check the call to the user objective function. In 
>> particular,
>> one check is that the size of vectors is the same in expressions like (x - 
>> y)^2.
>> This works fine with R CMD check, but the --as-cran seems to have changed 
>> and it
>> pops an error, even when the call is inside try(). The irony that the 
>> routine in
>> question is intended to avoid problems like this is not lost on me.
>>
>> I'm working on a small reproducible example, but it's not small enough yet.
>> In the meantime, I'm looking for the source codes of the scripts for "R CMD 
>> check" and
>> "R CMD check --as-cran" so I can work out why there is this difference, 
>> which seems
>> to be recent.
>>
>> Can someone send/post a link? I plan to figure this out and provide feedback,
>> as I suspect it is going to affect others. However, it may be a few days or 
>> even
>> weeks if past experience is a guide.
>>
>> JN
>>
>> __
>> R-package-devel using r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>
>>  [[alternative HTML version deleted]]
>>
>> __
>> R-package-devel using r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>>
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] try() in R CMD check --as-cran

2019-06-06 Thread J C Nash
After making a small fix to my optimx package, I ran my usual R CMD check 
--as-cran.

To my surprise, I got two ERRORs unrelated to the change. The errors popped up 
in
a routine designed to check the call to the user objective function. In 
particular,
one check is that the size of vectors is the same in expressions like (x - y)^2.
This works fine with R CMD check, but the --as-cran seems to have changed and it
pops an error, even when the call is inside try(). The irony that the routine in
question is intended to avoid problems like this is not lost on me.

I'm working on a small reproducible example, but it's not small enough yet.
In the meantime, I'm looking for the source codes of the scripts for "R CMD 
check" and
"R CMD check --as-cran" so I can work out why there is this difference, which 
seems
to be recent.

Can someone send/post a link? I plan to figure this out and provide feedback,
as I suspect it is going to affect others. However, it may be a few days or even
weeks if past experience is a guide.

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Bug in the "reformulate" function in stats package

2019-03-29 Thread J C Nash
The main thing is to post the "small reproducible example".

My (rather long term experience) can be written

  if (exists("reproducible example") ) {
 DeveloperFixHappens()
  } else {
 NULL
  }

JN

On 2019-03-29 11:38 a.m., Saren Tasciyan wrote:
> Well, first I can't sign in bugzilla myself, that is why I wrote here first. 
> Also, I don't know if I have the time at
> the moment to provide tests, multiple examples or more. If that is not ok or 
> welcomed, that is fine, I can come back,
> whenever I have more time to properly report the bug.
> 
> I didn't find the existing bug report, sorry for that.
> 
> Yes, it is related. My problem was that I have column names with spaces and 
> current solution doesn't solve it. I have a
> solution, which works for me and maybe also for others.
> 
> Either, someone can register me to bugzilla or I can post it here, which 
> could give some direction to developers. I
> don't mind whichever is preferred here.
> 
> Best,
> 
> Saren
> 
> 
> On 29.03.19 09:29, Martin Maechler wrote:
>>> Saren Tasciyan
>>>  on Thu, 28 Mar 2019 17:02:10 +0100 writes:
>>  > Hi,
>>  > I have found a bug in reformulate function and have a solution for 
>> it. I
>>  > was wondering, where I can submit it?
>>
>>  > Best,
>>  > Saren
>>
>>
>> Well, you could have given a small reproducible example
>> depicting the bug, notably when posting here:
>> Just a prose text with no R code or other technical content is
>> almost always not really appropriate fo the R-devel mailing list.
>>
>> Further, in such a case you should google a bit and hopefully
>> have found
>>     https://www.r-project.org/bugs.html
>>
>> which also mention reproducibility (and many more useful things).
>>
>> Then it also tells you about R's bug repository, also called
>> "R's bugzilla" at https://bugs.r-project.org/
>>
>> and if you are diligent (but here, I'd say bugzilla is
>> (configured?) far from ideal), you'd also find bug PR#17359
>>
>>     https://bugs.r-project.org/bugzilla/show_bug.cgi?id=17359
>>
>> which was reported already on Nov 2017 .. and only fixed
>> yesterday (in the "cleanup old bugs" process that happens
>> often before the big new spring release of R).
>>
>> So is your bug the same as that one?
>>
>> Martin
>>
>>  > --
>>  > Saren Tasciyan
>>  > /PhD Student / Sixt Group/
>>  > Institute of Science and Technology Austria
>>  > Am Campus 1
>>  > 3400 Klosterneuburg, Austria
>>
>>  > __
>>  > R-devel@r-project.org mailing list
>>  > https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Fwd: Package inclusion in R core implementation

2019-03-04 Thread J C Nash
Rereading my post below, I realize scope for misinterpretation. As I have said 
earlier,
I recognize the workload in doing any streamlining, and also the immense 
service to us
all by r-core. The issue is how to manage the workload efficiently while 
maintaining
and modernizing the capability. That is likely as challenging as doing the work 
itself.

JN


> I concur with Avraham that capabilities need to be ensured e.g., in 
> recommended
> packages. I should have mentioned that. My concern is that the core should be
> focused on the programming language aspects. The computational math and some 
> of the more
> intricate data management could better be handled by folk outside the core.
> 
> JN
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Package inclusion in R core implementation

2019-03-04 Thread J C Nash
I concur with Avraham that capabilities need to be ensured e.g., in recommended
packages. I should have mentioned that. My concern is that the core should be
focused on the programming language aspects. The computational math and some of 
the more
intricate data management could better be handled by folk outside the core.

JN

On 2019-03-04 9:12 a.m., Avraham Adler wrote:
> On Mon, Mar 4, 2019 at 5:01 PM J C Nash  <mailto:profjcn...@gmail.com>> wrote:
> 
> As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in 
> optim(), I've
> been pushing for some time for their deprecation. They aren't "bad", but 
> we have
> better tools, and they are in CRAN packages. Similarly, I believe other 
> optimization
> tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved 
> to
> packages (there are already 2 versions at least of LBFGS that I and Matt 
> Fidler
> are merging). And optim::SANN does not match any usual expectations of 
> users.
> 
> I'm sure there are other tools for other tasks that can and should move 
> to packages
> to streamline the work of our core team. However, I can understand that 
> there is this
> awkward issue of actually doing this. I know I'm willing to help with 
> preparing
> "Transition Guide" documentation and scripts, and would be surprised if 
> there are
> not others. R already has a policy of full support only for current 
> version, so
> hanging on to antique tools (the three codes at the top are based on 
> papers all
> of which now qualify for >50 years old) seems inconsistent with other 
> messages.
> 
> For information: I'm coordinating a project to build understanding of what
> older algorithms are in R as the histoRicalg project. See
> https://gitlab.com/nashjc/histoRicalg. We welcome participation.
> 
> Best, JN
> 
> On 2019-03-04 7:59 a.m., Jim Hester wrote:
> > Conversely, what is the process to remove a package from core R? It 
> seems
> > to me some (many?) of the packages included are there more out of
> > historical accident rather than any technical need to be in the core
> > distribution. Having them as a core (or recommended) package makes them
> > harder update independently to R and makes testing, development and
> > contribution more cumbersome.
> >
> > On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan  <mailto:morgan.email...@gmail.com>>
> > wrote:
> >
> >> Hi,
> >>
> >> It sometimes happens that some packages get included to R like for 
> example
> >> the parallel package.
> >>
> >> I was wondering if there is a process to decide whether or not to 
> include a
> >> package in the core implementation of R?
> >>
> >> For example, why not include the Rcpp package, which became for a lot 
> of
> >> user the main tool to extend R?
> >>
> >> What is our view on the (not so well known) dotCall64 package which is 
> an
> >> interesting alternative for extending R?
> >>
> >> Thank you
> >> Best regards,
> >> Morgan
> >>
> 
> 
> I have No arguments with updating code to more correct or modern versions, 
> but I think that as a design decision, base R
> should have optimization routines as opposed to it being an external package 
> which conceptually could be orphaned. Or at
> least some package gets made recommended and adopted by R core.
> 
> Thank you,
> 
> Avi
> -- 
> Sent from Gmail Mobile

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Package inclusion in R core implementation

2019-03-04 Thread J C Nash
As the original coder (in mid 1970s) of BFGS, CG and Nelder-Mead in optim(), 
I've
been pushing for some time for their deprecation. They aren't "bad", but we have
better tools, and they are in CRAN packages. Similarly, I believe other 
optimization
tools in the core (optim::L-BFGS-B, nlm, nlminb) can and should be moved to
packages (there are already 2 versions at least of LBFGS that I and Matt Fidler
are merging). And optim::SANN does not match any usual expectations of users.

I'm sure there are other tools for other tasks that can and should move to 
packages
to streamline the work of our core team. However, I can understand that there 
is this
awkward issue of actually doing this. I know I'm willing to help with preparing
"Transition Guide" documentation and scripts, and would be surprised if there 
are
not others. R already has a policy of full support only for current version, so
hanging on to antique tools (the three codes at the top are based on papers all
of which now qualify for >50 years old) seems inconsistent with other messages.

For information: I'm coordinating a project to build understanding of what
older algorithms are in R as the histoRicalg project. See
https://gitlab.com/nashjc/histoRicalg. We welcome participation.

Best, JN

On 2019-03-04 7:59 a.m., Jim Hester wrote:
> Conversely, what is the process to remove a package from core R? It seems
> to me some (many?) of the packages included are there more out of
> historical accident rather than any technical need to be in the core
> distribution. Having them as a core (or recommended) package makes them
> harder update independently to R and makes testing, development and
> contribution more cumbersome.
> 
> On Fri, Mar 1, 2019 at 4:35 AM Morgan Morgan 
> wrote:
> 
>> Hi,
>>
>> It sometimes happens that some packages get included to R like for example
>> the parallel package.
>>
>> I was wondering if there is a process to decide whether or not to include a
>> package in the core implementation of R?
>>
>> For example, why not include the Rcpp package, which became for a lot of
>> user the main tool to extend R?
>>
>> What is our view on the (not so well known) dotCall64 package which is an
>> interesting alternative for extending R?
>>
>> Thank you
>> Best regards,
>> Morgan
>>
>> [[alternative HTML version deleted]]
>>
>> __
>> R-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-devel
>>
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] Collaboration request to build R wrapper for C program(s)

2018-12-12 Thread J C Nash
There is a quite well-developed but not terribly large C program
for conjugate gradients and similar approaches to optimization I would
like to wrap in a package for use in R. I would then build this into the
optimx package I maintain. I suspect that the approach may turn out to be
one of the most efficient for large-n problems.

However, my skills with C and C++ are essentially knowing how to mimic
existing code, and I would welcome collaboration or help to build the
package, likely using Rcpp. Possibly this would be a suitable term
project in a stat. computing course. I'm more than happy to share my
expertise on the optimization side, or with other computing languages,
particularly Fortran.

To reduce noise on the list, I'll suggest off-line contact to the
address above (profjcnash _at_ gmail.com).

Best, JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-28 Thread J C Nash
Thanks for this. Also Duncan's description of how code in the R directory is 
executed.

I've more or less figured out a workaround. Unfortunately Georgi's solution 
doesn't quite
do the trick. Here is my current understanding and solution.

Issue: I want to get root of a function of 1 parameter x, but there may be 
exogenous data Xdata.
   I also want to count the evaluations.
   Some rootfinders don't have counter and some don't allow "..."

Initial solution: Create global envroot with counters and such. Put name of 
function and
gradient in there, then use a dummy FnTrace (and possibly grTrace). This gave 
various
check complaints about globals etc. However, does appear to work.

Present approach: Slightly less flexible, but no complaints.
   Within rootwrap() which calls different rootfinders according to 
method="name", define
   FnTrace and grTrace, set up a list glist for the items I want to share, then
   envroot <- list2env(glist)

The FnTrace and grTrace are defined before the calls to rootfinders, so envroot 
can be found.
No globals. R CMD check is happy. However, I must call rootfinders via the 
wrapper, which is
actually simpler from point of view of syntax.

I've still some testing and tweaking, but I think main issues resolved by this.

Thanks to all who responded.

JN







On 2018-08-28 12:27 PM, Georgi Boshnakov wrote:
> If you don't insist on putting the variable in the global environment, 
> variations of the following give a cleaner solution:
> 
> TraceSetup_1 <- local({
> ifn = 0
> igr = 0
> ftrace = FALSE
> fn = NA
> gr = NA
> 
> function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> ifn<<- ifn
> igr<<- igr
> ftrace <<- ftrace
> fn <<- fn
> gr <<- gr
> parent.env(environment())
> }
> })
> 
> For example,
> 
> TraceSetup_1 <- local({
> + ifn = 0
> + igr = 0
> + ftrace = FALSE
> + fn = NA
> + gr = NA
> + function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> + ifn<<- ifn
> + igr<<- igr
> + ftrace <<- ftrace
> + fn <<- fn
> + gr <<- gr
> + parent.env(environment())
> + }
> + })
>>
>> e <- TraceSetup_1(fn = function(x) x^2)
>> ls(e)
> [1] "fn" "ftrace" "gr" "ifn""igr"   
>> e$fn
> function(x) x^2
> 
> ## let's change 'fn':
>> e$fn <- function(x) x^4
>> e$fn
> function(x) x^4
> 
> 
> Note that the environment is always the same, so can be accessed from 
> anywhere in your code:
> 
>> e2 <- environment(TraceSetup_1)
>> e2
> 
>> identical(e2, e)
> [1] TRUE
>>
> 
> If you need a new environment every time, a basic setup might be:
> 
> TraceSetup_2 <- local({
> staticVar1 <- NULL
> ## other variables here
> 
> function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
> ## force evaluation of the arguments
> ifn
> igr
> ftrace 
> fn 
> gr 
> environment()
> }
> })
> 
> There is no need for local() here but usually one needs also some static 
> variables.
> Now every call gives a different environment  (but all have the same parent):
> 
> ea <- TraceSetup_2(fn = function(x) x^2 - 2*x + 1)
>> ls(ea)
> [1] "fn" "ftrace" "gr" "ifn""igr"   
>> ea$fn
> function(x) x^2 - 2*x + 1
>>
>> eb <- TraceSetup_2(fn = function(x) x^2 + 1)
>> eb$fn
> function(x) x^2 + 1
>>
>> ## ea$fn is still the same:
>> ea$fn
> function(x) x^2 - 2*x + 1
>>
> 
> Obviously, in this case some further arrangements are  needed for the 
> environments to be made available to the external world.
> 
> Hope this helps,
> Georgi Boshnakov
> 
> 
> -Original Message-
> From: R-package-devel [mailto:r-package-devel-boun...@r-project.org] On 
> Behalf Of J C Nash
> Sent: 28 August 2018 14:18
> To: Fox, John; Richard M. Heiberger
> Cc: List r-package-devel
> Subject: Re: [R-pkg-devel] Trying to work around missing functionality
> 
> Indeed, it appears that globalVariables must be outside the function. 
> However, I had quite a bit of
> fiddle to get things to work without errors or warnings or notes. While I now 
> have a package that
> does not complain with R CMD check, I am far from satisfied that I can give a 
> prescription. I had
> to remove lines in the rootfinder like
>envroot$fn <- fn
> that wer

Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-28 Thread J C Nash
Indeed, it appears that globalVariables must be outside the function. However, 
I had quite a bit of
fiddle to get things to work without errors or warnings or notes. While I now 
have a package that
does not complain with R CMD check, I am far from satisfied that I can give a 
prescription. I had
to remove lines in the rootfinder like
   envroot$fn <- fn
that were used to set the function to be used inside my instrumented function, 
and instead
call TraceSetup(fn=fn, ...) where a similar statement was given. Why that 
worked while the direct
assignment (note, not a <<- one) did not, I do not understand. However, I will 
work with this for
a while and try to get a better handle on it.

Thanks for the pointer. As an old-time programmer from days when you even set 
the add table, I'm
still uncomfortable just putting code in a directory and assuming it will be 
executed, i.e., the
globals.R file. However, I now have this set to establish the global structure 
as follows

> ## Put in R directory. 
> if(getRversion() >= "2.15.1") { utils::globalVariables(c('envroot')) } # Try 
> declaring here 
> groot<-list(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA, label="none")
> envroot <- list2env(groot) # Note globals in FnTrace

Then TraceSetup() is

> TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
>envroot$ifn <- ifn
>envroot$igr <- igr
>envroot$ftrace <- ftrace
>envroot$fn <- fn
>envroot$gr <- gr
>return()
> }

and it is called at the start of the rootfinder routine.

Thus I am establishing a global, then (re-)setting values in TraceSetup(), then
incrementing counters etc. in the instrumented FnTrace() that is the function 
for which I find
the root, which calls fn() given by the "user". Messy, but I can now track 
progress and measure
effort.

I'm sure there are cleaner solutions. I suggest offline discussion would be 
better until such
options are clearer.

Thanks again.

JN



On 2018-08-28 12:01 AM, Fox, John wrote:
> Hi John,
> 
> It's possible that I didn’t follow what you did, but it appears as if you 
> call globalVariables() *inside* the function. Instead try to do as Richard 
> Heiberger suggested and place the call outside of the function, e.g., in a 
> source file in the package R directory named globals.R. (Of course, the name 
> of the source file containing the command isn’t important.)
> 
> I hope this helps,
>  John
> 
> -
> John Fox
> Professor Emeritus
> McMaster University
> Hamilton, Ontario, Canada
> Web: https://socialsciences.mcmaster.ca/jfox/
> 
> 
> 
>> -Original Message-
>> From: R-package-devel [mailto:r-package-devel-boun...@r-project.org] On
>> Behalf Of J C Nash
>> Sent: Monday, August 27, 2018 8:44 PM
>> To: Richard M. Heiberger 
>> Cc: List r-package-devel 
>> Subject: Re: [R-pkg-devel] Trying to work around missing functionality
>>
>> Unfortunately, makes things much worse. I'd tried something like this 
>> already.
>>
>>> * checking examples ... ERROR
>>> Running examples in ‘rootoned-Ex.R’ failed The error most likely
>>> occurred in:
>>>
>>>> ### Name: rootwrap
>>>> ### Title: zeroin: Find a single root of a function of one variable within
>>>> ###   a specified interval.
>>>> ### Aliases: rootwrap
>>>> ### Keywords: root-finding
>>>>
>>>> ### ** Examples
>>>>
>>>> # Dekker example
>>>> # require(rootoned)
>>>> dek <- function(x){ 1/(x-3) - 6 }
>>>> r1 <- rootwrap(dek, ri=c(3.001, 6), ftrace=TRUE,
>>>> method="uniroot")
>>> Error in registerNames(names, package, ".__global__", add) :
>>>   The namespace for package "rootoned" is locked; no changes in the global
>> variables list may be made.
>>> Calls: rootwrap -> TraceSetup ->  -> registerNames
>>> Execution halted
>>
>> Also had to use utils::globalVariables( ...
>>
>> JN
>>
>>
>> On 2018-08-27 08:40 PM, Richard M. Heiberger wrote:
>>> Does this solve the problem?
>>>
>>> if (getRversion() >= '2.15.1')
>>>   globalVariables(c('envroot'))
>>>
>>> I keep this in file R/globals.R
>>>
>>> I learned of this from John Fox's use in Rcmdr.
>>>
>>> On Mon, Aug 27, 2018 at 8:28 PM, J C Nash 
>> wrote:
>>>> In order to track progress of a variety of rootfinding or
>>>> optimization routines that don't report some information I want, I'm
>>>>

Re: [R-pkg-devel] Trying to work around missing functionality

2018-08-27 Thread J C Nash
Unfortunately, makes things much worse. I'd tried something like this already.

> * checking examples ... ERROR
> Running examples in ‘rootoned-Ex.R’ failed
> The error most likely occurred in:
> 
>> ### Name: rootwrap
>> ### Title: zeroin: Find a single root of a function of one variable within
>> ###   a specified interval.
>> ### Aliases: rootwrap
>> ### Keywords: root-finding
>> 
>> ### ** Examples
>> 
>> # Dekker example
>> # require(rootoned)
>> dek <- function(x){ 1/(x-3) - 6 }
>> r1 <- rootwrap(dek, ri=c(3.001, 6), ftrace=TRUE, method="uniroot")
> Error in registerNames(names, package, ".__global__", add) : 
>   The namespace for package "rootoned" is locked; no changes in the global 
> variables list may be made.
> Calls: rootwrap -> TraceSetup ->  -> registerNames
> Execution halted

Also had to use utils::globalVariables( ...

JN


On 2018-08-27 08:40 PM, Richard M. Heiberger wrote:
> Does this solve the problem?
> 
> if (getRversion() >= '2.15.1')
>   globalVariables(c('envroot'))
> 
> I keep this in file R/globals.R
> 
> I learned of this from John Fox's use in Rcmdr.
> 
> On Mon, Aug 27, 2018 at 8:28 PM, J C Nash  wrote:
>> In order to track progress of a variety of rootfinding or optimization
>> routines that don't report some information I want, I'm using the
>> following setup (this one for rootfinding).
>>
>> TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
>> # JN: Define globals here
>>groot<-list(ifn=ifn, igr=igr, ftrace=ftrace, fn=fn, gr=gr, label="none")
>>envroot <<- list2env(groot) # Note globals in FnTrace
>>## This generates a NOTE that
>>## TraceSetup: no visible binding for '<<-' assignment to ‘envroot’
>> ##   envroot<-list2env(groot, parent=.GlobalEnv) # Note globals in FnTrace 
>> -- this does NOT work
>>## utils::globalVariables("envroot") # Try declaring here -- causes errors
>> # end globals
>>envroot
>> }
>>
>> FnTrace <- function(x,...) {
>>   # Substitute function to call when rootfinding
>>   # Evaluate fn(x, ...)
>> val <- envroot$fn(x, ...)
>> envroot$ifn <- envroot$ifn + 1 # probably more efficient ways
>> if (envroot$ftrace) {
>>cat("f(",x,")=",val," after ",envroot$ifn," ",envroot$label,"\n")
>> }
>> val
>> }
>>
>>
>> Perhaps there are better ways to do this, but this does seem to work quite 
>> well.
>> It lets me call a rootfinder with FnTrace and get information on evaluations 
>> of fn().
>> (There's another gr() routine, suppressed here.)
>>
>> However, R CMD check gives a NOTE for
>>
>>   TraceSetup: no visible binding for global variable ‘envroot’
>>   Undefined global functions or variables:
>> envroot
>>
>> The commented lines in TraceSetup suggest some of the things I've tried. 
>> Clearly I don't
>> fully comprehend how R is grinding up the code, but searches on the net seem 
>> to indicate
>> I am far from alone. Does anyone have any suggestion of a clean way to avoid 
>> the NOTE?
>>
>> JN
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Trying to work around missing functionality

2018-08-27 Thread J C Nash
In order to track progress of a variety of rootfinding or optimization
routines that don't report some information I want, I'm using the
following setup (this one for rootfinding).

TraceSetup <- function(ifn=0, igr=0, ftrace=FALSE, fn=NA, gr=NA){
# JN: Define globals here
   groot<-list(ifn=ifn, igr=igr, ftrace=ftrace, fn=fn, gr=gr, label="none")
   envroot <<- list2env(groot) # Note globals in FnTrace
   ## This generates a NOTE that
   ## TraceSetup: no visible binding for '<<-' assignment to ‘envroot’
##   envroot<-list2env(groot, parent=.GlobalEnv) # Note globals in FnTrace -- 
this does NOT work
   ## utils::globalVariables("envroot") # Try declaring here -- causes errors
# end globals
   envroot
}

FnTrace <- function(x,...) {
  # Substitute function to call when rootfinding
  # Evaluate fn(x, ...)
val <- envroot$fn(x, ...)
envroot$ifn <- envroot$ifn + 1 # probably more efficient ways
if (envroot$ftrace) {
   cat("f(",x,")=",val," after ",envroot$ifn," ",envroot$label,"\n")
}
val
}


Perhaps there are better ways to do this, but this does seem to work quite well.
It lets me call a rootfinder with FnTrace and get information on evaluations of 
fn().
(There's another gr() routine, suppressed here.)

However, R CMD check gives a NOTE for

  TraceSetup: no visible binding for global variable ‘envroot’
  Undefined global functions or variables:
envroot

The commented lines in TraceSetup suggest some of the things I've tried. 
Clearly I don't
fully comprehend how R is grinding up the code, but searches on the net seem to 
indicate
I am far from alone. Does anyone have any suggestion of a clean way to avoid 
the NOTE?

JN

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] CRAN incoming queue closed from Sep 1 to Sep 9

2018-08-14 Thread J C Nash
Will pending queries to CRAN-submissions about false positives in the check 
process be cleared
first, or left pending? I've been waiting quite a while re: new optimx package,
which has 1 so-called "ERROR" (non-convergence, please use a different method 
msg)
and 1 WARNING because new optimx subsumes optextras.

An alternative I can accept is an invite to resubmit after the reset, but those
of us in limbo do need to know to avoid time wasting on all sides.

Best, JN


On 2018-08-14 10:36 AM, Hadley Wickham wrote:
> Does this include automatically (bot) accepted submissions?
> Hadley
> On Tue, Aug 14, 2018 at 8:07 AM Uwe Ligges
>  wrote:
>>
>> Dear package developers,
>>
>> the CRAN incoming queue will be closed from Sep 1 to Sep 9. Hence
>> package submissions are only possible before and after that period.
>>
>> Best,
>> Uwe Ligges
>> (for the CRAN team)
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> 
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] trace in uniroot() ?

2018-08-13 Thread J C Nash
Despite my years with R, I didn't know about trace(). Thanks.

However, my decades in the minimization and root finding game make me like 
having
a trace that gives some info on the operation, the argument and the current 
function value.
I've usually found glitches are a result of things like >= rather than > in 
tests etc., and
knowing what was done is the quickest way to get there.

This is, of course, the numerical software developer view. I know "users" (a 
far too vague
term) don't like such output. I've sometimes been tempted with my svd or 
optimization codes to
have a return message in bold-caps "YOUR ANSWER IS WRONG AND THERE'S A LAWYER 
WAITING TO
MAKE YOU PAY", but I usually just satisfy myself with "Not at a minimum/root".

Best, JN

On 2018-08-13 06:00 PM, William Dunlap wrote:
> I tend to avoid the the trace/verbose arguments for the various root finders 
> and optimizers and instead use the trace
> function or otherwise modify the function handed to the operator.  You can 
> print or plot the arguments or save them.  E.g.,
> 
>> trace(ff, print=FALSE, quote(cat("x=", deparse(x), "\n", sep="")))
> [1] "ff"
>> ff0 <- uniroot(ff, c(0, 10))
> x=0
> x=10
> x=0.0678365490630423
> x=5.03391827453152
> x=0.490045026724842
> x=2.76198165062818
> x=1.09760394309444
> x=1.92979279686131
> x=1.34802524899502
> x=1.38677998493585
> x=1.3862897003949
> x=1.38635073555115
> x=1.3862897003949
> 
> or
> 
>> X <- numeric()
>> trace(ff, print=FALSE, quote(X[[length(X)+1]] <<- x))
> [1] "ff"
>> ff0 <- uniroot(ff, c(0, 10))
>> X
>  [1]  0. 10.  0.06783655
>  [4]  5.03391827  0.49004503  2.76198165
>  [7]  1.09760394  1.92979280  1.34802525
> [10]  1.38677998  1.38628970  1.38635074
> [13]  1.38628970
> 
> This will not tell you why the objective function is being called (e.g. in a 
> line search
> or in derivative estimation), but some plotting or other postprocessing can 
> ususally figure that out.
> 
> 
> Bill Dunlap
> TIBCO Software
> wdunlap tibco.com <http://tibco.com>
> 
> On Mon, Jul 30, 2018 at 11:35 AM, J C Nash  <mailto:profjcn...@gmail.com>> wrote:
> 
> In looking at rootfinding for the histoRicalg project (see 
> gitlab.com/nashjc/histoRicalg
> <http://gitlab.com/nashjc/histoRicalg>),
> I thought I would check how uniroot() solves some problems. The following 
> short example
> 
> ff <- function(x){ exp(0.5*x) - 2 }
> ff(2)
> ff(1)
> uniroot(ff, 0, 10)
> uniroot(ff, c(0, 10), trace=1)
> uniroot(ff, c(0, 10), trace=TRUE)
> 
> 
> shows that the trace parameter, as described in the Rd file, does not 
> seem to
> be functional except in limited situations (and it suggests an
> integer, then uses a logical for the example, e.g.,
>  ## numerically, f(-|M|) becomes zero :
>      u3 <- uniroot(exp, c(0,2), extendInt="yes", trace=TRUE)
> )
> 
> When extendInt is set, then there is some information output, but trace 
> alone
> produces nothing.
> 
> I looked at the source code -- it is in R-3.5.1/src/library/stats/R/nlm.R 
> and
> calls zeroin2 code from R-3.5.1/src/library/stats/src/optimize.c as far 
> as I
> can determing. My code inspection suggests trace does not show the 
> iterations
> of the rootfinding, and only has effect when the search interval is 
> allowed
> to be extended. It does not appear that there is any mechanism to ask
> the zeroin2 C code to display intermediate work.
> 
> This isn't desperately important for me as I wrote an R version of the 
> code in
> package rootoned on R-forge (which Martin Maechler adapted as unirootR.R 
> in
> Rmpfr so multi-precision roots can be found). My zeroin.R has 'trace' to 
> get
> the pattern of different steps. In fact it is a bit excessive. Note
> unirootR.R uses 'verbose' rather than 'trace'. However, it would be nice 
> to be
> able to see what is going on with uniroot() to verify equivalent 
> operation at
> the same precision level. It is very easy for codes to be very slightly
> different and give quite widely different output.
> 
> Indeed, even without the trace, we see (zeroin from rootoned here)
> 
> > zeroin(ff, c(0, 10), trace=FALSE)
> $root
> [1] 1.386294
> 
> $froot
> [1] -5.658169e-10
> 
> $rtol
> [1] 7.450581e-09
> 
> $maxit
> [1] 9
> 
> > uniroot(ff, c(0, 10), trace=FALSE)
> $root
> [1] 1.38629
> 
> $f.root
> [1] -4.66072e-06

[Rd] typo in Ubuntu download page

2018-08-04 Thread J C Nash
In https://cran.r-project.org/bin/linux/ubuntu/

Administration and Maintances of R Packages
   ^^

Minor stuff, but if someone who can edit is on the page,
perhaps it can be changed to "Maintenance"

Best, JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] trace in uniroot() ?

2018-07-30 Thread J C Nash
In looking at rootfinding for the histoRicalg project (see 
gitlab.com/nashjc/histoRicalg),
I thought I would check how uniroot() solves some problems. The following short 
example

ff <- function(x){ exp(0.5*x) - 2 }
ff(2)
ff(1)
uniroot(ff, 0, 10)
uniroot(ff, c(0, 10), trace=1)
uniroot(ff, c(0, 10), trace=TRUE)


shows that the trace parameter, as described in the Rd file, does not seem to
be functional except in limited situations (and it suggests an
integer, then uses a logical for the example, e.g.,
 ## numerically, f(-|M|) becomes zero :
 u3 <- uniroot(exp, c(0,2), extendInt="yes", trace=TRUE)
)

When extendInt is set, then there is some information output, but trace alone
produces nothing.

I looked at the source code -- it is in R-3.5.1/src/library/stats/R/nlm.R and
calls zeroin2 code from R-3.5.1/src/library/stats/src/optimize.c as far as I
can determing. My code inspection suggests trace does not show the iterations
of the rootfinding, and only has effect when the search interval is allowed
to be extended. It does not appear that there is any mechanism to ask
the zeroin2 C code to display intermediate work.

This isn't desperately important for me as I wrote an R version of the code in
package rootoned on R-forge (which Martin Maechler adapted as unirootR.R in
Rmpfr so multi-precision roots can be found). My zeroin.R has 'trace' to get
the pattern of different steps. In fact it is a bit excessive. Note
unirootR.R uses 'verbose' rather than 'trace'. However, it would be nice to be
able to see what is going on with uniroot() to verify equivalent operation at
the same precision level. It is very easy for codes to be very slightly
different and give quite widely different output.

Indeed, even without the trace, we see (zeroin from rootoned here)

> zeroin(ff, c(0, 10), trace=FALSE)
$root
[1] 1.386294

$froot
[1] -5.658169e-10

$rtol
[1] 7.450581e-09

$maxit
[1] 9

> uniroot(ff, c(0, 10), trace=FALSE)
$root
[1] 1.38629

$f.root
[1] -4.66072e-06

$iter
[1] 10

$init.it
[1] NA

$estim.prec
[1] 6.103516e-05

>

Is the lack of trace a bug, or at least an oversight? Being able to follow 
iterations is a
classic approach to checking that computations are proceeding as they should.

Best, JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] can't reproduce cran-pretest error

2018-07-26 Thread J C Nash
I think several of us have had similar issues lately. You might have seen my 
posts on reverse dependencies.
It seems there are some sensitivities in the CRAN test setup, though I think 
things are improving.

Last week I submitted optimx again. I don't think I changed anything but the 
date and some commentary
in documentation files. The pre-test was much better than before, but still had 
two "complaints". One of
these was an "ERROR" of the type "non-convergence -- please choose a different 
minimizer" (in mvord)
and the other was a "WARNING" since my new package (optimx) subsumes the 
functionality of several other
packages, including optextras, so the functions from that were now masked, as 
might be expected. This
was in surrosurv.

However, I did follow protocol and report these false positives, but have had 
no response from CRAN
team. Nor any response to previous msgs about 2 months ago. I suspect some 
volunteer overload, but
if that is the case, I would hope the CRAN team would ask for help. I know 
there are several of us
willing to help if we can. And the new functionality does fix some actual bugs, 
as well as providing
improvements. Without renewal, particularly for infrastructure packages, R will 
decay.

Cheers, JN



On 2018-07-26 03:11 PM, Brad Eck wrote:
> Dear list,
> 
> I'm having trouble reproducing errors from CRAN's pretests.
> 
> I have a package on CRAN called epanet2toolkit that provides R bindings
> to a legacy simulation engine written in C.  So far I've released two
> versions
> to CRAN without trouble.  Now I'm making a third release, principally to
> include
> a citation for the package, but also to clean up warnings raised by new
> compilers.
> 
> My latest submission fails the CRAN pretests for Debian with errors in the
> examples and tests:
> https://win-builder.r-project.org/incoming_pretest/
> epanet2toolkit_0.3.0_20180726_102947/Debian/00check.log
> 
> For what it's worth, the package checks fine under R-3.4.4, R-3.5.0 and
> R-devel
> r74997 (2018-07-21) and r74923 (2018-06-20).
> 
> However, when I run the debian-r-devel checks locally (albeit in Docker) I
> get
> a couple of warnings, but no errors. Since I can't reproduce the error, it's
> difficult to fix. See below the relevant lines of 00check.log:
> 
> * using log directory ‘/pkg/epanet2toolkit.Rcheck’
> * using R Under development (unstable) (2018-07-25 r75006)
> * using platform: x86_64-pc-linux-gnu (64-bit)
> * using session charset: UTF-8
> * using option ‘--as-cran’
> * checking for file ‘epanet2toolkit/DESCRIPTION’ ... OK
> * checking extension type ... Package
> ...
> * checking whether package ‘epanet2toolkit’ can be installed ... WARNING
> Found the following significant warnings:
>   text.h:421:9: warning: ‘KwKw  /d’ directive writing
> 30 bytes into a region of size between 23 and 278 [-Wformat-overflow=]
> See ‘/pkg/epanet2toolkit.Rcheck/00install.out’ for details.
> ...
> * checking compilation flags used ... WARNING
> Compilation used the following non-portable flag(s):
>   ‘-Wdate-time’ ‘-Werror=format-security’ ‘-Wformat’
> * checking compiled code ... OK
> * checking examples ... OK
> * checking for unstated dependencies in ‘tests’ ... OK
> * checking tests ... OK
>   Running ‘testthat.r’
> * checking PDF version of manual ... OK
> * DONE
> Status: 2 WARNINGs, 1 NOTE
> 
> 
> Thanks in advance for any insights.
> 
>   [[alternative HTML version deleted]]
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Reverse dependencies - again

2018-07-11 Thread J C Nash
Excuses for the length of this, but optimx has a lot of packages
using it.

Over the past couple of months, I have been struggling with checking
a new, and much augmented, optimx package. It offers some serious
improvements:
  - parameter scaling on all methods
  - two safeguarded Newton methods
  - improved gradient approximations
  - more maintainable structure for adding new solvers in future
  - a single method "solver" (wrapper) that uses optim() syntax

However, I have been going through a lot of bother with reverse
dependency checking. In summary, tools::check_packages and
devtools::revdep_check both give lots of complaints, in particular
about packages that cannot be installed. When I run my own checks
(see below), I get no issues that reflect on my own package. Similarly,
Duncan Murdoch was very helpful and ran a check on an earlier version
with no major issues using his own check program. I have noticed,
or been told, that several other workers have their own procedures
and that they have experienced somewhat similar difficulties.

Unfortunately, my last effort to submit to CRAN got blocked at the
pre-scan stage because of reverse dependencies. I have not been able
to fix what I cannot find, however, as it appears I am getting caught
on an obstacle outside my control. I did not get a response from the
CRAN team, but the timing was in the middle of the R3.5 launch, so
understandable. However, I am very reluctant to use submission to
CRAN as a checking tool.

My questions:

1) Am I missing something in my method of calling tools or devtools?
The code is included below. Or not reading the output carefully?
I will be happy to put detail on a website -- too large for here.

2) Is it time to consider an effort to provide online revdep checking
that would avoid pressure on CRAN team directly and would provide
clearer indicators of the issues raised by a particular package? I'd
be happy to assist in such an effort, as it appears to be needed, and
with appropriate output and links could aid developers to improve
their packages.

Cheers,

John Nash

tools::check_packages_in_dir finds install fail for

BioGeoBEARS
CensSpatial
Countr
ecd
ldhmm
LifeHist
macc
marked
midasr
QuantumClone
spfrontier
surrosurv

Code:

# cranrevdep

require(tools)
pkgdir <- "~/temp/wrkopt/srcpkg"
jcheck<-check_packages_in_dir(pkgdir,
  check_args = c("--as-cran", ""),
  reverse = list(repos = getOption("repos")["CRAN"]))
summary(jcheck)

-

devtools::revdep_check finds install fail for:

afex
IRTpp
lme4

as well as

BioGeoBEARS
CensSpatial
Countr
ecd
ldhmm
LifeHist
macc
marked
midasr
QuantumClone
spfrontier
surrosurv

Summary:
Saving check results to `revdep/check.rds` 
---
Cleaning up 
--
* Failed to install: afex, BioGeoBEARS, CensSpatial, Countr, ecd, IRTpp, ldhmm, 
LifeHist, lme4, macc, marked, midasr,
QuantumClone, spfrontier, surrosurv
* ACDm: checking compilation flags used ... WARNING
* languageR: checking Rd cross-references ... WARNING
* mvord: checking compilation flags used ... WARNING
* RandomFields: checking compilation flags used ... WARNING
* rankdist: checking compilation flags used ... WARNING
* regsem: checking compilation flags used ... WARNING


21 packages with problems

|package  |version | errors| warnings| notes|
|:|:---|--:|:|-:|
|ACDm |1.0.4   |  0|1| 1|
|afex |0.21-2  |  1|0| 0|
|BioGeoBEARS  |0.2.1   |  1|0| 0|
|CensSpatial  |1.3 |  1|0| 0|
|Countr   |3.4.1   |  1|0| 0|
|ecd  |0.9.1   |  1|0| 0|
|IRTpp|0.2.6.1 |  1|0| 0|
|languageR|1.4.1   |  0|1| 4|
|ldhmm|0.4.5   |  1|0| 0|
|LifeHist |1.0-1   |  1|0| 0|
|lme4 |1.1-17  |  1|0| 0|
|macc |1.0.1   |  1|0| 0|
|marked   |1.2.1   |  1|0| 0|
|midasr   |0.6 |  1|0| 0|
|mvord|0.3.1   |  0|1| 0|
|QuantumClone |1.0.0.6 |  1|0| 0|
|RandomFields |3.1.50  |  0|1| 2|
|rankdist |1.1.3   |  0|1| 0|
|regsem   |1.1.2   |  0|1| 0|
|spfrontier   |0.2.3   |  1|0| 0|
|surrosurv|1.1.24  |  1|0| 0|


But there are 43 dependencies, so must we assume rest are OK?
> require(devtools)
Loading required package: devtools
> rdlist <- revdep()
> rdlist
 [1] "ACDm"  "afex"  "bbmle"
 [4] "BioGeoBEARS"   "calibrar"  "CatDyn"
 [7] "CensSpatial"   "CJAMP" "Countr"
[10] "dimRed""ecd"   

[R-pkg-devel] Concern that reverse dependency checking currently unreliable

2018-06-20 Thread J C Nash
For the past few weeks I've been struggling to check a new version of optimx 
that gives
a major upgrade to the 2013 version currently on CRAN and subsumes several 
other packages.

It seems to work fine, and pass Win-builder, R CMD check etc.

However, both devtools and cran reverse dependency checks give multiple errors 
when
run through their respective special commands. Mostly the packages using optimx 
are
failing to install or otherwise stop (wrong testthat version etc.). When I check
manually (if package installs) that package XXX passes R CMD check with new
optimx installed, it invariably does. The automated checks -- given that optimx
is widely used -- take several hours per try.

Since it "fails" the cran reverse check, I'm not getting by the pre-test for 
cran
submission. CRAN maintainers have not responded, likely because they are 
swamped.

So I'm wondering if there is something amiss with either the checking tools or,
more likely, that the upgrade to R 3.5 has put some wrinkles in the package 
collection,
and, in particular, what I should do to provide the new package. I am rather
reluctant to join the growing crowd on github, but may have to if CRAN cannot
be updated.

Ideas welcome.

John Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] histoRicalg -- project to document older methods used by R and transfer knowledge

2018-06-05 Thread J C Nash
After some thought, I decided r-devel was probably the best of the R lists
for this item. Do feel free to share, as the purpose is to improve documentation
and identify potential issues.

John Nash



The R Consortium has awarded some modest funding for "histoRicalg",
a project to document and transfer knowledge of some older algorithms
used by R and by other computational systems. These older codes
are mainly in Fortran, but some are in C, with the original implementations
possibly in other programming languages. My efforts
were prompted by finding some apparent bugs in codes, which could be either
from the original programs or else the implementations. Two examples
in particular -- in nlm() and in optim::L-BFGS-B -- gave impetus
to the project.

As a first task, I am hoping to establish a "Working Group on
Algorithms Used in R" to identify and prioritize issues and to
develop procedures for linking older and younger workers to enable
the transfer of knowledge. Expressions of interest are welcome,
either to me (nashjc _at_ uottawa.ca) or to the mailing list
(https://lists.r-consortium.org/g/rconsortium-project-histoRicalg).
A preliminary web-site is at https://gitlab.com/nashjc/histoRicalg.

While active membership of the Working Group is desirable, given
the nature of this project, I anticipate that most members will
contribute mainly by providing timely and pertinent ideas. Some
may not even be R users, since the underlying algorithms are used
by other computing systems and the documentation effort has many
common features. We will also need participation of younger
workers willing to learn about the methods that underly the
computations in R.

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[R-pkg-devel] SIAM Wilkinson prize

2018-05-18 Thread J C Nash
It occurs to me that there could be packages developed by early career R 
developers that might fit
this prize which is considered quite prestigious (not to mention the cash) in 
the numerical methods community.
It is also likely that people may not be aware of the award in the R community.

Cheers, JN



 Forwarded Message 
Subject:[SIAM-OPT] June 1 Entry Deadline - James H. Wilkinson Prize for 
Numerical Software
Date:   Thu, 17 May 2018 14:22:41 +
From:   SIAM Prize Program 
CC: Optimization SIAG mailing list 



James H. Wilkinson Prize for Numerical Software

*cid:image001.png@01D29F3D.6ECC9B50* 


The deadline is June 1 for entries for the James H. Wilkinson Prize for 
Numerical Software
. We are looking 
for submissions of high-quality numerical
software from early career teams. If you or your team are developing numerical 
software for scientific computing, act as
a nominator and enter your software for the prize. To submit an entry, you 
first need to create an account at the SIAM
Prize Portal. Click on the “Submit” button above to start the process.

The James H. Wilkinson Prize for Numerical Software is awarded every four years 
to the authors of an outstanding piece
of numerical software. The prize is awarded for an entry that best addresses 
all phases of the preparation of
high-quality numerical software. It is intended to recognize innovative 
software in scientific computing and to
encourage researchers in the earlier stages of their career.

SIAM will award the Wilkinson Prize for Numerical Software at the SIAM 
Conference on Computational Science and
Engineering (CSE19). The award will consist of $3,000 and a plaque. As part of 
the award, the recipient(s) will be
expected to present a lecture at the conference.





*Eligibility Criteria:*

Selection will be based on: clarity of the software implementation and 
documentation, importance of the application(s)
addressed by the software; portability, reliability, efficiency, and usability 
of the software implementation; clarity
and depth of analysis of the algorithms and the software in the accompanying 
paper; and quality of the test software.

Candidates must have worked in mathematics or science for at most 12 years 
(full time equivalent) after receiving their
PhD as of January 1 of the award year, allowing for breaks in continuity. The 
prize committee can make exceptions, if in
their opinion the candidate is at an equivalent stage in their career.

For the 2019 award, a candidate must have received their PhD no earlier than 
January 1, 2007.



*Entry Deadline:*

*June 1, 2018*



*Required Materials:*

· CVs of the authors of the software, at most two pages per author (PDF)

· A two-page summary of the main features of the algorithm and software 
implementation (PDF)

· A paper describing the algorithm and the software implementation (PDF)

· Open source software written in a widely available high-level 
programming language. The software should be
submitted in a gzipped .tar archive with a README file describing the contents 
of the archive. Each submission should
include documentation, examples of the use of the software, a test program, and 
scripts for executing the test programs.





*Previous recipients:*



Previous recipients of the James H. Wilkinson Prize for Numerical Software are:



*2015*Patrick Farrell, Simon Funke, David Ham, and Marie Rognes for 
dolfin-adjoint
*2011 *Andreas Waechter and Carl Laird for IPOPT

*2007 *Wolfgang Bangerth, Guido Kanschat, and Ralf Hartmann for deal.II

*2003 *Jonathan Shewchuk for Triangle

*1999 *Matteo Frigo and Steven Johnson for FFTW
*1995 *Chris Bischof and Alan Carle for ADIFOR 2.0
*1991 *Linda Petzold for DASSL





*Selection Committee:*

Jorge Moré (Chair), Argonne National Laboratory
Sven Hammarling, Numerical Algorithms Group Ltd and University of Manchester
Michael Heroux, Sandia National Laboratories
Randall J. LeVeque, University of Washington
Katherine Yelick, Lawrence Berkeley National Laboratory



Learn more 

Re: [R-pkg-devel] CRAN pretest archived because of 2 NOTEs

2018-04-18 Thread J C Nash
If NOTEs are going to be treated as errors, then a lot of infrastructure (all my
packages for optimization and nonlinear modelling, which are dependencies of
a few dozen other packages etc.) will disappear. This is because they have 
version
numbering I've been using in some form that pre-dates R and uses -M(M).D(D).
e.g., NOTE "Version contains large components (2018-3.28)"

I believe changing it to a "smaller" value will mean the submission is refused
on an ERROR since the numbering will be out of order.

So perhaps it is time either to revisit NOTEs to drop some unnecessary ones,
and also to make some careful decisions and change critical ones to WARNINGs or
ERRORs.

One of the major concerns I have is that it is desirable that CRAN be the
true repository for R packages, and that increased restrictions -- especially
if unnecessary -- will surely increase the movement to informal distribution
on other platforms like Github. Such fragmentation of the package universe
weakens R as a resource, and we see quite a lot of this recently.

I'm strongly in favour of having fairly strict standards, but also of ensuring
that only necessary restrictions are enforced. Even more, I believe we must
keep working to make satisfying the standards as easy as possible. R has done
a good job of this, but there is always room to improve.

JN




On 2018-04-18 01:40 PM, Hadley Wickham wrote:
> For the purposes of CRAN submission, you should basically treat every
> NOTE as an ERROR.
> 
> Hadley
> 
> On Wed, Apr 18, 2018 at 3:36 AM, Gertjan van den Burg
>  wrote:
>> While waiting to get this message posted to the list, I've solved the
>> problem by copying the stdlib rand() and srand() functions into my package
>> under a different name. This makes the check pass and ensures my RNG does
>> not interfere with R's RNG.
>>
>> I do think that if this NOTE causes immediate dismissal of a package, it
>> shouldn't be a NOTE but an ERROR. Otherwise it just leads to a lot of wasted
>> time waiting for a reply from the maintainers to respond to the note.
>>
>>> Dear all,
>>>
>>> My CRAN submission doesn't pass the pre-tests and gets archived. I've
>>> emailed cran-submissi...@r-project.org explaining that these are false
>>> positives, but since I haven't heard back in 10 days I don't think anyone
>>> read that. Same thing for the submission comments (which also explained
>>> it).
>>>
>>> The first note is:
>>>
>>> * checking CRAN incoming feasibility ... NOTE
>>> Maintainer: ‘Gertjan van den Burg ’
>>>
>>> New submission
>>>
>>> Possibly mis-spelled words in DESCRIPTION:
>>>GenSVM (8:18, 10:61, 15:2, 16:26, 19:11)
>>>Multiclass (4:22)
>>>SVMs (14:25, 15:42)
>>>misclassifications (11:49)
>>>multiclass (8:53, 14:14, 15:31)
>>>
>>>
>>> These words are not mis-spelled, so this is a false positive.
>>>
>>> The second note is:
>>>
>>> * checking compiled code ... NOTE
>>> File ‘gensvm/libs/gensvm_wrapper.so’:
>>>Found ‘rand’, possibly from ‘rand’ (C)
>>>  Objects: ‘gensvm/src/gensvm_cv_util.o’, ‘gensvm/src/gensvm_init.o’,
>>>‘gensvm/lib/libgensvm.a’
>>>Found ‘srand’, possibly from ‘srand’ (C)
>>>  Objects: ‘gensvm/src/gensvm_train.o’, ‘gensvm/lib/libgensvm.a’
>>>
>>> Compiled code should not call entry points which might terminate R nor
>>> write to stdout/stderr instead of to the console, nor use Fortran I/O
>>> nor system RNGs.
>>>
>>> See ‘Writing portable packages’ in the ‘Writing R Extensions’ manual.
>>>
>>>
>>> This is probably why the package is rejected. I have a valid use case for
>>> using rand() and srand(): I'm trying to maintain compatibility of this
>>> package with the corresponding Python package. By using rand en srand
>>> users
>>> can reproduce models in both languages.
>>>
>>> Does anyone have any ideas on how I can get the package excepted to CRAN?
>>>
>>> Thanks,
>>>
>>> Gertjan van den Burg
>>>
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 
> 
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] Minor glitch in optim()

2018-04-17 Thread J C Nash
Having worked with optim() and related programs for years, it surprised me
that I haven't noticed this before, but optim() is inconsistent in how it
deals with bounds constraints specified at infinity. Here's an example:

# optim-glitch-Ex.R
x0<-c(1,2,3,4)
fnt <- function(x, fscale=10){
  yy <- length(x):1
  val <- sum((yy*x)^2)*fscale
}
grt <- function(x, fscale=10){
  nn <- length(x)
  yy <- nn:1
  #gg <- rep(NA,nn)
  gg <- 2*(yy^2)*x*fscale
  gg
}

npar <- 4
lower <- -Inf
l2 <- rep(-Inf,npar)

a1 <- optim(x0, fnt, grt, lower=lower, method="BFGS") # works
a1
a1a<- optim(x0, fnt, grt, lower=l2, method="BFGS") # changes method!
a1a

The first call uses BFGS method without warning. The second gives
a warning that L-BFGS-B should be used, and from the output uses
this.

This is a bit of an edge case. My own preference would be for optim()
to simply fail if bounds of any type are specified without L-BFGS-B
as the method. I believe that gives clarity, even though infinite
bounds imply an unconstrained problem.

The behaviour where a scalar infinite bound is treated as unconstrained
but a vector is not is inconsistent, however, and I think that at some
point should be fixed. Possibly the easiest way is to treat infinite
bounds specified as a vector the same as those specified as a scalar.
That is to adjust the code in File src/library/stats/R/optim.R
in the block

if((length(lower) > 1L || length(upper) > 1L ||
   lower[1L] != -Inf || upper[1L] != Inf)
   && !any(method == c("L-BFGS-B","Brent"))) {
warning("bounds can only be used with method L-BFGS-B (or Brent)")
method <- "L-BFGS-B"
}

Possibly

if((any(is.finite(lower) || any(is.finite(upper))
   && !any(method == c("L-BFGS-B","Brent"))) {
warning("bounds can only be used with method L-BFGS-B (or Brent)")
method <- "L-BFGS-B"
}

Best, JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Indeed these are useful for one of my present tasks. Thanks. JN

On 2018-04-11 03:10 PM, Georgi Boshnakov wrote:
> 
> Hi, 
> 
> Not really an answer but I only recently discovered  devtools::revdep(), 
> which automates checking reverse dependencies. 
> 
> Georgi Boshnakov
> 
> 
> 
> 
> 
> From: R-package-devel [r-package-devel-boun...@r-project.org] on behalf of J 
> C Nash [profjcn...@gmail.com]
> Sent: 11 April 2018 19:05
> To: List r-package-devel
> Subject: [R-pkg-devel] Saving output of check()
> 
> Hi,
> 
> In trying to test that an upgrade to my optimx package does not break other
> packages, I wanted to loop over a list of all such packages in alldep, with
> nall the length of this list.
> 
> cat("Check the dependent packages\n")
> for (ii in 1:nall){
>   cpkg <- alldep[ii]
>   dd <- "/home/john/temp/wrkopt/dlpkg"
>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>   cat("Downloaded ", dlname,"\n")
>   cpkg.chk <- devtools::check_built(dlname)
>   cat("Results package:",cpkg,"\n")
>   print(cpkg.chk)
> }
> 
> Before running this, I did
> 
> sink("dpkgcheck.txt", split=TRUE)
> 
> and afterwards, I did sink().
> 
> But ... none of the check output, nor the result of the final print, show
> up in the output file dpkgcheck.txt.
> 
> Have I totally misunderstood sink(), or is there a nasty bug?
> 
> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
> 18.3 Sylvia.
> 
> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 16:32:20 
> UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
> john@john-j6-18 ~ $ R
> 
> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
> 
> 
> J C Nash
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Another workaround is to use

tlogl <- readLines(attr(cpkg.chk, "path"))

Possibly this may suggest a way to improve functionality.

JN

On 2018-04-11 03:24 PM, Henrik Bengtsson wrote:
> R CMD check, which is used internally runs checks in standalone
> background R processes.  Output from these is not capturable/sinkable
> by the master R process.  The gist of what's happening is:
> 
>> sink("output.log")
>> system("echo hello")  ## not sinked/captured
> hello
>> sink()
>> readLines("output.log")
> character(0)
> 
> /Henrik
> 
> On Wed, Apr 11, 2018 at 11:05 AM, J C Nash <profjcn...@gmail.com> wrote:
>> Hi,
>>
>> In trying to test that an upgrade to my optimx package does not break other
>> packages, I wanted to loop over a list of all such packages in alldep, with
>> nall the length of this list.
>>
>> cat("Check the dependent packages\n")
>> for (ii in 1:nall){
>>   cpkg <- alldep[ii]
>>   dd <- "/home/john/temp/wrkopt/dlpkg"
>>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>>   cat("Downloaded ", dlname,"\n")
>>   cpkg.chk <- devtools::check_built(dlname)
>>   cat("Results package:",cpkg,"\n")
>>   print(cpkg.chk)
>> }
>>
>> Before running this, I did
>>
>> sink("dpkgcheck.txt", split=TRUE)
>>
>> and afterwards, I did sink().
>>
>> But ... none of the check output, nor the result of the final print, show
>> up in the output file dpkgcheck.txt.
>>
>> Have I totally misunderstood sink(), or is there a nasty bug?
>>
>> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
>> 18.3 Sylvia.
>>
>> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 
>> 16:32:20 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
>> john@john-j6-18 ~ $ R
>>
>> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
>>
>>
>> J C Nash
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
I got several responses to my query. Henrik's does suggest "why", but I
am rather unhappy that R has this weakness. (See below for a sort of
workaround for Linux users.)

In particular, note that the check_built() function DOES return an object,
but it does NOT print().

In fact, putting alldep <- "embryogrowth" gives a result file

> Check the dependent packages
> Downloaded  /home/john/temp/wrkopt/dlpkg/embryogrowth_7.4.tar.gz 
> Results package: embryogrowth 
> 

while the bottom of the terminal file gives

> * checking data for non-ASCII characters ... OK
> * checking data for ASCII and uncompressed saves ... OK
> * checking examples ... OK
> * DONE
> 
> Status: OK
> 
> Results package: embryogrowth 
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> 
>> sink()
>> 

Now the object cpkg.chk is still present, so I continued the exercise (terminal
copy here)

> 
>> ls()
> [1] "alldep"   "cpkg" "cpkg.chk" "dd"   "dlname"   "ii"   "nall"  
>   
>> sink("sinktest2.txt", split=TRUE)
>> cpkg.chk
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> print(cpkg.chk)
> R CMD check results
> 0 errors | 0 warnings | 0 notes
> 
>> cat("note the above use just the object name as well as print()\n")
> note the above use just the object name as well as print()
>> sink()
>> 

but the file sinktest2.txt is just

> 
> note the above use just the object name as well as print()

Perhaps this isn't a bug, but it rather smells like one, especially the
failure to show the cpkg.chk.

Workaround for Linux: Run things via

R |& tee -a myteeoutput.txt

This will keep all the output (sink not needed). But it isn't quite as nice
for keeping the data.

I've also not managed to find a way to get the information out of the cpkg.chk
object. If someone knows how to do that, it would help.

Best, JN








On 2018-04-11 03:24 PM, Henrik Bengtsson wrote:
> R CMD check, which is used internally runs checks in standalone
> background R processes.  Output from these is not capturable/sinkable
> by the master R process.  The gist of what's happening is:
> 
>> sink("output.log")
>> system("echo hello")  ## not sinked/captured
> hello
>> sink()
>> readLines("output.log")
> character(0)
> 
> /Henrik
> 
> On Wed, Apr 11, 2018 at 11:05 AM, J C Nash <profjcn...@gmail.com> wrote:
>> Hi,
>>
>> In trying to test that an upgrade to my optimx package does not break other
>> packages, I wanted to loop over a list of all such packages in alldep, with
>> nall the length of this list.
>>
>> cat("Check the dependent packages\n")
>> for (ii in 1:nall){
>>   cpkg <- alldep[ii]
>>   dd <- "/home/john/temp/wrkopt/dlpkg"
>>   dlname <- download.packages(cpkg, destdir=dd )[[2]]
>>   cat("Downloaded ", dlname,"\n")
>>   cpkg.chk <- devtools::check_built(dlname)
>>   cat("Results package:",cpkg,"\n")
>>   print(cpkg.chk)
>> }
>>
>> Before running this, I did
>>
>> sink("dpkgcheck.txt", split=TRUE)
>>
>> and afterwards, I did sink().
>>
>> But ... none of the check output, nor the result of the final print, show
>> up in the output file dpkgcheck.txt.
>>
>> Have I totally misunderstood sink(), or is there a nasty bug?
>>
>> I've tried running in Rstudio and in the terminal. I'm running Linux Mint
>> 18.3 Sylvia.
>>
>> Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 
>> 16:32:20 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
>> john@john-j6-18 ~ $ R
>>
>> R version 3.4.4 (2018-03-15) -- "Someone to Lean On"
>>
>>
>> J C Nash
>>
>> __
>> R-package-devel@r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Saving output of check()

2018-04-11 Thread J C Nash
Hi,

In trying to test that an upgrade to my optimx package does not break other
packages, I wanted to loop over a list of all such packages in alldep, with
nall the length of this list.

cat("Check the dependent packages\n")
for (ii in 1:nall){
  cpkg <- alldep[ii]
  dd <- "/home/john/temp/wrkopt/dlpkg"
  dlname <- download.packages(cpkg, destdir=dd )[[2]]
  cat("Downloaded ", dlname,"\n")
  cpkg.chk <- devtools::check_built(dlname)
  cat("Results package:",cpkg,"\n")
  print(cpkg.chk)
}

Before running this, I did

sink("dpkgcheck.txt", split=TRUE)

and afterwards, I did sink().

But ... none of the check output, nor the result of the final print, show
up in the output file dpkgcheck.txt.

Have I totally misunderstood sink(), or is there a nasty bug?

I've tried running in Rstudio and in the terminal. I'm running Linux Mint
18.3 Sylvia.

Linux john-j6-18 4.10.0-38-generic #42~16.04.1-Ubuntu SMP Tue Oct 10 16:32:20 
UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
john@john-j6-18 ~ $ R

R version 3.4.4 (2018-03-15) -- "Someone to Lean On"


J C Nash

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] tibbles are not data frames

2017-09-26 Thread J C Nash
Duncan's observation is correct. The background work to the standards
I worked on was a big effort, and the content was a lot smaller than R,
though possibly similar in scope to dealing with the current question.
The "voting" was also very late in the process, after the proposals
were developed, discussed and written, so more a confirmation of a
decision than a vote to do some work.

On the other hand, I do think such effort has to be made from time to
time. On this particular matter I don't feel well-suited. However, the
collective body of material that is R is mostly a result of those of us
who are willing to put out the effort, particularly R-core members.

JN

On 2017-09-26 07:00 PM, Duncan Murdoch wrote:
> On 26/09/2017 4:52 PM, Jens Oehlschlägel wrote:
>>
>> On 26.09.2017 15:37, Hadley Wickham wrote:
>>> I decided to make [.tibble type-stable (i.e. always return a data
>>> frame) because this behaviour causes substantial problems in real data
>>> analysis code. I did it understanding that it would cause some package
>>> developers frustration, but I think it's better for a handful of
>>> package maintainers to be frustrated than hundreds of users creating
>>> dangerous code.g
>>>
>>> Hadley
>>>
>>
>> If that is right -- and I tend to believe it is right -- this change had
>> better been done in R core and not on package level. I think the root of
>> this evil is design inconsistencies of the language together with the
>> lack of removing these inconsistencies. The longer we hesitated, the
>> more packages such a change could break. The lack of addressing issues
>> in R core drives people to try to solve issues on package level. But now
>> we have two conflicting standards, i.e. a fork-within-the-language: Am I
>> a member of the tidyverse or not? Am I writing a package for the
>> tidyverse or for standard-R or for both. With a fork-of-the-language we
>> would at least have a majority vote for one of the two and only the
>> fitter would survive. But with a fork-within-the-language 'R' gets more
>> and more complex, and working with it more and more difficult. There is
>> not only the tidyverse, also the Rcppverse and I don't know how many
>> other verses. If there is no extinction of inconsistencies in R, not
>> sufficient evolution in R, but lots of evolution in Julia, evolution
>> will extinct R together with all its foobarverses in favor of Julia (or
>> Python). May be that's a good thing.
>>
>> I think tibble should respect drop=TRUE and respect the work of all
>> package authors who wrote defensive code and explicitly passed drop=
>> instead of relying on the (wrong) default. Again: better would be a
>> long-term clean-up roadmap of R itself and one simple standard called
>> 'data.frame'. Instead of forking or betting on any particular
>> foobarverse: why not have direct democratic votes about certain critical
>> features of such a long-term roadmap in such a big community?
> 
> 
> I think R Core would not be interested in a vote, because you'd be voting to 
> give them work to do, and that's really rude.
> 
> What would have a better chance of success would be for someone to write a 
> short article describing the proposal in
> detail, and listing all changes to CRAN and Bioconductor packages that would 
> be necessary to implement it.  That's a lot
> of work!  Do you have time to do it?
> 
> Duncan Murdoch
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel

Re: [R-pkg-devel] tibbles are not data frames

2017-09-26 Thread J C Nash
Having been around a while and part of several programming language and
other standards (see ISO 6373:1984 and IEEE 754-1985), I prefer some democracy 
at the
level of getting a standard. Though perhaps at the design level I can agree
with Hadley. However, we're now at the stage of needing to clean up R
and actually get rid of some serious annoyances, in which I would include
my own contributions that appear in optim(), namely the Nelder-Mead,
BFGS and CG options for which there are replacements.

In the tibble/data-frame issue, it would appear there could be a resolution
with some decision making at the R-core level, and whether that is democratic
or ad-hoc, it needs to happen.

JN


On 2017-09-26 05:08 PM, Hadley Wickham wrote:

> 
> I'm not sure that democracy works for programming language design.
> 
> Hadley
>

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Solaris SPARC, Fortran, and logical errors?

2017-03-16 Thread J C Nash

FWIW it appears that QEMU has an admittedly slow implementation that supports
some architectures beyond x86/amd64 and that there is recent activity. See

http://wiki.qemu-project.org/Documentation/Platforms/SPARC

An alternative might be to persuade Oracle to provide a Sparc-builder, since 
they
advertise Oracle R Technologies at
http://www.oracle.com/technetwork/database/database-technologies/r/r-technologies/r-offerings-1566363.html

but dates on that page are from 2014. Perhaps someone has contacts at Oracle 
and could at least raise
the possibility.

JN



On 2017-03-16 08:20 AM, Ben Bolker wrote:

I completely agree that testing on SPARC Solaris is valuable, however
much of a nuisance it is.  But I also agree that it would be great if
we could find a way to provide a publicly accessible SPARC Solaris
testing framework.

On Thu, Mar 16, 2017 at 6:49 AM, Uwe Ligges
<lig...@statistik.tu-dortmund.de> wrote:



On 15.03.2017 18:30, Ben Bolker wrote:




On 17-03-15 11:09 AM, J C Nash wrote:


Possibly tangential, but has there been any effort to set up a Sparc
testbed? It
seems we could use a network-available (virtual?) machine, since this
platform is
often the unfortunate one. Unless, of course, there's a sunset date.

For information, I mentioned SPARC at our local linux group, and
apparently there
are a couple of folk who have them running, but I didn't find out the
state of the
OS etc.

JN



  The virtual machine platforms I know of (admittedly not a complete
list!) only support Solaris on x86, e.g.



Yes, you cannot emulate a Sparc in an efficient way on an amd64 platform.

I take the opportunity to repeat why testing on *Sparc Solaris* gives many
benefits:

- this way we cover big- and little-endian platforms (i.e. for future
stability so that it works on what appear to be still esoteric such as ARM
based architectures or so)
- we cover one of the commercial unixes, i.e. we see
  + how stuff works on the the typically rather old toolchains
  + and what happens in on gnu/gcc-setups and how much GNUisms are used

Best,
Uwe Ligges





https://community.oracle.com/thread/2569292






On 2017-03-15 10:40 AM, Avraham Adler wrote:


Hello.

The Delaporte package works properly on all R-core platforms except
Solaris SPARC, where it  compiles properly but fails a number of its
tests [1]. Not having access to a SPARC testbed, I'm limited in what
kind of diagnostics I can do. One thing I have noticed is that a lot
of the failures occur when I am passing non-default logicals (like
lower tail or log). For example, the first failure at that link is
when "log = true" is supposed to be passed, but the SPARC answers are
the unlogged values. Of the 22 failed tests, 12 of them pass logicals.

I'll bring an example of how it is coded below, and if anyone
recognizes where SPARC specifically goes wrong, I'd appreciate. I
guess, if I absolutely had to, I could convert the logical to an
integer in C and pass the integer to Fortran which should work even
for SPARC, but I'd prefer not to if I could help it.

Thank you,

Avi

[1] https://cran.r-project.org/web/checks/check_results_Delaporte.html

*Example Code*

R code:

ddelap <- function(x, alpha, beta, lambda, log = FALSE){
  if(!is.double(x)) {storage.mode(x) <- 'double'}
  if(!is.double(alpha)) {storage.mode(alpha) <- 'double'}
  if(!is.double(beta)) {storage.mode(beta) <- 'double'}
  if(!is.double(lambda)) {storage.mode(lambda) <- 'double'}
  if(any(x > floor(x))) {
warning("Non-integers passed to ddelap. These will have 0
probability.")
  }
  .Call(ddelap_C, x, alpha, beta, lambda, log)
}

C code:

void ddelap_f(double *x, int nx, double *a, int na, double *b, int nb,
double *l, int nl,
  int *lg, double *ret);

extern SEXP ddelap_C(SEXP x, SEXP alpha, SEXP beta, SEXP lambda, SEXP
lg){
  const int nx = LENGTH(x);
  const int na = LENGTH(alpha);
  const int nb = LENGTH(beta);
  const int nl = LENGTH(lambda);
  SEXP ret;
  PROTECT(ret = allocVector(REALSXP, nx));
  ddelap_f(REAL(x), nx, REAL(alpha), na, REAL(beta), nb, REAL(lambda),
nl, LOGICAL(lg), REAL(ret));
  UNPROTECT(1);
  return(ret);
}

Fortran: (not posting ddelap_f_s as that doesn't handle the logging)

subroutine ddelap_f(x, nx, a, na, b, nb, l, nl, lg, pmfv) bind(C,
name="ddelap_f")

integer(kind = c_int), intent(in), value :: nx, na, nb, nl
! Sizes
real(kind = c_double), intent(in), dimension(nx) :: x
! Observations
real(kind = c_double), intent(out), dimension(nx):: pmfv
! Result
real(kind = c_double), intent(in):: a(na), b(nb),
l(nl)! Parameters
logical(kind = c_bool), intent(in)   :: lg
! Log flag
integer  :: i
! Integer

!$omp parallel do default(shared) private(i)
do i = 1, nx
if (x(i) > floor(x(i))) then
pmfv(i) = ZERO
else
  

Re: [R-pkg-devel] Solaris SPARC, Fortran, and logical errors?

2017-03-15 Thread J C Nash

Possibly tangential, but has there been any effort to set up a Sparc testbed? It
seems we could use a network-available (virtual?) machine, since this platform 
is
often the unfortunate one. Unless, of course, there's a sunset date.

For information, I mentioned SPARC at our local linux group, and apparently 
there
are a couple of folk who have them running, but I didn't find out the state of 
the
OS etc.

JN


On 2017-03-15 10:40 AM, Avraham Adler wrote:

Hello.

The Delaporte package works properly on all R-core platforms except
Solaris SPARC, where it  compiles properly but fails a number of its
tests [1]. Not having access to a SPARC testbed, I'm limited in what
kind of diagnostics I can do. One thing I have noticed is that a lot
of the failures occur when I am passing non-default logicals (like
lower tail or log). For example, the first failure at that link is
when "log = true" is supposed to be passed, but the SPARC answers are
the unlogged values. Of the 22 failed tests, 12 of them pass logicals.

I'll bring an example of how it is coded below, and if anyone
recognizes where SPARC specifically goes wrong, I'd appreciate. I
guess, if I absolutely had to, I could convert the logical to an
integer in C and pass the integer to Fortran which should work even
for SPARC, but I'd prefer not to if I could help it.

Thank you,

Avi

[1] https://cran.r-project.org/web/checks/check_results_Delaporte.html

*Example Code*

R code:

ddelap <- function(x, alpha, beta, lambda, log = FALSE){
  if(!is.double(x)) {storage.mode(x) <- 'double'}
  if(!is.double(alpha)) {storage.mode(alpha) <- 'double'}
  if(!is.double(beta)) {storage.mode(beta) <- 'double'}
  if(!is.double(lambda)) {storage.mode(lambda) <- 'double'}
  if(any(x > floor(x))) {
warning("Non-integers passed to ddelap. These will have 0 probability.")
  }
  .Call(ddelap_C, x, alpha, beta, lambda, log)
}

C code:

void ddelap_f(double *x, int nx, double *a, int na, double *b, int nb,
double *l, int nl,
  int *lg, double *ret);

extern SEXP ddelap_C(SEXP x, SEXP alpha, SEXP beta, SEXP lambda, SEXP lg){
  const int nx = LENGTH(x);
  const int na = LENGTH(alpha);
  const int nb = LENGTH(beta);
  const int nl = LENGTH(lambda);
  SEXP ret;
  PROTECT(ret = allocVector(REALSXP, nx));
  ddelap_f(REAL(x), nx, REAL(alpha), na, REAL(beta), nb, REAL(lambda),
nl, LOGICAL(lg), REAL(ret));
  UNPROTECT(1);
  return(ret);
}

Fortran: (not posting ddelap_f_s as that doesn't handle the logging)

subroutine ddelap_f(x, nx, a, na, b, nb, l, nl, lg, pmfv) bind(C,
name="ddelap_f")

integer(kind = c_int), intent(in), value :: nx, na, nb, nl
! Sizes
real(kind = c_double), intent(in), dimension(nx) :: x
! Observations
real(kind = c_double), intent(out), dimension(nx):: pmfv
! Result
real(kind = c_double), intent(in):: a(na), b(nb),
l(nl)! Parameters
logical(kind = c_bool), intent(in)   :: lg
! Log flag
integer  :: i
! Integer

!$omp parallel do default(shared) private(i)
do i = 1, nx
if (x(i) > floor(x(i))) then
pmfv(i) = ZERO
else
pmfv(i) = ddelap_f_s(x(i), a(mod(i - 1, na) + 1), &
 b(mod(i - 1, nb) + 1), l(mod(i -
1, nl) + 1))
end if
end do
!$omp end parallel do

if (lg) then
pmfv = log(pmfv)
end if

end subroutine ddelap_f

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages.

2017-02-11 Thread J C Nash
Certainly Google can be useful, but it can also be infuriatingly time-wasting when one needs to sort out related tools 
that do slightly different things. Then good, up-to-date task views are important, and wrappers such as I and some 
others are trying to develop can be a way to ease the chore of applying the tools or changing between related ones where 
there isn't enough information on which is best.


Perhaps Jim, Spencer, and I (others welcome!) can come up with some small examples to show where Google / sos / other 
search tools and the task views (Julia?) can be illustrated to provide guidance. After all, the purpose of the UseR! 
session is to try to develop improved ways to access R's packages.


Cheers, John Nash

On 2017-02-10 05:26 PM, Jim Lemon wrote:

This discussion started me thinking about searching for a function or
package, as many questions on the R help list indicate the that poster
couldn't find (or hasn't searched for) what they want. I don't think I
have ever used task views. If I haven't got a clue where to look for
something, I use Google. I can't recall an occasion when I didn't get
an answer, even if it was that what I wanted didn't exist. Perhaps we
should ask why Google is so good at answering uninformed questions, in
particular about R. I'm not the only person on the help list who
advises the clueless to try Google.

Jim


On Sat, Feb 11, 2017 at 3:51 AM, Ben Bolker <bbol...@gmail.com> wrote:

  I definitely read the task views and advise others to do so.  I
don't know how representative my little corner of the world is,
though.

  I have an embryonic task view on mixed models at
https://github.com/bbolker/mixedmodels-misc/blob/master/MixedModels.ctv
but the perfect is the enemy of the good ...


On Fri, Feb 10, 2017 at 9:56 AM, J C Nash <profjcn...@gmail.com> wrote:

We'd be more than happy to have you contribute directly. The goal is not
just an
information session, but to get some movement to ways to make the package
collection(s)
easier to use effectively. Note to selves: "effectively" is important -- we
could make
things easy by only recommending a few packages.

Best, JN


On 2017-02-10 09:29 AM, Michael Dewey wrote:


Dear all

That seems an interesting session. I am the maintainer of one of the CRAN
Task Views (MetaAnalysis) and will attend
unless I am successful in the draw for Wimbledon tickets.

Just in case I strike lucky one question I would have raised from the
floor if I were there would have been "Does anyone
read the Task Views?". Since I started mine I have received only a couple
of suggestions for additions including a very
abrupt one about a package which had been included for months but whose
author clearly did not read before writing. So I
would ask whether we need to focus much energy on the Task Views.

So, maybe see you there, maybe not.


On 16/01/2017 14:57, ProfJCNash wrote:


Navigating the Jungle of R Packages

The R ecosystem has many packages in various collections,
especially CRAN, Bioconductor, and GitHub. While this
richness of choice speaks to the popularity and
importance of R, the large number of contributed packages
makes it difficult for users to find appropriate tools for
their work.

A session on this subject has been approved for UseR! in
Brussels. The tentative structure is three short
introductory presentations, followed by discussion or
planning work to improve the tools available to help
users find the best R package and function for their needs.

The currently proposed topics are

- wrapper packages that allow diverse tools that perform
  similar functions to be accessed by unified calls

- collaborative mechanisms to create and update Task Views

- search and sort tools to find packages.

At the time of writing we have tentative presenters for
the topics, but welcome others. We hope these presentations
at useR! 2017 will be part of a larger discussion that will
contribute to an increased team effort after the conference
to improve the the support for R users in these areas.


John Nash, Julia Silge, Spencer Graves

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel





__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel



__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] UseR! Session: Navigating the jungle of R packages.

2017-02-10 Thread J C Nash

We'd be more than happy to have you contribute directly. The goal is not just an
information session, but to get some movement to ways to make the package 
collection(s)
easier to use effectively. Note to selves: "effectively" is important -- we 
could make
things easy by only recommending a few packages.

Best, JN


On 2017-02-10 09:29 AM, Michael Dewey wrote:

Dear all

That seems an interesting session. I am the maintainer of one of the CRAN Task 
Views (MetaAnalysis) and will attend
unless I am successful in the draw for Wimbledon tickets.

Just in case I strike lucky one question I would have raised from the floor if I 
were there would have been "Does anyone
read the Task Views?". Since I started mine I have received only a couple of 
suggestions for additions including a very
abrupt one about a package which had been included for months but whose author 
clearly did not read before writing. So I
would ask whether we need to focus much energy on the Task Views.

So, maybe see you there, maybe not.

On 16/01/2017 14:57, ProfJCNash wrote:

Navigating the Jungle of R Packages

The R ecosystem has many packages in various collections,
especially CRAN, Bioconductor, and GitHub. While this
richness of choice speaks to the popularity and
importance of R, the large number of contributed packages
makes it difficult for users to find appropriate tools for
their work.

A session on this subject has been approved for UseR! in
Brussels. The tentative structure is three short
introductory presentations, followed by discussion or
planning work to improve the tools available to help
users find the best R package and function for their needs.

The currently proposed topics are

- wrapper packages that allow diverse tools that perform
  similar functions to be accessed by unified calls

- collaborative mechanisms to create and update Task Views

- search and sort tools to find packages.

At the time of writing we have tentative presenters for
the topics, but welcome others. We hope these presentations
at useR! 2017 will be part of a larger discussion that will
contribute to an increased team effort after the conference
to improve the the support for R users in these areas.


John Nash, Julia Silge, Spencer Graves

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel





__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Title case in DESCRIPTION for package where a word is a function name

2015-04-25 Thread Prof J C Nash (U30A)
Hendrik pointed out it was the parentheses that gave the complaint.
Single quotes and no parentheses seem to satisfy R CMD check. Perhaps
that needs to be in the WRE.

However, I have for some time used the parentheses to distinguish
functions from packages. optim() is a function, optimx a package.
Is this something CRAN should be thinking about? I would argue greater
benefit to users than title case.

JN


On 15-04-24 06:17 PM, Uwe Ligges wrote:
 
 
 On 24.04.2015 22:44, Ben Bolker wrote:
 Prof J C Nash (U30A nashjc at uottawa.ca writes:


 I was preparing a fix for a minor glitch in my optimx package and R CMD
 check gave an error that the title was not in title case.

[snip] to make Gmane happy ...

 I have found

 A Replacement and Extension of the _optim()_ Function

 does not get the complaint, but I'm not sure the underscore is allowed.

 Given that I've obeyed the RTFM rule, I'm wondering what to do now.

Presumably you should ask the CRAN maintainers?  That seems to
 be the only possible answer -- I don't think anyone else can guess
 very accurately ...
 
 From WRE:
 
 Refer to other packages and external software in single quotes, and to
 book titles (and similar) in double quotes.
 
 Other non-English usage (as documented for the Description field; this
 inlcudes function names) can also be used in single quotes.
 
 Best,
 Uwe Ligges
 
 

Ben Bolker

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Title case in DESCRIPTION for package where a word is a function namei

2015-04-25 Thread Prof J C Nash (U30A)
How about allowing underscore? (I believe WRE is silent on this, and I
have not tried submitting a package with underscore in the title.) As I
pointed out in my OP, _optim()_ works. And we have the advantage that we
can distinguish package from function.

The purpose of consistent editing is surely to provide the affordances
that save us from needing extra documentation, as per Donald Norman's
excellent discussions on Design of Everyday Things, or Turn Signals are
the Facial Expressions of Automobiles. Changing the name of a function
in a case-sensitive computing language may not be a bug, but it is
asking for trouble.

JN

On 15-04-25 07:57 AM, peter dalgaard wrote:
 
 On 25 Apr 2015, at 13:11 , Prof J C Nash (U30A) nas...@uottawa.ca wrote:

 Hendrik pointed out it was the parentheses that gave the complaint.
 Single quotes and no parentheses seem to satisfy R CMD check. Perhaps
 that needs to be in the WRE.
 
 Well, it is in ?toTitleCase:
 
  ...However, unknown
  technical terms will be capitalized unless they are single words
  enclosed in single quotes: names of packages and libraries should
  be quoted in titles.
 
 ..and it is the single word bit that gets you. AFAICT, the issue is that it 
 splits the text into words and then looks for words that begin and end with a 
 single quote. And parentheses count as word separators, so the quotes of 
 'optim()' end up in two different words. 
 
 It's one of those things that aren't easy to fix: Presumably you do want 
 capitalization within parentheses so we can't just not let them be 
 separators, and we can't just look for sets of single quotes with arbitrary 
 content because they get used inside ordinary text (e.g. the beginning of 
 this paragraph contains 's one of those things that aren'). So either we need 
 more heuristics, like only counting () as separators when preceded by or 
 preceding a space, or some sort of explicit escape mechanism, like BibTeX's 
 {foo}.
 

 However, I have for some time used the parentheses to distinguish
 functions from packages. optim() is a function, optimx a package.
 Is this something CRAN should be thinking about? I would argue greater
 benefit to users than title case.

 JN


 On 15-04-24 06:17 PM, Uwe Ligges wrote:


 On 24.04.2015 22:44, Ben Bolker wrote:
 Prof J C Nash (U30A nashjc at uottawa.ca writes:


 I was preparing a fix for a minor glitch in my optimx package and R CMD
 check gave an error that the title was not in title case.

   [snip] to make Gmane happy ...

 I have found

 A Replacement and Extension of the _optim()_ Function

 does not get the complaint, but I'm not sure the underscore is allowed.

 Given that I've obeyed the RTFM rule, I'm wondering what to do now.

   Presumably you should ask the CRAN maintainers?  That seems to
 be the only possible answer -- I don't think anyone else can guess
 very accurately ...

 From WRE:

 Refer to other packages and external software in single quotes, and to
 book titles (and similar) in double quotes.

 Other non-English usage (as documented for the Description field; this
 inlcudes function names) can also be used in single quotes.

 Best,
 Uwe Ligges



   Ben Bolker

 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Title case in DESCRIPTION for package where a word is a function name

2015-04-24 Thread Prof J C Nash (U30A)
I was preparing a fix for a minor glitch in my optimx package and R CMD
check gave an error that the title was not in title case. It is

A Replacement and Extension of the optim() Function

R CMD check suggests the incorrect form

A Replacement and Extension of the Optim() Function

'Writing R Extensions' suggests single quotes, i.e.,

A Replacement and Extension of the 'optim()' Function

which R CMD check still complains about.

I have found

A Replacement and Extension of the _optim()_ Function

does not get the complaint, but I'm not sure the underscore is allowed.

Given that I've obeyed the RTFM rule, I'm wondering what to do now.

On a related matter, I'm finding the reverse dependency check for optimx
takes a very long time and sometimes stalls for reasons I have not yet
sorted out. I run it in virtual machines for R3.2 and R-devel. Possibly
optimx needs so many packages I'm hitting memory or disk limits. Perhaps
off-list discussion could suggest a way to set up a reverse check
server. I'd be willing to help on such a project, which might be helpful
for many developers.

JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] nls

2015-03-19 Thread Prof J C Nash (U30A)
nls() is using
1) only a Gauss-Newton code which is prone to some glitches
2) approximate derivatives

Package nlmrt uses symbolic derivatives for expressions (you have to
provide Jacobian code for R functions) and an aggressive Marquardt
method to try to reduce the sum of squares. It does return more
information about the problem (singular values of the final Jacobian
and gradient at the proposed solution) but does NOT return the nls
structured object. And it will usually take more time and computing
effort because it tries hard to reduce the SS.

A reproducible example would get you a more informed response.

John Nash


On 15-03-19 07:00 AM, r-devel-requ...@r-project.org wrote:
 Date: Wed, 18 Mar 2015 14:14:12 +0200
 From: Evans Otieno Ochiaga evansochi...@aims.ac.za
 To: r-devel@r-project.org
 Subject: [Rd] Help
 Message-ID:
   CAObCh3XfvtCz+qWtSS+pSPrhWtUKtdZoYANN=_4ajndziii...@mail.gmail.com
 Content-Type: text/plain; charset=UTF-8
 
 Hi to All,
 
 I am fitting some models to a data using non linear least square, and
 whenever i run the command, parameters value have good convergence but I
 get the  error in red as shown below. Kindly how can I fix this problem.
 
 
 Convergence of parameter values
 
 0.2390121 :  0.1952981 0.975 1.000
 0.03716107 :  0.1553976 0.910 1.000
 0.009478433 :  0.2011017 0.798 1.000
 0.004108196 :  0.2640111 0.693 1.000
 0.003705189 :  0.2938360 0.652 1.000
 0.003702546 :  0.2965745 0.650 1.000
 0.003702546 :  0.2965898 0.650 1.000
 0.003702546 :  0.2965898 0.650 1.000
 0.003702546 :  0.2965898 0.650 1.000
 
 Error in nls(Occupancy ~ 1 - (theta * beta^(2 * Resolution^(1/2)) *
 delta^Resolution),  :
   step factor 0.000488281 reduced below 'minFactor' of 0.000976562
 
 Regards,
 
 
 
 
 *Evans Ochiaga*

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Requirement for pandoc 1.12.3 in R 3.1.3

2015-03-12 Thread Prof J C Nash (U30A)
Are other developers finding R 3.1.3 problematic because vignette
building requires pandoc 1.12.3, while Linux Mint 17 / Ubuntu 14.04 have
1.12.2.1? R 3.1.2 seems to work fine.

I'd very much like to avoid having to build as large a Linux package as
pandoc, which has given me issues outside of R (it leaves out words,
sentences or paragraphs when converting Latex to epub in a novel I'm
working on, and does so without warning). Possibly concerns like this
are why R has moved to a later pandoc.

Suggestions welcome.

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Requirement for pandoc 1.12.3 in R 3.1.3

2015-03-12 Thread Prof J C Nash (U30A)
/rel_rebecca_mate_whatsnew.php
 RELEASE_NOTES_URL=http://www.linuxmint.com/rel_rebecca_mate.php
 USER_GUIDE_URL=help:linuxmint
 GRUB_TITLE=Linux Mint 17.1 MATE 64-bit



 john@john-J6-2015 ~/current/nls14work $ dpkg -l | grep -i pandoc
 ii  libghc-pandoc-citeproc-data 0.2-3build1
 all  Pandoc support for Citation
 Style Language - data files

 ii  pandoc  1.12.2.1-1build2
 amd64general markup converter

 ii  pandoc-citeproc 0.2-3build1
 amd64Pandoc support for Citation
 Style Language - tools

 ii  pandoc-data 1.12.2.1-1build2
 all  general markup converter -
 data
 files

 JN


 On 15-03-12 10:21 AM, Prof Brian Ripley wrote:
 On 12/03/2015 13:51, Prof J C Nash (U30A) wrote:
 Are other developers finding R 3.1.3 problematic because vignette
 building requires pandoc 1.12.3, while Linux Mint 17 / Ubuntu
 14.04 have
 1.12.2.1? R 3.1.2 seems to work fine.

 R has no built-in support for non-Sweave vignettes, and there is no
 mention of pandoc in the R 3.1.3 sources except for the manual:

 'Complete checking of a package which contains a file
 @file{README.md}
 needs @command{pandoc} installed: see
 @uref{http://johnmacfarlane.net/@/pandoc/@/installing.html}.'

 which is true (but is not done with R 3.1.3).

 I suspect you are confusing an R update with an update of whatever
 packages you use to process your vignettes: package rmarkdown has a
 pandoc version requirement of

 SystemRequirements: pandoc (= 1.12.3) -


 __
 R-devel@r-project.org mailing list
 https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Help finding source of warnings

2015-01-18 Thread Prof J C Nash (U30A)
I've been implementing a wrapper to the 2011 Fortran version of 
L-BFGS-B. In optim(), R uses a C translation of a Fortran version (the 
version number does not appear to be documented by the original 
authors). The authors of the original Fortran code have updated it and 
published the reasons in ACM TOMS due to inefficiencies and a bug.


In running the checks on the resulting package (which is on R-forge 
under the optimizer project), I'm getting a number of warning messages 
of the type


Warning in file.copy(file.path(.Library, pkg, DESCRIPTION), pd) :
  problem copying /usr/lib/R/library/mgcv/DESCRIPTION to 
/tmp/Rtmp0kkeHo/RLIBS_1214765d1c5f/mgcv/DESCRIPTION: No such file or 
directory


which reference DESCRIPTIONs for a number of packages other than the one 
being checked -- here mgcv -- and which are not referenced in my package 
as far as I can determine.


Possibly unrelated, when I run the code on a problem, it works for one 
run, then gives a NAMESPACE error and failure on the second try. Apart 
from this, checks and unit tests appear to work correctly.


Does anyone have pointers where I might find some ideas on the origin of 
the issue(s)? I suspect the warning messages are not particularly 
indicative of the source of the warnings, but that I have some subtle 
glitch in the setup and call to the Fortran.


I suspect this is not platform dependent, but I'm running Linux Mint 
17.1 (ubuntu derivative), and  R 3.1.2.


Cheers, JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Help finding source of warnings

2015-01-18 Thread Prof J C Nash (U30A)
Kurt pointed to the issue. Thanks. I did install r-recommended, but it 
seems something went wrong at some point. A reinstall got rid of the 
warnings.


Thanks to Dirk for his offer of help.

Now I'm still getting a namespace issue on the second run of an 
optimization problem. However, I think I need to do some more digging to 
narrow down where this issue is lurking. It may be some local matter, as 
with the r-recommended links failing.


Best, JN

On 15-01-18 09:27 AM, Kurt Hornik wrote:

Prof J C Nash (U30A) writes:



I've been implementing a wrapper to the 2011 Fortran version of
L-BFGS-B. In optim(), R uses a C translation of a Fortran version (the
version number does not appear to be documented by the original
authors). The authors of the original Fortran code have updated it and
published the reasons in ACM TOMS due to inefficiencies and a bug.



In running the checks on the resulting package (which is on R-forge
under the optimizer project), I'm getting a number of warning messages
of the type



Warning in file.copy(file.path(.Library, pkg, DESCRIPTION), pd) :
problem copying /usr/lib/R/library/mgcv/DESCRIPTION to
/tmp/Rtmp0kkeHo/RLIBS_1214765d1c5f/mgcv/DESCRIPTION: No such file or
directory



which reference DESCRIPTIONs for a number of packages other than the one
being checked -- here mgcv -- and which are not referenced in my package
as far as I can determine.



Possibly unrelated, when I run the code on a problem, it works for one
run, then gives a NAMESPACE error and failure on the second try. Apart
from this, checks and unit tests appear to work correctly.



Does anyone have pointers where I might find some ideas on the origin of
the issue(s)? I suspect the warning messages are not particularly
indicative of the source of the warnings, but that I have some subtle
glitch in the setup and call to the Fortran.



I suspect this is not platform dependent, but I'm running Linux Mint
17.1 (ubuntu derivative), and  R 3.1.2.


John: maybe you did not install the recommended packages?
(On Debian, the corresponding package would be r-recommended.)

Best
-k


Cheers, JN



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] How can I use R-function in My C++ project ? (using optim)

2014-10-22 Thread Prof J C Nash (U30A)
As the author of 3 of the 5 methods in optim, I think you may be wasting
your time if this is for performance. My reasons are given in

http://www.jstatsoft.org/v60/i02

Note that most of the speed benefits of compilation are found in the
objective and gradient function, with generally more minor improvements
in the method.

JN


On 14-10-22 06:00 AM, r-devel-requ...@r-project.org wrote:
 Date: Tue, 21 Oct 2014 20:14:03 +0800 (CST)
 From: ?? 2012111...@njau.edu.cn
 To: R-devel@r-project.org
 Subject: [Rd] How can I use R-function in My C++ project ?
 Message-ID: 1f74c14.d1e7.14932a0ddbf.coremail.2012111...@njau.edu.cn
 Content-Type: text/plain; charset=UTF-8
 
 Dear seniors:
I am a student in Nanjing Agricultural University of China.
 
 
I want to use the  function optim  of package stats in my C++ project. I 
 have got the R.dll , R.def and R.lib,
 
 
 but I can't find the function prototypes of optim in R.def. 
   
How  can  I  do ?  Is the Method I call R function in C++ with R.dll 
 feasible ?  I hope to get your help ! Thanks 
 
 
 for reading my question.
 
 
 Yours sincerely,
 
 
 Bo Huang
   [[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] lbfgsb from C/C++

2014-09-08 Thread Prof J C Nash (U30A)
I won't comment on the C/C++ option, as I'm not expert in that. However,
R users and developers should know that Nocedal et al. who developed
L-BFGS-B released an update to correct a fault in 2011. It was important
enough that an ACM TOMS article was used for the announcement.

I recently implemented a wrapper to the new Fortran code. As it is a
reverse communication code, the main task was getting rid of all the
Fortran WRITE() statements (not a happy experience!).

Maybe Dirk has ideas on how to make the wrapper more efficient and the
call from C or C++ direct for those who need that.

The code will be incorporated eventually in the optimx package, but for
now an experimental (NOTE - EXPERIMENTAL) version lbfgsb3 is up on
r-forge under the optimizer project. I'd be happy to exchange ideas
off-list on this.

Here's the reference to the TOMS announcement:

@Article{Morales:2011:FSL,
  author =   Jos\'e Luis Morales and Jorge Nocedal,
  title =Remark on ``{Algorithm} 778: {L-BFGS-B}: {Fortran}
Subroutines
  for Large-Scale Bound Constrained Optimization'',
  journal =  {ACM} Transactions on Mathematical Software,
  accepted = 15 April 2011,
  volume =   38,
  number =   1,
  month =nov,
  URL =  http://doi.acm.org/10.1145/2049662.2049669;,
  pages =7:1--7:4,
  year = 2011,
  abstract = 
 This remark describes an improvement and a correction
 to Algorithm 778. It is shown that the performance of
 the algorithm can be improved significantly by making
 a relatively simple modification to the subspace
 minimization phase. The correction concerns an error
 caused by the use of routine dpmeps to estimate machine
 precision.,
}

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] a question about optim.R and optim.c in R

2014-07-08 Thread Prof J C Nash (U30A)
As you dig deeper you will find vmmin.c, cgmin.c and (I think) nmmin.c
etc. Those were, as I understand, converted by p2c from my Pascal codes
that you can find in the pascal library on netlib.org. These can be run
with the Free Pascal compiler.

Given how long ago these were developed (30 years in all cases), they
are due for review. The packages Rvmmin and Rcgmin are all-R
replacements for the 1st two, and nmkb from dfoptim generally offers a
better version of the Nelder-Mead approach. All have bounds-constrained
variants, but for efficiency, there are direct calls of the
unconstrained methods, though I need to provide some nice examples of
when and how to call each. There is likely a place for some compiling of
sections to speed things up, but the R codes are not particularly sluggish.

Side comment: At UseR last week, Yihui Xie sat with me and we
implemented a Fortran language engine for knitr. It looks like a Pascal
one may also be possible, and maybe even a BASIC (though that may need
variants for different platforms). This would allow vignettes to
document some of the legacy code to be written, and that may be an
important matter as the expertise for such older tools moves into
retirement. Off-list communication about such ideas welcome.

John Nash


On 14-07-08 06:00 AM, r-devel-requ...@r-project.org wrote:
 Message: 2
 Date: Mon, 7 Jul 2014 16:34:59 -0400
 From: Zhiyuan Dong zhiyuan.d...@gmail.com
 To: r-devel@r-project.org
 Subject: [Rd] a question about optim.R and optim.c in R
 Message-ID:
   can8pbzvw1sd_rq_qqz3dwbs8r5rwinnnykm2ian4o8w4fpg...@mail.gmail.com
 Content-Type: text/plain
 
 Hi, I am learning R by reading R source code. Here is one question I have
 about the optim function in R.
 
 The context : In the optim.R, after all the prep steps, the main function
 call call is made via :
 
 .External2(C_optim, par, fn1, gr1, method, con, lower, upper).
 
 So, it seems to me, to follow what is going on from here, that I should
 read the optim function in \src\library\stats\src\optim.c
 
 where it has this signature :
 
 SEXP optim(SEXP call, SEXP op, SEXP args, SEXP rho)
 
 I am not sure I follow here : In the .External2 call, we have 7 parameters
 :  par, fn1, gr1, method, con, lower, upper; This does not seem to match
 the signature of
 
 SEXP optim(SEXP call, SEXP op, SEXP args, SEXP rho)
 
 However, it seems (from the source code) that the 7 parameters are somehow
 embedded in the 'args' parameter. I am not sure what is going on...Am I
 missing something?
 
 Thanks much!!!
 
 Best,
 
 Zhiyuan
 
   [[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R CMD check for the R code from vignettes -- thread fraying?

2014-06-02 Thread Prof J C Nash (U30A)
I noted Duncan's comment that an answer had been provided, and went to
the archives to find his earlier comment, which I am fairly sure I saw a
day or two ago. However, neither May nor June archives show Duncan in
the thread except for the msg below (edited for space). Possibly tech
failures are causing misunderstandings.

JN

On 14-06-02 06:00 AM, r-devel-requ...@r-project.org wrote:
 Message: 4 Date: Mon, 02 Jun 2014 14:06:28 +0900 From: Duncan Murdoch
 murdoch.dun...@gmail.com To: Carl Boettiger cboet...@gmail.com,
 Gabriel Becker gmbec...@ucdavis.edu Cc: Henrik Bengtsson
 h...@biostat.ucsf.edu, R-devel r-devel@r-project.org Subject: Re: [Rd]
 R CMD check for the R code from vignettes Message-ID:
 538c0654.7050...@gmail.com Content-Type: text/plain;
 charset=ISO-8859-1; format=flowed On 02/06/2014, 1:41 PM, Carl Boettiger
 wrote:
  Thanks both for the replies.
 


 You haven't been reading very carefully.  I saw several:  mine, Martin 
 Morgan's, Kasper Daniel Hansen's.
 
 Duncan Murdoch



__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] cat with backspace and newline characters

2013-11-07 Thread Prof J C Nash (U30A)
Over the years, this has been useful to me (not just in R) for many
nonlinear optimization tasks. The alternatives often clutter the screen.

 On 13-11-06 06:00 AM, r-devel-requ...@r-project.org wrote:

 People do sometimes use this pattern for displaying progress (e.g. iteration 
 counts). 
 
 


As this looks like a terminal matter, the first level solution may be
to take a couple of examples and prepare a table of what happens on the
most used platforms, including in things like RStudio and ESS. Might be
a good issue for Rwiki, as I'm sure few of us have all the choices.

I can generally live with warning users when the iteration display may
display oddly. Perhaps others have more stringent requirements, but if
documentation is enough, we can avoid fixing something that is more or
less outside R and may be unnecessary work. And documented examples can
also show us if things are changing.

JN

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Byte code compile not helpful in R3.0.2

2013-11-03 Thread Prof J C Nash (U30A)
I had a bunch of examples of byte code compiles in something I was
writing. Changed to 3.0.2 and the advantage of compiler disappears. I've
looked in the NEWS file but do not see anything that suggests that the
compile is now built-in. Possibly I've just happened on a bunch of
examples where it does not help, but experiences of a year ago do not
seem to remain valid now. Just wondering if my experience was consistent
with what is expected now in 3.0.2.

John Nash

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Byte code compile not helpful in R3.0.2

2013-11-03 Thread Prof J C Nash (U30A)
My bad to not give details. I'm comparing (though not quite directly) to
results in the posting
http://rwiki.sciviews.org/doku.php?id=tips:rqcasestudy.

What prompted the query was a write up of for versus while loops,
where there was a speedup using compiler for one of these. I had the
example in a knitr file, and when I was reviewing the text before
sending it to an editor, I realized the timings no longer supported the
text. They were, as I recall, developed in R 2.15.2, and I just looked
through my VMs with different OS's to see if there is one with that
still extant, but except for a real Win7 case I have been too good and
updated to at least 3.0.1, where I'm getting no advantage. The Win7 case
is R 2.15.1, and there the compiler actually went slower on one run of
the code below. That may be due to antivirus running -- had not booted
that partition for quite a while.

Here is the for-while test code:

#  forwhiletime.R
library(microbenchmark)
require(compiler)

tfor - function(n){
for (i in 1:n) {
   xx-exp(sin(cos(as.double(i
}
xx
}

twhile - function(n){
i-0
while (in) {
   i-i+1
   xx-exp(sin(cos(as.double(i
}
xx
}
n-1

timfor-microbenchmark(tfor(n))
timwhile-microbenchmark(twhile(n))
timfor
timwhile
cmpfun(tfor)
cmpfun(twhile)
timforc-microbenchmark(tfor(n))
timwhilec-microbenchmark(twhile(n))
timforc
timwhilec
looptimes-data.frame(timfor$time, timforc$time, timwhile$time,
timwhilec$time)
colMeans(looptimes)


Actually, I'm not greatly axious about all this. Mainly I want to make
sure that I get whatever advice is to be rendered so it is correct.

Best,

JN


On 13-11-03 02:22 PM, Duncan Murdoch wrote:
 On 13-11-03 2:15 PM, Prof J C Nash (U30A) wrote:
 I had a bunch of examples of byte code compiles in something I was
 writing. Changed to 3.0.2 and the advantage of compiler disappears. I've
 looked in the NEWS file but do not see anything that suggests that the
 compile is now built-in. Possibly I've just happened on a bunch of
 examples where it does not help, but experiences of a year ago do not
 seem to remain valid now. Just wondering if my experience was consistent
 with what is expected now in 3.0.2.
 
 Post some details, please.  Are the times in 3.0.2 like the times in
 3.0.1 with or without compiling?  Or were you comparing to some other
 version?
 
 Duncan Murdoch
 
 


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


  1   2   >