Re: [Rd] [External] Re: Workaround very slow NAN/Infinities arithmetic?

2021-10-01 Thread GILLIBERT, Andre

> Mildly related (?) to this discussion, if you happen to be in a situation
> where you know something is a C NAN, but need to check if its a proper R
> NA, the R_IsNA function is surprisingly (to me, at least) expensive to do
> in a tight loop because it calls the (again, surprisingly expensive to me)
> isnan function.  

What is your platform? CPU, OS, compiler?
How much expensive? 5-10 times slower than the improved code you wrote, or 
100-200 times slower?

I analyzed the C and assembly source code of R_IsNA on a x86_64 GNU/Linux 
computer (Celeron J1900) with GCC 5.4 and found that it was somewhat expensive, 
but the main problems did not seem to come from isnan.

isnan was only responsible of a ucomisd xmm0, xmm0 instruction followed by a 
conditional jump on x86_64. This instruction is slower on NAN than on normal 
FP, but it seems to have an acceptable speed.
On x86_32, the isnan is  responsible of a fld mem64, fst mem64, fucomip and 
conditional jump : it is suboptimal, but things could be worse.

On x86_64, the first problem I noticed is that R_IsNA is not inlined, and the 
registry-based x86_64 Linux calling convention is not necessarily good for that 
problem, with added loads/unloads from memory to registry.
Second problem (the worst part) : the write of a 64-bits double followed by the 
read of a 32-bits integer in the ieee_double union confuses the compiler, that 
generates very poor code, with unnecessary load/stores.

The second problem can be solved by using a union with a uint64_t and a double 
fields, and using &0x to extract the low part of the uint64_t. This 
works well for x86_64, but also for x86_32, where GCC avoids useless emulation 
of 64-bits integers, directly reading the 32-bits integer.

-- 
Sincerely
André GILLIBERT
__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] [External] Re: Workaround very slow NAN/Infinities arithmetic?

2021-10-01 Thread Brodie Gaslam via R-devel
> On Thursday, September 30, 2021, 01:25:02 PM EDT,  
> wrote:
>
>> On Thu, 30 Sep 2021, brodie gaslam via R-devel wrote:
>>
>> André,
>>
>> I'm not an R core member, but happen to have looked a little bit at this
>> issue myself.  I've seen similar things on Skylake and Coffee Lake 2
>> (9700, one generation past your latest) too.  I think it would make sense
>> to have some handling of this, although I would want to show the trade-off
>> with performance impacts on CPUs that are not affected by this, and on
>> vectors that don't actually have NAs and similar.  I think the performance
>> impact is likely to be small so long as branch prediction is active, but
>> since branch prediction is involved you might need to check with different
>> ratios of NAs (not for your NA bailout branch, but for e.g. interaction
>> of what you add and the existing `na.rm=TRUE` logic).
>
> I would want to see realistic examples where this matters, not
> microbenchmarks, before thinking about complicating the code. Not all
> but most cases where sum(x) returns NaN/NA would eventually result in
> an error; getting to the error faster is not likely to be useful.

That's a very good point, and I'll admit I did not consider it
sufficiently.  There are examples such as `rowSums`/`colSums` where some
rows/columns evaluate to NA thus the result is still contains meaningful
data.  By extension, any loop that applies `sum` to list elements where
some might contain NAs, and others not.  `tapply` or any other group based
aggregation come to mind.

> My understanding is that arm64 does not support proper long doubles
> (they are the same as regular doubles).

Mine is the same.

> So code using long doubles isn't getting the hoped-for improved
> precision. Since that architecture is becoming more common we should
> probably be looking at replacing uses of long doubles with better
> algorithms that can work with regular doubles, e.g Kahan summation or
> variants for sum.

This is probably the bigger issue.  If the future is ARM/AMD, the value of
Intel x87-only optimizations becomes questionable.

More generally is the question of whether to completely replace long
double with algorithmic precision methods, at a cost of performance on
systems that do support hardware long doubles (Intel or otherwise), or
whether both code pathways are kept and selected at compile time.  Or
maybe the aggregation functions gain a low-precision flag for simple
double precision accumulation.

I'm curious to look at the performance and precision implications of e.g.
Kahan summation if no one has done that yet.  Some quick poking around
shows people using processor specific intrinsics to take advantage of
advanced multi-element instructions, but I imagine R would not want to do
that.  Assuming others have not done this already, I will have a look and
report back.

>> Since we're on the topic I want to point out that the default NA in R
>> starts off as a signaling NA:
>>
>> example(numToBits)   # for `bitC`
>> bitC(NA_real_)
>> ## [1] 0 111 | 
>>00100010
>> bitC(NA_real_ + 0)
>> ## [1] 0 111 | 
>>10100010
>>
>> Notice the leading bit of the significant starts off as zero, which marks
>> it as a signaling NA, but becomes 1, i.e. non-signaling, after any
>> operation[2].
>>
>> This is meaningful because the mere act of loading a signaling NA into the
>> x87 FPU is sufficient to trigger the slowdowns, even if the NA is not
>> actually used in arithmetic operations.  This happens sometimes under some
>> optimization levels.  I don't now of any benefit of starting off with a
>> signaling NA, especially since the encoding is lost pretty much as soon as
>> it is used.  If folks are interested I can provide patch to turn the NA
>> quiet by default.
>
> In principle this might be a good idea, but the current bit pattern is
> unfortunately baked into a number of packages and documents on
> internals, as well as serialized objects. The work needed to sort that
> out is probably not worth the effort.

One reason why we might not need to sort this out is precisely the
instability shown above.  Anything that relies on the signaling bit set to
a particular value will behave differently with `NA_real_` vs
`NA_real_ + x`.  `R_IsNA` only checks the lower word, so it doesn't care
about the signaling bit or the 19 subsequent ones.  Anything that does
likely has unpredictable behavior **currently**.

Similarly, the documentation[1] only specifies the low word:

> On such platforms NA is represented by the NaN value with low-word 0x7a2
> (1954 in decimal).

This is consistent with the semantics of `R_IsNA`.

> It also doesn't seem to affect the performance issue here since
> setting b[1] <- NA_real_ + 0 produces the same slowdown (at least on
> my current Intel machine).

The subtlety here is that the slowdown happens by merely loading the
signaling NaN onto the X

Re: [Rd] R 4.1.x make check fails, stats-Ex.R, step factor reduced below minFactor

2021-10-01 Thread Andrew Piskorski
On Fri, Oct 01, 2021 at 03:45:48PM +0200, Martin Maechler wrote:

> Is there anything special (system libraries, compilers, ..)
> on your platform?

No.  As far as I know this is an ordinary SuperMicro x86-64 server,
nothing strange or unusual.  /proc/cpuinfo says "Intel(R) Xeon(R) CPU
E5-2670 0 @ 2.60GHz".

> o2 <- options(digits = 10) # more accuracy for 'trace'
> ## central differencing works here typically (PR#18165: not converging on 
> *some*):
> ctr2 <- nls.control(nDcentral=TRUE, tol = 8e-8, # <- even smaller than above
>warnOnly = (grepl("^aarch64.*linux", R.version$platform) && 
> grepl("^NixOS", osVersion)
>   ))
> (nlm2 <- update(nlmod, control = ctr2, trace = TRUE)); options(o2)
> 
> ... now would that run w/o error on your Ubuntu-installed R ?

Interactively, the code above runs fine.  In fact, the original code
ALSO seems to run fine, no errors at all!  See output below.  I get
the error when running the tests via either "make check" or
tools::testInstalledPackages(scope="base"), but outside of that
testing framework it runs fine.

Ah, interactively, if I ALSO run the code for the immediately prior
test in stats-Ex.R, THEN the nlm2 code fails the same way as with
"make check".  That prior test does set.seed(27), which seems to
trigger the downstream failures.  Simply skipping the set.seed(27)
(interactively) makes the failure go away for me.  But if the
set.seed(27) is necessary, maybe the second test should be doing its
own set.seed() of some sort?

I don't know how/where to comment out that set.seed(27) to try running
tests without it.  Editing "src/library/stats/man/nls.Rd" and re-running
"make check" still does the set.seed(27).


Just run code from the single failing test, it works fine:

## R --vanilla
R version 4.1.1 Patched (2021-09-21 r80946) -- "Kick Things"
Copyright (C) 2021 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu/x86_64 (64-bit)

> Sys.setenv(LC_COLLATE = "C", LC_TIME = "C", LANGUAGE = "en")
> options(digits = 10)
> x <- -(1:100)/10
> y <- 100 + 10 * exp(x / 2) + rnorm(x)/10
> nlmod <- nls(y ~  Const + A * exp(B * x))
Warning message:
In nls(y ~ Const + A * exp(B * x)) :
  No starting values specified for some parameters.
Initializing 'Const', 'A', 'B' to '1.'.
Consider specifying 'start' or using a selfStart model

> nlm1 <- update(nlmod, control = list(tol = 1e-7))
Warning message:
In nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing 'Const', 'A', 'B' to '1.'.
Consider specifying 'start' or using a selfStart model

> nlm2 <- update(nlmod, control = list(tol = 8e-8, nDcentral=TRUE), trace=TRUE)
1017400.445(4.11e+02): par = (1 1 1)
752239.9094(1.96e+02): par = (13.41553998 1.959746504 0.01471383253)
668978.9926(1.65e+02): par = (189.3774772 -162.3882591 1.397507535)
375910.4745(1.20e+02): par = (167.1787529 -119.9960435 1.42386803)
93230.26788(5.49e+01): par = (133.8879258 -56.45697809 1.498055399)
382.9221937(2.42e+00): par = (100.6364489 6.806405333 1.84811172)
138.7915397(9.68e+00): par = (100.6763251 6.489793899 0.7564107501)
24.47843640(5.42e+00): par = (100.4024547 8.003646622 0.4918079622)
0.8056918383   (4.49e-03): par = (99.9629562 10.01549373 0.4913706525)
0.8056755692   (4.09e-06): par = (99.96295295 10.01549135 0.4914577719)
0.8056755692   (7.83e-09): par = (99.96295344 10.01549217 0.4914579487)
Warning message:
In nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing 'Const', 'A', 'B' to '1.'.
Consider specifying 'start' or using a selfStart model

> nlm1
Nonlinear regression model
  model: y ~ Const + A * exp(B * x)
   data: parent.frame()
 Const  A  B 
99.9629534 10.0154922  0.4914579 
 residual sum-of-squares: 0.8056756

Number of iterations to convergence: 10 
Achieved convergence tolerance: 1.586349e-08

> nlm2
Nonlinear regression model
  model: y ~ Const + A * exp(B * x)
   data: parent.frame()
 Const  A  B 
99.9629534 10.0154922  0.4914579 
 residual sum-of-squares: 0.8056756

Number of iterations to convergence: 10 
Achieved convergence tolerance: 7.832984e-09


Instead run BOTH these tests, now the last one fails:

## R --vanilla
R version 4.1.1 Patched (2021-09-21 r80946) -- "Kick Things"
Copyright (C) 2021 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu/x86_64 (64-bit)

## The two examples below show that you can fit a model to
## artificial data with noise but not to artificial data
## without noise.
> x <- 1:10
> y <- 2*x + 3# perfect fit
## terminates in an error, because convergence cannot be confirmed:
> try(nls(y ~ a + b*x, start = list(a = 0.12345, b = 0.54321)))
> Error in nls(y ~ a + b * x, start =

Re: [Rd] R 4.1.x make check fails, stats-Ex.R, step factor reduced below minFactor

2021-10-01 Thread Martin Maechler
> Andrew Piskorski 
> on Fri, 1 Oct 2021 05:01:39 -0400 writes:

> I recently built R 4.1.1 (Patched) from source, as I have many older
> versions over the years.  This version, on Ubuntu 18.04.4 LTS:

> R 4.1.1 (Patched), 2021-09-21, svn.rev 80946, x86_64-pc-linux-gnu

> Surprisingly, "make check" fails, which I don't recall seeing before.
> The error is in from stats-Ex.R, which unfortunately terminates all
> further testing!  This particular error, "step factor ... reduced
> below 'minFactor'" does not seem very serious, but I can't figure out
> why it's happening.

> I installed with "make install install-tests" as usual, which seemed
> to work fine.  Running the same tests after install, I'm able to get
> more coverage by using errorsAreFatal=FALSE.  However, it seems the
> rest of the 'stats' tests after the bad one still do not run.

> I'm confused about the intent of this particular test.  The comment
> above it seems to says that it's SUPPOSED to throw this error, yet
> getting the error still terminates further testing, which seems
> strange.  What's supposed to happen here?

> Any ideas on why this error might be occurring, and how I should debug
> it?  What's the right way for me to disable this one failing test, so
> the ones after it can run?

> Thanks for your help!


> ## "make check" output:
> make[1]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
> make[2]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
> make[3]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
> Testing examples for package 'base'
> Testing examples for package 'tools'
> comparing 'tools-Ex.Rout' to 'tools-Ex.Rout.save' ... OK
> Testing examples for package 'utils'
> Testing examples for package 'grDevices'
> comparing 'grDevices-Ex.Rout' to 'grDevices-Ex.Rout.save' ... OK
> Testing examples for package 'graphics'
> comparing 'graphics-Ex.Rout' to 'graphics-Ex.Rout.save' ... OK
> Testing examples for package 'stats'
> Error: testing 'stats' failed
> Execution halted
> Makefile:37: recipe for target 'test-Examples-Base' failed
> make[3]: *** [test-Examples-Base] Error 1
> make[3]: Leaving directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
> ../../tests/Makefile.common:198: recipe for target 'test-Examples' failed
> make[2]: *** [test-Examples] Error 2
> make[2]: Leaving directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
> ../../tests/Makefile.common:184: recipe for target 'test-all-basics' 
failed
> make[1]: *** [test-all-basics] Error 1
> make[1]: Leaving directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
> Makefile:305: recipe for target 'check-all' failed
> make: *** [check-all] Error 2


> ## From file:  tests/Examples/stats-Ex.Rout.fail

>> ## Here, requiring close convergence, you need to use more accurate 
numerical
>> ##  differentiation; this gives Error: "step factor .. reduced below 
'minFactor' .."
>> options(digits = 10) # more accuracy for 'trace'
>> ## IGNORE_RDIFF_BEGIN
>> try(nlm1 <- update(nlmod, control = list(tol = 1e-7))) # where central 
diff. work here:
> Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = 
"default",  :
> No starting values specified for some parameters.
> Initializing 'Const', 'A', 'B' to '1.'.
> Consider specifying 'start' or using a selfStart model

So this did give an error we expected (on some platforms only),
hence used try().

However, the next one "should work" (*)
and failing there, *does* fail the tests :

>> (nlm2 <- update(nlmod, control = list(tol = 8e-8, nDcentral=TRUE), 
trace=TRUE))
> Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = 
"default",  :
> No starting values specified for some parameters.
> Initializing 'Const', 'A', 'B' to '1.'.
> Consider specifying 'start' or using a selfStart model
> 1017460.306(4.15e+02): par = (1 1 1)
> 758164.7503(2.34e+02): par = (13.42031396 1.961485 0.05947543745)
> 269506.3538(3.23e+02): par = (51.75719816 -13.09155957 0.8428607709)
> 68969.21893(1.03e+02): par = (76.0006985 -1.935226745 1.0190858)
> 633.3672230(1.29e+00): par = (100.3761515 8.624648402 5.104490259)
> 151.4400218(9.39e+00): par = (100.6344391 4.913490985 0.2849209569)
> 53.08739850(7.24e+00): par = (100.6830407 6.899303317 0.4637755074)
> 1.344478640(5.97e-01): par = (100.0368306 9.897714142 0.5169294939)
> 0.9908415909   (1.55e-02): par = (100.0300625 9.9144191 0.5023516843)
> 0.9906046057   (1.84e-05): par = (100.0288724 9.916224018 0.5025207336)
> 0.9906046054   (9.95e-08): par = (100.028875 9.916228366 0.50252165)
> 0.9906046054   (9.93e-08): par = (100.028875 9

Re: [Rd] R 4.1.x make check fails, stats-Ex.R, step factor reduced below minFactor

2021-10-01 Thread peter dalgaard
On Mac, I also don't get the error, but I think this is a different issue.

If I remember correctly, the error is known to occur on some platforms, but not 
all, which is the reason for the ## IGNORE_RDIFF_BEGIN ... END.

However _when_ it occurs, R should just print the error and continue. If it 
doesn't, something is up. One possible reason is that something has been 
playing with options(error=), e.g. in a start-up file. 

-pd


> On 1 Oct 2021, at 12:03 , Sebastian Meyer  wrote:
> 
> For what it's worth, make check runs OK for me with sessionInfo()
> 
> R version 4.1.1 Patched (2021-09-30 r80997)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Ubuntu 18.04.6 LTS
> 
> Matrix products: default
> BLAS:   /home/smeyer/R/base/release/build/lib/libRblas.so
> LAPACK: /home/smeyer/R/base/release/build/lib/libRlapack.so
> 
> The output of these examples is:
> 
>>> try(nlm1 <- update(nlmod, control = list(tol = 1e-7))) # where central 
>>> diff. work here:
>> Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  
>> :
>>  No starting values specified for some parameters.
>> Initializing ‘Const’, ‘A’, ‘B’ to '1.'.
>> Consider specifying 'start' or using a selfStart model
>> Error in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  : 
>>   step factor 0.000488281 reduced below 'minFactor' of 0.000976562
>>>   (nlm2 <- update(nlmod, control = list(tol = 8e-8, nDcentral=TRUE), 
>>> trace=TRUE))
>> Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  
>> :
>>  No starting values specified for some parameters.
>> Initializing ‘Const’, ‘A’, ‘B’ to '1.'.
>> Consider specifying 'start' or using a selfStart model
>> 1017460.306(4.15e+02): par = (1 1 1)
>> 758164.7503(2.34e+02): par = (13.42031396 1.961485 0.05947543745)
>> 269506.3537(3.23e+02): par = (51.75719817 -13.09155958 0.8428607712)
>> 68969.21891(1.03e+02): par = (76.0006985 -1.93522675 1.0190858)
>> 633.3672224(1.29e+00): par = (100.3761515 8.624648408 5.104490252)
>> 151.4400170(9.39e+00): par = (100.6344391 4.913490999 0.2849209664)
>> 53.08739445(7.24e+00): par = (100.6830407 6.899303393 0.4637755095)
>> 1.344478582(5.97e-01): par = (100.0368306 9.897714144 0.5169294926)
>> 0.9908415908   (1.55e-02): par = (100.0300625 9.9144191 0.5023516843)
>> 0.9906046057   (1.84e-05): par = (100.0288724 9.916224018 0.5025207336)
>> 0.9906046054   (9.94e-08): par = (100.028875 9.916228366 0.50252165)
>> 0.9906046054   (5.00e-08): par = (100.028875 9.916228377 0.5025216525)
>> Nonlinear regression model
>>  model: y ~ Const + A * exp(B * x)
>>   data: parent.frame()
>>  Const   A   B 100.0288750   9.9162284   0.5025217  
>> residual sum-of-squares: 0.9906046
> 
> Running with example(nls) in an interactive session gives the extra output
> 
>> Number of iterations to convergence: 11 Achieved convergence tolerance: 
>> 4.996813e-08
> 
> (when the "show.nls.convergence" option is not set to FALSE. It is set to 
> FALSE in SSasymp.Rd but not reset at the end.)
> 
> Best regards,
> 
>   Sebastian
> 
> 
> Am 01.10.21 um 11:01 schrieb Andrew Piskorski:
>> I recently built R 4.1.1 (Patched) from source, as I have many older
>> versions over the years.  This version, on Ubuntu 18.04.4 LTS:
>>   R 4.1.1 (Patched), 2021-09-21, svn.rev 80946, x86_64-pc-linux-gnu
>> Surprisingly, "make check" fails, which I don't recall seeing before.
>> The error is in from stats-Ex.R, which unfortunately terminates all
>> further testing!  This particular error, "step factor ... reduced
>> below 'minFactor'" does not seem very serious, but I can't figure out
>> why it's happening.
>> I installed with "make install install-tests" as usual, which seemed
>> to work fine.  Running the same tests after install, I'm able to get
>> more coverage by using errorsAreFatal=FALSE.  However, it seems the
>> rest of the 'stats' tests after the bad one still do not run.
>> I'm confused about the intent of this particular test.  The comment
>> above it seems to says that it's SUPPOSED to throw this error, yet
>> getting the error still terminates further testing, which seems
>> strange.  What's supposed to happen here?
>> Any ideas on why this error might be occurring, and how I should debug
>> it?  What's the right way for me to disable this one failing test, so
>> the ones after it can run?
>> Thanks for your help!
>> ## "make check" output:
>> make[1]: Entering directory 
>> '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
>> make[2]: Entering directory 
>> '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
>> make[3]: Entering directory 
>> '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
>> Testing examples for package 'base'
>> Testing examples for package 'tools'
>>   comparing 'tools-Ex.Rout' to 'tools-Ex.Rout.save' ... OK
>> Testing examples for package 'utils'
>> Testing examples for package 'grDevices'
>>   comparing 'grDevices-Ex.Rout' to 'grDevices-Ex.Rou

Re: [Rd] translation domain is not inferred correctly from a package's print methods -- intended behavior?

2021-10-01 Thread Martin Maechler
> Michael Chirico 
> on Mon, 12 Jul 2021 14:21:14 -0700 writes:

> Here is a reprex:


> # initialize reprex package
> cd /tmp
> mkdir myPkg && cd myPkg
> echo "Package: myPkg" > DESCRIPTION
> echo "Version: 0.0.1" >> DESCRIPTION
> mkdir R
> echo "print.my_class = function(x, ...) { cat(gettext(\"'%s' is
> deprecated.\"), '\n', gettext(\"'%s' is deprecated.\",
> domain='R-myPkg'), '\n') }" > R/foo.R
> echo "S3method(print, my_class)" > NAMESPACE
> # extract string for translation
> Rscript -e "tools::update_pkg_po('.')"
> # add dummy translation
> msginit -i po/R-myPkg.pot -o po/R-ja.po -l ja --no-translator
> head -n -1 po/R-ja.po > tmp && mv tmp po/R-ja.po
> echo 'msgstr "%s successfully translated"' >> po/R-ja.po
> # install .mo translations
> Rscript -e "tools::update_pkg_po('.')"
> # install package & test
> R CMD INSTALL .
> LANGUAGE=ja Rscript -e "library(myPkg); print(structure(1, class = 
'my_class'))"
> #  '%s' は廃止予定です
> #  %s successfully translated

Trying to see if the current "R-devel trunk" would still suffer
from this, and prompted by Suharto Anggono's suggestion on R's
bugzilla,   https://bugs.r-project.org/show_bug.cgi?id=17998#c24


I've finally started looking at this ..
(Not having a Japanese locale installed though).

> Note that the first gettext() call, which doesn't supply domain=,
> returns the corresponding translation from base R (i.e., the output is
> the same as gettext("'%s' is deprecated.", domain="R-base")).

I don't see this (not having a Japanase locale?  should I try
with a locale I have installed?)

> The second gettext() call, where domain= is supplied, returns our
> dummy translation, which is what I would have expected from the first
> execution.

I can get the following which seems to say that everything is
fine and fixed now, right?

MM@lynne:myPkg$ LANGUAGE=ja R-devel -s --vanilla -e 
'library(myPkg,lib.loc="~/R/library/64-linux-MM-only");structure(1,class="my_class");R.version.string'
%s successfully translated 
 %s successfully translated 
[1] "R Under development (unstable) (2021-09-30 r80997)"


MM@lynne:myPkg$ LANGUAGE=ja `R-devel RHOME`/bin/Rscript --vanilla -e 
'library(myPkg,lib.loc="~/R/library/64-linux-MM-only");structure(1,class="my_class");R.version.string'
%s successfully translated 
 %s successfully translated 
[1] "R Under development (unstable) (2021-09-30 r80997)"


Note: During my experiments, I also do observe things confusing to me, when
using Rscript and R from the command line... in some cases
getting errors (in Japanese) ... but that may be just in those
cases I have left any space in the string
((in the case of 'R' which in my case suffers from quoting hell
  because I use wrapper  sh-scripts to call my versions of R ... ))


> Here is what's in ?gettext:

>> If domain is NULL or "", and gettext or ngettext is called from a 
function in the namespace of package pkg the domain is set to "R-pkg". 
Otherwise there is no default domain.


> Does that mean the S3 print method is not "in the namespace of myPkg"?

no.

> Or is there a bug here?

Yes, rather;  or there *was* one.

Thanks a lot, Michael!

Best,
Martin

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] R 4.1.x make check fails, stats-Ex.R, step factor reduced below minFactor

2021-10-01 Thread Sebastian Meyer

For what it's worth, make check runs OK for me with sessionInfo()

R version 4.1.1 Patched (2021-09-30 r80997)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 18.04.6 LTS

Matrix products: default
BLAS:   /home/smeyer/R/base/release/build/lib/libRblas.so
LAPACK: /home/smeyer/R/base/release/build/lib/libRlapack.so

The output of these examples is:


try(nlm1 <- update(nlmod, control = list(tol = 1e-7))) # where central diff. 
work here:

Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing ‘Const’, ‘A’, ‘B’ to '1.'.
Consider specifying 'start' or using a selfStart model
Error in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  : 
  step factor 0.000488281 reduced below 'minFactor' of 0.000976562

   (nlm2 <- update(nlmod, control = list(tol = 8e-8, nDcentral=TRUE), 
trace=TRUE))

Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing ‘Const’, ‘A’, ‘B’ to '1.'.
Consider specifying 'start' or using a selfStart model
1017460.306(4.15e+02): par = (1 1 1)
758164.7503(2.34e+02): par = (13.42031396 1.961485 0.05947543745)
269506.3537(3.23e+02): par = (51.75719817 -13.09155958 0.8428607712)
68969.21891(1.03e+02): par = (76.0006985 -1.93522675 1.0190858)
633.3672224(1.29e+00): par = (100.3761515 8.624648408 5.104490252)
151.4400170(9.39e+00): par = (100.6344391 4.913490999 0.2849209664)
53.08739445(7.24e+00): par = (100.6830407 6.899303393 0.4637755095)
1.344478582(5.97e-01): par = (100.0368306 9.897714144 0.5169294926)
0.9908415908   (1.55e-02): par = (100.0300625 9.9144191 0.5023516843)
0.9906046057   (1.84e-05): par = (100.0288724 9.916224018 0.5025207336)
0.9906046054   (9.94e-08): par = (100.028875 9.916228366 0.50252165)
0.9906046054   (5.00e-08): par = (100.028875 9.916228377 0.5025216525)
Nonlinear regression model
  model: y ~ Const + A * exp(B * x)
   data: parent.frame()
  Const   A   B 
100.0288750   9.9162284   0.5025217 
 residual sum-of-squares: 0.9906046


Running with example(nls) in an interactive session gives the extra output

Number of iterations to convergence: 11 
Achieved convergence tolerance: 4.996813e-08


(when the "show.nls.convergence" option is not set to FALSE. It is set 
to FALSE in SSasymp.Rd but not reset at the end.)


Best regards,

Sebastian


Am 01.10.21 um 11:01 schrieb Andrew Piskorski:

I recently built R 4.1.1 (Patched) from source, as I have many older
versions over the years.  This version, on Ubuntu 18.04.4 LTS:

   R 4.1.1 (Patched), 2021-09-21, svn.rev 80946, x86_64-pc-linux-gnu

Surprisingly, "make check" fails, which I don't recall seeing before.
The error is in from stats-Ex.R, which unfortunately terminates all
further testing!  This particular error, "step factor ... reduced
below 'minFactor'" does not seem very serious, but I can't figure out
why it's happening.

I installed with "make install install-tests" as usual, which seemed
to work fine.  Running the same tests after install, I'm able to get
more coverage by using errorsAreFatal=FALSE.  However, it seems the
rest of the 'stats' tests after the bad one still do not run.

I'm confused about the intent of this particular test.  The comment
above it seems to says that it's SUPPOSED to throw this error, yet
getting the error still terminates further testing, which seems
strange.  What's supposed to happen here?

Any ideas on why this error might be occurring, and how I should debug
it?  What's the right way for me to disable this one failing test, so
the ones after it can run?

Thanks for your help!


## "make check" output:
make[1]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
make[2]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
make[3]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
Testing examples for package 'base'
Testing examples for package 'tools'
   comparing 'tools-Ex.Rout' to 'tools-Ex.Rout.save' ... OK
Testing examples for package 'utils'
Testing examples for package 'grDevices'
   comparing 'grDevices-Ex.Rout' to 'grDevices-Ex.Rout.save' ... OK
Testing examples for package 'graphics'
   comparing 'graphics-Ex.Rout' to 'graphics-Ex.Rout.save' ... OK
Testing examples for package 'stats'
Error: testing 'stats' failed
Execution halted
Makefile:37: recipe for target 'test-Examples-Base' failed
make[3]: *** [test-Examples-Base] Error 1
make[3]: Leaving directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
../../tests/Makefile.common:198: recipe for target 'test-Examples' failed
make[2]: *** [test-Examples] Error 2
make[2]: Leaving directory '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
../../tests/Makefile.common:184: recipe for target 'test-all-basics' failed
make[1]: *** [test-all-basics] Error 1
make[1]: Leaving d

[Rd] R 4.1.x make check fails, stats-Ex.R, step factor reduced below minFactor

2021-10-01 Thread Andrew Piskorski
I recently built R 4.1.1 (Patched) from source, as I have many older
versions over the years.  This version, on Ubuntu 18.04.4 LTS:

  R 4.1.1 (Patched), 2021-09-21, svn.rev 80946, x86_64-pc-linux-gnu

Surprisingly, "make check" fails, which I don't recall seeing before.
The error is in from stats-Ex.R, which unfortunately terminates all
further testing!  This particular error, "step factor ... reduced
below 'minFactor'" does not seem very serious, but I can't figure out
why it's happening.

I installed with "make install install-tests" as usual, which seemed
to work fine.  Running the same tests after install, I'm able to get
more coverage by using errorsAreFatal=FALSE.  However, it seems the
rest of the 'stats' tests after the bad one still do not run.

I'm confused about the intent of this particular test.  The comment
above it seems to says that it's SUPPOSED to throw this error, yet
getting the error still terminates further testing, which seems
strange.  What's supposed to happen here?

Any ideas on why this error might be occurring, and how I should debug
it?  What's the right way for me to disable this one failing test, so
the ones after it can run?

Thanks for your help!


## "make check" output:
make[1]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
make[2]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
make[3]: Entering directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
Testing examples for package 'base'
Testing examples for package 'tools'
  comparing 'tools-Ex.Rout' to 'tools-Ex.Rout.save' ... OK
Testing examples for package 'utils'
Testing examples for package 'grDevices'
  comparing 'grDevices-Ex.Rout' to 'grDevices-Ex.Rout.save' ... OK
Testing examples for package 'graphics'
  comparing 'graphics-Ex.Rout' to 'graphics-Ex.Rout.save' ... OK
Testing examples for package 'stats'
Error: testing 'stats' failed
Execution halted
Makefile:37: recipe for target 'test-Examples-Base' failed
make[3]: *** [test-Examples-Base] Error 1
make[3]: Leaving directory 
'/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests/Examples'
../../tests/Makefile.common:198: recipe for target 'test-Examples' failed
make[2]: *** [test-Examples] Error 2
make[2]: Leaving directory '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
../../tests/Makefile.common:184: recipe for target 'test-all-basics' failed
make[1]: *** [test-all-basics] Error 1
make[1]: Leaving directory '/home/nobackup/co/R/R-4-1-branch/Build-x86_64/tests'
Makefile:305: recipe for target 'check-all' failed
make: *** [check-all] Error 2


## From file:  tests/Examples/stats-Ex.Rout.fail

> ## Here, requiring close convergence, you need to use more accurate numerical
> ##  differentiation; this gives Error: "step factor .. reduced below 
> 'minFactor' .."
> options(digits = 10) # more accuracy for 'trace'
> ## IGNORE_RDIFF_BEGIN
> try(nlm1 <- update(nlmod, control = list(tol = 1e-7))) # where central diff. 
> work here:
Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing 'Const', 'A', 'B' to '1.'.
Consider specifying 'start' or using a selfStart model
>(nlm2 <- update(nlmod, control = list(tol = 8e-8, nDcentral=TRUE), 
> trace=TRUE))
Warning in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  :
  No starting values specified for some parameters.
Initializing 'Const', 'A', 'B' to '1.'.
Consider specifying 'start' or using a selfStart model
1017460.306(4.15e+02): par = (1 1 1)
758164.7503(2.34e+02): par = (13.42031396 1.961485 0.05947543745)
269506.3538(3.23e+02): par = (51.75719816 -13.09155957 0.8428607709)
68969.21893(1.03e+02): par = (76.0006985 -1.935226745 1.0190858)
633.3672230(1.29e+00): par = (100.3761515 8.624648402 5.104490259)
151.4400218(9.39e+00): par = (100.6344391 4.913490985 0.2849209569)
53.08739850(7.24e+00): par = (100.6830407 6.899303317 0.4637755074)
1.344478640(5.97e-01): par = (100.0368306 9.897714142 0.5169294939)
0.9908415909   (1.55e-02): par = (100.0300625 9.9144191 0.5023516843)
0.9906046057   (1.84e-05): par = (100.0288724 9.916224018 0.5025207336)
0.9906046054   (9.95e-08): par = (100.028875 9.916228366 0.50252165)
0.9906046054   (9.93e-08): par = (100.028875 9.916228366 0.50252165)
Error in nls(formula = y ~ Const + A * exp(B * x), algorithm = "default",  : 
  step factor 0.000488281 reduced below 'minFactor' of 0.000976562
Calls: update -> update.default -> eval -> eval -> nls
Execution halted


## After install, start R with --vanilla and run tests like this:
## 
https://cran.r-project.org/doc/manuals/r-patched/R-admin.html#Testing-a-Unix_002dalike-Installation
Sys.setenv(LC_COLLATE = "C", LC_TIME = "C", LANGUAGE = "en")
pdf("tests.pdf")
tools::testInstalledPackages(scope="base", errorsAreFatal=FALSE)

-- 
Andrew Piskorski 

__
R-devel@r-project.org mailing list
h