Re: [R-pkg-devel] Rdmacros as Suggests rather than Imports

2021-01-23 Thread Ivan Krylov
On Sat, 23 Jan 2021 04:54:51 -0500
Duncan Murdoch  wrote:

> I wonder if there's a way to define stubs for the macros in the
> package, and use mathjaxr versions only conditional on having it
> available?

I had an idea along the lines of

\Sexpr[results=rd]{if (...) '\\newcommand{\\mjeqn}{...}'}

...but \Sexpr seems to be expanded too late in the Rd processing
pipeline for the \newcommand to have any effect. This also seems to
mean that it's impossible to dynamically load macro definitions
belonging to a different package from a \Sexpr:

% doesn't affect the rest of the document
 if (nzchar(f <- system.file('help/macros/mathjax.Rd', 'mathjaxr')))

An "optional mathjaxr" implementation will probably end up with its own
copy of man/macros/mathjax.Rd (but not the rest of MathJax), with \mjeqn
and friends defined to check whether mathjaxr package is available
before outputting MathJax markup or \eqn.

Loading the rest of the .Rd in the same \Sexpr where \newcommand{} is
evaluated (1) doesn't work because only one Rd section per \Sexpr is
supported and (2) would be a truly horrible hack if it worked.

Best regards,

__ mailing list

Re: [R-pkg-devel] Sudden error on r-devel-windows-ix86+x86_64

2021-01-14 Thread Ivan Krylov
On Thu, 14 Jan 2021 12:41:02 +0100
Helmut Schütz  wrote:

> Any ideas what might be the reason?

This happened yesterday to one of my packages, with a similarly cryptic
error page (just an "r" instead of a log), then got better by itself by
today. Most likely, some problems on the test machine.

Best regards,

__ mailing list

Re: [R-pkg-devel] [Re] warning: type of ‘zhpevx_’ does not match original declaration [-Wlto-type-mismatch]

2020-12-17 Thread Ivan Krylov
Dear Pierre L.,

I think that the zhpevxC wrapper, as written, may result in undefined

>const char *JOBZ = jobz[0];

>delete[] JOBZ;

>delete[] Cap;

This could work okay, depending on how the rest of the package is
written, but in general, it is considered a bad idea for linear algebra
routines to deallocate memory they didn't allocate. ("Pointer
ownership is usually retained by the calling code.")

May I suggest once again the idea of writing a Fortran 2003 wrapper
zhpevxC instead of C++? Subroutines defined using iso_c_binding are
guaranteed to follow the C calling convention, and, this being Fortran,
call zhpevx(...) is guaranteed to match the Fortran calling convention,
bringing you the best of both worlds:

No need to allocate or deallocate memory or provide different
definitions depending on the availability of FC_LEN_T, just make sure
that both prototypes mean the same thing. By the way,
std::complex is guaranteed to match the memory layout of C type
double _Complex and Fortran type complex(kind = c_double_complex) by
the respective standards.

Best regards,

__ mailing list

Re: [R-pkg-devel] FW: [CRAN-pretest-archived] CRAN submission valmetrics 1.0.0

2020-12-16 Thread Ivan Krylov
On Wed, 16 Dec 2020 07:12:04 +
Kristin Piikki  wrote:

> Check: PDF version of manual, Result: WARNING
>   LaTeX errors when creating PDF version.
>   This typically indicates Rd problems.
>   LaTeX errors found:
>   ! Package inputenc Error: Unicode char ‐ (U+2010)
>   (inputenc)not set up for use with LaTeX.

This means that some of your man/*.Rd files contain UTF-8 characters
that LaTeX isn't ready to accept. Use tools::showNonASCIIfile on those
files to find out where the non-ASCII hyphen is hiding.

Does catch this problem when you
submit your package there, too?

Best regards,

__ mailing list

Re: [R-pkg-devel] Formatting .Rbuildignore

2020-08-17 Thread Ivan Krylov
On Mon, 17 Aug 2020 15:56:07 +0200
Thierry Onkelinx  wrote:

> Can we add blank lines in .Rbuildignore? Or lines with only comments
> (line starting with #)?

.Rbuildignore is not documented [1] to allow comments, but the
current implementation does skip empty lines [2] since 2010 [3]
(otherwise empty regular expressions would match all file names).

It could be possible to get away with comment lines, since the
resulting regular expressions are unlikely to ever match anything, but
it would cost CPU cycles to match every file name against every
regular expression, and some regular expressions may turn out to be
really expensive on some inputs [4].

I think that it's best not to try to add comments to .Rbuildignore, and
to avoid blank lines unless not having them becomes really inconvenient.

Best regards,





__ mailing list

Re: [R-pkg-devel] How to retrieve a flag set in (filled in during package installation in an R or C++ script ?

2020-08-17 Thread Ivan Krylov
On Sat, 15 Aug 2020 19:50:41 +0530
Akshit Achara  wrote:

> To access these files, I need to use the path of libminizinc (which
> can change per installation). I want to extract this path from either
> Makevars or configure to use it in my package. 

Just as Makevars is generated during ./configure run from,
you could generate a config.h from a and substitute all the
necessary #defines in it. This is how GNU autoconf is typically used in
stand-alone programs [*].

A simpler option would be to add an equivalent of
-DMZN_PATH='"@MZN_PATH"' to PKG_CPPFLAGS in and make sure
that AC_SUBST is called for that variable in [**]. Then
the C or C++ code would be able to use MZN_PATH as if it was #defined
in a header file.

Best regards,



__ mailing list

Re: [R-pkg-devel] Etiquette for package submissions that do not automatically pass checks?

2020-08-17 Thread Ivan Krylov
Dear Cesko,

On Fri, 14 Aug 2020 21:08:55 +0200
Cesko Voeten  wrote:

> The package contains functionality to run on cluster nodes that were
> set up by the user and needs to access its own internal functions
> from there.

Apologies for derailing the thread, but I had a similar problem a few
months ago [*], found what looks like a different solution but did not
have time to investigate it further.

Given that serialize() does not send package namespaces over the wire
[**], why would it be a bad idea to pass actual functions (instead of
character strings naming functions) to parallel::parLapply and friends?
This seems to avoid the need to export the worker functions or use :::
in calls to parallel functions from package functions. Unless I am
missing something, which I probably am.

Best regards,



"Package and namespace environments are written with pseudo-SEXPTYPEs
followed by the name."

__ mailing list

Re: [R-pkg-devel] os/x compiled w/ openmp?

2020-07-21 Thread Ivan Krylov
On Mon, 13 Jul 2020 10:14:14 -0400
Joshua N Pritikin  wrote:

> Is this the best place to ask?

R-SIG-Mac [*] is probably a better place for this. The short answer is
that OpenMP support has been dropped from the compiler supplied with
macOS, but there are workarounds [**].

Best regards,



__ mailing list

Re: [R-pkg-devel] "non-ASCII input" and "--data-compress" ignored

2020-07-20 Thread Ivan Krylov
On Fri, 17 Jul 2020 18:08:24 -0500
Spencer Graves  wrote:

>    I tried escaping "%" every time it occurred without success,
> but adding "\encoding{UTF-8}" as the 4th line of
> nuclearWeaponStates.Rd eliminated that problem.

Glad it works for you, but you might want to check that the link still
leads to the correct URL in the PDF output. In particular, the
following .Rd file:

\href{ Reports/Ref 0072 -
North Korea’s Nuclear Program .pdf}{Derek Bolton (2012) North Korea's
Nuclear Program}

Bolton (2012) North Korea's Nuclear Program}
} two working links when processed with R CMD Rdconv -t html,
but when I process it using R CMD Rd2pdf, the first link gets a
\T1\textquoteright instead of ’ in its URL, which makes it invalid. This
could be a LaTeX problem on my part, of course.

> I also tried loading and resaving all the files in the data
> directory.

You can also try using tools::resaveRdaFiles with various parameters if
you are interested.

> at least the first of the resulting *.rda files was corrupted

This sounds like trying to load a version-3 *.rda file (implemented in
3.5.0, default since 3.6.0) using an older version of R. Or a possible
indicator of a hardware problem.

Best regards,

__ mailing list

Re: [R-pkg-devel] Note: information on .o files is not available / Found '_exit', possibly from '_exit' (C)

2020-07-17 Thread Ivan Krylov
On Fri, 17 Jul 2020 11:25:40 +0200
Fabio Sigrist  wrote:

> Found '_exit', possibly from '_exit' (C)
> Found 'abort', possibly from 'abort' (C), 'runtime' (Fortran)
> Found 'exit', possibly from 'exit' (C), 'stop' (Fortran)
> Found 'printf', possibly from 'printf' (C)

A curious thing is that it seems to only happen on Windows. I tried
searching objdump -D output for the offending functions by means of a
Perl5 one-liner [1] and found that the functions are all called by
various bits of libgcc, OpenMP, C and C++ runtimes. This is probably
caused by linking your R package with -static-libstdc++ (which is set
in CMakeLists.txt if(WIN32 AND MINGW)).

> Note that the shared library is compiled using install.libs.R (this
> is a deliberate choice)

Some of the things it does aren't very portable (e.g. manually setting
most -W flags is discouraged). In particular, I had to manually remove
-Wno-error=cast-function-type from CMakeLists.txt to build the package
with GCC 6.3.0-18+deb9u1, despite it passing the version check

Best regards,

[1] objdump -D lib_gpboost.dll | \
 perl -lnE'
  $fn = $1 if /^[0-9a-f]+ +<([^>]+)>:/;
  say $fn if /call.*<\Q$call\E/
 ' -s -- -call=abort | c++filt

__ mailing list

Re: [R-pkg-devel] "non-ASCII input" and "--data-compress" ignored

2020-07-17 Thread Ivan Krylov
On Fri, 17 Jul 2020 02:02:36 -0500
Spencer Graves  wrote:

> If I copy this URL into a browser and back out again, I get 
> the following:
>    However, if I use this inside "\href", "R CMD check" doesn't 
> recognize the close curly bracket because of the presence of the 
> non-ASCII characters.

WRE section 2.3 [*] provides an example of \href with RFC3986
percent-encoding. Since % is a comment character in Rd, the percent
signs have to be escaped with backslashes:

Bolton (2012) North Korea's Nuclear Program}

This only works correctly in R >= 3.1.3, but results in correct output
in both HTML and PDF formats.

Alternatively, it should be possible to declare the encoding of the Rd
file using \encoding{UTF-8} (WRE 2.14 [**]), but in my tests (R 3.6.3,
could have been fixed in later versions) it results in a broken link in
Rd2pdf output.

>    I'm getting, " Note: significantly better compression could be 
> obtained by using R CMD build --resave-data".  I get this message
> even though I use "R CMD build --data-compress Ecdat".  I also tried
> "R CMD build Ecdat --data-compress" and got the same result.

The note offers you to try adding --resave-data to R CMD build, not
--data-compress. What happens if you use --resave-data=best?
--data-compress doesn't seem to be an R CMD build option; at least
it's not mentioned in R CMD build --help.

WRE 1.1.6 [***] provides an example of --data-compress as an option of
R CMD INSTALL (not build).

Best regards,


>   [[alternative HTML version deleted]]

Please don't post in HTML.




__ mailing list

Re: [R-pkg-devel] Valgrind warning on saveRDS, about object in external pointer

2020-07-15 Thread Ivan Krylov
On Wed, 08 Jul 2020 22:43:13 +0300
David Cortes  wrote:

> About the source code: it actually complains about line
> fit_model.cpp:751 :
> hplane_root->reserve(exp_nodes);

My fault. I was reading the GitHub source code instead of CRAN package
source code.

> I’m not able to reproduce the warning when trying R CMD check with
> valgrind on my computer (tried compiling with gcc9 and clang9), nor
> with the r-debug docker images from github (

My current system is amd64 Debian 9.12 (gcc 6.3.0-18+deb9u1) with R
3.6.3-1~stretchcran.0. I have built a package from the current GitHub
source of the package and ran R -d 'valgrind --track-origins=yes'
--vanilla < isotree-Ex.R. This resulted in multiple Valgrind warnings
about accessing uninitialised values created by heap allocations at
fit_model.cpp:783 and fit_model.cpp:788.

The first such error can be traced to a pointer to

(gdb) frame 2
#2  cereal::BinaryOutputArchive::saveBinary (this=, 
data=data@entry=0x14a3ad40, size=size@entry=8)
67  auto const writtenSize = static_cast( 
itsStream.rdbuf()->sputn( reinterpret_cast( data ), size ) );
(gdb) p data
$19 = (const void *) 0x14a3ad40
(gdb) frame 41
#41 0x19d8ce38 in fit_model (X_num=..., X_cat=..., ncat=..., Xc=..., 
Xc_ind=..., Xc_indptr=..., sample_weights=..., col_weights=..., nrows=100,
ncols_numeric=2, ncols_categ=0, ndim=2, ntry=3, coef_type=..., 
coef_by_prop=false, with_replacement=false, weight_as_sample=true, 
sample_size=100, ntrees=3,
max_depth=7, limit_depth=false, penalize_range=true, calc_dist=false, 
standardize_dist=true, sq_dist=false, calc_depth=false, standardize_depth=true,
weigh_by_kurt=false, prob_pick_by_gain_avg=prob_pick_by_gain_avg@entry=0, 
prob_split_by_gain_pl=prob_split_by_gain_pl@entry=0, min_gain=min_gain@entry=0, 
new_cat_action=..., missing_action=..., all_perm=false, 
build_imputer=false, output_imputations=false, min_imp_obs=3, depth_imp=..., 
random_seed=1, nthreads=1) at Rwrapper.cpp:320
320 serialized_obj  =  serialize_cpp_obj(ext_model_ptr.get());
(gdb) p _model_ptr->hplanes[0][2].remainder
$20 = (double *) 0x14a3ad40

(In some stack frames gdb helpfully says that the pointer is optimised
out, but in others it can be accessed.)

I grepped the source code for 'remainder =' and found one assignment to
hplanes.back().remainder in extended.cpp:507. Could
hplanes.emplace_back() happen without a corresponding assignment to

Best regards,

__ mailing list

Re: [R-pkg-devel] Valgrind warning on saveRDS, about object in external pointer

2020-07-08 Thread Ivan Krylov
On Wed, 08 Jul 2020 19:23:41 +0300
David Cortes  wrote:

>- The warning is about an un-initialized value allocated in a call to
>C++ std::vector::reserve, which is called on a C++ vector member of the
>struct in the external pointer.

I'm ready to admit that I didn't read the code well enough, but it
seems to me that the vector resized on line 752 resides in
std::vector worker_memory allocated on lines 411-415 in
int fit_iforest(...) and then disposed of by the end of the function. I
don't see workspace.ix_arr being saved anywhere model_outputs(_ext). I
think that workspace.ix_arr shouldn't even exist by the time the lines
following isolation.forest(...) are executed.

Can you reproduce the warning on your own computer? It might be helpful
to run R -d 'valgrind --vgdb-error=1 --vgdb-stop-at=startup', follow
Valgrind's instructions to attach the debugger to it, type "continue"
into gdb, then trigger the warning in R and use gdb to gather more
information when Valgrind stops the process around the memory access
it considers uninitialised.

Best regards,

__ mailing list

Re: [R-pkg-devel] package installation and linking with JAGS

2020-07-08 Thread Ivan Krylov
On Wed, 8 Jul 2020 11:06:31 +0200
Frantisek Bartos  wrote:

>Check: for GNU extensions in Makefiles, Result: WARNING

This warning is easy to deal with:

>JAGS_ROOT ?= c:/progra~1/JAGS/JAGS-4.3.0

Use plain "=" macro definitions, since others aren't considered portable

>SOURCES= $(wildcard *.cc) $(wildcard */*.cc)

Replace $(wildcard ...) with hard-coded lists of files.

See the POSIX standard [*], or, indeed, section 'Writing portable
packages' in WRE [**] for a list of Make features considered portable.

Alternatively, add "GNU make" to SystemRequirements: in your
DESCRIPTION. This will silence the warnings, but require the GNU
flavour of Make to install your package.

>2) the package installation works only from the source. For example,
>devtools::install_github() returns an error since .o files are
>generated inside of the package folder. A similar problem occurs when
>generating the source .tar.gz, however, manually deleting the .o files
>from it fixes the problem and it can be used for installing the

How do you build the source package before installing it? I tried to
git clone your package, then R CMD build . it, and got a perfectly valid
RoBMA_0.0.0.9000.tar.gz without any *.o files inside. I *think* that R
CMD INSTALL  may be not a good idea, but you can add .*\.o$
to .Rbuildignore to prevent the object files from getting inside your
source package this way.

>   [[alternative HTML version deleted]]

Also, please don't post in HTML.

Best regards,



__ mailing list

Re: [R-pkg-devel] warning: type of ‘zhpevx_’ does not match original declaration [-Wlto-type-mismatch]

2020-07-07 Thread Ivan Krylov
On Tue, 7 Jul 2020 03:00:23 +
Pierre Lafaye de Micheaux  wrote:

>Should I just write something like (adding the middle instruction
>below to my existing code above)?:
>#ifdef FC_LEN_T
>typedef long long int FC_LEN_T;

No, I don't think that would help.

What _might_ help is adapting the incantation from [*] to redefine
FC_LEN_T to int on older GCC:

#if defined(__GNUC__) && __GNUC__ < 7
 /* Rconfig.h #define doesn't match the actual type
  * of hidden length argument in old gfortran */
 #define FC_LEN_T int
 /* Otherwise we use the #define from Rconfig.h */
 #define USE_FC_LEN_T
/* Your code starts here */
/* ... */

Another option that _should_ help is rewriting zhpevxC in Fortran
2003 using its bind(c) feature. The C interoperability would ensure that
the resulting function is callable from C, while the fact that it's
written in Fortran should make it safe to call other Fortran functions:

subroutine zhpevxC(jobz, range, uplo, n, ap, vl, vu, il, iu, &
   abstol, m, w, z, ldz, work, rwork, iwork, &
   ifail, info) bind(c, name='zhpevxC')
 use, intrinsic :: iso_c_binding, only: c_char, c_int, c_double, &

 character(kind = c_char) :: jobz, range, uplo
 integer(kind = c_int) :: il, info, iu, ldz, m, n
 real(kind = c_double) :: abstol, vl, vu
 integer(kind = c_int) :: ifail( * ), iwork( * )
 real(kind = c_double) :: rwork( * ), w( * )
 complex(kind = c_double_complex) :: ap( * ), work( * ), z( ldz, * )

 call zhpevx(JOBZ, RANGE, UPLO, N, AP, VL, VU, IL, IU, &
 abstol, m, w, z, ldz, work, rwork, iwork, &
 ifail, info)

end subroutine

A subroutine defined like this can be represented by the following C++

extern "C" void zhpevx(
char * JOBZ, char * RANGE, char * UPLO, int * N,
std::complex * AP, double * VL, double * VU, int * IL,
int * IU, double * abstol, int * m, double * w,
std::complex * z, int * ldz, std::complex *
work, double * rwork, int * iwork, int * ifail, int * info

This is the approach described in WRE 6.6.1 Fortran character strings
near the code block with the definition of subroutine c_dgemm.

Best regards,


__ mailing list

Re: [R-pkg-devel] warning: type of ‘zhpevx_’ does not match original declaration [-Wlto-type-mismatch]

2020-07-06 Thread Ivan Krylov
On Fri, 3 Jul 2020 00:15:27 +
Pierre Lafaye de Micheaux  wrote:

>Found the following significant warnings:
>myzhpevx.cpp:13:23: warning: type of _zhpevx__ does not match
> original declaration [-Wlto-type-mismatch]

I managed to reproduce the warning on R-devel r78607 built with
--enable-lto using gcc version 6.3.0 20170516 (Debian 6.3.0-18+deb9u1):

myzhpevx.cpp:13:16: warning: type of ‘zhpevx_’ does not match original 
declaration [-Wlto-type-mismatch]
   void F77_NAME(zhpevx)(char *jobz, char *range, char *uplo,
zhpevx.f:232:7: note: type mismatch in parameter 20
zhpevx.f:232:7: note: type ‘int’ should match type ‘size_t’
/usr/lib/gcc/x86_64-linux-gnu/6/include/stddef.h:216:23: note: the incompatible 
type is defined here
 typedef __SIZE_TYPE__ size_t;
zhpevx.f:232:7: note: ‘zhpevx’ was previously declared here

Do you have access to the notes produced by the compiler in addition
to the warnings? Do they spell the same difference?

If yes, the warning is likely to be safe to ignore. m4/R.m4 notes that,
while gfortran < 8 uses int instead of size_t for hidden size arguments,
it doesn't make a practical difference.

Best regards,

__ mailing list

Re: [R-pkg-devel] data and load version 3

2020-06-30 Thread Ivan Krylov
On Tue, 30 Jun 2020 11:48:29 +0200
Göran Broström  wrote:

> No point at all with version 3 in packages?

Format version 3 [1] introduces support for ALTREP objects [2].
Examples of where ALTREP might be useful include really long integer
vectors, like 1:1e10.

Best regards,



__ mailing list

Re: [R-pkg-devel] data and load version 3

2020-06-30 Thread Ivan Krylov
On Mon, 29 Jun 2020 22:55:02 +0200
Göran Broström  wrote:

> After googling for a while (found nothing relevant in 'WRE'), I 
> understand that I have two options: (i) Change 'Depends' in
> DESCRIPTION as suggested, and (ii) using save with 'version = 2' for
> the new files.

One of the steps performed by default during R CMD build is
tools::resaveRdaFiles(), which passes NULL as version= argument to
save(). Around R (not sure about this) 3.6.0 version=NULL was changed to
mean version=3, so if you want to preserve compatibility with R <
3.5.0, you may need to run tools::resaveRdaFiles(..., version = 2)
before running R CMD build --no-resave-data.

Best regards,

__ mailing list

Re: [R-pkg-devel] R package does not find DLL routine

2020-06-28 Thread Ivan Krylov
On Sun, 28 Jun 2020 12:43:53 +0200
Lisa GM  wrote:

>  "sum_c" not resolved from current namespace (sum)

As mentioned by Dirk Eddelbuettel, this is not the way R packages are
supposed to be built [*], but if you are absolutely positive you cannot
build the DLL from source together with the package or link your
package to externally installed DLL (as done by packages curl, rgdal,
and many others), it still seems to be possible. I have been able to
get a dummy package to work like this:

.onLoad <- function(libname, pkgname)
 library.dynam('dynl', pkgname, libname, TRUE)

.onUnload <- function(libpath)
 library.dynam.unload('dynl', libpath)

do <- function()
 .C('do_hello', x = as.integer(1L), y = as.numeric(2), PACKAGE = 'dynl')

(Note the extra PACKAGE argument required in absence of native routine
registration or useDynLib(...) declarations in NAMESPACE)

I placed the DLL itself in file.path('inst', 'libs',
paste0('dynl', .Platform$dynlib.ext)) under the package source
directory and made sure that it exports a C function
void do_hello(int * x, double * y).

Needless to say, this goes against CRAN and Bioconductor policies. The
preferred approach is described in Writing R Extensions, section 1.2.

Best regards,

[*] See  sections
1.5.4 and 5.4 for preparing C functions to be called from an R package.
The Rcpp package does wrap all this in a very convenient way, see the
Rcpp.package.skeleton function.

__ mailing list

Re: [R-pkg-devel] package CatDataAnalysis

2020-06-28 Thread Ivan Krylov
On Sun, 28 Jun 2020 11:07:46 -0500
Charles Geyer  wrote:

>Please note that I made Alan Agresti (with his acquiescence) the
>author of the package

Sorry to derail this, but is it possible for Alan Agresti to add a line
to the page [*] allowing redistribution of the data, preferably under
the terms of a well-known license such as CC BY-NC [**] or ODbL [***]?
As it stands, CRAN only has your word (and the fact that this whole
thread is Cc: to Prof. Agresti) that Alan Agresti agreed to have the
data published as an R package. It might be needed to allow creating
derivative works to make creating such a package feasible, though
(otherwise I would assume that only literal redistribution is allowed).

With that done, you could be much more comfortable providing the
requested description for the package, no?

Best regards,


__ mailing list

Re: [R-pkg-devel] how to prevent a small package from yielding a large installed size?

2020-06-15 Thread Ivan Krylov
On Mon, 15 Jun 2020 11:13:21 +
Daniel Kelley  wrote:

> A possible clue is that I get a large-file note on macOS, but not
> when I use rhub for test linux builds, or winbuilder for a windows
> build.  I do not have ready access to either linux or windows
> machines, to examine those builds in detail.

For what it's worth, if I build your package on Linux with R 3.6.3 and
--no-build-vignettes, it results in R/argoFloats.rdb being ~2.4M when
installed on either same Linux or R-hub's macOS "10.13.6 High Sierra,
R-release, CRAN's setup" [*]. Perhaps you would be able to find a
difference between artifacts from the R-hub installation and your own.

Best regards,


__ mailing list

Re: [R-pkg-devel] [External] Guidelines on use of snow-style clusters in R packages?

2020-06-13 Thread Ivan Krylov
On Wed, 3 Jun 2020 08:54:56 -0500 (CDT) wrote:

> If you use [a cluster] passed to you it would be best to leave
> it in the state you found it at least as far as the search path and
> global environment are concerned.

Thanks for this advice! I guess that clusterExport() is also out of
question in package code, then.

> So use foo::bar instead of library().

I have found out that R Internals documents serialize() to use a very
efficient representation for package environments (a pseudo-SEXPTYPE
followed by the name). Does it mean that it is a good idea to pass
unexported private worker functions to parLapply(cl, X, fun), since the
package environment is not sent over network?

Best regards,

__ mailing list

Re: [R-pkg-devel] [R] a question of etiquette

2020-06-03 Thread Ivan Krylov
On Tue, 2 Jun 2020 20:33:56 -0400
Avraham Adler  wrote:

> If there is a term which reflects that mechanism from a discipline
> other than biology, please let me know.

I think that "copyleft" is the term you are looking for. The Wikipedia
page [*] defines it as

>> the practice of offering people the right to freely distribute copies
>> and modified versions of a work with the stipulation that the same
>> rights be preserved in derivative works created later

Best regards,


__ mailing list

[R-pkg-devel] Guidelines on use of snow-style clusters in R packages?

2020-05-24 Thread Ivan Krylov
Some of the packages I use make it possible to run some of the
computations in parallel. For example, sNPLS::cv_snpls calls
makeCluster() itself, makes sure that the package is loaded by workers,
exports the necessary variables and stops the cluster after it is
finished. On the other hand, multiway::parafac accepts arbitrary
cluster objects supplied by user, but requires the user to manually
preload the package on the workers. Both packages export and document
the internal functions intended to run on the workers.

Are there any guidelines for use of snow-style clusters in R packages? I
remember reading somewhere that accepting arbitrary cluster objects from
the user instead of makeCluster(detectCores()) is generally considered
a good idea (for multiple reasons ranging from giving the user more
control of CPU load to making it possible to run the code on a number
of networked machines that the package code knows nothing about), but I
couldn't find a reference for that in Writing R Extensions or parallel
package documentation.

What about preloading the package on the workers? Are there any
downsides to the package code unconditionally running clusterEvalQ(cl,
library(myself)) to avoid disappointing errors like "10 nodes produced
errors; first error: could not find function"?

Speaking of private functions intended to run by the package itself on
the worker nodes, should they be exported? I have prepared a test
package doing little more than the following:

private <- function(x) paste(x, Sys.getpid())
public <- function(cl, x) parallel::parLapply(cl, x, private)


The package passes R CMD check --as-cran without warnings or errors,
which seems to suggest that exporting worker functions is not required.

Best regards,

__ mailing list

Re: [R-pkg-devel] Help Debugging Debian Error

2020-05-15 Thread Ivan Krylov
On Fri, 15 May 2020 03:50:12 -0400
Paul Hibbing  wrote:

> Complete output:
>   > library(testthat)
>   > library(PAutilities)
>   >
>   > test_check("PAutilities")  

It seems to me that the R process crashes while running your tests, but
since testthat::test_check captures everything to summarise it later,
you don't get to know where the crash occurs.

Perhaps if you (temporarily) move your tests out of testthat/
subdirectory and remove the context() and testthat::test_that("...",
{ ... }) calls, you would be able to see the exact location of the
crash in the *.Rout files?

That said, it's probably one of your dependencies that's responsible
for the crash, not your package, since there is no compiled code.

Best regards,

__ mailing list

Re: [R-pkg-devel] /usr/bin/ld: cannot find -ludev

2020-04-26 Thread Ivan Krylov
On Sun, 26 Apr 2020 15:39:50 +
"Sameh M. Abdulah"  wrote:

> Yes, I think I will need it

Could you tell us why your package needs it? Perhaps there is a way
to make it an optional dependency.

> since I know that this is an OS libs

It is true that libudev is required for successful bootup of most
desktop and server GNU/Linux installations... not so for *BSDs, macOS,
Solaris, and Windows. I think that libudev (now part of systemd) will
not even build on these platforms.

> does adding this will help?

It will help in the sense of indicating a system requirement that might
not be present on the target system, but won't get you much further.

Is libudev a hard dependency of your package? If yes, CRAN may reject
the package as not portable enough ("Packages will not normally be
accepted that do not run on at least two of the major R platforms.").
If no, consider including a configure script [*] and some #ifdefs in
your code to make the package only depend on libudev when it is

Best regards,


__ mailing list

Re: [R-pkg-devel] More GitHub problems

2020-04-21 Thread Ivan Krylov
On Mon, 20 Apr 2020 23:44:43 -0500
Spencer Graves  wrote:

> Is there a way to restore the functionality of a local clone of a 
> GitHub repository after the SSH key it used was replaced?

Does `git remote -v` in the repo directory show
or or ssh://... URLs?

Can you add the new SSH key on  and
check ~/.ssh/config to make sure that it's used when connecting to You can verify that the SSH part of the stack works by
running `ssh`:

PTY allocation request failed on channel 0
Hi ! You've successfully authenticated, but GitHub does not
provide shell access.
Connection to closed.

>    I have a local clone that previously worked fine but now asks
> for a password that I don't think I have when I try "git push".

This sounds like it's using an https:// remote URL, not ssh. But I
might be mistaken.

Best regards,

__ mailing list

Re: [R-pkg-devel] OpenMP variable not specified in enclosing 'parallel'

2020-03-23 Thread Ivan Krylov
On Mon, 23 Mar 2020 15:29:20 +0100
Emil Sjørup  wrote:

> const int iMaxLag = 20;

> error: ‘iMaxLag’ not specified in enclosing ‘parallel’

> error: ‘iMaxLag’ is predetermined ‘shared’ for ‘shared’

This looks like a compiler bug to me. g++ seems to forget the rule that
"const" variables are supposed to be shared despite the default(none)

In a similar situation (g++ being confused about the sharing
status of a hidden temporary variable it had internally created) I had
to remove default(none) and leave a comment explaining that the code
would not compile otherwise.

Best regards,

__ mailing list

Re: [R-pkg-devel] An invalid URLs

2020-03-13 Thread Ivan Krylov
On Fri, 13 Mar 2020 11:02:06 +0300
Ivan Krylov  wrote:

> the remote server could deny requests from such automated user
> agents, only allowing clients that look like browsers

Here is what I have been able to observe:

If wait for some time, then try to access
using cURL, I start getting 403 errors in both cURL and browser.

If wait for some time, then go to using a
browser and click on a few links, subsequent access using cURL from
the same IP address also starts working (for a while).

Given the paranoid nature of the website's security system, it's hard
to offer a good solution to your problem: linking to it may place
people running R CMD check into temporary ban, while not linking to it
does not seem polite.

Best regards,

__ mailing list

Re: [R-pkg-devel] An invalid URLs

2020-03-13 Thread Ivan Krylov
On Fri, 13 Mar 2020 09:49:03 +0800 (GMT+08:00)
jared_wood  wrote:

>   Status: 403
>   Message: Forbidden
> I don’t know why. The URL was picked up in the article of this
> database and I can open it.

To be fair, my requests to this service are also blocked unless I use
Tor Browser. This could be an aggressive case of GeoIP banning.
(Blocking a well-known university but not blocking Tor exit nodes? Talk
about security theatre.)

Another reason could be that R CMD check uses libcurl with its default
User-Agent: libcurl/A.BB.C to check URLs, and the remote server could
deny requests from such automated user agents, only allowing clients
that look like browsers. I have no data to confirm this hypothesis.

Best regards,

__ mailing list

Re: [R-pkg-devel] Installed package size

2020-03-12 Thread Ivan Krylov
On Thu, 12 Mar 2020 15:16:13 +
Carsten Croonenbroeck  wrote:

> I would like to know what's the maximum size and if there's a way
> around that limit.

Here's what CRAN policy [*] says about that:

>> As a general rule, neither data nor documentation should exceed 5MB
>> (which covers several books). A CRAN package is not an appropriate
>> way to distribute course notes, and authors will be asked to trim
>> their documentation to a maximum of 5MB.

(According to src/library/tools/R/check.R, 5MB seems to be the limit on
installed size, not compressed tarball size.)

>> Where a large amount of data is required (even after compression),
>> consideration should be given to a separate data-only package which
>> can be updated only rarely (since older versions of packages are
>> archived in perpetuity).

If publishing the data separately from the code is acceptable, you
could use drat [**] to set up a repository for the data package
somewhere else, then list the data package in Suggests: and the repo in
Additional_repositories: in the DESCRIPTION of the code package, which
you could submit to CRAN.

Best regards,



__ mailing list

Re: [R-pkg-devel] Error in curl: Failed FTP upload: 550

2020-02-21 Thread Ivan Krylov
On Fri, 21 Feb 2020 14:04:24 +0100
Gianmarco Alberti  wrote:

> I have also used:
> check_win_devel() and check_win_release() out of devtools, but I keep
> getting the following message:
> Error in curl::curl_fetch_memory(url, handle = h) :
>  Failed FTP upload: 550

Does it work if you build the package manually (i.e. issue the command
R CMD build . in package directory), then upload the resulting file
using cURL or any other FTP client?

curl -T yourfile.tar.gz

You can also try the alternative upload page at

If manual FTP uploading works while devtools::check_win_devel()
doesn't, some debugging may be required. For example, try

trace(devtools:::upload_ftp, quote({str(file); str(url)}))

before running check_win_devel() or check_win_release() to see which
arguments does devtools:::check_win call upload_ftp with.

Best regards,

__ mailing list

Re: [R-pkg-devel] CRAN Package Check Results - No protocol specified (OS X only)

2020-02-18 Thread Ivan Krylov
On Mon, 17 Feb 2020 19:28:43 -0500
Dominic Comtois  wrote:

> a bunch of warnings with "No protocol specified" messages

"No protocol specified" is printed by the X11 client library when it
fails to connect to the X server. I think that these indicate a problem
with setup of the macOS testing servers.

Best regards,

__ mailing list

Re: [R-pkg-devel] Catching console messages from libGL

2020-02-17 Thread Ivan Krylov
On Mon, 17 Feb 2020 14:56:31 -0500
Duncan Murdoch  wrote:

> So how do I capture stderr (or, off topic here, how do I get libGL to
> be quiet)?

libGL seems to only offer bad news in this regard: it writes directly
to stderr [1] and does not seem to offer a way to silence the
_LOADER_FATAL messages [2], which "failed to load driver: %s" are.

(You have found that out while I was preparing the message.)

As far as I understand do_sink(), it does not touch the actual stdout
or stderr, only R's wrappers of them. You probably know better than me
how portable would it be to try to reassign stderr (which the standard
says is a macro) to open_memstream() (which is POSIX.1-2008 only).

Best regards,



__ mailing list

Re: [R-pkg-devel] finding "logo.jpg" [was: "try" malfunctions on Ubuntu Linux 16.04 LTS, R-release, GCC]

2020-02-03 Thread Ivan Krylov
On Mon, 3 Feb 2020 13:30:11 -0600
Spencer Graves  wrote:

> logo.jpg <- paste(R.home(), "doc", "html", "logo.jpg", sep 
> = .Platform$file.sep)

I wonder whether file.path(R.home('doc'), 'html', 'logo.jpg') would be
more portable. Are there R installations built without the HTML docs?

Best regards,

__ mailing list

Re: [R-pkg-devel] Adding .exes to R package?

2020-01-23 Thread Ivan Krylov
On Wed, 22 Jan 2020 20:54:40 +
Jonathan Greenberg  wrote:

> Are there reasonable tutorials on how to do this?

If you absolutely have to do this, take a look at the tinytex package.
It is basically an installer for a preselected set of packages from TeX
Live inside a platform-specific directory. However, if you go this way,
I would like to ask you to preserve the use of manually-installed GDAL
as an option, because I consider manually installed (or distro-provided
in case of GNU/Linux) binaries much easier to trust than automatically
downloaded ones.

"Writing R extensions" mentions _building_ executables as part of the
package a few paragraphs down 1.1.5 Package subdirectories, but
describes such packages as "very special cases" and requiring a lot of
additional hassle: you'd have to provide src/Makefile{,.win} that
should manually build the shared object and the executables and
src/install.libs.R that would install both the shared object and the
executables. I don't think this approach is viable, since latest GDAL
sources are about 13M in size (compressed), and maintaining both the
package and the GDAL binaries would be a serious duplication of effort.

Best regards,

__ mailing list

Re: [R-pkg-devel] Need help to resolve NOTEs in auto check

2020-01-22 Thread Ivan Krylov
Hello Ian,

On Wed, 22 Jan 2020 10:29:51 +
Ian Walker  wrote:

> * checking CRAN incoming feasibility ... NOTE
> Maintainer: ‘Ian Walker <
> >’  

This NOTE is there for CRAN staff to have another look at the
Maintainer: field in case this is someone's first submission or there
is a change in package ownership.

> Possibly mis-spelled words in DESCRIPTION:
>   DEMs (7:503)
>   STL (7:47)
>   stereolithography (7:52)
>   stl (7:24)

These should be explained in the "Optional comment" field in the CRAN
submission form.

> The build time stamp is missing.

> Checking should be performed on sources prepared by ‘R CMD build’.

How exactly do you build and check your package? It is recommended to
run the command "R CMD build ." in the source directory of the package,
then "R CMD check mypackage_version.tar.gz" (substituting the name of
the newly generated file for mypackage_version.tar.gz).

> These two .stl files are simply mentioned in the documentation's
> example code - they're files that would be created if somebody ran
> the example code; they're not in the R package I've created.

It might be a better idea to use file.temp() for these file paths. CRAN
policy says that packages should not write files anywhere besides R's
session temporary directory unless user explicitly passes the the path.

Best regards,

__ mailing list

Re: [R-pkg-devel] For reproducibility issue

2020-01-17 Thread Ivan Krylov
On Fri, 17 Jan 2020 13:55:39 +
وليد خلف معوض المطيرى  wrote:

> So, does anyone have an idea of how to solve this issue.

"Writing R Extensions", 1.6. Writing portable packages:

>> Compiled code should not call the system random number generators
>> such as rand, drand48 and random, but rather use the interfaces to
>> R’s RNGs described in Random numbers. In particular, if more than
>> one package initializes the system RNG (e.g. via srand), they will
>> interfere with each other.

>> Nor should the C++11 random number library be used, nor any other
>> third-party random number generators such as those in GSL.

It somewhat less convenient to call the R random number generator from
Fortran than it would be from C or C++, but still possible. There is a
F77-style example of such use [1], but since you are already using
iso_c_binding, you should be able to declare the C API [2] right in the
Fortran source:

subroutine GetRNGState() bind(c)
end subroutine

subroutine PutRNGstate() bind(c)
end subroutine

As a bonus, you get to use the R distribution functions [3], without
the need to implement them yourself from uniformly distributed samples:

function rnorm(mu, sigma) bind(c) result(ret)
 use intrinsic, iso_c_binding, only: c_double
 real(c_double), value :: mu, sigma
 real(c_double) ret
end function

function rgamma(shape, scale) bind(c) result(ret)
 use intrinsic, iso_c_binding, only: c_double
 real(c_double), value :: shape, scale
 real(c_double) ret
end function

(The prototypes above are unchecked; I haven't written any Fortran 2003
in more than a year.)

Best regards,




__ mailing list

Re: [R-pkg-devel] Checkpoint package failing CRAN checks

2020-01-10 Thread Ivan Krylov
I wonder why does vignettes/checkpoint.Rmd run the following:

> example_project <- tempdir()

Now example_project contains the path of per-session temporary

> dir.create(example_project, recursive = TRUE, showWarnings = FALSE) there should be no need to create it...

> unlink(example_project, recursive = TRUE)

And deleting it is might be the cause of the problems: rmarkdown
probably uses the same temporary directory to store its own files.

Perhaps example_project should be something like tempfile(project)
instead of just tempdir()? Then dir.create() and unlink() calls start
making sense.

Best regards,

__ mailing list

Re: [R-pkg-devel] suggestion: conda for third-party software

2020-01-07 Thread Ivan Krylov
On Tue, 7 Jan 2020 15:49:45 +0100
Serguei Sokol  wrote:

> Currently, many R packages include TPS as part of them thus bloating
> their sizes and often duplicating files on a given system.  And even
> when TPS is not included in an R package but is just installed on a
> system, it is not so obvious to get the right path to it. Sometimes
> pkg-config helps but it is not always present.

I agree that making a package depend on a third-party library means
finding oneself in a bit of a pickle. A really popular library like
cURL could be "just" depended upon (for the price of some problems when
building on Windows). A really small (e.g. 3 source files) and rarely
updated (just once last year) library like liborigin could "just" be
bundled (but the package maintainer would have to constantly watch out
for new versions of the library). Finding that the bundled version of a
network-facing library in an R package (e.g. libuv in httpuv) is several
minor versions out of date is always a bit scary, even if it turns out
that no major security flaws have been found in that version (just a few
low-probability resource leaks, one unlikely NULL pointer dereference
and some portability problems). The road to dependency hell is paved
with intentions of code reuse.

> So, the new feature would be to let R package developers to write in 
> DESCRIPTION/SystemRequirements field something like 
> 'conda:boost-cpp>=1.71' where 'boost-cpp' is an example of a conda 
> package and '>=1.71' is an optional version requirement.

While I appreciate the effort behind Anaconda, I would hate to see it
being *required* to depend on third-party binaries compiled by a
fourth-party (am I counting my parties right?) when there's already a
copy installed and available via means the user trusts more (e.g. via
GNU/Linux distro package, or Homebrew on macOS, or just a copy sitting
in /usr/local installed manually from source). In this regard, a
separate field like "Config/conda" suggested by Kevin Ushey sounds like
a good idea: if one wants to use Anaconda, the field is there. If one
doesn't, one can just ignore it and provide the necessary dependencies
in a different way.

Best regards,

__ mailing list

Re: [R-pkg-devel] rhub, docker and Bioconductor

2020-01-07 Thread Ivan Krylov
On Tue, 7 Jan 2020 13:54:38 +
Christian Martin Hennig  wrote:

> Are there any other options apart from rhub/docker to get wiser about
> my errors and issues (error on debian-clang in a routine that shows
> nothing suspicious on my machine

I cannot answer your original questions, but I can offer some advice on
the problems uncovered by package checks:

> covinv <- try(solve(m))
> if (class(covinv) != "try-error")

In R-devel, matrices are now also arrays [*]:

>> matrix objects now also inherit from class "array", namely, e.g.,
>> class(diag(1)) is c("matrix", "array") which invalidates code
>> assuming that length(class(obj)) == 1, an incorrect assumption that
>> is less frequently fulfilled now.

Use !inherits(convinv, 'try-error') instead.

> additional issues
> noLD
> OpenBLAS)?

These mostly seem to be concerned with the package startup message from
the mclust package. Try wrapping the calls to library(mclust) into

As for the numbers being different between and actual test
results in ATLAS/MKL/noLD/OpenBLAS tests despite set.seed() being used,
I am not sure what could be done. RNGversion() might help, but this
could also be caused by differences in mclust version, in which case
the result might just be impossible to reproduce.

Best regards,

Link should be valid until R-4.0.0 is released.

__ mailing list

Re: [R-pkg-devel] Fix non-ASCII characters in R packages

2019-12-02 Thread Ivan Krylov
On Mon, 2 Dec 2019 10:57:51 -0300
Rafael Pereira  wrote:

> checking data for non-ASCII characters ... NOTE Note: found 58 marked
> Latin-1 strings
> I have used to code below to identify my scripts that have strings
> using non-ASCII characters. 

I don't think it's about non-ASCII in source code; it's about Latin-1
strings in the package data:

git clone
cd geobr/data
  lapply(grid_state_correspondence_table, Encoding)
 ) == 'latin1'
# [1] 58

What I'm not sure of is *how* this NOTE should be fixed. "Writing R
extensions" §1.6.3 provides advice on UTF-8 strings in the R code, not
data; §5.15 only says that strings *could* be marked as Latin-1 or
UTF-8 (but doesn't say what *should* be done); finally,
tools:::.check_package_datasets seems to produce NOTEs about Latin-1,
UTF-8 and strings marked as bytes.

Best regards,

__ mailing list

Re: [R-pkg-devel] potential memory leak using openMP

2019-11-15 Thread Ivan Krylov
On Fri, 15 Nov 2019 10:49:25 -0600
Marcin Jurek  wrote:

> 1) what should I do to reproduce the error?

The following is a quick, unchecked suggestion, but:

`R CMD check  --use-valgrind` just adds `-d valgrind` to
the command line, while the report you are seeing can only be obtained
with `--leak-check=full --show-reachable=yes` added to valgrind options.

What happens if you run `VALGRIND_OPTS='--leak-check=full
--show-reachable=yes' R CMD check  --use-valgrind` ?

Best regards,

__ mailing list

Re: [R-pkg-devel] debugging memory errors

2019-11-06 Thread Ivan Krylov
On Wed, 6 Nov 2019 08:43:50 -0600
Marcin Jurek  wrote:

> I have very little clue what to do, above all because I don't know
> how to reproduce the error.

Using AddressSanitizer to find memory errors in R packages requires the
R installation to be built with AddressSanitizer, too.

I found it useful to build the package with -g (and maybe -Og), then
run the checks using `R -d valgrind CMD check ...`. Sometimes the
errors found by Valgrind are different from errors found by
AddressSanitizer, though (not to mention the differences in hardware).

Best regards,

__ mailing list

Re: [R-pkg-devel] checking CRAN incoming feasibility NOTE

2019-10-06 Thread Ivan Krylov
On Sat, 5 Oct 2019 16:52:16 -0500
"R. Mark Sharp"  wrote:

> MIT + file LICENSE
>   File 'LICENSE':
> Copyright 2017-2019 R. Mark Sharp
> Permission is hereby granted, <...>

Note that for packages licensed under MIT license, the LICENSE file
should only contain the lines:

>> YEAR: 2019

and not the actual text of MIT license. See [*] for more info.

Best regards,


__ mailing list

Re: [R-pkg-devel] please help understand an error in openMP statements

2019-09-13 Thread Ivan Krylov
On Thu, 12 Sep 2019 16:12:17 -0500
Marcin Jurek  wrote:

> U_NZentries.cpp:258:19: error: ‘covparms’ not specified in enclosing 
> ‘parallel’
>   258 |  covmat= MaternFun(dist,covparms) + diagmat(nug) ; // summation from 
> arma
>   |  ~^~~

This might be a compiler bug: const arma::vec covparms from the
argument list should have been automatically predetermined "shared". If
you try to specify #pragma omp shared(...,covparams) manually, you might
hit another error: "covparams" is predetermined "shared" for "shared".
In a similar case, I couldn't find a good workaround and had to switch
to default(shared).

> U_NZentries.cpp:271:12: error: ‘none’ not specified in enclosing ‘parallel’
>   271 | M=solve(chol(covmat,"upper"),onevec);
>   |   ~^

This is probably about the temporary returned from
chol(covmat,"upper"). Declare a variable for it inside the loop, and
the compiler will understand that it's private. Again, this shouldn't
be happening, so what I'm offering is a workaround.

What I find really strange is that gcc-4.9.3/mingw_32/bin/g++.exe
doesn't have any problems with this code, while supposedly newer g++-9

Best regards,

__ mailing list

Re: [R-pkg-devel] No protocol specified

2019-08-25 Thread Ivan Krylov
On Fri, 23 Aug 2019 09:18:07 +0200
"Dr. rer. nat. Michael Thrun"  wrote:

>> No protocol specified
> Can anyone translate this note to me in a way that I know what the
> issue is?

I might be mistaken, but "No protocol specified" is an error frequently
returned by an X server [*] when a connection to the display can be
established, but the client fails to authenticate and is denied access.

If my diagnosis is true, the error message indicates a setup problem on
the machine running R CMD check, not a package problem. At the
very least, this error message does not seem to appear anywhere in the R
source code.

Best regards,


__ mailing list

Re: [R-pkg-devel] "Additional issues" show WRITE outside an array

2019-07-26 Thread Ivan Krylov
On Fri, 26 Jul 2019 01:50:36 -0500
Jiahuan ye  wrote:

> I am very confused what causes the ERROR.

Your code on line 197 of src/min_wgss.cpp causes memory access outside
the block that had been allocated for the best_change_point vector.

I have not read the code in depth, but it looks like the length of
best_change_point should be M at this point. Array indices on the C++
side start with 0 and end at (length-1), so erasing the element at
position M in a NumericVector of length M in Rcpp code results in
undefined behaviour.

Best regards,

__ mailing list

Re: [R-pkg-devel] "Progress reports" for examples in packages.

2019-07-02 Thread Ivan Krylov
Could R CMD check be using valgrind to run the examples? Valgrind has
to interpret CPU instructions manually to be able to warn about
results of code execution depending on memory values it considers
undefined, so it is much slower than execution on a real CPU.

One way to verify that on a GNU/Linux system would be to temporary
insert system('pstree -aps $$') into one of the examples and look at
the output produced.

Best regards,

__ mailing list

Re: [R-pkg-devel] How to obtain intercept of intercept-only glm in Fortran?

2019-05-11 Thread Ivan Krylov
On Fri, 10 May 2019 16:17:42 +
"Wang, Zhu"  wrote:

> Are there any examples or links for me to follow through more closely?

Calling R functions from C++ is described at
elsewhere in Rcpp documentation. An example follows:

using namespace Rcpp;

extern "C" double intercept_glm(size_t n, const double * response) {
// access functions from default environment
Function glm_fit(""), coef("coef");

// intercept-only model: response ~ 1
NumericVector x(n);

// I couldn't find a way to wrap a double* into a NumericVector
// without copying anything, sorry; perhaps someone else
// can offer a solution
NumericVector y(n);
std::copy_n(response, n, y.begin());

// call the R function, convert the result back
return as(coef(glm_fit(x, y)));

Since this function is extern "C" and uses only primitive C types, it
should be fairly easy to call from Fortran. (C is the lingua franca of
programming languages). Fortran-C interoperability is well described in
"Modern Fortran Explained" by Metcalf et al. Here is the Fortran side
of the code:

subroutine callglm(ret)
use, intrinsic :: iso_c_binding, only: c_size_t, c_double
! using iso_c_binding here
! - to get correct type of ret when R calls the function
! - to convert variables before calling C function
implicit none
! using F77-style arguments to match expectations of .Fortran()
real(c_double), intent(out) :: ret
! toy data to compare against R code later
real :: y(10) = [10, 11, 20, 9, 10, 8, 11, 45, 2, 3]
 ! the interface block declares an extern "C" function
  ! double intercept_glm(size_t n, const double * response)
  function intercept_glm(n, response) bind(c)
   use, intrinsic :: iso_c_binding
   real(c_double) :: intercept_glm
   integer(c_size_t), value :: n
   real(c_double) :: response(*)
  end function
 end interface

 ! call the function as you would call any other function
 ret = intercept_glm(int(size(y), c_size_t), real(y, c_double))
end subroutine

For a quick test, make sure that you have Rcpp installed and run:

# adjust R version and path if your library is elsewhere
PKG_CPPFLAGS='-g -I ~/R/x86_64-pc-linux-gnu-library/3.3/Rcpp/include' \
R CMD SHLIB callglm.f90 glmfit.cpp
dyn.load('') # change extension if needed
.Fortran('callglm', ret=numeric(1))
# $ret
# [1] 12.9
coef(, 10), c(10, 11, 20, 9, 10, 8, 11, 45, 2, 3)))
# [1] 12.9

To use this in a package, place both files in the src/ subdirectory of
your package and add LinkingTo: Rcpp in the DESCRIPTION.

Best regards,

__ mailing list

Re: [R-pkg-devel] R CMD check ERROR (strange to me)

2019-05-10 Thread Ivan Krylov
On Thu, 9 May 2019 23:19:46 +
"Wang, Zhu"  wrote:

> I have encountered some strange error (see 00install.out).

It seems to have been stripped by the attachment filter, and there are
no compilation errors on my system. Can you include the relevant lines
from 00install.out inline?

Best regards,

__ mailing list

Re: [R-pkg-devel] How to obtain intercept of intercept-only glm in Fortran?

2019-05-06 Thread Ivan Krylov
On Sat, 4 May 2019 22:41:16 +
"Wang, Zhu"  wrote:

> In an R package I would like to compute intercept for an
> intercept-only GLM in a Fortran subroutine.

If all else fails, you could use R API [*] to call coef(,
though it might require writing a C or C++ wrapper to avoid translating
all function prototypes from Rinternals.h into Fortran 2003 C
interoperability syntax.

Best regards,


__ mailing list

Re: [R-pkg-devel] Need help for Debian check error

2019-04-30 Thread Ivan Krylov
On Tue, 30 Apr 2019 19:53:49 +
"Zhang, Lixiang"  wrote:

> I don't know why is that happening and how should I fix it. Thanks a
> lot for your help.

The e-mail from CRAN also said,

>> More details are given in the directory:

followed by a link scrambled by your e-mail software. When you follow
the link, you can find this file:

which lists the messages from the compiler produced when trying to
build your package on a Debian machine. "HUGE" is not an identifier
defined anywhere in C++ standard (AFAIK), but in standard 
header you can find HUGE_VAL
, which seems
to do what you mean.

Best regards,

__ mailing list

Re: [R-pkg-devel] parallel computing slower than sequential computing

2019-04-30 Thread Ivan Krylov
On Mon, 29 Apr 2019 23:44:42 +
"Wang, Zhu"  wrote:

> sessionInfo()  
> R version 3.5.2 (2018-12-20)
> Platform: x86_64-pc-linux-gnu (64-bit)
> Running under: Ubuntu 18.04.2 LTS

Which BLAS implementation do you use? One popular implementation,
OpenBLAS, spawns multiple threads to do some operations faster; the
threads can compete against each other for CPU resources if resulting
number of processes * threads per process is more than what CPU can

How many CPU cores does your system have? Does this include SMT (also
known as hyper-threading on Intel processors)? While some problems
benefit from processor pipeline being able to fetch from multiple
threads at the same time, for others it's more of a bottleneck.

It may help to decrease the n.cores parameter.

Best regards,

__ mailing list

[R-pkg-devel] Win-builder: Author field differs from that derived from Authors@R

2019-02-06 Thread Ivan Krylov

I'm relying on Authors@R to generate the Author: and Maintainer:
headers. When checking my package at win-builder using R-unstable, I
got a NOTE that Author field differs from that derived from Authors@R:

>> Author: 'Miquel Garriga [aut, cph], Stefan Gerlach [aut, cph],
>> Ion Vasilief [aut, cph], Alex Kargovsky [aut, cph], Knut Franke
>> [cph], Alexander Semke [cph], Tilman Benkert [cph], Kasper Peeters
>> [cph], Russell Standish [cph], Ivan Krylov [cre, cph]'

>> Authors@R: 'Miquel Garriga [aut, cph], Stefan Gerlach [aut, cph],
>> Ion Vasilief [aut, cph], Alex Kargovsky [aut, cph], Knut Franke
>> [con, cph], Alexander Semke [con, cph], Tilman Benkert [con, cph],
>> Kasper Peeters [con, cph], Russell Standish [con, cph], Ivan Krylov
>> [cre, cph]'

Link to full check output:

The actual field from DESCRIPTION now looks like this:

>> Authors@R: c(person('Miquel', 'Garriga', email =
>> '', role = c('aut', 'cph')), person('Stefan',
>> 'Gerlach', email = '', role = c('aut',
>> 'cph')), person('Ion', 'Vasilief', email = '',
>> role = c('aut', 'cph')), person('Alex', 'Kargovsky', email =
>> '', role = c('aut', 'cph')),
>> person('Knut', 'Franke', email = '', role =
>> c('con', 'cph')), person('Alexander', 'Semke', email =
>> '', role = c('con', 'cph')), person('Tilman',
>> 'Benkert', email = '', role = c('con', 'cph')),
>> person('Kasper', 'Peeters', email = '',
>> role = c('con', 'cph')), person('Russell', 'Standish', role =
>> c('con', 'cph')), person('Ivan', 'Krylov', email =
>> '', role = c('cre', 'cph')))

The version of R I'm currently using is 3.3.3 (2017-03-06) from Debian
stable, which might explain the differences in utils:::format.person.

What should I do to avoid the NOTE?

Best regards,

__ mailing list

[R-pkg-devel] Robj: reading proprietary format - first time packaging

2019-01-29 Thread Ivan Krylov

Not being able to find an R package that would let me import an
Origin(R)[0] OPJ file, I found liborigin[1] and used Rcpp (thanks,
Dirk Eddelbuettel!) to create the package which I titled Ropj[2]. Right
now it only understands the absolute minimum of what I needed to
import, but I intend to translate more object types in the future.

I have bundled liborigin as a subdirectory under src, which seems to be
an acceptable practice for small (3 *.cpp files) libraries whose
license (GPL >= 3) matches the package. I'm passing R CMD check
--as-cran, both on my PC and at win-builder (thanks, Uwe Ligges!) [3],
except with a small NOTE that the word OPJ might be misspelled.

Is there anything I might have overlooked that would prevent this
package from being accepted to CRAN? For example, could the presence of
the words Origin(R) or OPJ or OPJ-decoding code itself be a problem
from trademark or licensing point of view?

Should I submit the first version right away or implement all the
features I can in advance?

Best regards,





__ mailing list