Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Gabriel Becker
Ah, thats embarrassing. Thats a bug in how/where I handle lack of
connectivity, rather than me not doing it. I've just push a fix to the
github repo that now cleanly passes check  with no internet connectivity
(much more stringent).

Using a canned file is a bit odd, because in the case where there's no
connectivity, the package  won't work (the canned file would just set the
repositories to URLs that R still won't be able to reach).

Anyway,
Thanks
~G

On Mon, Sep 26, 2022 at 3:11 PM Simon Urbanek 
wrote:

>
>
> > On 27/09/2022, at 11:02 AM, Gabriel Becker 
> wrote:
> >
> > For the record, the only things switchr (my package) is doing internet
> wise should be hitting the bioconductor config file (
> http://bioconductor.org/config.yaml) so that it knows the things it need
> to know about Bioc repos/versions/etc (at load time, actually, not install
> time, but since install does a test load, those are essentially the same).
> >
> > I have fallback behavior for when the file can't be read, so there
> shouldn't be any actual build breakages/install breakages I don't think,
> but the check does happen.
> >
>
> $ sandbox-exec -n no-network R CMD INSTALL switchr_0.14.5.tar.gz
> [...]
> ** testing if installed package can be loaded from final location
> Error in readLines(con) :
>   cannot open the connection to 'http://bioconductor.org/config.yaml'
> Calls:  ... getBiocDevelVr -> getBiocYaml -> inet_handlers ->
> readLines
> Execution halted
> ERROR: loading failed
>
> So, yes, it does break. You should recover from the error and use a
> fall-back file that you ship.
>
> Cheers,
> Simon
>
>
> > Advice on what to do for the above use case that is better practice is
> welcome.
> >
> > ~G
> >
> > On Mon, Sep 26, 2022 at 2:40 PM Simon Urbanek <
> simon.urba...@r-project.org> wrote:
> >
> >
> > > On 27/09/2022, at 10:21 AM, Iñaki Ucar 
> wrote:
> > >
> > > On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
> > >  wrote:
> > >>
> > >> Iñaki,
> > >>
> > >> I'm not sure I understand - system dependencies are an entirely
> different topic and I would argue a far more important one (very happy to
> start a discussion about that), but that has nothing to do with declaring
> downloads. I assumed your question was about large files in packages which
> packages avoid to ship and download instead so declaring them would be
> useful.
> > >
> > > Exactly. Maybe there's a misunderstanding, because I didn't talk about
> system dependencies (alas there are packages that try to download things
> that are declared as system dependencies, as Gabe noted). :)
> > >
> >
> >
> > Ok, understood. I would like to tackle those as well, but let's start
> that conversation in a few weeks when I have a lot more time.
> >
> >
> > >> And for that, the obvious answer is they shouldn't do that - if a
> package needs a file to run, it should include it. So an easy solution is
> to disallow it.
> > >
> > > Then we completely agree. My proposal about declaring additional
> sources was because, given that so many packages do this, I thought that I
> would find a strong opposition to this. But if R Core / CRAN is ok with
> just limiting net access at install time, then that's perfect to me. :)
> > >
> >
> > Yes we do agree :). I started looking at your list, and so far those
> seem simply bugs or design deficiencies in the packages (and outright
> policy violations). I think the only reason they exist is that it doesn't
> get detected in CRAN incoming, it's certainly not intentional.
> >
> > Cheers,
> > Simon
> >
> >
> > > Iñaki
> > >
> > >> But so far all examples where just (ab)use of downloads for binary
> dependencies which is an entirely different issue that needs a different
> solution (in a naive way declaring such dependencies, but we know it's not
> that simple - and download URLs don't help there).
> > >>
> > >> Cheers,
> > >> Simon
> > >>
> > >>
> > >>> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> > >>>
> > >>> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> > >>>  wrote:
> > 
> >  Iñaki,
> > 
> >  I fully agree, this a very common issue since vast majority of
> server deployments I have encountered don't allow internet access. In
> practice this means that such packages are effectively banned.
> > 
> >  I would argue that not even (1) or (2) are really an issue, because
> in fact the CRAN policy doesn't impose any absolute limits on size, it only
> states that the package should be "of minimum necessary size" which means
> it shouldn't waste space. If there is no way to reduce the size without
> impacting functionality, it's perfectly fine.
> > >>>
> > >>> "Packages should be of the minimum necessary size" is subject to
> > >>> interpretation. And in practice, there is an issue with e.g. packages
> > >>> that "bundle" big third-party libraries. There are also packages that
> > >>> require downloading precompiled code, JARs... at installation time.
> > >>>
> >  That said, there are exceptions such as very large datasets 

Re: [Rd] Question about grid.group compositing operators in cairo

2022-09-26 Thread Paul Murrell



Could you also please send me the SVG code that your device is 
generating for your example.  Thanks!


Paul

On 27/09/22 08:50, Paul Murrell wrote:

Hi

Thanks for the report.  It certainly sounds like I have done something 
stupid :)  For my debugging and testing could you please share the R 
code from your tests ?  Thanks!


Paul

On 26/09/22 10:27, Panagiotis Skintzos wrote:

Hello,

I'm trying to update ggiraph package in graphic engine v15 (currently 
we support up to v14).


I've implemented the group operators and when I compare the outputs of 
ggiraph::dsvg with the outputs of svg/png, I noticed some weird results.


Specifically, some operators in cairo (in, out, dest.in, dest.atop) 
give strange output, when any source element in the group has a stroke 
color defined.


I attach three example images, where two stroked rectangles are used 
as source (right) and destination (left).


cairo.over.png shows the result of the over operator in cairo

cairo.in.png shows the result of the in operator in cairo

dsvg.in.png shows the result of the in operator in dsvg


You can see the difference between cairo.in.png and dsvg.in.png. I 
found out why I get different results:


In dsvg implementation there is one drawing operation: Draw the source 
element, as whole (fill and stroke) over the destination element 
(using feComposite filter)


In cairo implementation though there are two operations: Apply the 
fill on source and draw over the destination and then apply the stroke 
and draw over the result of the previous operation.


I'm not sure if this is intentional or not. Shouldn't the source 
element being drawn first as whole (fill and stroke with over 
operator) and then apply the group operator and draw it over the 
destination? It would seem more logical that way.



Thanks,

Panagiotis


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel 





--
Dr Paul Murrell
Te Kura Tatauranga | Department of Statistics
Waipapa Taumata Rau | The University of Auckland
Private Bag 92019, Auckland 1142, New Zealand
64 9 3737599 x85392
p...@stat.auckland.ac.nz
www.stat.auckland.ac.nz/~paul/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Dirk Eddelbuettel


Regarding 'system' libraries: Packages like stringi and nloptr download the
source of, respectively, libicu or libnlopt and build a library _if_ the
library is not found locally.  If we outlaw this, more users may hit a brick
wall because they cannot install system libraries (for lack of permissions),
or don't know how to, or ...  These facilities were not added to run afoul of
best practices -- they were added to help actual users. Something to keep in
mind. 

Dirk

-- 
dirk.eddelbuettel.com | @eddelbuettel | e...@debian.org

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Bob Rudis
I would personally like something like an Android/iOS permissions
required/requested manifest document describing what the pkg needs
with R doing what it can to enforce said permissions. R would be
breaking some ground in this space, but it does that regularly in many
respects. Yes, I know I just 10x++ the scope.

I'd support just this flag, tho. Anything to increase transparency and safety.

On Mon, Sep 26, 2022 at 6:22 PM Simon Urbanek
 wrote:
>
> BTW: It is a good question whether packages that require internet access in 
> order to function at all should be flagged as such so they can be removed 
> from server installations. Let's say if a package provides an API for 
> retrieving stock quotes online and it's all it does then perhaps it does make 
> sense to exclude it. It would be pointless to appease the load check just to 
> not be able to perform the function it was designed for...
>
> Cheers,
> Simon
>
>
> > On 27/09/2022, at 11:11 AM, Simon Urbanek  
> > wrote:
> >
> >
> >
> >> On 27/09/2022, at 11:02 AM, Gabriel Becker  wrote:
> >>
> >> For the record, the only things switchr (my package) is doing internet 
> >> wise should be hitting the bioconductor config file 
> >> (http://bioconductor.org/config.yaml) so that it knows the things it need 
> >> to know about Bioc repos/versions/etc (at load time, actually, not install 
> >> time, but since install does a test load, those are essentially the same).
> >>
> >> I have fallback behavior for when the file can't be read, so there 
> >> shouldn't be any actual build breakages/install breakages I don't think, 
> >> but the check does happen.
> >>
> >
> > $ sandbox-exec -n no-network R CMD INSTALL switchr_0.14.5.tar.gz
> > [...]
> > ** testing if installed package can be loaded from final location
> > Error in readLines(con) :
> >  cannot open the connection to 'http://bioconductor.org/config.yaml'
> > Calls:  ... getBiocDevelVr -> getBiocYaml -> inet_handlers -> 
> > readLines
> > Execution halted
> > ERROR: loading failed
> >
> > So, yes, it does break. You should recover from the error and use a 
> > fall-back file that you ship.
> >
> > Cheers,
> > Simon
> >
> >
> >> Advice on what to do for the above use case that is better practice is 
> >> welcome.
> >>
> >> ~G
> >>
> >> On Mon, Sep 26, 2022 at 2:40 PM Simon Urbanek 
> >>  wrote:
> >>
> >>
> >>> On 27/09/2022, at 10:21 AM, Iñaki Ucar  wrote:
> >>>
> >>> On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
> >>>  wrote:
> 
>  Iñaki,
> 
>  I'm not sure I understand - system dependencies are an entirely 
>  different topic and I would argue a far more important one (very happy 
>  to start a discussion about that), but that has nothing to do with 
>  declaring downloads. I assumed your question was about large files in 
>  packages which packages avoid to ship and download instead so declaring 
>  them would be useful.
> >>>
> >>> Exactly. Maybe there's a misunderstanding, because I didn't talk about 
> >>> system dependencies (alas there are packages that try to download things 
> >>> that are declared as system dependencies, as Gabe noted). :)
> >>>
> >>
> >>
> >> Ok, understood. I would like to tackle those as well, but let's start that 
> >> conversation in a few weeks when I have a lot more time.
> >>
> >>
>  And for that, the obvious answer is they shouldn't do that - if a 
>  package needs a file to run, it should include it. So an easy solution 
>  is to disallow it.
> >>>
> >>> Then we completely agree. My proposal about declaring additional sources 
> >>> was because, given that so many packages do this, I thought that I would 
> >>> find a strong opposition to this. But if R Core / CRAN is ok with just 
> >>> limiting net access at install time, then that's perfect to me. :)
> >>>
> >>
> >> Yes we do agree :). I started looking at your list, and so far those seem 
> >> simply bugs or design deficiencies in the packages (and outright policy 
> >> violations). I think the only reason they exist is that it doesn't get 
> >> detected in CRAN incoming, it's certainly not intentional.
> >>
> >> Cheers,
> >> Simon
> >>
> >>
> >>> Iñaki
> >>>
>  But so far all examples where just (ab)use of downloads for binary 
>  dependencies which is an entirely different issue that needs a different 
>  solution (in a naive way declaring such dependencies, but we know it's 
>  not that simple - and download URLs don't help there).
> 
>  Cheers,
>  Simon
> 
> 
> > On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> >
> > On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >  wrote:
> >>
> >> Iñaki,
> >>
> >> I fully agree, this a very common issue since vast majority of server 
> >> deployments I have encountered don't allow internet access. In 
> >> practice this means that such packages are effectively banned.
> >>
> >> I would argue that not even (1) or (2) are really an issue, because in 
> 

Re: [R-pkg-devel] Inquiry

2022-09-26 Thread Rolf Turner


On Mon, 26 Sep 2022 20:07:28 -0400
Andrew Simmons  wrote:

> This issue isn't related to RStudio.
> 
> The issue is that you're exporting an object without providing any
> documentation for it. It sounds like you don't want to export it, so
> you need to go to your NAMESPACE file and remove the part was
> export(r2). If you do want to export it, then you need to document
> it, there's no way around that. The documentation doesn't need to be
> very detailed.
> 
> 
> On Mon., Sep. 26, 2022, 20:01 ,  wrote:
> 
> > ​Hi there,
> > I am writing to aks your help for an issuue arising when I am
> > writing my R package using R studio 1.2.1335 as follows.
> > ``checking for missing documentation entries ... WARNING
> > Undocumented code objects:
> >   'r2'
> > All user-level objects in a package should have documentation
> > entries." The function r2.r is among  .r   files within R folder of
> > my package. I am not interested to present "r2" in the R topics
> > documented: as a part of first page of built pdf help file of my
> > package. I appreciate highly if you could help me to solve this
> > issue.

My understanding is that functions, that are not meant to be called
directly by users, should be documented in a file named
-internal.Rd.  It should have a structure something like:

> \name{pkgnm-internal}
> \alias{pkgnm-internal}
> \alias{foo}
> \alias{bar}
> \alias{clyde}
> \title{Internal pkgnm functions}
> \description{
>   Internal pkgnm functions.
> }
> \usage{
> foo(x,y,z)
> bar(melvin,irving,...)
> clyde(arrgh)
> }
> \details{
>   These functions are not intended to be called directly by the
>   user.  
> }
> \author{Hoo Hee
>   \email{hoo@somewhere.otr}
> }
> \keyword{internal}

Then if someone types, e.g., "help(clyde)" they get the processed
form of the forgoing *.Rd file displayed, and are thereby told that
they probably should not mess with clyde().

cheers,

Rolf Turner

P.S. I always export everything.

R. T.

-- 
Honorary Research Fellow
Department of Statistics
University of Auckland
Phone: +64-9-373-7599 ext. 88276

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R build fail without a message

2022-09-26 Thread Simon Urbanek
Paul,

I wouldn't worry about oldrel - that's likely an incomplete run (I don't see 
that error anymore), but I would worry about the failure on R-release:
https://www.r-project.org/nosvn/R.check/r-release-macos-arm64/EdSurvey-00check.html

You can always check with the Mac Builder before you submit it to CRAN:
https://mac.r-project.org/macbuilder/submit.html

Cheers,
Simon



> On 27/09/2022, at 10:03 AM, Bailey, Paul  wrote:
> 
> Hi,
> 
> One of my CRAN packages gets an ARM-64 build fail, visible at 
> https://www.r-project.org/nosvn/R.check/r-oldrel-macos-arm64/EdSurvey-00check.html
>  
> 
> It ends:
> 
>checking replacement functions ... OK
>checking foreign function calls ...
> 
> It looks like someone tripped over the power chord, but I have no way of 
> knowing what actually happened.
> 
> I cannot reproduce this on Rhub for R-release on ARM-64 nor on my coworker's 
> m1 on R-release. Any ideas on what I can do to diagnose this before 
> resubmitting?
> 
> Best,
> Paul Bailey, Ph.D.
> Senior Economist, AIR
> 202-403-5694
> 
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> 

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Inquiry

2022-09-26 Thread Andrew Simmons
This issue isn't related to RStudio.

The issue is that you're exporting an object without providing any
documentation for it. It sounds like you don't want to export it, so you
need to go to your NAMESPACE file and remove the part was export(r2). If
you do want to export it, then you need to document it, there's no way
around that. The documentation doesn't need to be very detailed.


On Mon., Sep. 26, 2022, 20:01 ,  wrote:

> ​Hi there,
> I am writing to aks your help for an issuue arising when I am writing my R
> package using R studio 1.2.1335 as follows.
> ``checking for missing documentation entries ... WARNING
> Undocumented code objects:
>   'r2'
> All user-level objects in a package should have documentation entries."
> The function r2.r is among  .r   files within R folder of my package.
> I am not interested to present "r2" in the R topics documented: as a part
> of first page of built pdf help file of my package.
> I appreciate highly if you could help me to solve this issue.
>
> --
> This email was Anti Virus checked by  Security Gateway.
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Inquiry

2022-09-26 Thread teimouri
​Hi there,
I am writing to aks your help for an issuue arising when I am writing my R 
package using R studio 1.2.1335 as follows.
``checking for missing documentation entries ... WARNING
Undocumented code objects:
  'r2'
All user-level objects in a package should have documentation entries."
The function r2.r is among  .r   files within R folder of my package. 
I am not interested to present "r2" in the R topics documented: as a part of 
first page of built pdf help file of my package.
I appreciate highly if you could help me to solve this issue.

-- 
This email was Anti Virus checked by  Security Gateway.

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek
BTW: It is a good question whether packages that require internet access in 
order to function at all should be flagged as such so they can be removed from 
server installations. Let's say if a package provides an API for retrieving 
stock quotes online and it's all it does then perhaps it does make sense to 
exclude it. It would be pointless to appease the load check just to not be able 
to perform the function it was designed for...

Cheers,
Simon


> On 27/09/2022, at 11:11 AM, Simon Urbanek  wrote:
> 
> 
> 
>> On 27/09/2022, at 11:02 AM, Gabriel Becker  wrote:
>> 
>> For the record, the only things switchr (my package) is doing internet wise 
>> should be hitting the bioconductor config file 
>> (http://bioconductor.org/config.yaml) so that it knows the things it need to 
>> know about Bioc repos/versions/etc (at load time, actually, not install 
>> time, but since install does a test load, those are essentially the same).
>> 
>> I have fallback behavior for when the file can't be read, so there shouldn't 
>> be any actual build breakages/install breakages I don't think, but the check 
>> does happen.
>> 
> 
> $ sandbox-exec -n no-network R CMD INSTALL switchr_0.14.5.tar.gz 
> [...]
> ** testing if installed package can be loaded from final location
> Error in readLines(con) : 
>  cannot open the connection to 'http://bioconductor.org/config.yaml'
> Calls:  ... getBiocDevelVr -> getBiocYaml -> inet_handlers -> 
> readLines
> Execution halted
> ERROR: loading failed
> 
> So, yes, it does break. You should recover from the error and use a fall-back 
> file that you ship.
> 
> Cheers,
> Simon
> 
> 
>> Advice on what to do for the above use case that is better practice is 
>> welcome.
>> 
>> ~G
>> 
>> On Mon, Sep 26, 2022 at 2:40 PM Simon Urbanek  
>> wrote:
>> 
>> 
>>> On 27/09/2022, at 10:21 AM, Iñaki Ucar  wrote:
>>> 
>>> On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
>>>  wrote:
 
 Iñaki,
 
 I'm not sure I understand - system dependencies are an entirely different 
 topic and I would argue a far more important one (very happy to start a 
 discussion about that), but that has nothing to do with declaring 
 downloads. I assumed your question was about large files in packages which 
 packages avoid to ship and download instead so declaring them would be 
 useful.
>>> 
>>> Exactly. Maybe there's a misunderstanding, because I didn't talk about 
>>> system dependencies (alas there are packages that try to download things 
>>> that are declared as system dependencies, as Gabe noted). :)
>>> 
>> 
>> 
>> Ok, understood. I would like to tackle those as well, but let's start that 
>> conversation in a few weeks when I have a lot more time.
>> 
>> 
 And for that, the obvious answer is they shouldn't do that - if a package 
 needs a file to run, it should include it. So an easy solution is to 
 disallow it.
>>> 
>>> Then we completely agree. My proposal about declaring additional sources 
>>> was because, given that so many packages do this, I thought that I would 
>>> find a strong opposition to this. But if R Core / CRAN is ok with just 
>>> limiting net access at install time, then that's perfect to me. :)
>>> 
>> 
>> Yes we do agree :). I started looking at your list, and so far those seem 
>> simply bugs or design deficiencies in the packages (and outright policy 
>> violations). I think the only reason they exist is that it doesn't get 
>> detected in CRAN incoming, it's certainly not intentional.
>> 
>> Cheers,
>> Simon
>> 
>> 
>>> Iñaki
>>> 
 But so far all examples where just (ab)use of downloads for binary 
 dependencies which is an entirely different issue that needs a different 
 solution (in a naive way declaring such dependencies, but we know it's not 
 that simple - and download URLs don't help there).
 
 Cheers,
 Simon
 
 
> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> 
> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
>  wrote:
>> 
>> Iñaki,
>> 
>> I fully agree, this a very common issue since vast majority of server 
>> deployments I have encountered don't allow internet access. In practice 
>> this means that such packages are effectively banned.
>> 
>> I would argue that not even (1) or (2) are really an issue, because in 
>> fact the CRAN policy doesn't impose any absolute limits on size, it only 
>> states that the package should be "of minimum necessary size" which 
>> means it shouldn't waste space. If there is no way to reduce the size 
>> without impacting functionality, it's perfectly fine.
> 
> "Packages should be of the minimum necessary size" is subject to
> interpretation. And in practice, there is an issue with e.g. packages
> that "bundle" big third-party libraries. There are also packages that
> require downloading precompiled code, JARs... at installation time.
> 
>> That said, there are exceptions such as very 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek



> On 27/09/2022, at 11:02 AM, Gabriel Becker  wrote:
> 
> For the record, the only things switchr (my package) is doing internet wise 
> should be hitting the bioconductor config file 
> (http://bioconductor.org/config.yaml) so that it knows the things it need to 
> know about Bioc repos/versions/etc (at load time, actually, not install time, 
> but since install does a test load, those are essentially the same).
> 
> I have fallback behavior for when the file can't be read, so there shouldn't 
> be any actual build breakages/install breakages I don't think, but the check 
> does happen.
> 

$ sandbox-exec -n no-network R CMD INSTALL switchr_0.14.5.tar.gz 
[...]
** testing if installed package can be loaded from final location
Error in readLines(con) : 
  cannot open the connection to 'http://bioconductor.org/config.yaml'
Calls:  ... getBiocDevelVr -> getBiocYaml -> inet_handlers -> 
readLines
Execution halted
ERROR: loading failed

So, yes, it does break. You should recover from the error and use a fall-back 
file that you ship.

Cheers,
Simon
 

> Advice on what to do for the above use case that is better practice is 
> welcome.
> 
> ~G
> 
> On Mon, Sep 26, 2022 at 2:40 PM Simon Urbanek  
> wrote:
> 
> 
> > On 27/09/2022, at 10:21 AM, Iñaki Ucar  wrote:
> > 
> > On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
> >  wrote:
> >> 
> >> Iñaki,
> >> 
> >> I'm not sure I understand - system dependencies are an entirely different 
> >> topic and I would argue a far more important one (very happy to start a 
> >> discussion about that), but that has nothing to do with declaring 
> >> downloads. I assumed your question was about large files in packages which 
> >> packages avoid to ship and download instead so declaring them would be 
> >> useful.
> > 
> > Exactly. Maybe there's a misunderstanding, because I didn't talk about 
> > system dependencies (alas there are packages that try to download things 
> > that are declared as system dependencies, as Gabe noted). :)
> > 
> 
> 
> Ok, understood. I would like to tackle those as well, but let's start that 
> conversation in a few weeks when I have a lot more time.
> 
> 
> >> And for that, the obvious answer is they shouldn't do that - if a package 
> >> needs a file to run, it should include it. So an easy solution is to 
> >> disallow it.
> > 
> > Then we completely agree. My proposal about declaring additional sources 
> > was because, given that so many packages do this, I thought that I would 
> > find a strong opposition to this. But if R Core / CRAN is ok with just 
> > limiting net access at install time, then that's perfect to me. :)
> > 
> 
> Yes we do agree :). I started looking at your list, and so far those seem 
> simply bugs or design deficiencies in the packages (and outright policy 
> violations). I think the only reason they exist is that it doesn't get 
> detected in CRAN incoming, it's certainly not intentional.
> 
> Cheers,
> Simon
> 
> 
> > Iñaki
> > 
> >> But so far all examples where just (ab)use of downloads for binary 
> >> dependencies which is an entirely different issue that needs a different 
> >> solution (in a naive way declaring such dependencies, but we know it's not 
> >> that simple - and download URLs don't help there).
> >> 
> >> Cheers,
> >> Simon
> >> 
> >> 
> >>> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> >>> 
> >>> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >>>  wrote:
>  
>  Iñaki,
>  
>  I fully agree, this a very common issue since vast majority of server 
>  deployments I have encountered don't allow internet access. In practice 
>  this means that such packages are effectively banned.
>  
>  I would argue that not even (1) or (2) are really an issue, because in 
>  fact the CRAN policy doesn't impose any absolute limits on size, it only 
>  states that the package should be "of minimum necessary size" which 
>  means it shouldn't waste space. If there is no way to reduce the size 
>  without impacting functionality, it's perfectly fine.
> >>> 
> >>> "Packages should be of the minimum necessary size" is subject to
> >>> interpretation. And in practice, there is an issue with e.g. packages
> >>> that "bundle" big third-party libraries. There are also packages that
> >>> require downloading precompiled code, JARs... at installation time.
> >>> 
>  That said, there are exceptions such as very large datasets (e.g., as 
>  distributed by Bioconductor) which are orders of magnitude larger than 
>  what is sustainable. I agree that it would be nice to have a mechanism 
>  for specifying such sources. So yes, I like the idea, but I'd like to 
>  see more real use cases to justify the effort.
> >>> 
> >>> "More real use cases" like in "more use cases" or like in "the
> >>> previous ones are not real ones"? :)
> >>> 
>  The issue with any online downloads, though, is that there is no 
>  guarantee of availability - which is real issue for 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Gabriel Becker
For the record, the only things switchr (my package) is doing internet wise
should be hitting the bioconductor config file (
http://bioconductor.org/config.yaml) so that it knows the things it need to
know about Bioc repos/versions/etc (at load time, actually, not install
time, but since install does a test load, those are essentially the same).

I have fallback behavior for when the file can't be read, so there
shouldn't be any actual build breakages/install breakages I don't think,
but the check does happen.

Advice on what to do for the above use case that is better practice is
welcome.

~G

On Mon, Sep 26, 2022 at 2:40 PM Simon Urbanek 
wrote:

>
>
> > On 27/09/2022, at 10:21 AM, Iñaki Ucar  wrote:
> >
> > On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
> >  wrote:
> >>
> >> Iñaki,
> >>
> >> I'm not sure I understand - system dependencies are an entirely
> different topic and I would argue a far more important one (very happy to
> start a discussion about that), but that has nothing to do with declaring
> downloads. I assumed your question was about large files in packages which
> packages avoid to ship and download instead so declaring them would be
> useful.
> >
> > Exactly. Maybe there's a misunderstanding, because I didn't talk about
> system dependencies (alas there are packages that try to download things
> that are declared as system dependencies, as Gabe noted). :)
> >
>
>
> Ok, understood. I would like to tackle those as well, but let's start that
> conversation in a few weeks when I have a lot more time.
>
>
> >> And for that, the obvious answer is they shouldn't do that - if a
> package needs a file to run, it should include it. So an easy solution is
> to disallow it.
> >
> > Then we completely agree. My proposal about declaring additional sources
> was because, given that so many packages do this, I thought that I would
> find a strong opposition to this. But if R Core / CRAN is ok with just
> limiting net access at install time, then that's perfect to me. :)
> >
>
> Yes we do agree :). I started looking at your list, and so far those seem
> simply bugs or design deficiencies in the packages (and outright policy
> violations). I think the only reason they exist is that it doesn't get
> detected in CRAN incoming, it's certainly not intentional.
>
> Cheers,
> Simon
>
>
> > Iñaki
> >
> >> But so far all examples where just (ab)use of downloads for binary
> dependencies which is an entirely different issue that needs a different
> solution (in a naive way declaring such dependencies, but we know it's not
> that simple - and download URLs don't help there).
> >>
> >> Cheers,
> >> Simon
> >>
> >>
> >>> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> >>>
> >>> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >>>  wrote:
> 
>  Iñaki,
> 
>  I fully agree, this a very common issue since vast majority of server
> deployments I have encountered don't allow internet access. In practice
> this means that such packages are effectively banned.
> 
>  I would argue that not even (1) or (2) are really an issue, because
> in fact the CRAN policy doesn't impose any absolute limits on size, it only
> states that the package should be "of minimum necessary size" which means
> it shouldn't waste space. If there is no way to reduce the size without
> impacting functionality, it's perfectly fine.
> >>>
> >>> "Packages should be of the minimum necessary size" is subject to
> >>> interpretation. And in practice, there is an issue with e.g. packages
> >>> that "bundle" big third-party libraries. There are also packages that
> >>> require downloading precompiled code, JARs... at installation time.
> >>>
>  That said, there are exceptions such as very large datasets (e.g., as
> distributed by Bioconductor) which are orders of magnitude larger than what
> is sustainable. I agree that it would be nice to have a mechanism for
> specifying such sources. So yes, I like the idea, but I'd like to see more
> real use cases to justify the effort.
> >>>
> >>> "More real use cases" like in "more use cases" or like in "the
> >>> previous ones are not real ones"? :)
> >>>
>  The issue with any online downloads, though, is that there is no
> guarantee of availability - which is real issue for reproducibility. So one
> could argue that if such external sources are required then they should be
> on a well-defined, independent, permanent storage such as Zenodo. This
> could be a matter of policy as opposed to the technical side above which
> would be adding such support to R CMD INSTALL.
> >>>
> >>> Not necessarily. If the package declares the additional sources in the
> >>> DESCRIPTION (probably with hashes), that's a big improvement over the
> >>> current state of things, in which basically we don't know what the
> >>> package tries download, then it may fail, and finally there's no
> >>> guarantee that it's what the author intended in the first place.
> >>>
> >>> But on top of this, R could add a CMD to 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek



> On 27/09/2022, at 10:21 AM, Iñaki Ucar  wrote:
> 
> On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
>  wrote:
>> 
>> Iñaki,
>> 
>> I'm not sure I understand - system dependencies are an entirely different 
>> topic and I would argue a far more important one (very happy to start a 
>> discussion about that), but that has nothing to do with declaring downloads. 
>> I assumed your question was about large files in packages which packages 
>> avoid to ship and download instead so declaring them would be useful.
> 
> Exactly. Maybe there's a misunderstanding, because I didn't talk about system 
> dependencies (alas there are packages that try to download things that are 
> declared as system dependencies, as Gabe noted). :)
> 


Ok, understood. I would like to tackle those as well, but let's start that 
conversation in a few weeks when I have a lot more time.


>> And for that, the obvious answer is they shouldn't do that - if a package 
>> needs a file to run, it should include it. So an easy solution is to 
>> disallow it.
> 
> Then we completely agree. My proposal about declaring additional sources was 
> because, given that so many packages do this, I thought that I would find a 
> strong opposition to this. But if R Core / CRAN is ok with just limiting net 
> access at install time, then that's perfect to me. :)
> 

Yes we do agree :). I started looking at your list, and so far those seem 
simply bugs or design deficiencies in the packages (and outright policy 
violations). I think the only reason they exist is that it doesn't get detected 
in CRAN incoming, it's certainly not intentional.

Cheers,
Simon


> Iñaki
> 
>> But so far all examples where just (ab)use of downloads for binary 
>> dependencies which is an entirely different issue that needs a different 
>> solution (in a naive way declaring such dependencies, but we know it's not 
>> that simple - and download URLs don't help there).
>> 
>> Cheers,
>> Simon
>> 
>> 
>>> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
>>> 
>>> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
>>>  wrote:
 
 Iñaki,
 
 I fully agree, this a very common issue since vast majority of server 
 deployments I have encountered don't allow internet access. In practice 
 this means that such packages are effectively banned.
 
 I would argue that not even (1) or (2) are really an issue, because in 
 fact the CRAN policy doesn't impose any absolute limits on size, it only 
 states that the package should be "of minimum necessary size" which means 
 it shouldn't waste space. If there is no way to reduce the size without 
 impacting functionality, it's perfectly fine.
>>> 
>>> "Packages should be of the minimum necessary size" is subject to
>>> interpretation. And in practice, there is an issue with e.g. packages
>>> that "bundle" big third-party libraries. There are also packages that
>>> require downloading precompiled code, JARs... at installation time.
>>> 
 That said, there are exceptions such as very large datasets (e.g., as 
 distributed by Bioconductor) which are orders of magnitude larger than 
 what is sustainable. I agree that it would be nice to have a mechanism for 
 specifying such sources. So yes, I like the idea, but I'd like to see more 
 real use cases to justify the effort.
>>> 
>>> "More real use cases" like in "more use cases" or like in "the
>>> previous ones are not real ones"? :)
>>> 
 The issue with any online downloads, though, is that there is no guarantee 
 of availability - which is real issue for reproducibility. So one could 
 argue that if such external sources are required then they should be on a 
 well-defined, independent, permanent storage such as Zenodo. This could be 
 a matter of policy as opposed to the technical side above which would be 
 adding such support to R CMD INSTALL.
>>> 
>>> Not necessarily. If the package declares the additional sources in the
>>> DESCRIPTION (probably with hashes), that's a big improvement over the
>>> current state of things, in which basically we don't know what the
>>> package tries download, then it may fail, and finally there's no
>>> guarantee that it's what the author intended in the first place.
>>> 
>>> But on top of this, R could add a CMD to download those, and then some
>>> lookaside storage could be used on CRAN. This is e.g. how RPM
>>> packaging works: the spec declares all the sources, they are
>>> downloaded once, hashed and stored in a lookaside cache. Then package
>>> building doesn't need general Internet connectivity, just access to
>>> the cache.
>>> 
>>> Iñaki
>>> 
 
 Cheers,
 Simon
 
 
> On Sep 24, 2022, at 3:22 AM, Iñaki Ucar  wrote:
> 
> Hi all,
> 
> I'd like to open this debate here, because IMO this is a big issue.
> Many packages do this for various reasons, some more legitimate than
> others, but I think that this shouldn't be allowed, because it
> 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Iñaki Ucar
On Mon, 26 Sept 2022 at 23:07, Simon Urbanek
 wrote:
>
> Iñaki,
>
> I'm not sure I understand - system dependencies are an entirely different 
> topic and I would argue a far more important one (very happy to start a 
> discussion about that), but that has nothing to do with declaring downloads. 
> I assumed your question was about large files in packages which packages 
> avoid to ship and download instead so declaring them would be useful.

Exactly. Maybe there's a misunderstanding, because I didn't talk about
system dependencies (alas there are packages that try to download
things that are declared as system dependencies, as Gabe noted). :)

> And for that, the obvious answer is they shouldn't do that - if a package 
> needs a file to run, it should include it. So an easy solution is to disallow 
> it.

Then we completely agree. My proposal about declaring additional
sources was because, given that so many packages do this, I thought
that I would find a strong opposition to this. But if R Core / CRAN is
ok with just limiting net access at install time, then that's perfect
to me. :)

Iñaki

> But so far all examples where just (ab)use of downloads for binary 
> dependencies which is an entirely different issue that needs a different 
> solution (in a naive way declaring such dependencies, but we know it's not 
> that simple - and download URLs don't help there).
>
> Cheers,
> Simon
>
>
> > On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> >
> > On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >  wrote:
> >>
> >> Iñaki,
> >>
> >> I fully agree, this a very common issue since vast majority of server 
> >> deployments I have encountered don't allow internet access. In practice 
> >> this means that such packages are effectively banned.
> >>
> >> I would argue that not even (1) or (2) are really an issue, because in 
> >> fact the CRAN policy doesn't impose any absolute limits on size, it only 
> >> states that the package should be "of minimum necessary size" which means 
> >> it shouldn't waste space. If there is no way to reduce the size without 
> >> impacting functionality, it's perfectly fine.
> >
> > "Packages should be of the minimum necessary size" is subject to
> > interpretation. And in practice, there is an issue with e.g. packages
> > that "bundle" big third-party libraries. There are also packages that
> > require downloading precompiled code, JARs... at installation time.
> >
> >> That said, there are exceptions such as very large datasets (e.g., as 
> >> distributed by Bioconductor) which are orders of magnitude larger than 
> >> what is sustainable. I agree that it would be nice to have a mechanism for 
> >> specifying such sources. So yes, I like the idea, but I'd like to see more 
> >> real use cases to justify the effort.
> >
> > "More real use cases" like in "more use cases" or like in "the
> > previous ones are not real ones"? :)
> >
> >> The issue with any online downloads, though, is that there is no guarantee 
> >> of availability - which is real issue for reproducibility. So one could 
> >> argue that if such external sources are required then they should be on a 
> >> well-defined, independent, permanent storage such as Zenodo. This could be 
> >> a matter of policy as opposed to the technical side above which would be 
> >> adding such support to R CMD INSTALL.
> >
> > Not necessarily. If the package declares the additional sources in the
> > DESCRIPTION (probably with hashes), that's a big improvement over the
> > current state of things, in which basically we don't know what the
> > package tries download, then it may fail, and finally there's no
> > guarantee that it's what the author intended in the first place.
> >
> > But on top of this, R could add a CMD to download those, and then some
> > lookaside storage could be used on CRAN. This is e.g. how RPM
> > packaging works: the spec declares all the sources, they are
> > downloaded once, hashed and stored in a lookaside cache. Then package
> > building doesn't need general Internet connectivity, just access to
> > the cache.
> >
> > Iñaki
> >
> >>
> >> Cheers,
> >> Simon
> >>
> >>
> >>> On Sep 24, 2022, at 3:22 AM, Iñaki Ucar  wrote:
> >>>
> >>> Hi all,
> >>>
> >>> I'd like to open this debate here, because IMO this is a big issue.
> >>> Many packages do this for various reasons, some more legitimate than
> >>> others, but I think that this shouldn't be allowed, because it
> >>> basically means that installation fails in a machine without Internet
> >>> access (which happens e.g. in Linux distro builders for security
> >>> reasons).
> >>>
> >>> Now, what if connection is suppressed during package load? There are
> >>> basically three use cases out there:
> >>>
> >>> (1) The package requires additional files for the installation (e.g.
> >>> the source code of an external library) that cannot be bundled into
> >>> the package due to CRAN restrictions (size).
> >>> (2) The package requires additional files for using it (e.g.,
> >>> 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek
Iñaki,

I'm not sure I understand - system dependencies are an entirely different topic 
and I would argue a far more important one (very happy to start a discussion 
about that), but that has nothing to do with declaring downloads. I assumed 
your question was about large files in packages which packages avoid to ship 
and download instead so declaring them would be useful. And for that, the 
obvious answer is they shouldn't do that - if a package needs a file to run, it 
should include it. So an easy solution is to disallow it.

But so far all examples where just (ab)use of downloads for binary dependencies 
which is an entirely different issue that needs a different solution (in a 
naive way declaring such dependencies, but we know it's not that simple - and 
download URLs don't help there).

Cheers,
Simon


> On 27/09/2022, at 8:25 AM,  Ucar  wrote:
> 
> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
>  wrote:
>> 
>> Iñaki,
>> 
>> I fully agree, this a very common issue since vast majority of server 
>> deployments I have encountered don't allow internet access. In practice this 
>> means that such packages are effectively banned.
>> 
>> I would argue that not even (1) or (2) are really an issue, because in fact 
>> the CRAN policy doesn't impose any absolute limits on size, it only states 
>> that the package should be "of minimum necessary size" which means it 
>> shouldn't waste space. If there is no way to reduce the size without 
>> impacting functionality, it's perfectly fine.
> 
> "Packages should be of the minimum necessary size" is subject to
> interpretation. And in practice, there is an issue with e.g. packages
> that "bundle" big third-party libraries. There are also packages that
> require downloading precompiled code, JARs... at installation time.
> 
>> That said, there are exceptions such as very large datasets (e.g., as 
>> distributed by Bioconductor) which are orders of magnitude larger than what 
>> is sustainable. I agree that it would be nice to have a mechanism for 
>> specifying such sources. So yes, I like the idea, but I'd like to see more 
>> real use cases to justify the effort.
> 
> "More real use cases" like in "more use cases" or like in "the
> previous ones are not real ones"? :)
> 
>> The issue with any online downloads, though, is that there is no guarantee 
>> of availability - which is real issue for reproducibility. So one could 
>> argue that if such external sources are required then they should be on a 
>> well-defined, independent, permanent storage such as Zenodo. This could be a 
>> matter of policy as opposed to the technical side above which would be 
>> adding such support to R CMD INSTALL.
> 
> Not necessarily. If the package declares the additional sources in the
> DESCRIPTION (probably with hashes), that's a big improvement over the
> current state of things, in which basically we don't know what the
> package tries download, then it may fail, and finally there's no
> guarantee that it's what the author intended in the first place.
> 
> But on top of this, R could add a CMD to download those, and then some
> lookaside storage could be used on CRAN. This is e.g. how RPM
> packaging works: the spec declares all the sources, they are
> downloaded once, hashed and stored in a lookaside cache. Then package
> building doesn't need general Internet connectivity, just access to
> the cache.
> 
> Iñaki
> 
>> 
>> Cheers,
>> Simon
>> 
>> 
>>> On Sep 24, 2022, at 3:22 AM, Iñaki Ucar  wrote:
>>> 
>>> Hi all,
>>> 
>>> I'd like to open this debate here, because IMO this is a big issue.
>>> Many packages do this for various reasons, some more legitimate than
>>> others, but I think that this shouldn't be allowed, because it
>>> basically means that installation fails in a machine without Internet
>>> access (which happens e.g. in Linux distro builders for security
>>> reasons).
>>> 
>>> Now, what if connection is suppressed during package load? There are
>>> basically three use cases out there:
>>> 
>>> (1) The package requires additional files for the installation (e.g.
>>> the source code of an external library) that cannot be bundled into
>>> the package due to CRAN restrictions (size).
>>> (2) The package requires additional files for using it (e.g.,
>>> datasets, a JAR...) that cannot be bundled into the package due to
>>> CRAN restrictions (size).
>>> (3) Other spurious reasons (e.g. the maintainer decided that package
>>> load was a good place to check an online service availability, etc.).
>>> 
>>> Again IMO, (3) shouldn't be allowed in any case; (2) should be a
>>> separate function that the user actively calls to download the files,
>>> and those files should be placed into the user dir, and (3) is the
>>> only legitimate use, but then other mechanism should be provided to
>>> avoid connections during package load.
>>> 
>>> My proposal to support (3) would be to add a new field in the
>>> DESCRIPTION, "Additional_sources", which would be a comma separated
>>> 

[R-pkg-devel] R build fail without a message

2022-09-26 Thread Bailey, Paul
Hi,

One of my CRAN packages gets an ARM-64 build fail, visible at 
https://www.r-project.org/nosvn/R.check/r-oldrel-macos-arm64/EdSurvey-00check.html
 

It ends:

checking replacement functions ... OK
checking foreign function calls ...

It looks like someone tripped over the power chord, but I have no way of 
knowing what actually happened.

I cannot reproduce this on Rhub for R-release on ARM-64 nor on my coworker's m1 
on R-release. Any ideas on what I can do to diagnose this before resubmitting?

Best,
Paul Bailey, Ph.D.
Senior Economist, AIR
202-403-5694

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek
Gabe,

that's a great example how **not** to do it and why it is such a bad idea. 
icu4c is a system library, so it is generally available and it already includes 
the data in the system library, so embedding data from an outdated version is 
generally bad. I'm not sure why it should be needed in the first place, since 
icu actually tries to avoid the need for external files, so I'd say this would 
be ideally fixed in stringi.

That said, if you want to cache static data from the system library, that is an 
option, but should be done at build time from the system (no internet needed) - 
it is a common practice - have a look at sf (and other packages that copy 
projections data from PROJ). So, yes, that's a good argument for disallowing 
downloads to detect such issues in packages.

Cheers,
Simon



> On 27/09/2022, at 9:02 AM, Gabriel Becker  wrote:
> 
> Hi Simon,
> 
> The example of this I'm aware of that is most popular and widely used "in the 
> wild" is the stringi package (which is a dep of the widely used stringr pkg) 
> whose configure file downloads the ICU Data Library (icudt).
> 
> See https://github.com/gagolews/stringi/blob/master/configure#L5412
> 
> Note it does have some sort of workaround in place for non-internet-capable 
> build machines, but it is external (the build in question fails without the 
> workaround already explicitly performed).
> 
> Best,
> ~G
> 
> 
> 
> On Mon, Sep 26, 2022 at 12:50 PM Simon Urbanek  
> wrote:
> 
> 
> > On Sep 27, 2022, at 8:25 AM, Iñaki Ucar  wrote:
> > 
> > On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >  wrote:
> >> 
> >> Iñaki,
> >> 
> >> I fully agree, this a very common issue since vast majority of server 
> >> deployments I have encountered don't allow internet access. In practice 
> >> this means that such packages are effectively banned.
> >> 
> >> I would argue that not even (1) or (2) are really an issue, because in 
> >> fact the CRAN policy doesn't impose any absolute limits on size, it only 
> >> states that the package should be "of minimum necessary size" which means 
> >> it shouldn't waste space. If there is no way to reduce the size without 
> >> impacting functionality, it's perfectly fine.
> > 
> > "Packages should be of the minimum necessary size" is subject to
> > interpretation. And in practice, there is an issue with e.g. packages
> > that "bundle" big third-party libraries. There are also packages that
> > require downloading precompiled code, JARs... at installation time.
> > 
> 
> JARs are part of the package, so that's a valid use, no question there, 
> that's how Java packages do this already.
> 
> Downloading pre-compiled binaries is something that shouldn't be done and a 
> whole can of worms (since those are not sources and it *is* specific to the 
> platform, os etc.) that is entirely separate, but worth a separate 
> discussion. So I still don't see any use cases for actual sources. I do see a 
> need for better specification of external dependencies which are not part of 
> the package such that those can be satisfied automatically - but that's not 
> the problem you asked about.
> 
> 
> >> That said, there are exceptions such as very large datasets (e.g., as 
> >> distributed by Bioconductor) which are orders of magnitude larger than 
> >> what is sustainable. I agree that it would be nice to have a mechanism for 
> >> specifying such sources. So yes, I like the idea, but I'd like to see more 
> >> real use cases to justify the effort.
> > 
> > "More real use cases" like in "more use cases" or like in "the
> > previous ones are not real ones"? :)
> > 
> >> The issue with any online downloads, though, is that there is no guarantee 
> >> of availability - which is real issue for reproducibility. So one could 
> >> argue that if such external sources are required then they should be on a 
> >> well-defined, independent, permanent storage such as Zenodo. This could be 
> >> a matter of policy as opposed to the technical side above which would be 
> >> adding such support to R CMD INSTALL.
> > 
> > Not necessarily. If the package declares the additional sources in the
> > DESCRIPTION (probably with hashes), that's a big improvement over the
> > current state of things, in which basically we don't know what the
> > package tries download, then it may fail, and finally there's no
> > guarantee that it's what the author intended in the first place.
> > 
> > But on top of this, R could add a CMD to download those, and then some
> > lookaside storage could be used on CRAN. This is e.g. how RPM
> > packaging works: the spec declares all the sources, they are
> > downloaded once, hashed and stored in a lookaside cache. Then package
> > building doesn't need general Internet connectivity, just access to
> > the cache.
> > 
> 
> Sure, I fully agree that it would be a good first step, but I'm still waiting 
> for examples ;).
> 
> Cheers,
> Simon
> 
> 
> > Iñaki
> > 
> >> 
> >> Cheers,
> >> Simon
> >> 
> >> 
> >>> On Sep 24, 2022, at 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Iñaki Ucar
On Mon, 26 Sept 2022 at 21:50, Simon Urbanek
 wrote:
>
> [snip]
> Sure, I fully agree that it would be a good first step, but I'm still waiting 
> for examples ;).

Oh, you want me to actually name specific packages? I thought that
this was a well-established fact from your initial statement "I fully
agree, this a very common issue [...]", so I preferred to avoid
pointing fingers.

But of course you can start by taking a look at [1], where all
packages marked as "internet" or "cargo" are downloading stuff at
install time. There are some others that are too important to get rid
of, so I just build them with an Internet connection from time to
time. Or have them patched to avoid such downloads.

And others have been fixed after me opening an issue when a package
blows up when I try to build an RPM with it. But this is like playing
cat and mouse if this is not enforced somehow.

[1] https://github.com/Enchufa2/cran2copr/blob/master/excl-no-sysreqs.txt

-- 
Iñaki Úcar

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Gabriel Becker
Hi Simon,

The example of this I'm aware of that is most popular and widely used "in
the wild" is the stringi package (which is a dep of the widely used stringr
pkg) whose configure file downloads the ICU Data Library (icudt).

See https://github.com/gagolews/stringi/blob/master/configure#L5412

Note it does have some sort of workaround in place for non-internet-capable
build machines, but it is external (the build in question fails without the
workaround already explicitly performed).

Best,
~G



On Mon, Sep 26, 2022 at 12:50 PM Simon Urbanek 
wrote:

>
>
> > On Sep 27, 2022, at 8:25 AM, Iñaki Ucar  wrote:
> >
> > On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
> >  wrote:
> >>
> >> Iñaki,
> >>
> >> I fully agree, this a very common issue since vast majority of server
> deployments I have encountered don't allow internet access. In practice
> this means that such packages are effectively banned.
> >>
> >> I would argue that not even (1) or (2) are really an issue, because in
> fact the CRAN policy doesn't impose any absolute limits on size, it only
> states that the package should be "of minimum necessary size" which means
> it shouldn't waste space. If there is no way to reduce the size without
> impacting functionality, it's perfectly fine.
> >
> > "Packages should be of the minimum necessary size" is subject to
> > interpretation. And in practice, there is an issue with e.g. packages
> > that "bundle" big third-party libraries. There are also packages that
> > require downloading precompiled code, JARs... at installation time.
> >
>
> JARs are part of the package, so that's a valid use, no question there,
> that's how Java packages do this already.
>
> Downloading pre-compiled binaries is something that shouldn't be done and
> a whole can of worms (since those are not sources and it *is* specific to
> the platform, os etc.) that is entirely separate, but worth a separate
> discussion. So I still don't see any use cases for actual sources. I do see
> a need for better specification of external dependencies which are not part
> of the package such that those can be satisfied automatically - but that's
> not the problem you asked about.
>
>
> >> That said, there are exceptions such as very large datasets (e.g., as
> distributed by Bioconductor) which are orders of magnitude larger than what
> is sustainable. I agree that it would be nice to have a mechanism for
> specifying such sources. So yes, I like the idea, but I'd like to see more
> real use cases to justify the effort.
> >
> > "More real use cases" like in "more use cases" or like in "the
> > previous ones are not real ones"? :)
> >
> >> The issue with any online downloads, though, is that there is no
> guarantee of availability - which is real issue for reproducibility. So one
> could argue that if such external sources are required then they should be
> on a well-defined, independent, permanent storage such as Zenodo. This
> could be a matter of policy as opposed to the technical side above which
> would be adding such support to R CMD INSTALL.
> >
> > Not necessarily. If the package declares the additional sources in the
> > DESCRIPTION (probably with hashes), that's a big improvement over the
> > current state of things, in which basically we don't know what the
> > package tries download, then it may fail, and finally there's no
> > guarantee that it's what the author intended in the first place.
> >
> > But on top of this, R could add a CMD to download those, and then some
> > lookaside storage could be used on CRAN. This is e.g. how RPM
> > packaging works: the spec declares all the sources, they are
> > downloaded once, hashed and stored in a lookaside cache. Then package
> > building doesn't need general Internet connectivity, just access to
> > the cache.
> >
>
> Sure, I fully agree that it would be a good first step, but I'm still
> waiting for examples ;).
>
> Cheers,
> Simon
>
>
> > Iñaki
> >
> >>
> >> Cheers,
> >> Simon
> >>
> >>
> >>> On Sep 24, 2022, at 3:22 AM, Iñaki Ucar 
> wrote:
> >>>
> >>> Hi all,
> >>>
> >>> I'd like to open this debate here, because IMO this is a big issue.
> >>> Many packages do this for various reasons, some more legitimate than
> >>> others, but I think that this shouldn't be allowed, because it
> >>> basically means that installation fails in a machine without Internet
> >>> access (which happens e.g. in Linux distro builders for security
> >>> reasons).
> >>>
> >>> Now, what if connection is suppressed during package load? There are
> >>> basically three use cases out there:
> >>>
> >>> (1) The package requires additional files for the installation (e.g.
> >>> the source code of an external library) that cannot be bundled into
> >>> the package due to CRAN restrictions (size).
> >>> (2) The package requires additional files for using it (e.g.,
> >>> datasets, a JAR...) that cannot be bundled into the package due to
> >>> CRAN restrictions (size).
> >>> (3) Other spurious reasons (e.g. the maintainer 

Re: [Rd] Question about grid.group compositing operators in cairo

2022-09-26 Thread Paul Murrell

Hi

Thanks for the report.  It certainly sounds like I have done something 
stupid :)  For my debugging and testing could you please share the R 
code from your tests ?  Thanks!


Paul

On 26/09/22 10:27, Panagiotis Skintzos wrote:

Hello,

I'm trying to update ggiraph package in graphic engine v15 (currently we 
support up to v14).


I've implemented the group operators and when I compare the outputs of 
ggiraph::dsvg with the outputs of svg/png, I noticed some weird results.


Specifically, some operators in cairo (in, out, dest.in, dest.atop) give 
strange output, when any source element in the group has a stroke color 
defined.


I attach three example images, where two stroked rectangles are used as 
source (right) and destination (left).


cairo.over.png shows the result of the over operator in cairo

cairo.in.png shows the result of the in operator in cairo

dsvg.in.png shows the result of the in operator in dsvg


You can see the difference between cairo.in.png and dsvg.in.png. I found 
out why I get different results:


In dsvg implementation there is one drawing operation: Draw the source 
element, as whole (fill and stroke) over the destination element (using 
feComposite filter)


In cairo implementation though there are two operations: Apply the fill 
on source and draw over the destination and then apply the stroke and 
draw over the result of the previous operation.


I'm not sure if this is intentional or not. Shouldn't the source element 
being drawn first as whole (fill and stroke with over operator) and then 
apply the group operator and draw it over the destination? It would seem 
more logical that way.



Thanks,

Panagiotis


__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


--
Dr Paul Murrell
Te Kura Tatauranga | Department of Statistics
Waipapa Taumata Rau | The University of Auckland
Private Bag 92019, Auckland 1142, New Zealand
64 9 3737599 x85392
p...@stat.auckland.ac.nz
www.stat.auckland.ac.nz/~paul/

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Simon Urbanek



> On Sep 27, 2022, at 8:25 AM, Iñaki Ucar  wrote:
> 
> On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
>  wrote:
>> 
>> Iñaki,
>> 
>> I fully agree, this a very common issue since vast majority of server 
>> deployments I have encountered don't allow internet access. In practice this 
>> means that such packages are effectively banned.
>> 
>> I would argue that not even (1) or (2) are really an issue, because in fact 
>> the CRAN policy doesn't impose any absolute limits on size, it only states 
>> that the package should be "of minimum necessary size" which means it 
>> shouldn't waste space. If there is no way to reduce the size without 
>> impacting functionality, it's perfectly fine.
> 
> "Packages should be of the minimum necessary size" is subject to
> interpretation. And in practice, there is an issue with e.g. packages
> that "bundle" big third-party libraries. There are also packages that
> require downloading precompiled code, JARs... at installation time.
> 

JARs are part of the package, so that's a valid use, no question there, that's 
how Java packages do this already.

Downloading pre-compiled binaries is something that shouldn't be done and a 
whole can of worms (since those are not sources and it *is* specific to the 
platform, os etc.) that is entirely separate, but worth a separate discussion. 
So I still don't see any use cases for actual sources. I do see a need for 
better specification of external dependencies which are not part of the package 
such that those can be satisfied automatically - but that's not the problem you 
asked about.


>> That said, there are exceptions such as very large datasets (e.g., as 
>> distributed by Bioconductor) which are orders of magnitude larger than what 
>> is sustainable. I agree that it would be nice to have a mechanism for 
>> specifying such sources. So yes, I like the idea, but I'd like to see more 
>> real use cases to justify the effort.
> 
> "More real use cases" like in "more use cases" or like in "the
> previous ones are not real ones"? :)
> 
>> The issue with any online downloads, though, is that there is no guarantee 
>> of availability - which is real issue for reproducibility. So one could 
>> argue that if such external sources are required then they should be on a 
>> well-defined, independent, permanent storage such as Zenodo. This could be a 
>> matter of policy as opposed to the technical side above which would be 
>> adding such support to R CMD INSTALL.
> 
> Not necessarily. If the package declares the additional sources in the
> DESCRIPTION (probably with hashes), that's a big improvement over the
> current state of things, in which basically we don't know what the
> package tries download, then it may fail, and finally there's no
> guarantee that it's what the author intended in the first place.
> 
> But on top of this, R could add a CMD to download those, and then some
> lookaside storage could be used on CRAN. This is e.g. how RPM
> packaging works: the spec declares all the sources, they are
> downloaded once, hashed and stored in a lookaside cache. Then package
> building doesn't need general Internet connectivity, just access to
> the cache.
> 

Sure, I fully agree that it would be a good first step, but I'm still waiting 
for examples ;).

Cheers,
Simon


> Iñaki
> 
>> 
>> Cheers,
>> Simon
>> 
>> 
>>> On Sep 24, 2022, at 3:22 AM, Iñaki Ucar  wrote:
>>> 
>>> Hi all,
>>> 
>>> I'd like to open this debate here, because IMO this is a big issue.
>>> Many packages do this for various reasons, some more legitimate than
>>> others, but I think that this shouldn't be allowed, because it
>>> basically means that installation fails in a machine without Internet
>>> access (which happens e.g. in Linux distro builders for security
>>> reasons).
>>> 
>>> Now, what if connection is suppressed during package load? There are
>>> basically three use cases out there:
>>> 
>>> (1) The package requires additional files for the installation (e.g.
>>> the source code of an external library) that cannot be bundled into
>>> the package due to CRAN restrictions (size).
>>> (2) The package requires additional files for using it (e.g.,
>>> datasets, a JAR...) that cannot be bundled into the package due to
>>> CRAN restrictions (size).
>>> (3) Other spurious reasons (e.g. the maintainer decided that package
>>> load was a good place to check an online service availability, etc.).
>>> 
>>> Again IMO, (3) shouldn't be allowed in any case; (2) should be a
>>> separate function that the user actively calls to download the files,
>>> and those files should be placed into the user dir, and (3) is the
>>> only legitimate use, but then other mechanism should be provided to
>>> avoid connections during package load.
>>> 
>>> My proposal to support (3) would be to add a new field in the
>>> DESCRIPTION, "Additional_sources", which would be a comma separated
>>> list of additional resources to download during R CMD INSTALL. Those
>>> sources would be 

Re: [Rd] Proposal to limit Internet access during package load

2022-09-26 Thread Iñaki Ucar
On Sat, 24 Sept 2022 at 01:55, Simon Urbanek
 wrote:
>
> Iñaki,
>
> I fully agree, this a very common issue since vast majority of server 
> deployments I have encountered don't allow internet access. In practice this 
> means that such packages are effectively banned.
>
> I would argue that not even (1) or (2) are really an issue, because in fact 
> the CRAN policy doesn't impose any absolute limits on size, it only states 
> that the package should be "of minimum necessary size" which means it 
> shouldn't waste space. If there is no way to reduce the size without 
> impacting functionality, it's perfectly fine.

"Packages should be of the minimum necessary size" is subject to
interpretation. And in practice, there is an issue with e.g. packages
that "bundle" big third-party libraries. There are also packages that
require downloading precompiled code, JARs... at installation time.

> That said, there are exceptions such as very large datasets (e.g., as 
> distributed by Bioconductor) which are orders of magnitude larger than what 
> is sustainable. I agree that it would be nice to have a mechanism for 
> specifying such sources. So yes, I like the idea, but I'd like to see more 
> real use cases to justify the effort.

"More real use cases" like in "more use cases" or like in "the
previous ones are not real ones"? :)

> The issue with any online downloads, though, is that there is no guarantee of 
> availability - which is real issue for reproducibility. So one could argue 
> that if such external sources are required then they should be on a 
> well-defined, independent, permanent storage such as Zenodo. This could be a 
> matter of policy as opposed to the technical side above which would be adding 
> such support to R CMD INSTALL.

Not necessarily. If the package declares the additional sources in the
DESCRIPTION (probably with hashes), that's a big improvement over the
current state of things, in which basically we don't know what the
package tries download, then it may fail, and finally there's no
guarantee that it's what the author intended in the first place.

But on top of this, R could add a CMD to download those, and then some
lookaside storage could be used on CRAN. This is e.g. how RPM
packaging works: the spec declares all the sources, they are
downloaded once, hashed and stored in a lookaside cache. Then package
building doesn't need general Internet connectivity, just access to
the cache.

Iñaki

>
> Cheers,
> Simon
>
>
> > On Sep 24, 2022, at 3:22 AM, Iñaki Ucar  wrote:
> >
> > Hi all,
> >
> > I'd like to open this debate here, because IMO this is a big issue.
> > Many packages do this for various reasons, some more legitimate than
> > others, but I think that this shouldn't be allowed, because it
> > basically means that installation fails in a machine without Internet
> > access (which happens e.g. in Linux distro builders for security
> > reasons).
> >
> > Now, what if connection is suppressed during package load? There are
> > basically three use cases out there:
> >
> > (1) The package requires additional files for the installation (e.g.
> > the source code of an external library) that cannot be bundled into
> > the package due to CRAN restrictions (size).
> > (2) The package requires additional files for using it (e.g.,
> > datasets, a JAR...) that cannot be bundled into the package due to
> > CRAN restrictions (size).
> > (3) Other spurious reasons (e.g. the maintainer decided that package
> > load was a good place to check an online service availability, etc.).
> >
> > Again IMO, (3) shouldn't be allowed in any case; (2) should be a
> > separate function that the user actively calls to download the files,
> > and those files should be placed into the user dir, and (3) is the
> > only legitimate use, but then other mechanism should be provided to
> > avoid connections during package load.
> >
> > My proposal to support (3) would be to add a new field in the
> > DESCRIPTION, "Additional_sources", which would be a comma separated
> > list of additional resources to download during R CMD INSTALL. Those
> > sources would be downloaded by R CMD INSTALL if not provided via an
> > option (to support offline installations), and would be placed in a
> > predefined place for the package to find and configure them (via an
> > environment variable or in a predefined subdirectory).
> >
> > This proposal has several advantages. Apart from the obvious one
> > (Internet access during package load can be limited without losing
> > current functionalities), it gives more visibility to the resources
> > that packages are using during the installation phase, and thus makes
> > those installations more reproducible and more secure.
> >
> > Best,
> > --
> > Iñaki Úcar
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>


-- 
Iñaki Úcar

__
R-devel@r-project.org 

Re: [R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Ivan Krylov
It might be easier to help you if you show us your package by
publishing the source code somewhere.

On Mon, 26 Sep 2022 22:22:48 +0400
"Jahajeeah, Havisha"  wrote:

> CIvalue2: no visible global function definition for 'qt'
> andgm11: no visible binding for global variable 'ParticleSwarm'
> andgm11: no visible global function definition for 'tail'
> app: no visible global function definition for 'shinyApp'
> dbgm12: no visible binding for global variable 'ParticleSwarm'

It sounds like your NAMESPACE file isn't properly set up to import the
functions you're using. For the package to work correctly, it should
contain lines

importFrom(stats, qt)
importFrom(utils, tail)
importFrom(shiny, shinyApp)

and so on for every function you use that's not in base.

See Writing R Extensions section 1.5:
https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Specifying-imports-and-exports

> Objects in \usage without \alias in documentation object 'Plots':
>   'plots'

A single Rd file can describe multiple functions, but they should be
both mentioned in the \usage{} section and as an \alias{}. Do you
export two different objects (functions?) named "plots" and "Plots", or
is one of those an error?

> Bad \usage lines found in documentation object 'BackgroundValues':
>   gm11(x0), epgm11(x0), tbgm11(x0), igm11(x0), gm114(x0)

The \usage{} section must exactly match the definition of the function
(but you can omit default values of the arguments if they're too large
and not very informative), without any other words or punctuation.

Once your package passes the automated tests, a human volunteer will go
over your package to make sure that it fits the CRAN policy (not
providing a link because you've already read it when you submitted the
package), which includes having good documentation for every function
you export.

See Writing R Extensions section 2 for more information on this:
https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Writing-R-documentation-files

I've also noticed that you're showing us an error message from the CRAN
pre-test infrastructure. You can get these errors (and start fixing
them) faster without spending time waiting for the test result by
running R CMD check --as-cran your_package.tar.gz on your own machine.

-- 
Best regards,
Ivan

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Henrik Bengtsson
Hello,

are you aware of https://win-builder.r-project.org/? It'll allow you
to validate that your package passes all the requirements before
submitting it to CRAN.

My $.02

/Henrik

On Mon, Sep 26, 2022 at 11:23 AM Jahajeeah, Havisha
 wrote:
>
> Dear team,
>
> My second attempt at submitting the package GreymodelsPackage_1.0.tar.gz to
> CRAN.
>
> Grateful if you could please assist me with the following issues:
>
> CIvalue2: no visible global function definition for 'qt'
> andgm11: no visible binding for global variable 'ParticleSwarm'
> andgm11: no visible global function definition for 'tail'
> app: no visible global function definition for 'shinyApp'
> dbgm12: no visible binding for global variable 'ParticleSwarm'
> dbgm12: no visible global function definition fo
>
> lotegm: no visible global function definition for 'geom_point'
> plotegm: no visible global function definition for 'aes'
> plotegm: no visible binding for global variable 'x'
> plotegm: no visible binding for global variable 'y'
> plotegm: no visible global function definition for 'geom_line'
> plotegm: no visible global function definition for 'scale_color_manual'
> plotegm: no visible global function definition for 'ggplot
>
> Also,
>
> Objects in \usage without \alias in documentation object 'Multivariable':
>   'dbgm12'
>
> Objects in \usage without \alias in documentation object 'Plots':
>   'plots'
>
> Bad \usage lines found in documentation object 'BackgroundValues':
>   gm11(x0), epgm11(x0), tbgm11(x0), igm11(x0), gm114(x0)
> Bad \usage lines found in documentation object 'CombinedModels':
>   ngbm11(x0), ggvm11(x0), tfdgm11(x0)
>
>
>
> Thanks and Regards,
>
> Havisha Jahajeeah
>
>
> On Mon, Sep 26, 2022 at 7:14 PM Jahajeeah, Havisha 
> wrote:
>
> > Dear team,
> >
> > I have submitted the package GreymodelsPackage_1.0.tar.gz to CRAN and it's
> > a shiny app.
> >
> > However, I received  the following
> >
> > * using log directory 
> > 'd:/RCompile/CRANincoming/R-devel/GreymodelsPackage.Rcheck'
> > * using R Under development (unstable) (2022-09-25 r82916 ucrt)
> > * using platform: x86_64-w64-mingw32 (64-bit)
> > * using session charset: UTF-8
> > * checking for file 'GreymodelsPackage/DESCRIPTION' ... OK
> > * checking extension type ... ERROR
> > Extensions with Type 'Shiny application' cannot be checked.
> > * DONE
> > Status: 1 ERROR
> >
> > I am not sure  how to fix the problems and I would appreciate your help on 
> > how to resolve this issue.
> >
> > Thanks and regards,
> >
> > Havisha Jahajeeah
> >
> >
>
> [[alternative HTML version deleted]]
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unable to create manual

2022-09-26 Thread Ivan Krylov
On Mon, 26 Sep 2022 10:50:01 -0700
Edward Wei  wrote:

> 1. Where do I run "make check"?

In the directory where R is built from source. If you're using a binary
build of R, this isn't applicable.

> 3. I get this back when I run the "tools::testInstalledPackages(scope
> = "base")" on my RGUI.
> 
> Error: testing 'utils' failed
> 
> Where may I find the error log for this?

testInstalledPackages() creates output files in the current directory.
Does list.files(pattern = 'utils|Rout\\.fail') give you anything useful?

-- 
Best regards,
Ivan

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Jahajeeah, Havisha
Dear team,

My second attempt at submitting the package GreymodelsPackage_1.0.tar.gz to
CRAN.

Grateful if you could please assist me with the following issues:

CIvalue2: no visible global function definition for 'qt'
andgm11: no visible binding for global variable 'ParticleSwarm'
andgm11: no visible global function definition for 'tail'
app: no visible global function definition for 'shinyApp'
dbgm12: no visible binding for global variable 'ParticleSwarm'
dbgm12: no visible global function definition fo

lotegm: no visible global function definition for 'geom_point'
plotegm: no visible global function definition for 'aes'
plotegm: no visible binding for global variable 'x'
plotegm: no visible binding for global variable 'y'
plotegm: no visible global function definition for 'geom_line'
plotegm: no visible global function definition for 'scale_color_manual'
plotegm: no visible global function definition for 'ggplot

Also,

Objects in \usage without \alias in documentation object 'Multivariable':
  'dbgm12'

Objects in \usage without \alias in documentation object 'Plots':
  'plots'

Bad \usage lines found in documentation object 'BackgroundValues':
  gm11(x0), epgm11(x0), tbgm11(x0), igm11(x0), gm114(x0)
Bad \usage lines found in documentation object 'CombinedModels':
  ngbm11(x0), ggvm11(x0), tfdgm11(x0)



Thanks and Regards,

Havisha Jahajeeah


On Mon, Sep 26, 2022 at 7:14 PM Jahajeeah, Havisha 
wrote:

> Dear team,
>
> I have submitted the package GreymodelsPackage_1.0.tar.gz to CRAN and it's
> a shiny app.
>
> However, I received  the following
>
> * using log directory 
> 'd:/RCompile/CRANincoming/R-devel/GreymodelsPackage.Rcheck'
> * using R Under development (unstable) (2022-09-25 r82916 ucrt)
> * using platform: x86_64-w64-mingw32 (64-bit)
> * using session charset: UTF-8
> * checking for file 'GreymodelsPackage/DESCRIPTION' ... OK
> * checking extension type ... ERROR
> Extensions with Type 'Shiny application' cannot be checked.
> * DONE
> Status: 1 ERROR
>
> I am not sure  how to fix the problems and I would appreciate your help on 
> how to resolve this issue.
>
> Thanks and regards,
>
> Havisha Jahajeeah
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Unable to create manual

2022-09-26 Thread Edward Wei
So I am looking at the R Installation and Administration section,
specifically section 3 for windows. . I have some inquiries.

1. Where do I run "make check"?
2. I got this when I was running the make check:
C:\Users\edmon\Documents\R\RFIN> R CMD make check
make: *** No rule to make target 'check'.  Stop.

I'm guessing I'm running it incorrectly.

3. I get this back when I run the "tools::testInstalledPackages(scope =
"base")" on my RGUI.

Error: testing 'utils' failed

Where may I find the error log for this?


Best regards,


On Sat, Sep 17, 2022 at 3:27 AM Tomas Kalibera 
wrote:

>
> On 9/16/22 20:17, Edward Wei wrote:
> > That is an interesting thought. I don't think I have downloaded LaTex
> > explicitly, however, I do have MikTex and I added that file to PATH
> > manually.
> >
> > This is what my current PATH calls:
> >
> "c:/rtools42/x86_64-w64-mingw32.static.posix/bin;c:/rtools42/usr/bin;C:\\Program
> > Files\\R\\R-4.2.0\\bin\\x64;C:\\Program Files (x86)\\Common
> >
> Files\\Oracle\\Java\\javapath;C:\\WINDOWS\\system32;C:\\WINDOWS;C:\\WINDOWS\\System32\\Wbem;C:\\WINDOWS\\System32\\WindowsPowerShell\\v1.0\\;C:\\WINDOWS\\System32\\OpenSSH\\;C:\\Program
> > Files\\Wolfram Research\\WolframScript\\;C:\\Program
> >
> Files\\Git\\cmd;C:\\Users\\edmon\\AppData\\Local\\Microsoft\\WindowsApps;C:\\Users\\edmon\\AppData\\Local\\GitHubDesktop\\bin;C:\\Users\\edmon\\AppData\\Local\\Programs\\MiKTeX\\miktex\\bin\\x64\\;C:/Program
> > Files/RStudio/bin/quarto/bin"
> >
> > I have been able to generate a PDF of my package by entering :
> > tools::texi2pdf(file = "avfintools.Rcheck/avfintools-manual.tex")
> > on the console.
> >
> > Is there a way to submit the manual manually rather than have it
> compiled?
> > Or is it a requirement for the manual to be able to be compiled from the
> > package?
> >
> > Thanks for your help,
>
> MikTeX is fine and common choice and works with R.
>
> R Admin manual has "Installing R under Windows" with more details there
> and in the linked documents, including
> Howto: Building R 4.2 and packages on Windows.
>
> There must be something special set up on your system that probably only
> debugging could reveal. Possibly some customization in any of the R
> startup files. Or some conflicting tool on the PATH (e.g. in Git?).
>
> Tomas
>
> >
> > On Wed, Sep 14, 2022 at 3:14 PM Uwe Ligges <
> lig...@statistik.tu-dortmund.de>
> > wrote:
> >
> >>
> >> On 14.09.2022 23:54, Duncan Murdoch wrote:
> >>> On 12/09/2022 9:09 a.m., Edward Wei wrote:
>  This is the following error message I get from R CMD check:
> 
>  LaTeX errors when creating PDF version.
>  This typically indicates Rd problems.
>  * checking PDF version of manual without index ... ERROR
>  Re-running with no redirection of stdout/stderr.
>  Hmm ... looks like a package
>  Converting parsed Rd's to LaTeX ...
>  Creating pdf output from LaTeX ...
>  Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet =
>  quiet,  :
>  pdflatex is not available
> >>> This looks like a problem in the way tools::texi2dvi detects pdflatex.
> >>> If you have a valid LaTeX file, try running
> >>>
> >>> tools::texi2dvi(filename, pdf = TRUE)
> >>>
> >>> If that gives the same message, then debug(tools::texi2dvi), and
> repeat,
> >>> single stepping through the function to see what test it uses, and
> >>> figure out why it fails on your system.
> >> Or the env var PATH is overwritten once R get started, perhaps you have
> >> it in one of the files R process it at startup?
> >> Does Sys.getenv("PATH") show the pat to the LaTeX binary?
> >>
> >> Best,
> >> Uwe Ligges
> >>
> >>
> >>> Duncan Murdoch
> >>>
> >>>
>  Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet =
>  quiet,  :
>  pdflatex is not available
>  Error in running tools::texi2pdf()
>  You may want to clean up by 'rm -Rf
>  C:/Users/edmon/AppData/Local/Temp/RtmpWkD7Iy/Rd2pdf35d81e3e188c'
>  * DONE
> 
>  Status: 1 ERROR, 1 WARNING, 2 NOTEs
> 
> 
>  Then, I tried debugging by isolating errors from the Rd to PDF
>  conversion.
> 
> 
>  C:\Users\edmon\Documents\R\RFIN> R CMD Rd2pdf avfintools --no-clean
>  Hmm ... looks like a package
>  Converting Rd files to LaTeX ...
>  Creating pdf output from LaTeX ...
>  Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet =
>  quiet,  :
>  pdflatex is not available
>  Error in texi2dvi(file = file, pdf = TRUE, clean = clean, quiet =
>  quiet,  :
>  pdflatex is not available
>  Error in running tools::texi2pdf()
>  You may want to clean up by 'rm -Rf .Rd2pdf27964'
> 
> 
>  In this folder is the file Rd2.tex which I tried to complie into a
>  pdf. But
>  this error returns and PDF cannot be compiled. :
> 
> 
>  LaTeX Error: File `Rd.sty' not found.
> 
>  Type X to quit or  to proceed,
>  or enter new name. (Default 

Re: [R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Uwe Ligges




On 26.09.2022 17:14, Jahajeeah, Havisha wrote:

Dear team,

I have submitted the package GreymodelsPackage_1.0.tar.gz to CRAN and it's
a shiny app.

However, I received  the following

* using log directory
'd:/RCompile/CRANincoming/R-devel/GreymodelsPackage.Rcheck'
* using R Under development (unstable) (2022-09-25 r82916 ucrt)
* using platform: x86_64-w64-mingw32 (64-bit)
* using session charset: UTF-8
* checking for file 'GreymodelsPackage/DESCRIPTION' ... OK
* checking extension type ... ERROR
Extensions with Type 'Shiny application' cannot be checked.


You apparently declared it as Type "Shiny application" rather than 
"Package" in the DESCRIPTION field?


Simply omit the Type field or decalre it as Package.

Best,
Uwe Ligges




* DONE
Status: 1 ERROR

I am not sure  how to fix the problems and I would appreciate your
help on how to resolve this issue.

Thanks and regards,

Havisha Jahajeeah

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Ivan Krylov
В Mon, 26 Sep 2022 19:14:04 +0400
"Jahajeeah, Havisha"  пишет:

> * checking extension type ... ERROR
> Extensions with Type 'Shiny application' cannot be checked.

Since you're writing a package, you can either specify Type: Package in
the DESCRIPTION file, or omit the field entirely:
https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Package-types

-- 
Best regards,
Ivan

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Help - Shiny app on CRAN

2022-09-26 Thread Jahajeeah, Havisha
Dear team,

I have submitted the package GreymodelsPackage_1.0.tar.gz to CRAN and it's
a shiny app.

However, I received  the following

* using log directory
'd:/RCompile/CRANincoming/R-devel/GreymodelsPackage.Rcheck'
* using R Under development (unstable) (2022-09-25 r82916 ucrt)
* using platform: x86_64-w64-mingw32 (64-bit)
* using session charset: UTF-8
* checking for file 'GreymodelsPackage/DESCRIPTION' ... OK
* checking extension type ... ERROR
Extensions with Type 'Shiny application' cannot be checked.
* DONE
Status: 1 ERROR

I am not sure  how to fix the problems and I would appreciate your
help on how to resolve this issue.

Thanks and regards,

Havisha Jahajeeah

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[Rd] Question about grid.group compositing operators in cairo

2022-09-26 Thread Panagiotis Skintzos

Hello,

I'm trying to update ggiraph package in graphic engine v15 (currently we 
support up to v14).


I've implemented the group operators and when I compare the outputs of 
ggiraph::dsvg with the outputs of svg/png, I noticed some weird results.


Specifically, some operators in cairo (in, out, dest.in, dest.atop) give 
strange output, when any source element in the group has a stroke color 
defined.


I attach three example images, where two stroked rectangles are used as 
source (right) and destination (left).


cairo.over.png shows the result of the over operator in cairo

cairo.in.png shows the result of the in operator in cairo

dsvg.in.png shows the result of the in operator in dsvg


You can see the difference between cairo.in.png and dsvg.in.png. I found 
out why I get different results:


In dsvg implementation there is one drawing operation: Draw the source 
element, as whole (fill and stroke) over the destination element (using 
feComposite filter)


In cairo implementation though there are two operations: Apply the fill 
on source and draw over the destination and then apply the stroke and 
draw over the result of the previous operation.


I'm not sure if this is intentional or not. Shouldn't the source element 
being drawn first as whole (fill and stroke with over operator) and then 
apply the group operator and draw it over the destination? It would seem 
more logical that way.



Thanks,

Panagiotis

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Better 'undefined columns' error for data.frame

2022-09-26 Thread GILLIBERT, Andre



Duncan Murdoch  wrote:
>
>
> On 25/09/2022 2:48 p.m., Dirk Eddelbuettel wrote:
> >
> > Andre,
> >
> > On 25 September 2022 at 18:09, GILLIBERT, Andre wrote:
> > | Please, find the patch attached, based on the latest R SVN trunk code.
> >
> > Well the mailing list software tends to drop attachments.  There is a reason
> > all these emails suggest to use bugs.r-project.org.
> >
>
> I was named in the posting, so I saw the patch file, my copy didn't come
> via the list.  It's a good patch, I hope Andre does post there.
>
> Duncan Murdoch

Thank you.
I reported an "enhancement" #18409 at 
https://bugs.r-project.org/show_bug.cgi?id=18409
I took in account your suggestions to improve error messages.

I choosed to provide different messages for character and logical/numeric 
indices, but this increases the code size, the number of code paths and the 
number of translations to perform.
If you have suggestions to improve the patch, I am open to comments and ideas.

--
Sincerely
Andre GILLIBERT

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel