Re: [Rd] Is ALTREP "non-API"?

2024-04-24 Thread Hiroaki Yutani
> And in general, I'd urge R Core to make an explicit list of functions that
you consider to be part of the exported API

While I believe R Core is in the process of such clarification, I'd also
vote for this. Now that WRE has "experimental" category, and if we take the
current definition of "documented in the manual" literally, an
"experimental" entry point cannot be documented because the entry point
would promote to an "API" for the obvious reason. It would sound funny that
you cannot write precautionary statements on experimental entry points just
because of the very definition of "experimental". So, I agree R should have
the explicit list.

I'd add that R should also define a process on how to stabilize an
"experimental" or "public" entry point into an "API". For example, Rust
language has such a process [1]. After a feature is introduced as unstable,
a "tracking issue" is filed and the related problems are reported or linked
to it. Users can know what are the remaining problems before getting
stabilized, and, if they have strong will, they can contribute to resolving
such blockers. Similarly, if we can track the unresolved problems of each
non-API, we might be able to help the R core team more smoothly.

Best,
Yutani

[1]: https://rustc-dev-guide.rust-lang.org/stabilization_guide.html


2024年4月24日(水) 21:55 Hadley Wickham :

> >
> >
> >
> > >>> That is not true at all - the presence of header does not constitute
> > >> declaration of something as the R API. There are cases where internal
> > >> functions are in the headers for historical or other reasons since the
> > >> headers are used both for the internal implementation and packages.
> > That's
> > >> why this is in R-exts under "The R API: entry points for C code":
> > >>>
> > >>> If I understand your point correctly, does this mean that
> > >> Rf_allocVector() is not part of the "official" R API? It does not
> > appear to
> > >> be documented in the "The R API: entry points for C code" section.
> > >>>
> > >>
> > >> It does, obviously:
> > >>
> https://cran.r-project.org/doc/manuals/R-exts.html#Allocating-storage-1
> > >
> > >
> > > I'm just trying to understand the precise definition of the official
> API
> > > here. So it's any function mentioned in R-exts, regardless of which
> > section
> > > it appears in?
> > >
> > > Does this sentence imply that all functions starting with alloc* are
> part
> > > of the official API?
> > >
> >
> > Again, I can only quote the R-exts (few lines below the previous "The R
> > API" quote):
> >
> >
> > We can classify the entry points as
> > API
> > Entry points which are documented in this manual and declared in an
> > installed header file. These can be used in distributed packages and will
> > only be changed after deprecation.
> >
> >
> > It says "in this manual" - I don't see anywhere restriction on a
> > particular section of the manual, so I really don't see why you would
> think
> > that allocation is not part on the API.
> >
>
> Because you mentioned that section explicitly earlier in the thread. This
> obviously seems clear to you, but it's not at all clear to me and I suspect
> many of the wider community. It's frustrating because we are trying
> our best to do what y'all want us to do, but it feels like we keep getting
> the rug pulled out from under us with very little notice, and then have to
> spend a large amount of time figuring out workarounds. That is at least
> feasible for my team since we have multiple talented folks who are paid
> full-time to work on R, but it's a huge struggle for most people who are
> generally maintaining packages in their spare time.
>
> For the purposes of this discussion could you please "documented in the
> manual" means? For example, this line mentions allocXxx functions: "There
> are quite a few allocXxx functions defined in Rinternals.h—you may want to
> explore them.". Does that imply that they are documented and free to use?
>
> And in general, I'd urge R Core to make an explicit list of functions that
> you consider to be part of the exported API, and then grandfather in
> packages that used those functions prior to learning that we weren't
> supposed to.
>
> Hadley
>
>
> --
> http://hadley.nz
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Is ALTREP "non-API"?

2024-04-22 Thread Hiroaki Yutani
I just saw the recent commits about the "experimental" entry points. So, my
original question about the current status of ALTREP is now resolved. I'm
glad that ALTREP is confirmed usable on CRAN (with care). Thank you for all
your help!

I think other "non-API"s still need clarification. For example, this one I
picked in my previous email.

> For example, src/include/R_ext/Parse.h got a comment "So not API," but
the entry point R_ParseVector is explained in Writing R Extension[1]. So, I
believe it's clearly an "API" both in the sense of WRE's dialect and in an
ordinary sense. Which should I believe? WRE? The source code?

But, in my understanding, R is now in the process of clearing such
ambiguities. So, I can just wait.

Lastly, I want the R core to consider marking ALTREP as stable, or the
"API". I didn't actively follow the development of ALTREP, but I think the
ALTREP entry points have been there for half a decade without any major
breaking changes. So, in my opinion, it's safe to declare it's stable.

Best,
Yutani


2024年4月23日(火) 7:14 Simon Urbanek :

>
>
> > On Apr 22, 2024, at 7:37 PM, Gabriel Becker 
> wrote:
> >
> > Hi Yutani,
> >
> > ALTREP is part of the official R api, as illustrated by the presence of
> > src/include/R_ext/Altrep.h. Everything declared in the header files in
> that
> > directory is official API AFAIK (and I believe that is more definitive
> than
> > the manuals).
> >
>
> That is not true at all - the presence of header does not constitute
> declaration of something as the R API. There are cases where internal
> functions are in the headers for historical or other reasons since the
> headers are used both for the internal implementation and packages. That's
> why this is in R-exts under "The R API: entry points for C code":
>
> > There are a large number of entry points in the R executable/DLL that
> can be called from C code (and some that can be called from Fortran code).
> Only those documented here are stable enough that they will only be changed
> with considerable notice.
>
> And that's why CRAN does not allow unstable ones = those not documented in
> R-exts as part of the API.
>
> Therefore Hiroaki's question is a very good one. ALTREP is declared as
> experimental and is not part of the API, but the development and stability
> of the API in some sense should get better as more packages are using it.
> Therefore it is currently allowed on CRAN in the hope that it will
> transition to stable at some point, but package authors using it must be
> willing to adapt to changes to the API as necessary.
>
> Cheers,
> Simon
>
>
>
> > The documentation of ALTREP has lagged behind its implementation
> > unfortunately, which may partially my fault for not submitting doc
> > patches for it against the manuals. Sorry for my contribution to that,
> I'll
> > see if I can loop back around to contributing documentation for ALTREP.
> >
> > Best,
> > ~G
> >
> > On Sun, Apr 21, 2024 at 6:36 PM Hiroaki Yutani 
> wrote:
> >
> >> Thanks, Hernando,
> >>
> >> Sorry, "API" is a bit confusing term in this context, but what I want to
> >> discuss is the "API" that Writing R Extension defines as quoted in my
> >> previous email. It's probably different from an ordinary sense when we
> >> casually say "R C API".
> >>
> >> You might wonder why I care about such a difference. This is because
> >> calling a "non-API" is considered a violation of CRAN repository policy,
> >> which means CRAN will kick out the R package. I know many CRAN packages
> use
> >> ALTREP, but just being accepted by CRAN at the moment doesn't mean CRAN
> >> will keep accepting it. So, I want to clarify the current status of
> ALTREP.
> >>
> >> Best,
> >> Yutani
> >>
> >> 2024年4月22日(月) 10:17 :
> >>
> >>> Hello, I don't believe it is illegal, as ALTREP "implements an
> >> abstraction
> >>> underneath the C API". And is "compatible with all code which uses the
> >>> API".
> >>>
> >>> Please see slide deck by Gabriel Becker,  with L Tierney, M Lawrence
> and
> >> T
> >>> Kalibera.
> >>>
> >>>
> >>>
> >>
> https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP
> >>> .pdf
> >>> <
> >>
> https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP.pdf
> >>>
> >>>
> >>> ALTREP framewo

Re: [Rd] Is ALTREP "non-API"?

2024-04-22 Thread Hiroaki Yutani
Thanks for your convincing comment, but it seems the R core team has a
different opinion...
A few hours ago, src/include/R_ext/Altrep.h got this comment:

/*
   Not part of the API, subject to change at any time.
*/

commit:
https://github.com/r-devel/r-svn/commit/2059bffde642f8426d1f39ab5dd995d19a575d4d

While I'm glad to see their attempt to make it clear, I'm confused. That
commit marks many other files as "not API," but I think it's a bit
inconsistent with what Writing R Extension says.

For example, src/include/R_ext/Parse.h got a comment "So not API," but the
entry point R_ParseVector is explained in Writing R Extension[1]. So, I
believe it's clearly an "API" both in the sense of WRE's dialect and in an
ordinary sense. Which should I believe? WRE? The source code?

It might be just a coincidence, but I'm sorry if my question drove the R
core team to such a too-quick clarification. I just wanted to discuss how
to fix the current inconsistencies.

I think the R core needs a proper definition of "API" first. In my opinion,
it makes little sense to call it "non-API" just to show the possibility of
future breaking changes. Whether you call it API or non-API, clever users
will still accept the breaking changes on it if it's reasonable. For
example, how about "experimental API" or "unstable API"? They sound better
to me.

Best,
Yutani

[1]:
https://cran.r-project.org/doc/manuals/r-release/R-exts.html#Parsing-R-code-from-C


2024年4月22日(月) 16:37 Gabriel Becker :

> Hi Yutani,
>
> ALTREP is part of the official R api, as illustrated by the presence of
> src/include/R_ext/Altrep.h. Everything declared in the header files in that
> directory is official API AFAIK (and I believe that is more definitive than
> the manuals).
>
> The documentation of ALTREP has lagged behind its implementation
> unfortunately, which may partially my fault for not submitting doc
> patches for it against the manuals. Sorry for my contribution to that, I'll
> see if I can loop back around to contributing documentation for ALTREP.
>
> Best,
> ~G
>
> On Sun, Apr 21, 2024 at 6:36 PM Hiroaki Yutani 
> wrote:
>
>> Thanks, Hernando,
>>
>> Sorry, "API" is a bit confusing term in this context, but what I want to
>> discuss is the "API" that Writing R Extension defines as quoted in my
>> previous email. It's probably different from an ordinary sense when we
>> casually say "R C API".
>>
>> You might wonder why I care about such a difference. This is because
>> calling a "non-API" is considered a violation of CRAN repository policy,
>> which means CRAN will kick out the R package. I know many CRAN packages
>> use
>> ALTREP, but just being accepted by CRAN at the moment doesn't mean CRAN
>> will keep accepting it. So, I want to clarify the current status of
>> ALTREP.
>>
>> Best,
>> Yutani
>>
>> 2024年4月22日(月) 10:17 :
>>
>> > Hello, I don't believe it is illegal, as ALTREP "implements an
>> abstraction
>> > underneath the C API". And is "compatible with all code which uses the
>> > API".
>> >
>> > Please see slide deck by Gabriel Becker,  with L Tierney, M Lawrence
>> and T
>> > Kalibera.
>> >
>> >
>> >
>> https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP
>> > .pdf
>> > <
>> https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP.pdf
>> >
>> >
>> > ALTREP framework implements an abstraction underneath traditional R C
>> API
>> > - Generalizes whats underneath the API
>> > - Without changing how data are accessed
>> > - Compatible with all C code which uses the API
>> > - Compatible with R internals
>> >
>> >
>> > I hope this helps,
>> > Hernando
>> >
>> >
>> > -Original Message-
>> > From: R-devel  On Behalf Of Hiroaki
>> Yutani
>> > Sent: Sunday, April 21, 2024 8:48 PM
>> > To: r-devel 
>> > Subject: [Rd] Is ALTREP "non-API"?
>> >
>> > Writing R Extension[1] defines "API" as:
>> >
>> > Entry points which are documented in this manual and declared in an
>> > installed header file. These can be used in distributed packages and
>> will
>> > only be changed after deprecation.
>> >
>> > But, the document (WRE) doesn't have even a single mention of ALTREP,
>> the
>> > term "ALTREP" itself or any entry points related to ALTREP. Does this
>> mean,
>> &

Re: [Rd] Is ALTREP "non-API"?

2024-04-21 Thread Hiroaki Yutani
Thanks, Hernando,

Sorry, "API" is a bit confusing term in this context, but what I want to
discuss is the "API" that Writing R Extension defines as quoted in my
previous email. It's probably different from an ordinary sense when we
casually say "R C API".

You might wonder why I care about such a difference. This is because
calling a "non-API" is considered a violation of CRAN repository policy,
which means CRAN will kick out the R package. I know many CRAN packages use
ALTREP, but just being accepted by CRAN at the moment doesn't mean CRAN
will keep accepting it. So, I want to clarify the current status of ALTREP.

Best,
Yutani

2024年4月22日(月) 10:17 :

> Hello, I don't believe it is illegal, as ALTREP "implements an abstraction
> underneath the C API". And is "compatible with all code which uses the
> API".
>
> Please see slide deck by Gabriel Becker,  with L Tierney, M Lawrence and T
> Kalibera.
>
>
> https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP
> .pdf
> <https://bioconductor.org/help/course-materials/2020/BiocDevelForum/16-ALTREP.pdf>
>
> ALTREP framework implements an abstraction underneath traditional R C API
> - Generalizes whats underneath the API
> - Without changing how data are accessed
> - Compatible with all C code which uses the API
> - Compatible with R internals
>
>
> I hope this helps,
> Hernando
>
>
> -Original Message-
> From: R-devel  On Behalf Of Hiroaki Yutani
> Sent: Sunday, April 21, 2024 8:48 PM
> To: r-devel 
> Subject: [Rd] Is ALTREP "non-API"?
>
> Writing R Extension[1] defines "API" as:
>
> Entry points which are documented in this manual and declared in an
> installed header file. These can be used in distributed packages and will
> only be changed after deprecation.
>
> But, the document (WRE) doesn't have even a single mention of ALTREP, the
> term "ALTREP" itself or any entry points related to ALTREP. Does this mean,
> despite the widespread use of it on R packages including CRAN ones, ALTREP
> is not the API and accordingly using it in distributed packages is
> considered illegal?
>
> Best,
> Yutani
>
> [1]:
> https://cran.r-project.org/doc/manuals/r-release/R-exts.html#The-R-API
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Is ALTREP "non-API"?

2024-04-21 Thread Hiroaki Yutani
Writing R Extension[1] defines "API" as:

Entry points which are documented in this manual and declared in an
installed header file. These can be used in distributed packages and will
only be changed after deprecation.

But, the document (WRE) doesn't have even a single mention of ALTREP, the
term "ALTREP" itself or any entry points related to ALTREP. Does this mean,
despite the widespread use of it on R packages including CRAN ones, ALTREP
is not the API and accordingly using it in distributed packages is
considered illegal?

Best,
Yutani

[1]: https://cran.r-project.org/doc/manuals/r-release/R-exts.html#The-R-API

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-26 Thread Hiroaki Yutani
Simon,

Ok, let's take a look at a real example. The first item of inst/AUTHORS of
prqlr (GitHub version) is this:

addr2line (version 0.20.0):
  addr2line authors

You can find addr2line's owners on crates.io [1], while its manifest file
(Cargo.toml) [2] doesn't contain the names of its owners or authors. In
Rust's manifest, the "authors" field is optional [3] unlike R. You might
argue "owners" is not the same as "authors," but at least crates.io
provides the names of those who are responsible for the crate.

Let's go back to your question.

> So are you saying you have to use crates.io and do some extra step during
the (misnamed) "vendor" step?

"cargo vendor" doesn't take care of generating the list of authors, so it's
not "during the vender step." It has to be done separately anyway. I was
just saying you **can** use crates.io in that step instead of searching for
the authors manually one by one (or filling it with "foo authors" when the
manifest file doesn't contain any names).

That said, I agree with you in general that the Rust community is
relatively loose about authorship and licensing when compared with R. I
don't think it's necessarily a problem, but the impedance mismatch is a
headache. I was just trying to point out this part of your opinion

> the Rust community as there doesn't seem to be any accountability with
respect to ownership and attribution.

was not quite true. I hope the R community and the Rust community have
respect for each other.

Best,
Yutani

[1]: https://crates.io/crates/addr2line
[2]: https://github.com/gimli-rs/addr2line/blob/0.20.0/Cargo.toml
[3]:
https://doc.rust-lang.org/cargo/reference/manifest.html#the-authors-field


2023年8月27日(日) 12:07 Simon Urbanek :

> Yutani,
>
>
> On Aug 27, 2023, at 2:19 PM, Hiroaki Yutani  wrote:
>
> Simon,
>
> > it's assumed that GitHub history is the canonical source with the
> provenance, but that gets lost when pulled into the package.
>
> No, not GitHub. You can usually find the ownership on crates.io. So, if
> you want a target to blame, it's probably just a problem of the script to
> auto-generate inst/AUTHORS in this specific case. But, clearly, Rust's
> ecosystem works soundly under the existence of crates.io, so I think this
> is the same kind of pain which you would feel if you use R without CRAN.
>
>
> Can you elaborate? I have not found anything that would have a list of
> authors in the sources. I fully agree that I know nothing about it, but
> even if you use R without CRAN, each package contains that information in
> the DESCRIPTION file since it's so crucial. So are you saying you have to
> use crates.io and do some extra step during the (misnamed) "vendor" step?
> (I didn't see the submitted tar ball of plqrl and its release on GitHub is
> not the actual package so can't check - thus just trying reverse-engineer
> what happens by looking at the dependencies which leads to GitHub).
>
>
> Sorry for nitpicking.
>
>
> Sure, good to get the fact straight.
>
> Cheers,
> Simon
>
>
>
> Best,
> Yutani
>
> 2023年8月27日(日) 6:57 Simon Urbanek :
>
>> Tatsuya,
>>
>> What you do is contact CRAN. I don't think anyone here can answer your
>> question, only CRAN can, so ask there.
>>
>> Generally, packages with sufficiently many Rust dependencies have to be
>> handled manually as they break the size limit, so auto-rejections are
>> normal. Archival is unusual, but it may have fallen through the cracks -
>> but the way to find out is to ask.
>>
>> One related issue with respect to CRAN policies that I don't see a good
>> solution for is that inst/AUTHORS is patently unhelpful, because most of
>> them say "foo (version ..): foo authors" with no contact, or real names or
>> any links. That seems to be a problem stemming from the Rust community as
>> there doesn't seem to be any accountability with respect to ownership and
>> attribution. I don't know if it's because it's assumed that GitHub history
>> is the canonical source with the provenance, but that gets lost when pulled
>> into the package.
>>
>> Cheers,
>> Simon
>>
>> PS: Your README says "(Rust 1.65 or later)", but the version condition is
>> missing from SystemRequirements.
>>
>>
>> > On Aug 26, 2023, at 2:46 PM, SHIMA Tatsuya  wrote:
>> >
>> > Hi,
>> >
>> > I noticed that my submitted package `prqlr` 0.5.0 was archived from
>> CRAN on 2023-08-19.
>> > <https://CRAN.R-project.org/package=prqlr
>> <https://cran.r-project.org/package=prqlr>>
>> >
>> > I submitted prqlr 0.5.0 on 2023-

Re: [R-pkg-devel] What to do when a package is archived from CRAN

2023-08-26 Thread Hiroaki Yutani
Simon,

> it's assumed that GitHub history is the canonical source with the
provenance, but that gets lost when pulled into the package.

No, not GitHub. You can usually find the ownership on crates.io. So, if you
want a target to blame, it's probably just a problem of the script to
auto-generate inst/AUTHORS in this specific case. But, clearly, Rust's
ecosystem works soundly under the existence of crates.io, so I think this
is the same kind of pain which you would feel if you use R without CRAN.

Sorry for nitpicking.

Best,
Yutani

2023年8月27日(日) 6:57 Simon Urbanek :

> Tatsuya,
>
> What you do is contact CRAN. I don't think anyone here can answer your
> question, only CRAN can, so ask there.
>
> Generally, packages with sufficiently many Rust dependencies have to be
> handled manually as they break the size limit, so auto-rejections are
> normal. Archival is unusual, but it may have fallen through the cracks -
> but the way to find out is to ask.
>
> One related issue with respect to CRAN policies that I don't see a good
> solution for is that inst/AUTHORS is patently unhelpful, because most of
> them say "foo (version ..): foo authors" with no contact, or real names or
> any links. That seems to be a problem stemming from the Rust community as
> there doesn't seem to be any accountability with respect to ownership and
> attribution. I don't know if it's because it's assumed that GitHub history
> is the canonical source with the provenance, but that gets lost when pulled
> into the package.
>
> Cheers,
> Simon
>
> PS: Your README says "(Rust 1.65 or later)", but the version condition is
> missing from SystemRequirements.
>
>
> > On Aug 26, 2023, at 2:46 PM, SHIMA Tatsuya  wrote:
> >
> > Hi,
> >
> > I noticed that my submitted package `prqlr` 0.5.0 was archived from CRAN
> on 2023-08-19.
> > 
> >
> > I submitted prqlr 0.5.0 on 2023-08-13. I believe I have since only
> received word from CRAN that it passed the automated release process. <
> https://github.com/eitsupi/prqlr/pull/161>
> > So I was very surprised to find out after I returned from my trip that
> this was archived.
> >
> > The CRAN page says "Archived on 2023-08-19 for policy violation. " but I
> don't know what exactly was the problem.
> > I have no idea what more to fix as I believe I have solved all the
> problems when I submitted 0.5.0.
> >
> > Is there any way to know what exactly was the problem?
> > (I thought I sent an e-mail to CRAN 5 days ago but have not yet received
> an answer, so I decided to ask my question on this mailing list, thinking
> that there is a possibility that there will be no answer to my e-mail,
> although I may have to wait a few weeks for an answer. My apologies if this
> idea is incorrect.)
> >
> > Best,
> > Tatsuya
> >
> > __
> > R-package-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel
> >
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-14 Thread Hiroaki Yutani
I just found the policy is updated and I now understand why GitHub matters
in your opinion. Thanks for the clarification, I forgot this fact.

>  CRAN does not regard github.com (which hosts the index of crates.io) as
sufficiently reliable.

The good news is that, as of Rust 1.68, Cargo supports the "sparse" index
protocol [1][2]. In this case, the index is hosted at
https://index.crates.io/, crates.io's own infrastructure. So, if I
understand correctly, when all the CRAN servers have Cargo >=1.68
installed, CRAN is ready to believe crates.io is reliable?

Note that, at the time of writing this, the version on Debian testing is
still 1.66 [3] and it's not updated very frequently (about once a year?),
so it probably takes a while before the day.

Best,
Yutani


[1]:
https://blog.rust-lang.org/2023/03/09/Rust-1.68.0.html#cargos-sparse-protocol
[2]:
https://blog.rust-lang.org/inside-rust/2023/01/30/cargo-sparse-protocol.html
[3]: https://packages.debian.org/testing/cargo (it seems 0.66 means 1.66)

2023年7月14日(金) 9:58 Hiroaki Yutani :

> Simin,
>
> Sorry that my question was not clear. Let me clarify.
>
> I think we all agree that "cargo vendor" is the primary option. Since
> downloading without explicit permission is not allowed on CRAN in general,
> it's reasonable. I'm happy that the instructions will describe it clearly.
>
> But, some R packages have too large dependencies to bundle. In this case,
> downloading can be allowed with "the explicit permission of the CRAN team,"
> if I understand correctly. For this, I think Cargo's downloading mechanism
> satisfy this requirement if (1) all the dependencies are from crates.io
> and (2) Cargo.lock exists:
>
> > download a specific version from a secure site and check that the
> download is the expected code by some sort of checksum
>
> Because Cargo downloads specific versions recorded in Cargo.lock, verifies
> the checksums, and crates.io is the "secure site" that we can rely on as
> Hadley wrote.
>
> My question is, does CRAN allow Cargo to download the dependency sources
> on CRAN? The policy says:
>
> > So downloading of Rust ‘crates’ will in future require the explicit
> permission of the CRAN team
>
> To my eyes, this implies
>
> - CRAN currently allows Cargo's downloading of dependency Rust crates even
> without the permission
> - CRAN will keep allowing Cargo's downloading if the package author asks
> the permission
>
> And, if CRAN doesn't allow it, I (and probably many Rust users) would like
> to know why. As I described above, it should satisfy the requirement.
>
> >  please don't cross-post
>
> Sorry.
>
> > I thought cargo build --offline is not needed if the dependencies are
> already vendored?
>
> Yes, you are right. --offline is not needed if vendering is properly
> configured. But, this probably means you have to review the build
> configurations in .cargo/config.toml or so, so I just thought it would be
> easier for you to check if --offline is specified to the command. This
> seems a bit off-topic, so please ignore.
>
> Best,
> Yutani
>
>
> 2023年7月14日(金) 9:06 Simon Urbanek :
>
>>
>>
>> > On Jul 14, 2023, at 11:19 AM, Hadley Wickham 
>> wrote:
>> >
>> >>> If CRAN cannot trust even the official one of Rust, why does CRAN
>> have Rust at all?
>> >>>
>> >>
>> >> I don't see the connection - if you downloaded something in the past
>> it doesn't mean you will be able to do so in the future. And CRAN has Rust
>> because it sounded like a good idea to allow packages to use it, but I can
>> see that it opened a can of worms that we trying to tame here.
>> >
>> > Can you give a bit more detail about your concerns here? Obviously
>> > crates.io isn't some random site on the internet, it's the official
>> > repository of the Rust language, supported by the corresponding
>> > foundation for the language. To me that makes it feel very much like
>> > CRAN, where we can assume if you downloaded something in the past, you
>> > can download something in the future.
>> >
>>
>> I was just responding to Yutani's question why we downloaded the Rust
>> compilers on CRAN at all. This has really nothing to do with the previous
>> discussion which is why I did say "I don't see the connection". Also I
>> wasn't talking about crates.io anywhere in my responses in this thread.
>> The only thing I wanted to discuss here was that I think the existing Rust
>> model  ("vendor" into the package sources) seems like a good one to apply
>> to Go, but that got somehow hijacked...
>>
>> Cheers,
>> Simon
>>
>>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-13 Thread Hiroaki Yutani
Simin,

Sorry that my question was not clear. Let me clarify.

I think we all agree that "cargo vendor" is the primary option. Since
downloading without explicit permission is not allowed on CRAN in general,
it's reasonable. I'm happy that the instructions will describe it clearly.

But, some R packages have too large dependencies to bundle. In this case,
downloading can be allowed with "the explicit permission of the CRAN team,"
if I understand correctly. For this, I think Cargo's downloading mechanism
satisfy this requirement if (1) all the dependencies are from crates.io and
(2) Cargo.lock exists:

> download a specific version from a secure site and check that the
download is the expected code by some sort of checksum

Because Cargo downloads specific versions recorded in Cargo.lock, verifies
the checksums, and crates.io is the "secure site" that we can rely on as
Hadley wrote.

My question is, does CRAN allow Cargo to download the dependency sources on
CRAN? The policy says:

> So downloading of Rust ‘crates’ will in future require the explicit
permission of the CRAN team

To my eyes, this implies

- CRAN currently allows Cargo's downloading of dependency Rust crates even
without the permission
- CRAN will keep allowing Cargo's downloading if the package author asks
the permission

And, if CRAN doesn't allow it, I (and probably many Rust users) would like
to know why. As I described above, it should satisfy the requirement.

>  please don't cross-post

Sorry.

> I thought cargo build --offline is not needed if the dependencies are
already vendored?

Yes, you are right. --offline is not needed if vendering is properly
configured. But, this probably means you have to review the build
configurations in .cargo/config.toml or so, so I just thought it would be
easier for you to check if --offline is specified to the command. This
seems a bit off-topic, so please ignore.

Best,
Yutani


2023年7月14日(金) 9:06 Simon Urbanek :

>
>
> > On Jul 14, 2023, at 11:19 AM, Hadley Wickham 
> wrote:
> >
> >>> If CRAN cannot trust even the official one of Rust, why does CRAN have
> Rust at all?
> >>>
> >>
> >> I don't see the connection - if you downloaded something in the past it
> doesn't mean you will be able to do so in the future. And CRAN has Rust
> because it sounded like a good idea to allow packages to use it, but I can
> see that it opened a can of worms that we trying to tame here.
> >
> > Can you give a bit more detail about your concerns here? Obviously
> > crates.io isn't some random site on the internet, it's the official
> > repository of the Rust language, supported by the corresponding
> > foundation for the language. To me that makes it feel very much like
> > CRAN, where we can assume if you downloaded something in the past, you
> > can download something in the future.
> >
>
> I was just responding to Yutani's question why we downloaded the Rust
> compilers on CRAN at all. This has really nothing to do with the previous
> discussion which is why I did say "I don't see the connection". Also I
> wasn't talking about crates.io anywhere in my responses in this thread.
> The only thing I wanted to discuss here was that I think the existing Rust
> model  ("vendor" into the package sources) seems like a good one to apply
> to Go, but that got somehow hijacked...
>
> Cheers,
> Simon
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-13 Thread Hiroaki Yutani
Thank you for the correction. I see.

Best,
Yutani

2023年7月13日(木) 16:08 Tomas Kalibera :

>
> On 7/13/23 05:08, Hiroaki Yutani wrote:
> > I actually use cargo vendor.
> >
> >
> https://github.com/yutannihilation/string2path/blob/main/src/rust/vendor.sh
> >
> > One thing to note is that, prior to R 4.3.0, the vendored directories hit
> > the Windows' path limit so I had to put them into a TAR file. I haven't
> > tested on R 4.3.0, but probably this problem is solved by this
> improvement.
> > So, if you target only R >= 4.3, you can just cargo vendor.
> >
> >
> https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html
>
> I wouldn't rely on that long paths on Windows are supported even in R >=
> 4.3, because it requires at least Windows 10 1607, and it needs to be
> enabled system-wide in Windows - so, users/admins have to do that, and
> it impacts also other applications. The blog post has more details and
> recommendations.
>
> Best
> Tomas
>
> >
> > Best,
> > Yutani
> >
> > 2023年7月13日(木) 11:50 Kevin Ushey :
> >
> >> Package authors could use 'cargo vendor' to include Rust crate sources
> >> directly in their source R packages. Would that be acceptable?
> >>
> >> Presumedly, the vendored sources would be built using the versions
> >> specified in an accompanying Cargo.lock as well.
> >>
> >> https://doc.rust-lang.org/cargo/commands/cargo-vendor.html
> >>
> >>
> >> On Wed, Jul 12, 2023, 7:35 PM Simon Urbanek <
> simon.urba...@r-project.org>
> >> wrote:
> >>
> >>> Yutani,
> >>>
> >>> I'm not quite sure your reading fully matches the intent of the policy.
> >>> Cargo.lock is not sufficient, it is expected that the package will
> provide
> >>> *all* the sources, it is not expected to use cargo to resolve them from
> >>> random (possibly inaccessible) places. So the package author is
> expected to
> >>> either include the sources in the package *or* (if prohibitive due to
> >>> extreme size) have a release tar ball available at a fixed, secure,
> >>> reliable location (I was recommending Zenodo.org for that reason -
> GitHub
> >>> is neither fixed nor reliable by definition).
> >>>
> >>> Based on that, I'm not sure I fully understand the scope of your
> proposal
> >>> for improvement. Carlo.lock is certainly the first step that the
> package
> >>> author should take in creating the distribution tar ball so you can
> fix the
> >>> versions, but it is not sufficient as the next step involves
> collecting the
> >>> related sources. We don't want R users to be involved in that can of
> worms
> >>> (especially since the lock file itself provides no guarantees of
> >>> accessibility of the components and we don't want to have to manually
> >>> inspect it), the package should be ready to be used which is why it
> has to
> >>> do that step first. Does that explain the intent better? (In general,
> the
> >>> downloading at install time is actually a problem, because it's not
> >>> uncommon to use R in environments that have no Internet access, but the
> >>> download is a concession for extreme cases where the tar balls may be
> too
> >>> big to make it part of the package, but it's yet another can of
> worms...).
> >>>
> >>> Cheers,
> >>> Simon
> >>>
> >>>
> >>>
> >>>> On 13/07/2023, at 12:37 PM, Hiroaki Yutani 
> >>> wrote:
> >>>> Hi,
> >>>>
> >>>> I'm glad to see CRAN now has its official policy about Rust [1]!
> >>>> It seems it probably needs some feedback from those who are familiar
> >>> with
> >>>> the Rust workflow. I'm not an expert, but let me leave some quick
> >>> feedback.
> >>>> This email is sent to the R-package-devel mailing list as well as to
> >>> cran@~
> >>>> so that we can publicly discuss.
> >>>>
> >>>> It seems most of the concern is about how to make the build
> >>> deterministic.
> >>>> In this regard, the policy should encourage including "Cargo.lock"
> file
> >>>> [2]. Cargo.lock is created on the first compile, and the resolved
> >>> versions
> >>>> of dependencies are recorded. As long as this file exists, the
> >>> dependenc

Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-12 Thread Hiroaki Yutani
Hi Simon,

Thanks for the response. I thought

> download a specific version from a secure site and check that the
download is the expected code by some sort of checksum

refers to the usual process that's done by Cargo automatically. If it's
not, I think the policy should have a clear explanation. It seems it's not
only me who wondered why this policy doesn't mention Cargo.lock at all.

> it is not expected to use cargo to resolve them from random (possibly
inaccessible) places

Yes, I agree with you. So, I suggested the possibility of forbidding the
Git dependency. Or, do you call crates.io, Rust's official repository,
"random places"? If CRAN cannot trust even the official one of Rust, why
does CRAN have Rust at all?

That said, I agree with your concern about downloading via the Internet in
general. Downloading is one of the common sources of failure. If you want
to prevent cargo from downloading any source files, you can enforce adding
--offline option to "cargo build". While the package author might feel
unhappy, I think this would make your intent a bit clearer.

Best,
Yutani


2023年7月13日(木) 11:34 Simon Urbanek :

> Yutani,
>
> I'm not quite sure your reading fully matches the intent of the policy.
> Cargo.lock is not sufficient, it is expected that the package will provide
> *all* the sources, it is not expected to use cargo to resolve them from
> random (possibly inaccessible) places. So the package author is expected to
> either include the sources in the package *or* (if prohibitive due to
> extreme size) have a release tar ball available at a fixed, secure,
> reliable location (I was recommending Zenodo.org for that reason - GitHub
> is neither fixed nor reliable by definition).
>
> Based on that, I'm not sure I fully understand the scope of your proposal
> for improvement. Carlo.lock is certainly the first step that the package
> author should take in creating the distribution tar ball so you can fix the
> versions, but it is not sufficient as the next step involves collecting the
> related sources. We don't want R users to be involved in that can of worms
> (especially since the lock file itself provides no guarantees of
> accessibility of the components and we don't want to have to manually
> inspect it), the package should be ready to be used which is why it has to
> do that step first. Does that explain the intent better? (In general, the
> downloading at install time is actually a problem, because it's not
> uncommon to use R in environments that have no Internet access, but the
> download is a concession for extreme cases where the tar balls may be too
> big to make it part of the package, but it's yet another can of worms...).
>
> Cheers,
> Simon
>
>
>
> > On 13/07/2023, at 12:37 PM, Hiroaki Yutani  wrote:
> >
> > Hi,
> >
> > I'm glad to see CRAN now has its official policy about Rust [1]!
> > It seems it probably needs some feedback from those who are familiar with
> > the Rust workflow. I'm not an expert, but let me leave some quick
> feedback.
> > This email is sent to the R-package-devel mailing list as well as to
> cran@~
> > so that we can publicly discuss.
> >
> > It seems most of the concern is about how to make the build
> deterministic.
> > In this regard, the policy should encourage including "Cargo.lock" file
> > [2]. Cargo.lock is created on the first compile, and the resolved
> versions
> > of dependencies are recorded. As long as this file exists, the dependency
> > versions are locked to the ones in this file, except when the package
> > author explicitly updates the versions.
> >
> > Cargo.lock also records the SHA256 checksums of the crates if they are
> from
> > crates.io, Rust's official crate registry. If the checksums don't match,
> > the build will fail with the following message:
> >
> >error: checksum for `foo v0.1.2` changed between lock files
> >
> >this could be indicative of a few possible errors:
> >
> >* the lock file is corrupt
> >* a replacement source in use (e.g., a mirror) returned a
> different
> > checksum
> >* the source itself may be corrupt in one way or another
> >
> >unable to verify that `foo v0.1.2` is the same as when the lockfile
> was
> > generated
> >
> > For dependencies from Git repositories, Cargo.lock records the commit
> > hashes. So, the version of the source code (not the version of the crate)
> > is uniquely determined. That said, unlike cargo.io, it's possible that
> the
> > commit or the Git repository itself has disappeared at the time of
> > building, which makes the build fail. So, it might be reason

Re: [R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-12 Thread Hiroaki Yutani
I actually use cargo vendor.

https://github.com/yutannihilation/string2path/blob/main/src/rust/vendor.sh

One thing to note is that, prior to R 4.3.0, the vendored directories hit
the Windows' path limit so I had to put them into a TAR file. I haven't
tested on R 4.3.0, but probably this problem is solved by this improvement.
So, if you target only R >= 4.3, you can just cargo vendor.

https://blog.r-project.org/2023/03/07/path-length-limit-on-windows/index.html

Best,
Yutani

2023年7月13日(木) 11:50 Kevin Ushey :

> Package authors could use 'cargo vendor' to include Rust crate sources
> directly in their source R packages. Would that be acceptable?
>
> Presumedly, the vendored sources would be built using the versions
> specified in an accompanying Cargo.lock as well.
>
> https://doc.rust-lang.org/cargo/commands/cargo-vendor.html
>
>
> On Wed, Jul 12, 2023, 7:35 PM Simon Urbanek 
> wrote:
>
>> Yutani,
>>
>> I'm not quite sure your reading fully matches the intent of the policy.
>> Cargo.lock is not sufficient, it is expected that the package will provide
>> *all* the sources, it is not expected to use cargo to resolve them from
>> random (possibly inaccessible) places. So the package author is expected to
>> either include the sources in the package *or* (if prohibitive due to
>> extreme size) have a release tar ball available at a fixed, secure,
>> reliable location (I was recommending Zenodo.org for that reason - GitHub
>> is neither fixed nor reliable by definition).
>>
>> Based on that, I'm not sure I fully understand the scope of your proposal
>> for improvement. Carlo.lock is certainly the first step that the package
>> author should take in creating the distribution tar ball so you can fix the
>> versions, but it is not sufficient as the next step involves collecting the
>> related sources. We don't want R users to be involved in that can of worms
>> (especially since the lock file itself provides no guarantees of
>> accessibility of the components and we don't want to have to manually
>> inspect it), the package should be ready to be used which is why it has to
>> do that step first. Does that explain the intent better? (In general, the
>> downloading at install time is actually a problem, because it's not
>> uncommon to use R in environments that have no Internet access, but the
>> download is a concession for extreme cases where the tar balls may be too
>> big to make it part of the package, but it's yet another can of worms...).
>>
>> Cheers,
>> Simon
>>
>>
>>
>> > On 13/07/2023, at 12:37 PM, Hiroaki Yutani 
>> wrote:
>> >
>> > Hi,
>> >
>> > I'm glad to see CRAN now has its official policy about Rust [1]!
>> > It seems it probably needs some feedback from those who are familiar
>> with
>> > the Rust workflow. I'm not an expert, but let me leave some quick
>> feedback.
>> > This email is sent to the R-package-devel mailing list as well as to
>> cran@~
>> > so that we can publicly discuss.
>> >
>> > It seems most of the concern is about how to make the build
>> deterministic.
>> > In this regard, the policy should encourage including "Cargo.lock" file
>> > [2]. Cargo.lock is created on the first compile, and the resolved
>> versions
>> > of dependencies are recorded. As long as this file exists, the
>> dependency
>> > versions are locked to the ones in this file, except when the package
>> > author explicitly updates the versions.
>> >
>> > Cargo.lock also records the SHA256 checksums of the crates if they are
>> from
>> > crates.io, Rust's official crate registry. If the checksums don't
>> match,
>> > the build will fail with the following message:
>> >
>> >error: checksum for `foo v0.1.2` changed between lock files
>> >
>> >this could be indicative of a few possible errors:
>> >
>> >* the lock file is corrupt
>> >* a replacement source in use (e.g., a mirror) returned a
>> different
>> > checksum
>> >* the source itself may be corrupt in one way or another
>> >
>> >unable to verify that `foo v0.1.2` is the same as when the lockfile
>> was
>> > generated
>> >
>> > For dependencies from Git repositories, Cargo.lock records the commit
>> > hashes. So, the version of the source code (not the version of the
>> crate)
>> > is uniquely determined. That said, unlike cargo.io, it's possible that
>> the
>> > commit or the Git repository 

[R-pkg-devel] Feedback on "Using Rust in CRAN packages"

2023-07-12 Thread Hiroaki Yutani
Hi,

I'm glad to see CRAN now has its official policy about Rust [1]!
It seems it probably needs some feedback from those who are familiar with
the Rust workflow. I'm not an expert, but let me leave some quick feedback.
This email is sent to the R-package-devel mailing list as well as to cran@~
so that we can publicly discuss.

It seems most of the concern is about how to make the build deterministic.
In this regard, the policy should encourage including "Cargo.lock" file
[2]. Cargo.lock is created on the first compile, and the resolved versions
of dependencies are recorded. As long as this file exists, the dependency
versions are locked to the ones in this file, except when the package
author explicitly updates the versions.

Cargo.lock also records the SHA256 checksums of the crates if they are from
crates.io, Rust's official crate registry. If the checksums don't match,
the build will fail with the following message:

error: checksum for `foo v0.1.2` changed between lock files

this could be indicative of a few possible errors:

* the lock file is corrupt
* a replacement source in use (e.g., a mirror) returned a different
checksum
* the source itself may be corrupt in one way or another

unable to verify that `foo v0.1.2` is the same as when the lockfile was
generated

For dependencies from Git repositories, Cargo.lock records the commit
hashes. So, the version of the source code (not the version of the crate)
is uniquely determined. That said, unlike cargo.io, it's possible that the
commit or the Git repository itself has disappeared at the time of
building, which makes the build fail. So, it might be reasonable the CRAN
policy prohibits the use of Git dependency unless the source code is
bundled. I have no strong opinion here.

Accordingly, I believe this sentence

> In practice maintainers have found it nigh-impossible to meet these
conditions whilst downloading as they have too little control.

is not quite true. More specifically, these things

> The standard way to download a Rust ‘crate’ is by its version number, and
these have been changed without changing their number.
> Downloading a ‘crate’ normally entails downloading its dependencies, and
that is done without fixing their version numbers

won't happen if the R package does include Cargo.lock because

- if the crate is from crates.io, "the version can never be overwritten,
and the code cannot be deleted" there [3]
- if the crate is from a Git repository, the commit hash is unique in its
nature. The version of the crate might be the same between commits, but a
git dependency is specified by the commit hash, not the version of the
crate.

I'm keen to know what problems the CRAN maintainers have experienced that
Cargo.lock cannot solve. I hope we can help somehow to improve the policy.

Best,
Yutani

[1]: https://cran.r-project.org/web/packages/using_rust.html
[2]: https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
[3]: https://doc.rust-lang.org/cargo/reference/publishing.html

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Is it a wrong assumption that ${R_HOME}/lib always exists?

2023-07-10 Thread Hiroaki Yutani
Thanks so much for the quick response. It answered everything!
It was my mistake that I didn't consider these types of installations.
I'll fix my package as soon as I can.

Best,
Yutani

2023年7月10日(月) 17:51 Martin Maechler :

> >>>>> Hiroaki Yutani
> >>>>> on Mon, 10 Jul 2023 17:24:37 +0900 writes:
>
> [...]
>
> Short answer to your questsion (from the 'Subject') is :NO !
> For an example, see below:
>
> > libR-sys assumes the path to R's shared libraries is
> > `${R_HOME}/lib` on Unix-alike platforms.
>
> In some configurations, R does not need *any* shared libraries,
> and there,  $R_HOME/lib  does *not* exist.
>
> E.g., I have one of my R-devel versions installed (on Fedora
> Linux) with
>
> ../R/configure --with-blas=-lflexiblas
>
> using the nice, currently Fedora/Redhat-only "flexiblas"
> approach, with which I can nicely switch the versions of BLAS
> and Lapack libraries that R works with from within R.
>
> As that version of R is not "shared", i.e., no libR.so , *and*
> gets both its BLAS and Lapack libraries from "external" (not
> from R), there's no need for a ./lib/  and so none is created.
>
> Martin Maechler
> ETH Zurich  and  R Core Team
>
>
> > Is it possible
> > that this path doesn't exist on the MKL server?
>
> > Actually, it compiles fine on the other Linux platforms,
> > so I'm wondering what's different there from the other
> > servers.
>
>
> > Best, Yutani
> >   [[alternative HTML version deleted]]
>
>

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] Is it a wrong assumption that ${R_HOME}/lib always exists?

2023-07-10 Thread Hiroaki Yutani
Hi,

My package, string2path, using Rust fails on the CRAN check of MKL [1],
with an error that seems irrelevant to MKL. The error says:

>   thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value:
Os { code: 2, kind: NotFound, message: "No such file or directory" }',
/data/gannet/ripley/R/packages/tests-MKL/string2path/src/rust/vendor/libR-sys/build.rs:637
:40

The line 637 of build.rs of libR-sys crate is here [2]:

r_paths.library.canonicalize().unwrap().display()

Compared to the previous version, which didn't experience such a failure,
.canonicalize() was added, and it seems this is the cause. This method
normalizes the path expression like R's normalizePath(), and it fails
because the path specified didn't exist.

libR-sys assumes the path to R's shared libraries is `${R_HOME}/lib` on
Unix-alike platforms. Is it possible that this path doesn't exist on the
MKL server? Actually, it compiles fine on the other Linux platforms, so I'm
wondering what's different there from the other servers.

Best,
Yutani

[1]: https://cran.r-project.org/web/checks/check_results_string2path.html
[2]: https://github.com/extendr/libR-sys/blob/v0.5.0/build.rs#L637C1-L637C58

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Undocumented change of dirname("C:/") on R-devel on Windows

2023-02-23 Thread Hiroaki Yutani
I confirmed the revert fixed my failing test. Thanks!

2023年2月23日(木) 20:12 Hiroaki Yutani :

> Thanks for the prompt response, I'll confirm it after the new R-devel
> binary is available.
> Also, thanks for the detailed explanation. I agree with you in general.
>
> > "/" in "C:/" is a path separator or not, and whether it is trailing or
> not
>
> It seems a Windows' path basically consists of two components; a drive
> specification (e.g., C:) and the directory structure within the drive. What
> I learned today is that both "C:/" and "C:" are valid path specifications,
> but refer to different locations; "C:" is not the root directory of the
> drive, but just a relative path [1]. So, I agree with you that the basename
> of "C:/" should be "C:/". However, at the same time, I don't feel this is
> worth a breaking change, so I think we can preserve the current (R 4.2.2)
> behavior.
>
> [1]:
> https://learn.microsoft.com/en-us/dotnet/standard/io/file-path-formats#apply-the-current-directory
>
> Best,
> Yutani
>
> 2023年2月23日(木) 17:15 Tomas Kalibera :
>
>>
>> On 2/23/23 03:27, Hiroaki Yutani wrote:
>> > Hi,
>> >
>> > I found dirname() behaves differently on R-devel on Windows. Since I'm
>> not
>> > sure which behavior is right, let me ask here before filing this to R's
>> > Bigzilla.
>> >
>> > On R 4.2.2., we get
>> >
>> >  > dirname("C:/")
>> >  [1] "C:/"
>> >
>> > However, on R-devel (r83888), we get
>> >
>> >  > dirname("C:/")
>> >  [1] "."
>> >
>> > ?dirname says 'dirname returns the part of the path up to but excluding
>> the
>> > last path separator, or "." if there is no path separator,' but I don't
>> see
>> > how the root path is supposed to be treated based on this rule (,
>> whether
>> > it's WIndows or UNIX-alike).
>> Thanks for spotting the difference, I've reverted to the previous
>> behavior, the change was unintentional. If you spot any other suspicious
>> changes in behavior in file-system operations, please report.
>> > What should we expect as the return value of dirname("C:/")? I feel the
>> > current behavior on R 4.2.2 is right, but I'd like to confirm.
>>
>> I also think the old behavior is better, even though it could be argued
>> whether the "/" in "C:/" is a path separator or not, and whether it is
>> trailing or not. But the behavior is in line with Unix where dirname of
>> "/" is also "/". Msys2 would return "C:".
>>
>> If  "/" in "C:/" is a path separator but not a trailing path separator,
>> then basename("C:/") should probably be "" and not "C:", and this would
>> be in line with what R does on Unix. However, to be in line with Unix, I
>> think the basename of "C:/" should be "C:/". Yet, Msys2 returns "C:"
>> which is what R does now.
>>
>> So what these functions should do on Windows is definitely tricky. In
>> either case the behavior is now again as in R 4.2.2.
>>
>> Best
>> Tomas
>>
>> >
>> > Best,
>> > Yutani
>> >
>> >   [[alternative HTML version deleted]]
>> >
>> > __
>> > R-devel@r-project.org mailing list
>> > https://stat.ethz.ch/mailman/listinfo/r-devel
>>
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Undocumented change of dirname("C:/") on R-devel on Windows

2023-02-23 Thread Hiroaki Yutani
Thanks for the prompt response, I'll confirm it after the new R-devel
binary is available.
Also, thanks for the detailed explanation. I agree with you in general.

> "/" in "C:/" is a path separator or not, and whether it is trailing or not

It seems a Windows' path basically consists of two components; a drive
specification (e.g., C:) and the directory structure within the drive. What
I learned today is that both "C:/" and "C:" are valid path specifications,
but refer to different locations; "C:" is not the root directory of the
drive, but just a relative path [1]. So, I agree with you that the basename
of "C:/" should be "C:/". However, at the same time, I don't feel this is
worth a breaking change, so I think we can preserve the current (R 4.2.2)
behavior.

[1]:
https://learn.microsoft.com/en-us/dotnet/standard/io/file-path-formats#apply-the-current-directory

Best,
Yutani

2023年2月23日(木) 17:15 Tomas Kalibera :

>
> On 2/23/23 03:27, Hiroaki Yutani wrote:
> > Hi,
> >
> > I found dirname() behaves differently on R-devel on Windows. Since I'm
> not
> > sure which behavior is right, let me ask here before filing this to R's
> > Bigzilla.
> >
> > On R 4.2.2., we get
> >
> >  > dirname("C:/")
> >  [1] "C:/"
> >
> > However, on R-devel (r83888), we get
> >
> >  > dirname("C:/")
> >  [1] "."
> >
> > ?dirname says 'dirname returns the part of the path up to but excluding
> the
> > last path separator, or "." if there is no path separator,' but I don't
> see
> > how the root path is supposed to be treated based on this rule (, whether
> > it's WIndows or UNIX-alike).
> Thanks for spotting the difference, I've reverted to the previous
> behavior, the change was unintentional. If you spot any other suspicious
> changes in behavior in file-system operations, please report.
> > What should we expect as the return value of dirname("C:/")? I feel the
> > current behavior on R 4.2.2 is right, but I'd like to confirm.
>
> I also think the old behavior is better, even though it could be argued
> whether the "/" in "C:/" is a path separator or not, and whether it is
> trailing or not. But the behavior is in line with Unix where dirname of
> "/" is also "/". Msys2 would return "C:".
>
> If  "/" in "C:/" is a path separator but not a trailing path separator,
> then basename("C:/") should probably be "" and not "C:", and this would
> be in line with what R does on Unix. However, to be in line with Unix, I
> think the basename of "C:/" should be "C:/". Yet, Msys2 returns "C:"
> which is what R does now.
>
> So what these functions should do on Windows is definitely tricky. In
> either case the behavior is now again as in R 4.2.2.
>
> Best
> Tomas
>
> >
> > Best,
> > Yutani
> >
> >   [[alternative HTML version deleted]]
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Undocumented change of dirname("C:/") on R-devel on Windows

2023-02-22 Thread Hiroaki Yutani
Hi,

I found dirname() behaves differently on R-devel on Windows. Since I'm not
sure which behavior is right, let me ask here before filing this to R's
Bigzilla.

On R 4.2.2., we get

> dirname("C:/")
[1] "C:/"

However, on R-devel (r83888), we get

> dirname("C:/")
[1] "."

?dirname says 'dirname returns the part of the path up to but excluding the
last path separator, or "." if there is no path separator,' but I don't see
how the root path is supposed to be treated based on this rule (, whether
it's WIndows or UNIX-alike).

What should we expect as the return value of dirname("C:/")? I feel the
current behavior on R 4.2.2 is right, but I'd like to confirm.

Best,
Yutani

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] About the CRAN policy on downloading pre-compiled binary

2022-08-04 Thread Hiroaki Yutani
Hi,

Thank you so much Tomas for helping me to figure out the things. Let me
share some updates (some of them are what you've seen already in the reply
to the CRAN maintainer) in case this might be eventually useful to those
who are curious about using Rust on CRAN.

First of all, let me correct my misunderstanding. I thought what violates
the CRAN policy was that my package downloads a binary, but in actual, it
seems my fault was

1. if the compiler exists, my package downloads the **sources** of the
dependency, whose authorships are not described.
2. if the compiler doesn't exist, my package downloads the pre-compiled
binary **without the agreement of the CRAN team**

For point 2, as suggested in the previous comment,

> at least check whether the compiler is present (during package
installation), and use it if it is.

my package first checks if Cargo, Rust's compiler toolchain and package
manager, is installed (more specifically, whether the `cargo` command can
be found on PATH), and falls back to downloading the pre-compiled binary
only when Cargo is not available. So, I believe this can be considered as
"a last resort". Again, it was clearly my fault that I didn't ask the CRAN
maintainers for permission on my first submission.

However, it seems Cargo is already available on the following CRAN machines:

- Linux machines
- Windows Server 2022 (at least on winbuilder. Thanks David B. Dahl for
the information!)
- M1 macOS machine (guessed from the comment from the CRAN maintainer)

If I remember correctly, the Windows machine didn't have Cargo installed
when I submitted my package to CRAN. I'd really appreciate the CRAN
maintainers' effort to improve the infrastructure, and I hope Cargo will be
available on Intel macOS machines in the not so distant future so that I no
longer need to provide the pre-compiled binary (I'm not sure if I can have
the same hope on the Windows 2008, which might be a bit too old to install
some latest-ish version of Cargo, though [2]).


For point 1, it's Cargo that downloads the sources. It does download the
fixed versions recorded in the bundled "Cargo.lock" file, and verifies the
checksum. While Cargo provides the vendoring mechanism [1], I thought it
was not realistic to include the sources because my dependency was so huge
(132MB). But, I was wrong at two points.

First, no matter whether to download or to bundle the sources, the CRAN
Policy requires to describe the copyright information of "all components of
the package," which should refer to all the individual Rust crate in the
context of my package.

>  The ownership of copyright and intellectual property rights of all
components of the package must be clear and unambiguous (including from the
authors specification in the DESCRIPTION file). Where code is copied (or
derived) from the work of others (including from R itself), care must be
taken that any copyright/license statements are preserved and authorship is
not misrepresented.

Second, it was possible to reduce the size to the extent where I can bundle
it. I could tweak some libraries to drop the heavy dependency, and compress
the source code (again, thanks David B. Dahl for the idea).

I'm yet to figure out what's the proper way to list up the authors (I'm
currently using inst/COPYRIGHTS, but I'm not fully sure at the moment), but
I'm working on tweaking my package to bundle all the sources and to clarify
the authorships of these dependencies [2]. I hope this will keep my package
on CRAN.

Best,
Yutani

[1]: https://doc.rust-lang.org/cargo/commands/cargo-vendor.html
[2]: Actually, it keeps failing on Windows Server 2008 with the
pre-compiled binary probably due to lack of some system API:
https://www.r-project.org/nosvn/R.check/r-oldrel-windows-ix86+x86_64/string2path-00check.html
[3]: https://github.com/yutannihilation/string2path/pull/35 (work in
progress)

2022年7月27日(水) 17:12 Tomas Kalibera :

>
> On 7/27/22 08:08, Tomas Kalibera wrote:
> >
> > On 7/27/22 00:30, Hiroaki Yutani wrote:
> >> Hi,
> >>
> >> Recently I got the following email from the CRAN maintainer about my
> >> package, string2path[1].
> >>
> >> However, I do ensure the binary is the pinned version and verify if the
> >> hash matches with the embedded one in the DESCRIPTION [2][3]. In case
> >> of a
> >> mismatch, the build fails. So, this mechanism should ensure that I (or
> >> anyone) cannot change the version of the binary without actually
> >> resubmitting to CRAN.
> >
> > Please see the policy cited. Ensuring that the download is of a fixed
> > version refers to the sources (which can be downloaded under the
> > conditions mentioned).
> >
> > Downloading binaries are only a last resort option and requires the
> > agreement of the CRAN team in the first place.
> >

[R-pkg-devel] About the CRAN policy on downloading pre-compiled binary

2022-07-26 Thread Hiroaki Yutani
Hi,

Recently I got the following email from the CRAN maintainer about my
package, string2path[1].

However, I do ensure the binary is the pinned version and verify if the
hash matches with the embedded one in the DESCRIPTION [2][3]. In case of a
mismatch, the build fails. So, this mechanism should ensure that I (or
anyone) cannot change the version of the binary without actually
resubmitting to CRAN.

I believe this complies with the CRAN policy (except for not clearing the
authorship and copyright). Is there anything I have to address to prove I
do "ensure that the download is of a fixed version"? Any suggestions are
welcome.

The CRAN policy stipulates
>
> "Where a package wishes to make use of a library not written solely for
> the package, the package installation should first look to see if it is
> already installed and if so is of a suitable version. In case not, it is
> desirable to include the library sources in the package and compile them
> as part of package installation. If the sources are too large, it is
> acceptable to download them as part of installation, but do ensure that
> the download is of a fixed version rather than the latest. Only as a
> last resort and with the agreement of the CRAN team should a package
> download pre-compiled software."
>
> and we have recently seen an instance of a rust-using package whose
> check output changed because what it downloaded had changed.  CRAN
> checking is not set up for that (for example, macOS checks are done once
> only for each version).
>
> Whilst investigating, the Windows' maintainers found that binary libs
> were being downloaded.  And subsequently I found that salso, string2path
> and ymd are downloading compiled code on Intel macOS.
>
> Also. make sure that the authorship and copyright of code you download
> (and hence include in the package) is clear from the DESCRIPTION file.
> as required by the CRAN policy.
>

Best,
Hiroaki Yutani

[1]: https://cran.r-project.org/package=string2path
[2]:
https://github.com/cran/string2path/blob/46020296410cd78e2021bff86cb6f17c681d13a6/DESCRIPTION#L29-L40
[3]:
https://github.com/cran/string2path/blob/46020296410cd78e2021bff86cb6f17c681d13a6/tools/configure.R#L177-L295

[[alternative HTML version deleted]]

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] R session crash on closing a graphic device on Windows

2022-01-26 Thread Hiroaki Yutani
I see, thanks. I filed here:

https://bugs.r-project.org/show_bug.cgi?id=18292

Best,
Yutani

2022年1月27日(木) 1:35 Tomas Kalibera :
>
> Hi Yutani,
>
> On 1/26/22 16:42, Hiroaki Yutani wrote:
> > Hi Tomas,
> >
> > Thanks, but, if I understand correctly, there's no room to call the
> > Rust allocator's "free" function in the case of `DevDesc`. A `DevDesc`
> > is supposed to be freed in `GEdestroyDevDesc()` when the device is
> > closed. If I free it by myself, it would cause double-free.
> >
> > So, now I'm wondering if it makes sense that R provides either an API
> > that creates a `DevDesc` instance, or one that accepts a custom
> > allocator for DevDesc. But, as I expect this is a minor use case, I'm
> > not confident enough this would be meaningful.
> > I might end up filing such a feature request on Bugzilla, but let me
> > hold off for a while.
>
> I see. If you are using the public interface and it allows you to
> (indirectly) free the devices, it should allow you also to allocate
> them. So yes, please file a bug report and please provide enough context
> there so that the report is self-sufficient.
>
> Beyond the general rule than any API allowing to allocate or free needs
> to allow both, I'd leave this to the experts on graphics in R.
>
> Thanks,
> Tomas
>
> >
> > Best,
> > Yutani
> >
> > 2022年1月27日(木) 0:21 Tomas Kalibera :
> >>
> >> On 1/26/22 15:44, Hiroaki Yutani wrote:
> >>> Hi Tomas,
> >>>
> >>> Thanks for your helpful advice. This time, it seems the cause of the
> >>> error was an allocator mismatch; I mistakenly allocated the struct on
> >>> Rust's side, which means it's allocated by Rust's allocator, but a
> >>> `DevDesc` is to be freed on R's side. The problem is solved by using
> >>> libc::calloc(), which allocates using the C runtime's allocator, and
> >>> compiling it with the same toolchain that compiles R.
> >> Hi Yutani,
> >>
> >> congratulations on tracing it down.
> >>
> >> Particularly on Windows, whenever a DLL (or any API) is providing a
> >> function to allocate anything, it should provide also a function to free
> >> it, and only that function should be used to do so, even if it is just a
> >> wrapper for malloc() etc. So I would recommend following that, there
> >> should be a Rust allocator's "free" function which you could then call
> >> from R.
> >>
> >> Best
> >> Tomas
> >>
> >>> I also saw some errors when it relates to GC, so it might be some
> >>> PROTECT issue. Thanks for the hint.
> >>>
> >>> During debugging, I learned a lot about how to build R with DEBUG=T
> >>> and use gdb, and it really helped me. I'm yet to unlock the power of
> >>> WinDBG, but I will try next time...
> >>>
> >>> Best,
> >>> Yutani
> >>>
> >>> 2022年1月26日(水) 23:20 Tomas Kalibera :
> >>>
> >>>> Hi Yutani,
> >>>>
> >>>> if you haven't done that already, I would recommend building R with
> >>>> debug symbols (DEBUG=T, so that make file don't strip them) and with -O0
> >>>> (no optimizations), so that the debug symbols are more accurate. Without
> >>>> that, the stack traces can be very misleading. You might try both with
> >>>> windbg and gdb, sometimes one of them provides an extra hint. Ideally
> >>>> you would also build the involved package(s) the same way.
> >>>>
> >>>> Then, it is important to check where the free() is called from, whether
> >>>> it is directly or from the GC. In both cases (but more likely in the
> >>>> latter), it could be caused by somewhat unrelated memory corruption,
> >>>> which may be hard to find - e.g. possibly a PROTECT error. Running with
> >>>> gctorture() might help, if gctorture() changes where the crash happens,
> >>>> it is more likely a somewhat unrelated memory corruption.
> >>>>
> >>>> If it were a double-free or similar allocation error inside R itself (or
> >>>> some of the involved packages), it would be easy to find with a debugger.
> >>>>
> >>>> If debugging this way does not help, you can try narrowing down the
> >>>> example, while preserving the crash. That may make debugging easier, and
> >>>> if you eventually get to a point that you have a reproducible example
> >&g

Re: [R-pkg-devel] R session crash on closing a graphic device on Windows

2022-01-26 Thread Hiroaki Yutani
Hi Tomas,

Thanks, but, if I understand correctly, there's no room to call the
Rust allocator's "free" function in the case of `DevDesc`. A `DevDesc`
is supposed to be freed in `GEdestroyDevDesc()` when the device is
closed. If I free it by myself, it would cause double-free.

So, now I'm wondering if it makes sense that R provides either an API
that creates a `DevDesc` instance, or one that accepts a custom
allocator for DevDesc. But, as I expect this is a minor use case, I'm
not confident enough this would be meaningful.
I might end up filing such a feature request on Bugzilla, but let me
hold off for a while.

Best,
Yutani

2022年1月27日(木) 0:21 Tomas Kalibera :
>
>
> On 1/26/22 15:44, Hiroaki Yutani wrote:
> > Hi Tomas,
> >
> > Thanks for your helpful advice. This time, it seems the cause of the
> > error was an allocator mismatch; I mistakenly allocated the struct on
> > Rust's side, which means it's allocated by Rust's allocator, but a
> > `DevDesc` is to be freed on R's side. The problem is solved by using
> > libc::calloc(), which allocates using the C runtime's allocator, and
> > compiling it with the same toolchain that compiles R.
>
> Hi Yutani,
>
> congratulations on tracing it down.
>
> Particularly on Windows, whenever a DLL (or any API) is providing a
> function to allocate anything, it should provide also a function to free
> it, and only that function should be used to do so, even if it is just a
> wrapper for malloc() etc. So I would recommend following that, there
> should be a Rust allocator's "free" function which you could then call
> from R.
>
> Best
> Tomas
>
> >
> > I also saw some errors when it relates to GC, so it might be some
> > PROTECT issue. Thanks for the hint.
> >
> > During debugging, I learned a lot about how to build R with DEBUG=T
> > and use gdb, and it really helped me. I'm yet to unlock the power of
> > WinDBG, but I will try next time...
> >
> > Best,
> > Yutani
> >
> > 2022年1月26日(水) 23:20 Tomas Kalibera :
> >
> >> Hi Yutani,
> >>
> >> if you haven't done that already, I would recommend building R with
> >> debug symbols (DEBUG=T, so that make file don't strip them) and with -O0
> >> (no optimizations), so that the debug symbols are more accurate. Without
> >> that, the stack traces can be very misleading. You might try both with
> >> windbg and gdb, sometimes one of them provides an extra hint. Ideally
> >> you would also build the involved package(s) the same way.
> >>
> >> Then, it is important to check where the free() is called from, whether
> >> it is directly or from the GC. In both cases (but more likely in the
> >> latter), it could be caused by somewhat unrelated memory corruption,
> >> which may be hard to find - e.g. possibly a PROTECT error. Running with
> >> gctorture() might help, if gctorture() changes where the crash happens,
> >> it is more likely a somewhat unrelated memory corruption.
> >>
> >> If it were a double-free or similar allocation error inside R itself (or
> >> some of the involved packages), it would be easy to find with a debugger.
> >>
> >> If debugging this way does not help, you can try narrowing down the
> >> example, while preserving the crash. That may make debugging easier, and
> >> if you eventually get to a point that you have a reproducible example
> >> involving only base R and base packages, you know it is a bug in R, and
> >> can submit that in a bug report for others to debug.
> >>
> >> Best
> >> Tomas
> >>
> >> On 1/22/22 10:50, Hiroaki Yutani wrote:
> >>> Hi,
> >>>
> >>> I'm trying to create a Rust library that can implement an R graphic
> >>> device[1], but the R session crashes on `dev.off()` on Windows with R
> >>> 4.1.2. Strangely, it works without errors on Linux, on macOS, and even
> >>> on Windows with R-devel.
> >>>
> >>> Looking at the stack trace below by WinDbg, the problem is probably
> >>> that either of the two free()s in GEdestroyDevDesc() tries to free
> >>> some memory that was already freed (I'm a very beginner of this kind
> >>> of debugging, so I might be wrong).
> >>>
> >>>   # Child-SP  RetAddr   Call Site
> >>>   00 `0441cb70 7ffb`3df0be63 
> >>> ntdll!RtlReportFatalFailure+0x9
> >>>   01 `0441cbc0 7ffb`3df14c82
> >>> ntdll!RtlReportCriticalFailure+0x97
> >>>   ...snip.

[R-pkg-devel] R session crash on closing a graphic device on Windows

2022-01-22 Thread Hiroaki Yutani
Hi,

I'm trying to create a Rust library that can implement an R graphic
device[1], but the R session crashes on `dev.off()` on Windows with R
4.1.2. Strangely, it works without errors on Linux, on macOS, and even
on Windows with R-devel.

Looking at the stack trace below by WinDbg, the problem is probably
that either of the two free()s in GEdestroyDevDesc() tries to free
some memory that was already freed (I'm a very beginner of this kind
of debugging, so I might be wrong).

# Child-SP  RetAddr   Call Site
00 `0441cb70 7ffb`3df0be63 ntdll!RtlReportFatalFailure+0x9
01 `0441cbc0 7ffb`3df14c82
ntdll!RtlReportCriticalFailure+0x97
...snip...
08 `0441cfc0 7ffb`3c30c6ac ntdll!RtlFreeHeap+0x51
09 `0441d000 `6c7bcf99 msvcrt!free+0x1c
0a `0441d030 `6c79e7de R!GEdestroyDevDesc+0x59
0b `0441d080 `6fc828e9 R!GEcurrentDevice+0x37e
0c `0441d0f0 `6c7a15fa grDevices!devoff+0x59
...snip...

But, I found no difference in the related code (around devoff() and
GEdestroyDevDesc()) between R 4.1.2 and R-devel. I know there are a
lot of feature additions in R-devel, but I don't think it affects
here. Is there anyone who suffered from similar crashes? Am I missing
something?

I would appreciate any advice.

Best,
Yutani

[1]: https://github.com/extendr/extendr/pull/360

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] R on Windows with UCRT and the system encoding

2021-12-21 Thread Hiroaki Yutani
Hi Tomas,

Thanks for your prompt reply and spotting the right place. While I'm
not good at C/C++ things, I'll try investigating this and, if
possible, creating some patch to fix the issue. As the UTF-8 R on
Windows is really exciting news to us in CJK locale, I'd like to do my
best to help making the upcoming release a success.

I'll report on Bugzilla with more thetails first. Thanks for your support.

Best,
Yutani

2021年12月22日(水) 0:23 Tomas Kalibera :

>
> Hi Yutani,
>
> On 12/21/21 3:47 PM, Hiroaki Yutani wrote:
> > Hi Tomas,
> >
> > Thank you very much for the detailed explanation! I think now I have a
> > bit better understanding on how the things work; at least now I know I
> > didn't understand the concept of "active code page". I'll follow your
> > advice when I need to fix the packages that need some tweaks to handle
> > UTF-8 properly.
> >
> > Sorry, I'd like to ask one more question related to locale. If I copy
> > the following text and execute `read.csv("clipboard")`, it returns
> > "uao" instead of "úáö" (the characters are transliterated).
> >
> >  "col1","col2"
> >  "úáö","úáö"
> >
> >
> > While this is probably the status quo (the same behavior on R 4.1) on
> > Latin-1 encoding, things are worse on CJK locales. If I try,
> >
> >  "col1","col2"
> >  "あ","い"
> >
> > I get the following error:
> >
> >  > read.csv("clipboard")
> >  Error in type.convert.default(data[[i]], as.is = as.is[i], dec = dec,  
> > :
> >invalid multibyte string at '<82>'
> >
> > Is this supposed to work? It seems the characters are encoded as CP932
> > (my system locale) but marked as UTF-8.
> >
> >  > x <- utils:::readClipboard()
> >  > x
> >  [1] "\"col1\",\"col2\"" "\"\x82\xa0\",\"\x82\xa2\""
> >  > iconv(x, from = "CP932", to = "UTF-8")
> >  [1] "\"col1\",\"col2\"" "\"あ\",\"い\""
> >
> > I read the source code of readClipboard() in
> > src/library/utils/src/windows/util.c, but have no idea if there's
> > anything that needs to be fixed.
>
> Yes, this should work. I can reproduce the problem on my system, the
> clipboard apparently contains the Unicode characters, but R does not get
> them correctly, and from my quick read, it is a bug in R.
>
> My guess is this is in connections.c, where we call
> GetClipboardData(CF_TEXT). Perhaps if we used CF_UNICODETEXT, it would
> work (or alternatively CF_TEXT but also CF_LOCALE to find out what is
> the locale used, but CF_UNICODETEXT seems simpler). See
> https://docs.microsoft.com/en-us/windows/win32/dataxchg/standard-clipboard-formats
>
> As you started looking at the code, would you like to try
> debugging/fixing this?
>
> Best
> Tomas
>
> >
> > Best,
> > Yutani
> >
> > 2021年12月21日(火) 17:26 Tomas Kalibera :
> >
> >
> >
> >
> >
> >> Hi Yutani,
> >>
> >> On 12/21/21 6:34 AM, Hiroaki Yutani wrote:
> >>> Hi,
> >>>
> >>> I'm more than excited about the announcement about the upcoming UTF-8
> >>> R on Windows. Let me confirm my understanding. Is R 4.2 supposed to
> >>> work on Windows with non-UTF-8 encoding as the system locale? I think
> >>> this blog post indicates so (as this describes the older Windows than
> >>> the UTF-8 era), but I'm not fully confident if I understand the
> >>> details correctly.
> >> R 4.2 will automatically use UTF-8 as the active code page (system
> >> locale) and the C library encoding and the R current native encoding on
> >> systems which allow this (recent Windows 10 and newer, Windows Server
> >> 2022, etc). There is no way to opt-out from that, and of course no
> >> reason to, either. It does not matter of what is the system locale set
> >> in Windows for the whole system - these recent Windows allow individual
> >> applications to override the system-wide setting to UTF-8, which is what
> >> R does. Typically the system-wide setting will not be UTF-8, because
> >> many applications will not work with that.
> >>
> >> On older systems, R 4.2 will run in some other system locale and the
> >> same C library encoding and R current native encoding - the same system
> >> default as R 4.1 would run on that 

Re: [Rd] R on Windows with UCRT and the system encoding

2021-12-21 Thread Hiroaki Yutani
Hi Tomas,

Thank you very much for the detailed explanation! I think now I have a
bit better understanding on how the things work; at least now I know I
didn't understand the concept of "active code page". I'll follow your
advice when I need to fix the packages that need some tweaks to handle
UTF-8 properly.

Sorry, I'd like to ask one more question related to locale. If I copy
the following text and execute `read.csv("clipboard")`, it returns
"uao" instead of "úáö" (the characters are transliterated).

"col1","col2"
"úáö","úáö"


While this is probably the status quo (the same behavior on R 4.1) on
Latin-1 encoding, things are worse on CJK locales. If I try,

"col1","col2"
"あ","い"

I get the following error:

> read.csv("clipboard")
Error in type.convert.default(data[[i]], as.is = as.is[i], dec = dec,  :
  invalid multibyte string at '<82>'

Is this supposed to work? It seems the characters are encoded as CP932
(my system locale) but marked as UTF-8.

> x <- utils:::readClipboard()
> x
[1] "\"col1\",\"col2\"" "\"\x82\xa0\",\"\x82\xa2\""
> iconv(x, from = "CP932", to = "UTF-8")
[1] "\"col1\",\"col2\"" "\"あ\",\"い\""

I read the source code of readClipboard() in
src/library/utils/src/windows/util.c, but have no idea if there's
anything that needs to be fixed.

Best,
Yutani

2021年12月21日(火) 17:26 Tomas Kalibera :





>
> Hi Yutani,
>
> On 12/21/21 6:34 AM, Hiroaki Yutani wrote:
> > Hi,
> >
> > I'm more than excited about the announcement about the upcoming UTF-8
> > R on Windows. Let me confirm my understanding. Is R 4.2 supposed to
> > work on Windows with non-UTF-8 encoding as the system locale? I think
> > this blog post indicates so (as this describes the older Windows than
> > the UTF-8 era), but I'm not fully confident if I understand the
> > details correctly.
>
> R 4.2 will automatically use UTF-8 as the active code page (system
> locale) and the C library encoding and the R current native encoding on
> systems which allow this (recent Windows 10 and newer, Windows Server
> 2022, etc). There is no way to opt-out from that, and of course no
> reason to, either. It does not matter of what is the system locale set
> in Windows for the whole system - these recent Windows allow individual
> applications to override the system-wide setting to UTF-8, which is what
> R does. Typically the system-wide setting will not be UTF-8, because
> many applications will not work with that.
>
> On older systems, R 4.2 will run in some other system locale and the
> same C library encoding and R current native encoding - the same system
> default as R 4.1 would run on that system. So for some time, encoding
> support for this in R will have to stay, but eventually will be removed.
> But yes, R 4.2 is still supposed to work on such systems.
>
> > https://developer.r-project.org/Blog/public/2021/12/07/upcoming-changes-in-r-4.2-on-windows/index.html
> >
> > If so, I'm curious what the package authors should do when the locales
> > are different between OS and R. For example (disclaimer: I don't
> > intend to blame processx at all. Just for an example), the CRAN check
> > on the processx package currently fails with this warning on R-devel
> > Windows.
> >
> >>  1. UTF-8 in stdout (test-utf8.R:85:3) - Invalid multi-byte character 
> >> at end of stream ignored
> > https://cran.r-project.org/web/checks/check_results_processx.html
> >
> > As far as I know, processx launches an external process and captures
> > its output, and I suspect the problem is that the output of the
> > process is encoded in non-UTF-8 while R assumes it's UTF-8. I
> > experienced similar problems with other packages as well, which
> > disappear if I switch the locale to the same one as the OS by
> > Sys.setlocale(). So, I think it would be great if there's some
> > guidance for the package authors on how to handle these properly.
>
> Incidentally I've debugged this case and sent a detailed analysis to the
> maintainer, so he knows about the problem.
>
> In short, you cannot assume in Windows that different applications use
> the same system encoding. That is not true at least with the invention
> of the fusion manifests which allow an application to switch to UTF-8 as
> system encoding, which R does. So, when using an external application on
> Windows, you need to know and respect a specific encoding used by that
> application on input and output.
>

[Rd] R on Windows with UCRT and the system encoding

2021-12-20 Thread Hiroaki Yutani
Hi,

I'm more than excited about the announcement about the upcoming UTF-8
R on Windows. Let me confirm my understanding. Is R 4.2 supposed to
work on Windows with non-UTF-8 encoding as the system locale? I think
this blog post indicates so (as this describes the older Windows than
the UTF-8 era), but I'm not fully confident if I understand the
details correctly.

https://developer.r-project.org/Blog/public/2021/12/07/upcoming-changes-in-r-4.2-on-windows/index.html

If so, I'm curious what the package authors should do when the locales
are different between OS and R. For example (disclaimer: I don't
intend to blame processx at all. Just for an example), the CRAN check
on the processx package currently fails with this warning on R-devel
Windows.

> 1. UTF-8 in stdout (test-utf8.R:85:3) - Invalid multi-byte character at 
> end of stream ignored
https://cran.r-project.org/web/checks/check_results_processx.html

As far as I know, processx launches an external process and captures
its output, and I suspect the problem is that the output of the
process is encoded in non-UTF-8 while R assumes it's UTF-8. I
experienced similar problems with other packages as well, which
disappear if I switch the locale to the same one as the OS by
Sys.setlocale(). So, I think it would be great if there's some
guidance for the package authors on how to handle these properly.

Any suggestions?

Best,
Yutani

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] New CRAN checks on r-devel-windows-x86_64-new-UL and the installed fonts

2021-12-16 Thread Hiroaki Yutani
Yes, it's my fault that I didn't consider the case when no fonts are
available. I'll improve the code until the next submission to CRAN.
Thanks for your advice!

Best,
Hiroaki Yutani

2021年12月17日(金) 1:40 Tomas Kalibera :
>
>
> On 12/16/21 5:16 PM, Hiroaki Yutani wrote:
> > Thanks for the details and the suggestions. My package uses
> > systemfonts package for illustration purposes only in the examples, so
> > I'm not that desperate to find the root cause this time. I'll try
> > using winbuilder in case I need to.
>
> I see. Maybe best then making the example more robust when no fonts are
> found, on different platforms.
>
> Best
> Tomas
>
> >
> > Best,
> > Hiroaki Yutani
> >
> > 2021年12月17日(金) 0:52 Tomas Kalibera :
> >
> >>
> >> On 12/16/21 4:17 PM, Hiroaki Yutani wrote:
> >>>> This would be an empty character vector on my Alpine Linux server as
> >>>> well.
> >>> I see, thanks for the information. Sorry for my lack of consideration on 
> >>> this.
> >>>
> >>>> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> >>>> does not find any of these.
> >>> Thanks for investigating quickly! Then, it seems I should wait for the
> >>> problem to be solved on systemfonts' side. I'm curious what's the
> >>> difference between r-devel-windows-x86_64-new-TK, on which the check
> >>> doesn't fail, by the way.
> >> It is also Windows Server 2022 with GUI, a quite fresh installation. The
> >> checks run in a docker container (also WS2022,
> >> mcr.microsoft.com/windows/server:ltsc2022, without any manually
> >> installed fonts).
> >>
> >> Actually there is no manually installed software there, all that is
> >> installed is (in this order):
> >>
> >> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt/r/setup_miktex_standalone.ps1
> >> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt3/r/setup.ps1
> >> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt3/r_packages/setup_checks.ps1
> >>
> >> plus R and R packages.
> >>
> >> I assume you can reproduce on Winbuilder, and so perhaps you could
> >> create a version of your package with a lot of instrumentation/print
> >> messages and submit there to find the cause? Possibly also an
> >> instrumented variant of systemfonts.
> >>
> >> Best
> >> Tomas
> >>
> >>> Best,
> >>> Hiroaki Yutani
> >>>
> >>> 2021年12月16日(木) 23:57 Uwe Ligges :
> >>>>
> >>>> On 16.12.2021 15:34, Sebastian Meyer wrote:
> >>>>> Am 16.12.21 um 15:06 schrieb Hiroaki Yutani:
> >>>>>> Hi,
> >>>>>>
> >>>>>> My package is failing on CRAN check on r-devel-windows-x86_64-new-UL.
> >>>>>>
> >>>>>> https://cran.r-project.org/web/checks/check_results_string2path.html
> >>>>>>
> >>>>>> It seems the problem is that there is no available font that meets the
> >>>>>> condition in the following code. Is it irrational to assume at least
> >>>>>> one TrueType or OpenType font is installed in the system?
> >>>>>>
> >>>>>>available_fonts <- systemfonts::system_fonts()$path
> >>>>>>
> >>>>>># string2path supports only TrueType or OpenType formats
> >>>>>>ttf_or_otf <- available_fonts[grepl("\\.(ttf|otf)$",
> >>>>>> available_fonts)]
> >>>>>>
> >>>> The machine that is running " r-devel-windows-x86_64-new-UL" is a
> >>>> freshly installed Windows Server 2022 with GUI. The standard fonts are
> >>>> available, but no extra fonts isntalled.
> >>>>
> >>>> To confirm:
> >>>>
> >>>>> table(gsub(".*\\.(.{3})$", "\\1", dir("c:/WIndows/fonts")))
> >>>>
> >>>> dat fon ini ttc ttf xml
> >>>>  1 192   1  16 127   1
> >>>>
> >>>>
> >>>> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> >>>> does not find any of these.
> >>>>
> >>>> Best,
> >>>> Uwe Ligges
> >>>>
> >>>>
> >>>>
> >>>&g

Re: [R-pkg-devel] New CRAN checks on r-devel-windows-x86_64-new-UL and the installed fonts

2021-12-16 Thread Hiroaki Yutani
Thanks for the details and the suggestions. My package uses
systemfonts package for illustration purposes only in the examples, so
I'm not that desperate to find the root cause this time. I'll try
using winbuilder in case I need to.

Best,
Hiroaki Yutani

2021年12月17日(金) 0:52 Tomas Kalibera :

>
>
> On 12/16/21 4:17 PM, Hiroaki Yutani wrote:
> >> This would be an empty character vector on my Alpine Linux server as
> >> well.
> > I see, thanks for the information. Sorry for my lack of consideration on 
> > this.
> >
> >> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> >> does not find any of these.
> > Thanks for investigating quickly! Then, it seems I should wait for the
> > problem to be solved on systemfonts' side. I'm curious what's the
> > difference between r-devel-windows-x86_64-new-TK, on which the check
> > doesn't fail, by the way.
>
> It is also Windows Server 2022 with GUI, a quite fresh installation. The
> checks run in a docker container (also WS2022,
> mcr.microsoft.com/windows/server:ltsc2022, without any manually
> installed fonts).
>
> Actually there is no manually installed software there, all that is
> installed is (in this order):
>
> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt/r/setup_miktex_standalone.ps1
> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt3/r/setup.ps1
> https://svn.r-project.org/R-dev-web/trunk/WindowsBuilds/winutf8/ucrt3/r_packages/setup_checks.ps1
>
> plus R and R packages.
>
> I assume you can reproduce on Winbuilder, and so perhaps you could
> create a version of your package with a lot of instrumentation/print
> messages and submit there to find the cause? Possibly also an
> instrumented variant of systemfonts.
>
> Best
> Tomas
>
> >
> > Best,
> > Hiroaki Yutani
> >
> > 2021年12月16日(木) 23:57 Uwe Ligges :
> >>
> >>
> >> On 16.12.2021 15:34, Sebastian Meyer wrote:
> >>> Am 16.12.21 um 15:06 schrieb Hiroaki Yutani:
> >>>> Hi,
> >>>>
> >>>> My package is failing on CRAN check on r-devel-windows-x86_64-new-UL.
> >>>>
> >>>> https://cran.r-project.org/web/checks/check_results_string2path.html
> >>>>
> >>>> It seems the problem is that there is no available font that meets the
> >>>> condition in the following code. Is it irrational to assume at least
> >>>> one TrueType or OpenType font is installed in the system?
> >>>>
> >>>>   available_fonts <- systemfonts::system_fonts()$path
> >>>>
> >>>>   # string2path supports only TrueType or OpenType formats
> >>>>   ttf_or_otf <- available_fonts[grepl("\\.(ttf|otf)$",
> >>>> available_fonts)]
> >>>>
> >> The machine that is running " r-devel-windows-x86_64-new-UL" is a
> >> freshly installed Windows Server 2022 with GUI. The standard fonts are
> >> available, but no extra fonts isntalled.
> >>
> >> To confirm:
> >>
> >>   > table(gsub(".*\\.(.{3})$", "\\1", dir("c:/WIndows/fonts")))
> >>
> >> dat fon ini ttc ttf xml
> >> 1 192   1  16 127   1
> >>
> >>
> >> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> >> does not find any of these.
> >>
> >> Best,
> >> Uwe Ligges
> >>
> >>
> >>
> >>
> >>
> >>> This would be an empty character vector on my Alpine Linux server as
> >>> well. The system_fonts() there only contain ".pcf.gz" files from
> >>> "/usr/share/fonts/misc/".
> >>>
> >>> Note that the "systemfonts" package on which you rely currently also
> >>> fails on that CRAN check flavour for a similar reason
> >>> (https://cran.r-project.org/web/checks/check_results_systemfonts.html).
> >>> On my Alpine Linux system, from example("register_font",
> >>> package="systemfonts"):
> >>>
> >>> fonts <- system_fonts()
> >>> plain <- sample(which(!fonts$italic & fonts$weight <= 'normal'), 1)
> >>> bold <- sample(which(!fonts$italic & fonts$weight > 'normal'), 1)
> >>> italic <- sample(which(fonts$italic & fonts$weight <= 'normal'), 1)
> >>> ## Error in sample.int(length(x), size, replace, prob) :
> >>> ##   invalid first argument
> >>>

Re: [R-pkg-devel] New CRAN checks on r-devel-windows-x86_64-new-UL and the installed fonts

2021-12-16 Thread Hiroaki Yutani
> This would be an empty character vector on my Alpine Linux server as
> well.

I see, thanks for the information. Sorry for my lack of consideration on this.

> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> does not find any of these.

Thanks for investigating quickly! Then, it seems I should wait for the
problem to be solved on systemfonts' side. I'm curious what's the
difference between r-devel-windows-x86_64-new-TK, on which the check
doesn't fail, by the way.

Best,
Hiroaki Yutani

2021年12月16日(木) 23:57 Uwe Ligges :
>
>
>
> On 16.12.2021 15:34, Sebastian Meyer wrote:
> > Am 16.12.21 um 15:06 schrieb Hiroaki Yutani:
> >> Hi,
> >>
> >> My package is failing on CRAN check on r-devel-windows-x86_64-new-UL.
> >>
> >> https://cran.r-project.org/web/checks/check_results_string2path.html
> >>
> >> It seems the problem is that there is no available font that meets the
> >> condition in the following code. Is it irrational to assume at least
> >> one TrueType or OpenType font is installed in the system?
> >>
> >>  available_fonts <- systemfonts::system_fonts()$path
> >>
> >>  # string2path supports only TrueType or OpenType formats
> >>  ttf_or_otf <- available_fonts[grepl("\\.(ttf|otf)$",
> >> available_fonts)]
> >>
>
> The machine that is running " r-devel-windows-x86_64-new-UL" is a
> freshly installed Windows Server 2022 with GUI. The standard fonts are
> available, but no extra fonts isntalled.
>
> To confirm:
>
>  > table(gsub(".*\\.(.{3})$", "\\1", dir("c:/WIndows/fonts")))
>
> dat fon ini ttc ttf xml
>1 192   1  16 127   1
>
>
> So there are 127 *.ttf files installed, but systemfonts::system_fonts()
> does not find any of these.
>
> Best,
> Uwe Ligges
>
>
>
>
>
> > This would be an empty character vector on my Alpine Linux server as
> > well. The system_fonts() there only contain ".pcf.gz" files from
> > "/usr/share/fonts/misc/".
> >
> > Note that the "systemfonts" package on which you rely currently also
> > fails on that CRAN check flavour for a similar reason
> > (https://cran.r-project.org/web/checks/check_results_systemfonts.html).
> > On my Alpine Linux system, from example("register_font",
> > package="systemfonts"):
> >
> > fonts <- system_fonts()
> > plain <- sample(which(!fonts$italic & fonts$weight <= 'normal'), 1)
> > bold <- sample(which(!fonts$italic & fonts$weight > 'normal'), 1)
> > italic <- sample(which(fonts$italic & fonts$weight <= 'normal'), 1)
> > ## Error in sample.int(length(x), size, replace, prob) :
> > ##   invalid first argument
> >
> > (as there are no italic fonts).
> >
> > HTH,
> >
> >  Sebastian Meyer
> >
> >> I'm wondering if I need to release a new version to avoid this test
> >> failure. Note that, the other Windows r-devel machine
> >> (r-devel-windows-x86_64-new-TK) doesn't fail. So, it might be just
> >> that something is wrong with r-devel-windows-x86_64-new-UL.
> >>
> >> Any suggestions?
> >>
> >> Best,
> >> Hiroaki Yutani
> >>
> >> __
> >> R-package-devel@r-project.org mailing list
> >> https://stat.ethz.ch/mailman/listinfo/r-package-devel
> >>
> >
> > __
> > R-package-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] New CRAN checks on r-devel-windows-x86_64-new-UL and the installed fonts

2021-12-16 Thread Hiroaki Yutani
Hi,

My package is failing on CRAN check on r-devel-windows-x86_64-new-UL.

https://cran.r-project.org/web/checks/check_results_string2path.html

It seems the problem is that there is no available font that meets the
condition in the following code. Is it irrational to assume at least
one TrueType or OpenType font is installed in the system?

available_fonts <- systemfonts::system_fonts()$path

# string2path supports only TrueType or OpenType formats
ttf_or_otf <- available_fonts[grepl("\\.(ttf|otf)$", available_fonts)]

I'm wondering if I need to release a new version to avoid this test
failure. Note that, the other Windows r-devel machine
(r-devel-windows-x86_64-new-TK) doesn't fail. So, it might be just
that something is wrong with r-devel-windows-x86_64-new-UL.

Any suggestions?

Best,
Hiroaki Yutani

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [R-pkg-devel] Internet resources and Errors

2021-09-24 Thread Hiroaki Yutani
Hi,

Regarding Solaris, while the criteria is unknown, it seems Solaris can
be excluded from the CRAN check targets when there's a good reason not
to support Solaris. My package got a failure on Solaris check and I
resubmitted with the following comment, then Solaris disappeared from
the check results [1].

> I would like to request to exclude Solaris from the build targets because 
> Solaris is not a supported platform by Rust. This should be in line with the 
> treatments of other CRAN packages that use Rust; gifski, baseflow, and salso 
> are not built on Solaris. I'm sorry that I didn't write this in the first 
> submission.

Again, I'm not sure on what criteria they decided to accept my
suggestion. It's unclear if your package is the case. Some months ago,
I asked "How can I avoid Solaris build on CRAN?" on this mailing list
[2], but I got no reply. So, probably no one knows the right answer...

Best,
Hiroaki Yutani

[1]: https://cran.r-project.org/web/checks/check_results_string2path.html
[2]: https://stat.ethz.ch/pipermail/r-package-devel/2021q3/007262.html

2021年9月24日(金) 23:50 Roy Mendelssohn - NOAA Federal via R-package-devel
:


>
> Hi All:
>
> I am getting dinged again on CRAN  (just Solaris for some reason),  and the 
> problem is how to exit if there is a failure of accessing the resource,  I 
> know it has been discussed here before,  but I just am not understanding what 
> is desired to end properly. As I have been reminded:
>
> 'Packages which use Internet resources should fail gracefully with an 
> informative message
> if the resource is not available or has changed (and not give a check warning 
> nor error).'
>
> All internet calls are wrapped in 'try()'.  If that shows an error,  I  write 
> a message to the screen about the error,  and call stop(), perhaps with a 
> further message in that call.   Somehow this does not appear to meet the 
> standard.Can someone then please tell me what I should do instead.  The 
> point is I have checked that the access to the internet resources has worked, 
>  i have seen that it hasn't,  now what should be the steps to take to exit 
> gracefully.
>
> I also want to add to what others have said about the frustrations of dealing 
> with Solaris.  I have spent a fair amount of time getting things to  work 
> with Solaris which no one uses.  In this instance the package test is only 
> failing on Solaris.   Not a good use of limited time IMO.
>
> Thanks for any advice.
>
> -Roy
>
>
>
> **
> "The contents of this message do not reflect any position of the U.S. 
> Government or NOAA."
> **
> Roy Mendelssohn
> Supervisory Operations Research Analyst
> NOAA/NMFS
> Environmental Research Division
> Southwest Fisheries Science Center
> ***Note new street address***
> 110 McAllister Way
> Santa Cruz, CA 95060
> Phone: (831)-420-3666
> Fax: (831) 420-3980
> e-mail: roy.mendelss...@noaa.gov www: https://www.pfeg.noaa.gov/
>
> "Old age and treachery will overcome youth and skill."
> "From those who have been given much, much will be expected"
> "the arc of the moral universe is long, but it bends toward justice" -MLK Jr.
>
> __
> R-package-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-package-devel

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] Detect UCRT-built R from within R sessions (and in configure.win)

2021-09-23 Thread Hiroaki Yutani
> Thanks, that's right, so I've ported this part to R-devel and R-patched,

I noticed R-devel no longer complains about this from a while ago, thanks.

> With R.version$crt, you can already get a make (or even environment)
> variable. Writing R Extensions has examples how to invoke R in make
> files to get "R CMD config" values, so here you would invoke "Rscript"
> instead with one of the conditions above.

This slipped my mind, thanks for pointing it out! Yes, this works
perfectly without configure.ucrt. I will stick with this at least for
a while until the next version of R gets released.

> ... The .ucrt
> files are easier to maintain in hot-patches, but that is not an
> advantage for package authors.

I see, I think now I get your point. So, even if all the package
authors would choose to use the Rscript way, the .ucrt files would be
still needed to make room for the (R? or CRAN?) maintainers to
hot-patch the packages that don't work on UCRT nicely. Thanks for all
the efforts to make the UCRT R a reality.

Best,
Hiroaki Yutani

2021年9月24日(金) 2:16 Tomas Kalibera :


>
>
> On 9/20/21 11:03 AM, Hiroaki Yutani wrote:
> > I tried to use configure.ucrt, and found it results in the following
> > NOTE on the released version of R, unfortunately.
> >
> >  * checking top-level files ... NOTE
> >  Non-standard file/directory found at top level:
> >  'configure.ucrt'
> >
> > Will this be accepted by CRAN if I submit a package that contains
> > configure.ucrt? Or, is it too early to use it in a CRAN package?
>
> Thanks, that's right, so I've ported this part to R-devel and R-patched,
> configure.ucrt and cleanup.ucrt will be treated as "standard". There is
> nothing we can do about already released versions, the NOTE will appear.
>
> You can also use configure.win and branch on R.version$crt, e.g.
>
> !is.null(R.version$crt) && R.version$crt == "ucrt"
>
> or
>
> identical(R.version$crt, "ucrt")
>
> > In either case, while I don't have a strong opinion here, I'm starting
> > to feel that it might be preferable to provide an environmental
> > variable rather than creating ".ucrt" versions of files. In my
> > understanding, the plan is to switch all the Windows R to UCRT at some
> > point in future. But, it's not clear to me how to unify these ".win"
> > files and ".ucrt" files smoothly.
>
> With R.version$crt, you can already get a make (or even environment)
> variable. Writing R Extensions has examples how to invoke R in make
> files to get "R CMD config" values, so here you would invoke "Rscript"
> instead with one of the conditions above.
>
> Either is fine. With .ucrt files, you can avoid copy pasting of common
> code using "include" directives. With the variable, you can use make
> conditionals. As you found now, with the variable you have the advantage
> of not getting a NOTE with already released versions of R. The .ucrt
> files are easier to maintain in hot-patches, but that is not an
> advantage for package authors.
>
> Once a package depends on a version of R that will already use UCRT, one
> either would refactor/remove the conditionals, or integrate the ".ucrt"
> files back into the ".win". So, in the long term, there should be no
> conditionals on R.version$crt nor ".ucrt" files.
>
> Best
> Tomas
> > Best,
> > Hiroaki Yutani
> >
> > 2021年9月14日(火) 23:44 Hiroaki Yutani :
> >
> >> Thanks for both, I'll try these features.
> >>
> >> 2021年9月14日(火) 22:40 Tomas Kalibera :
> >>
> >>>
> >>> On 9/9/21 5:54 AM, Hiroaki Yutani wrote:
> >>>
> >>> Thank you for the prompt reply.
> >>>
> >>>> There in not such a mechanism, yet, but can be added, at least for
> >>>> diagnostics.
> >>> For example, can R.version somehow contain the information?
> >>>
> >>> Yes, now added to the experimental builds. R.version$crt contains "ucrt" 
> >>> (and would contain "msvcrt" if R was built against MSVCRT).
> >>>
> >>>
> >>>> We could add support for configure.ucrt, which would take precedence
> >>>> over configure.win on the UCRT builds (like Makevars.ucrt takes
> >>>> precedence over Makevars.win). Would that work for you?
> >>> Yes, configure.ucrt should work for me. There might be someone who 
> >>> prefers to switch by some envvar rather than creating another file, but I 
> >>> don't have a strong opinion here.
>

Re: [Rd] Detect UCRT-built R from within R sessions (and in configure.win)

2021-09-20 Thread Hiroaki Yutani
I tried to use configure.ucrt, and found it results in the following
NOTE on the released version of R, unfortunately.

* checking top-level files ... NOTE
Non-standard file/directory found at top level:
'configure.ucrt'

Will this be accepted by CRAN if I submit a package that contains
configure.ucrt? Or, is it too early to use it in a CRAN package?

In either case, while I don't have a strong opinion here, I'm starting
to feel that it might be preferable to provide an environmental
variable rather than creating ".ucrt" versions of files. In my
understanding, the plan is to switch all the Windows R to UCRT at some
point in future. But, it's not clear to me how to unify these ".win"
files and ".ucrt" files smoothly.

Best,
Hiroaki Yutani

2021年9月14日(火) 23:44 Hiroaki Yutani :

>
> Thanks for both, I'll try these features.
>
> 2021年9月14日(火) 22:40 Tomas Kalibera :
>
> >
> >
> > On 9/9/21 5:54 AM, Hiroaki Yutani wrote:
> >
> > Thank you for the prompt reply.
> >
> > > There in not such a mechanism, yet, but can be added, at least for
> > > diagnostics.
> >
> > For example, can R.version somehow contain the information?
> >
> > Yes, now added to the experimental builds. R.version$crt contains "ucrt" 
> > (and would contain "msvcrt" if R was built against MSVCRT).
> >
> >
> > > We could add support for configure.ucrt, which would take precedence
> > > over configure.win on the UCRT builds (like Makevars.ucrt takes
> > > precedence over Makevars.win). Would that work for you?
> >
> > Yes, configure.ucrt should work for me. There might be someone who prefers 
> > to switch by some envvar rather than creating another file, but I don't 
> > have a strong opinion here.
> >
> > The experimental builds now support configure.ucrt and cleanup.ucrt files.
> >
> > Best
> > Tomas
> >
> >
> > Best,
> > Hiroaki Yutani
> >
> > 2021年9月9日(木) 0:48 Tomas Kalibera :
> >>
> >>
> >> On 9/8/21 2:08 PM, Hiroaki Yutani wrote:
> >> > Hi,
> >> >
> >> > Are there any proper ways to know whether the session is running on
> >> > the R that is built with the UCRT toolchain or not? Checking if the
> >> > encoding is UTF-8 might do the trick, but I'm not sure if it's always
> >> > reliable.
> >>
> >> There in not such a mechanism, yet, but can be added, at least for
> >> diagnostics.
> >>
> >> You are right that checking for UTF-8 encoding would not always be
> >> reliable. For example, the version of Windows may be too old to allow R
> >> use UTF-8 as native encoding (e.g. Windows server 2016), then R will use
> >> the native code page as it does today in the MSVCRT builds.
> >>
> >> > Also, I'd like to know if there's any mechanism to detect the UCRT in
> >> > configure.win. I know there are Makevars.ucrt and Makefile.ucrt, but
> >> > one might want to do some feature test that is specific to the UCRT
> >> > toolchain.
> >>
> >> We could add support for configure.ucrt, which would take precedence
> >> over configure.win on the UCRT builds (like Makevars.ucrt takes
> >> precedence over Makevars.win). Would that work for you?
> >>
> >> Best
> >> Tomas
> >>
> >> >
> >> > Best,
> >> > Hiroaki Yutani
> >> >
> >> > __
> >> > R-devel@r-project.org mailing list
> >> > https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Detect UCRT-built R from within R sessions (and in configure.win)

2021-09-14 Thread Hiroaki Yutani
Thanks for both, I'll try these features.

2021年9月14日(火) 22:40 Tomas Kalibera :

>
>
> On 9/9/21 5:54 AM, Hiroaki Yutani wrote:
>
> Thank you for the prompt reply.
>
> > There in not such a mechanism, yet, but can be added, at least for
> > diagnostics.
>
> For example, can R.version somehow contain the information?
>
> Yes, now added to the experimental builds. R.version$crt contains "ucrt" (and 
> would contain "msvcrt" if R was built against MSVCRT).
>
>
> > We could add support for configure.ucrt, which would take precedence
> > over configure.win on the UCRT builds (like Makevars.ucrt takes
> > precedence over Makevars.win). Would that work for you?
>
> Yes, configure.ucrt should work for me. There might be someone who prefers to 
> switch by some envvar rather than creating another file, but I don't have a 
> strong opinion here.
>
> The experimental builds now support configure.ucrt and cleanup.ucrt files.
>
> Best
> Tomas
>
>
> Best,
> Hiroaki Yutani
>
> 2021年9月9日(木) 0:48 Tomas Kalibera :
>>
>>
>> On 9/8/21 2:08 PM, Hiroaki Yutani wrote:
>> > Hi,
>> >
>> > Are there any proper ways to know whether the session is running on
>> > the R that is built with the UCRT toolchain or not? Checking if the
>> > encoding is UTF-8 might do the trick, but I'm not sure if it's always
>> > reliable.
>>
>> There in not such a mechanism, yet, but can be added, at least for
>> diagnostics.
>>
>> You are right that checking for UTF-8 encoding would not always be
>> reliable. For example, the version of Windows may be too old to allow R
>> use UTF-8 as native encoding (e.g. Windows server 2016), then R will use
>> the native code page as it does today in the MSVCRT builds.
>>
>> > Also, I'd like to know if there's any mechanism to detect the UCRT in
>> > configure.win. I know there are Makevars.ucrt and Makefile.ucrt, but
>> > one might want to do some feature test that is specific to the UCRT
>> > toolchain.
>>
>> We could add support for configure.ucrt, which would take precedence
>> over configure.win on the UCRT builds (like Makevars.ucrt takes
>> precedence over Makevars.win). Would that work for you?
>>
>> Best
>> Tomas
>>
>> >
>> > Best,
>> > Hiroaki Yutani
>> >
>> > __
>> > R-devel@r-project.org mailing list
>> > https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Detect UCRT-built R from within R sessions (and in configure.win)

2021-09-08 Thread Hiroaki Yutani
Thank you for the prompt reply.

> There in not such a mechanism, yet, but can be added, at least for
> diagnostics.

For example, can R.version somehow contain the information?

> We could add support for configure.ucrt, which would take precedence
> over configure.win on the UCRT builds (like Makevars.ucrt takes
> precedence over Makevars.win). Would that work for you?

Yes, configure.ucrt should work for me. There might be someone who prefers
to switch by some envvar rather than creating another file, but I don't
have a strong opinion here.

Best,
Hiroaki Yutani

2021年9月9日(木) 0:48 Tomas Kalibera :

>
> On 9/8/21 2:08 PM, Hiroaki Yutani wrote:
> > Hi,
> >
> > Are there any proper ways to know whether the session is running on
> > the R that is built with the UCRT toolchain or not? Checking if the
> > encoding is UTF-8 might do the trick, but I'm not sure if it's always
> > reliable.
>
> There in not such a mechanism, yet, but can be added, at least for
> diagnostics.
>
> You are right that checking for UTF-8 encoding would not always be
> reliable. For example, the version of Windows may be too old to allow R
> use UTF-8 as native encoding (e.g. Windows server 2016), then R will use
> the native code page as it does today in the MSVCRT builds.
>
> > Also, I'd like to know if there's any mechanism to detect the UCRT in
> > configure.win. I know there are Makevars.ucrt and Makefile.ucrt, but
> > one might want to do some feature test that is specific to the UCRT
> > toolchain.
>
> We could add support for configure.ucrt, which would take precedence
> over configure.win on the UCRT builds (like Makevars.ucrt takes
> precedence over Makevars.win). Would that work for you?
>
> Best
> Tomas
>
> >
> > Best,
> > Hiroaki Yutani
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] Detect UCRT-built R from within R sessions (and in configure.win)

2021-09-08 Thread Hiroaki Yutani
Hi,

Are there any proper ways to know whether the session is running on
the R that is built with the UCRT toolchain or not? Checking if the
encoding is UTF-8 might do the trick, but I'm not sure if it's always
reliable.

Also, I'd like to know if there's any mechanism to detect the UCRT in
configure.win. I know there are Makevars.ucrt and Makefile.ucrt, but
one might want to do some feature test that is specific to the UCRT
toolchain.

Best,
Hiroaki Yutani

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [R-pkg-devel] How can I avoid Solaris build on CRAN?

2021-08-04 Thread Hiroaki Yutani
Sorry, I pointed a wrong package. I meant,

[2]: https://cran.r-project.org/package=gifski
[3]: https://cran.r-project.org/package=baseflow

Best,
Hiroaki Yutani

2021年8月5日(木) 8:17 Hiroaki Yutani :
>
> Hi,
>
> I recently submitted my package, which needs compilation of Rust code,
> to CRAN. Now I'm seeing a CRAN check error on Solaris.
>
> https://cran.r-project.org/web/checks/check_results_string2path.html
>
> Since Solaris is not a platform supported by Rust [1], I need to avoid
> building on Solaris. I assumed specifying
>
> SystemRequirements: Cargo (rustc package manager)
>
> would effectively achieve this, considering this seems the only common
> thing among other CRAN packages [2][3] with Rust compilation. But, it
> turned out I was wrong.
>
> Are there any ways to ask CRAN to exclude Solaris?
>
> Best,
> Hiroaki Yutani
>
> [1]: https://doc.rust-lang.org/nightly/rustc/platform-support.html
> [2]: https://cran.r-project.org/package=cargo
> [3]: https://cran.r-project.org/package=baseflow

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


[R-pkg-devel] How can I avoid Solaris build on CRAN?

2021-08-04 Thread Hiroaki Yutani
Hi,

I recently submitted my package, which needs compilation of Rust code,
to CRAN. Now I'm seeing a CRAN check error on Solaris.

https://cran.r-project.org/web/checks/check_results_string2path.html

Since Solaris is not a platform supported by Rust [1], I need to avoid
building on Solaris. I assumed specifying

SystemRequirements: Cargo (rustc package manager)

would effectively achieve this, considering this seems the only common
thing among other CRAN packages [2][3] with Rust compilation. But, it
turned out I was wrong.

Are there any ways to ask CRAN to exclude Solaris?

Best,
Hiroaki Yutani

[1]: https://doc.rust-lang.org/nightly/rustc/platform-support.html
[2]: https://cran.r-project.org/package=cargo
[3]: https://cran.r-project.org/package=baseflow

__
R-package-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-package-devel


Re: [Rd] issue with print()ing multibyte characters on R 4.0.4

2021-02-17 Thread Hiroaki Yutani
I filed an issue for this on R's Bugzilla as well, in case this makes
it easier to track (This is my first time to submit a bug report,
please excuse me if I'm failing to follow the appropriate steps to do
this).

https://bugs.r-project.org/bugzilla/show_bug.cgi?id=18059

2021年2月17日(水) 22:47 Hiroaki Yutani :
>
> Thanks for confirming and investigating.
>
> > but it was no one reported in the run up to 4.0.4.
>
> Yes, it was unfortunate that no one had reported it to the right place
> before the release...
>
> 2021年2月17日(水) 19:20 Prof Brian Ripley :
>
> >
> > On 17/02/2021 04:58, Hiroaki Yutani wrote:
> > > Hi all,
> > >
> > > I saw several people on Japanese locale claim that, on R 4.0.4,
> > > print() doesn't display
> > > Japanese characters correctly. This seems to happen only on Windows
> > > and on macOS (I
> > > usually use Linux and I don't see this problem).
> > >
> > > For example, in the result below, "鬼" and "外" are displayed in
> > > "\u" format. What's
> > > curious here is that "は" is displayed as it is, by the way.
> > >
> > >> "鬼は外"
> > > [1] "\u9b3cは\u5916"
> > >
> > > But, if I use such functions as message() or cat(), the string is
> > > displayed as it is.
> > >
> > >> message("鬼は外")
> > > 鬼は外
> >
> > that does not escape non-printable characters, so as expected.
> > >
> > > Considering the fact that it seems only Windows and macOS are
> > > affected, I suspect this
> > > is somehow related to this change described in the release note,
> > > (though I have no idea
> > > what change this is):
> > >
> > >  The internal table for iswprint (used on Windows, macOS and AIX) has 
> > > been
> > >  updated to include many recent Unicode characters.
> > >  (https://cran.r-project.org/doc/manuals/r-release/NEWS.html)
> > >
> > > Before I'm going to file this issue on Bugzilla, I'd like to confirm
> > > if this is not the intended
> > > change, and, if this is actually intended, I want to discuss how to
> > > improve this behaviour.
> >
> > I am sorry: this was not intended but it was no one reported in the run
> > up to 4.0.4.  It seems to be working in R-devel so I suggest you check
> > that or go back to 4.0.3.
> >
> > It looks like a line in the iswprint table got deleted in the merge from
> > R-devel.  I will try to set up some automated checks to see if I can
> > find any other problems, but that will take a few days.
> >
> > --
> > Brian D. Ripley,  rip...@stats.ox.ac.uk
> > Emeritus Professor of Applied Statistics, University of Oxford

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] issue with print()ing multibyte characters on R 4.0.4

2021-02-17 Thread Hiroaki Yutani
Thanks for confirming and investigating.

> but it was no one reported in the run up to 4.0.4.

Yes, it was unfortunate that no one had reported it to the right place
before the release...

2021年2月17日(水) 19:20 Prof Brian Ripley :

>
> On 17/02/2021 04:58, Hiroaki Yutani wrote:
> > Hi all,
> >
> > I saw several people on Japanese locale claim that, on R 4.0.4,
> > print() doesn't display
> > Japanese characters correctly. This seems to happen only on Windows
> > and on macOS (I
> > usually use Linux and I don't see this problem).
> >
> > For example, in the result below, "鬼" and "外" are displayed in
> > "\u" format. What's
> > curious here is that "は" is displayed as it is, by the way.
> >
> >> "鬼は外"
> > [1] "\u9b3cは\u5916"
> >
> > But, if I use such functions as message() or cat(), the string is
> > displayed as it is.
> >
> >> message("鬼は外")
> > 鬼は外
>
> that does not escape non-printable characters, so as expected.
> >
> > Considering the fact that it seems only Windows and macOS are
> > affected, I suspect this
> > is somehow related to this change described in the release note,
> > (though I have no idea
> > what change this is):
> >
> >  The internal table for iswprint (used on Windows, macOS and AIX) has 
> > been
> >  updated to include many recent Unicode characters.
> >  (https://cran.r-project.org/doc/manuals/r-release/NEWS.html)
> >
> > Before I'm going to file this issue on Bugzilla, I'd like to confirm
> > if this is not the intended
> > change, and, if this is actually intended, I want to discuss how to
> > improve this behaviour.
>
> I am sorry: this was not intended but it was no one reported in the run
> up to 4.0.4.  It seems to be working in R-devel so I suggest you check
> that or go back to 4.0.3.
>
> It looks like a line in the iswprint table got deleted in the merge from
> R-devel.  I will try to set up some automated checks to see if I can
> find any other problems, but that will take a few days.
>
> --
> Brian D. Ripley,  rip...@stats.ox.ac.uk
> Emeritus Professor of Applied Statistics, University of Oxford

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] issue with print()ing multibyte characters on R 4.0.4

2021-02-16 Thread Hiroaki Yutani
Hi all,

I saw several people on Japanese locale claim that, on R 4.0.4,
print() doesn't display
Japanese characters correctly. This seems to happen only on Windows
and on macOS (I
usually use Linux and I don't see this problem).

For example, in the result below, "鬼" and "外" are displayed in
"\u" format. What's
curious here is that "は" is displayed as it is, by the way.

> "鬼は外"
[1] "\u9b3cは\u5916"

But, if I use such functions as message() or cat(), the string is
displayed as it is.

> message("鬼は外")
鬼は外

Considering the fact that it seems only Windows and macOS are
affected, I suspect this
is somehow related to this change described in the release note,
(though I have no idea
what change this is):

The internal table for iswprint (used on Windows, macOS and AIX) has been
updated to include many recent Unicode characters.
(https://cran.r-project.org/doc/manuals/r-release/NEWS.html)

Before I'm going to file this issue on Bugzilla, I'd like to confirm
if this is not the intended
change, and, if this is actually intended, I want to discuss how to
improve this behaviour.

Best,
Hiroaki Yutani

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] New pipe operator

2020-12-05 Thread Hiroaki Yutani
It is common practice to call |> as pipe (or pipeline operator) among
many languages
including ones that recently introduced it as an experimental feature.
Pipeline is a
common feature for functional programming, not just for "data pipeline."

F#: 
https://docs.microsoft.com/en-us/dotnet/fsharp/language-reference/symbol-and-operator-reference/
Elixir: https://hexdocs.pm/elixir/operators.html#general-operators
Typescript:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Pipeline_operator
Ruby: https://bugs.ruby-lang.org/issues/15799

(This blog post about the history of pipe operator might be
interesting: 
https://mamememo.blogspot.com/2019/06/a-brief-history-of-pipeline-operator.html
)

I agree this is a bit confusing for those who are familiar with other
"pipe" concepts,
but there's no other appropriate term to call |>.

2020年12月6日(日) 12:22 Gregory Warnes :
>
> If we’re being mathematically pedantic, the “pipe” operator is actually
> function composition.
>
> That being said, pipes are a simple and well-known idiom. While being less
> than mathematically exact, it seems a reasonable   label for the (very
> useful) behavior.
>
> On Sat, Dec 5, 2020 at 9:43 PM Abby Spurdle  wrote:
>
> > > This is a good addition
> >
> > I can't understand why so many people are calling this a "pipe".
> > Pipes connect processes, via their I/O streams.
> > Arguably, a more general interpretation would include sockets and files.
> >
> > https://en.wikipedia.org/wiki/Pipeline_(Unix)
> > https://en.wikipedia.org/wiki/Named_pipe
> > https://en.wikipedia.org/wiki/Anonymous_pipe
> >
> > As far as I can tell, the magrittr-like operators are functions (not
> > pipes), with nonstandard syntax.
> > This is not consistent with R's original design philosophy, building
> > on C, Lisp and S, along with lots of *important* math and stats.
> >
> > It's possible that some parties are interested in creating a kind of
> > "data pipeline".
> > I'm interested in this myself, and I think we could discuss this more.
> > But I'm not convinced the magrittr-like operators help to achieve this
> > goal.
> > Which, in my opinion, would require one to model programs as directed
> > graphs, along with some degree of asynchronous input.
> >
> > Presumably, these operators will be added to R anyway, and (almost) no
> > one will listen to me.
> >
> > So, I would like to make one suggestion:
> > Is it possible for these operators to *not* be named:
> > The R Pipe
> > The S Pipe
> > Or anything with a similar meaning.
> >
> > Maybe tidy pipe, or something else that links it to its proponents?
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
> --
> "Whereas true religion and good morals are the only solid foundations of
> public liberty and happiness . . . it is hereby earnestly recommended to
> the several States to take the most effectual measures for the
> encouragement thereof." Continental Congress, 1778
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] New pipe operator

2020-12-04 Thread Hiroaki Yutani
>  Error: function '::' not supported in RHS call of a pipe

To me, this error looks much more friendly than magrittr's error.
Some of them got too used to specify functions without (). This
is OK until they use `::`, but when they need to use it, it takes
hours to figure out why

mtcars %>% base::head
#> Error in .::base : unused argument (head)

won't work but

mtcars %>% head

works. I think this is a too harsh lesson for ordinary R users to
learn `::` is a function. I've been wanting for magrittr to drop the
support for a function name without () to avoid this confusion,
so I would very much welcome the new pipe operator's behavior.
Thank you all the developers who implemented this!

Best,
Hiroaki Yutani

2020年12月4日(金) 20:51 Duncan Murdoch :
>
> Just saw this on the R-devel news:
>
>
> R now provides a simple native pipe syntax ‘|>’ as well as a shorthand
> notation for creating functions, e.g. ‘\(x) x + 1’ is parsed as
> ‘function(x) x + 1’. The pipe implementation as a syntax transformation
> was motivated by suggestions from Jim Hester and Lionel Henry. These
> features are experimental and may change prior to release.
>
>
> This is a good addition; by using "|>" instead of "%>%" there should be
> a chance to get operator precedence right.  That said, the ?Syntax help
> topic hasn't been updated, so I'm not sure where it fits in.
>
> There are some choices that take a little getting used to:
>
>  > mtcars |> head
> Error: The pipe operator requires a function call or an anonymous
> function expression as RHS
>
> (I need to say mtcars |> head() instead.)  This sometimes leads to error
> messages that are somewhat confusing:
>
>  > mtcars |> magrittr::debug_pipe |> head
> Error: function '::' not supported in RHS call of a pipe
>
> but
>
> mtcars |> magrittr::debug_pipe() |> head()
>
> works.
>
> Overall, I think this is a great addition, though it's going to be
> disruptive for a while.
>
> Duncan Murdoch
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] vector arithmetic

2018-08-14 Thread Hiroaki Yutani
I've been wondering this, too! Following the codes in arithmetic.c, I've
finally reached MOD_ITERATE2_CORE macro in src/include/R_ext/Itermacros.h.
Is this the place?

Best

2018年8月14日(火) 2:59 isomorphismes :

> I'm looking for where in the source recycling and vector
> multiplication+addition are defined. I see some stuff in
> ~/src/main/arithmetic.c.
>
> Is there anywhere else I should be looking as well?
>
> Cheers
>
> --
> isomorphi...@sdf.org
> SDF Public Access UNIX System - http://sdf.org
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel