Re: [Rd] Lazy-evaluate elements wrapped with invisible

2022-10-28 Thread Dipterix Wang


> This is not quite true. The value, even when invisible, is captured by 
> .Last.value, and 
> 
> > f <- function() invisible(5)
> > f()
> > .Last.value
> [1] 5


I understand .Last.value will capture the function returns, but that only 
happens in the top-level... I guess?

In the followings code, I think .Last.value does not capture the results of f, 
h, k, l

g <- function() {
  f(); h(); k(); l()
  return()
}
g()


Maybe I caused confusion by mentioning `invisible` function. I guess it should 
be a new function (let’s call it `delayed`). The function does not have to be 
limited to “printing”. For example, a digest key


a <- function(key, value) {
  map$set(key, value)

  return(delayed({
digest(value)
  }))
}

Or an async evaluation of which the saved result might not be needed if not 
assigned (detached), or the result will be “joined” to the main process

a <- function(path) {
  # async 
  f <- future::future({
# calculate, and then write to path
saveRDS(…, path)
  })
  
  return(delayed({
resolve(f) # wait till f to finish

readRDS(path)
  }))
}

Although I could use wrappers such as formula, quosure, or environment to 
achieve similar results, there are two major differences

1. There is an extra call to get the lazy-evaluated results (if I do want to 
resolve it)
2. The returned objects have to contain sort of “environment” component in it. 
It can’t just be simple objects like vectors, matrices, lists, … (also you 
can't immediately garbage collect the enclosing environment)

>From the implementation perspective, the `delayed` object is ready to be 
>garbage collected if not assigned immediately.

Best,
- D

> 
> This is not quite true. The value, even when invisible, is captured by 
> .Last.value, and 
> 
> > f <- function() invisible(5)
> > f()
> > .Last.value
> [1] 5
> 
> Now that doesn't actually preclude what you're suggesting (just have to wait 
> for .Last.value to be populated by something else), but it does complicate it 
> to the extent that I'm not sure the benefit we'd get would be worth it.
> 
> Also, in the case you're describing, you'd be pushing the computational cost 
> into printing, which, imo, is not where it should live. Printing a values 
> generally speaking, should just print things, imo.
> 
> That said, if you really wanted to do this, you could approach the behavior 
> you want, I believe (but again, I think this is a bad idea) by returning a 
> custom class that wraps formula (or, I imagine, tidyverse style quosures) 
> that reach back into the call frame you return them from, and evaluating them 
> only on demand.
> 
> Best,
> ~G 
> 
> 
> This idea is somewhere between `delayedAssign` and eager evaluation. Maybe we 
> could call it delayedInvisible()?
> 
> Best,
> - Zhengjia
> 
> __
> R-devel@r-project.org  mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel 
> 


[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tools:: extracting pkg dependencies from DCF

2022-10-28 Thread Gabriel Becker
Hi Jan,


On Fri, Oct 28, 2022 at 1:57 PM Jan Gorecki  wrote:

> Gabriel,
>
> It is the most basic CI use case. One wants to install only
> dependencies only of the package, and run R CMD check on the package.


Really what you're looking for though, is to install all the dependencies
which aren't present right? Excluding base packages is just a particular
way to do that under certain assumptions about the CI environment.

So


needed_pkgs <- setdiff(package_dependencies(...),
installed.packages()[,"Package"])
install.packages(needed_pkgs, repos = fancyrepos)


will do what you want without installing the package itself, if that is
important. This will filter out base and recommended packages (which will
be already installed in your CI container, since R is).


Now this does not take into account versioned dependencies, so it's not
actually fully correct (whereas installing the package is), but it gets you
where you're trying to go. And in a clean CI container without cached
package installation for the deps, its equivalent.


Also, as an aside, if you need to get the base packages, you can do

installed.packages(priority="base")[,"Package"]

   basecompilerdatasetsgraphics   grDevicesgrid

 "base"  "compiler"  "datasets"  "graphics" "grDevices"  "grid"

methodsparallel splines   stats  stats4   tcltk

  "methods"  "parallel"   "splines" "stats""stats4" "tcltk"

  tools   utils

"tools" "utils"

(to get base and recommended packages use 'high' instead of 'base')

No need to be reaching down into unexported functions. So if you *really*
only want to exclude base functions (which likely will give you some
protection from versioned dep issues), you can change the code above to

needed_pkgs <- setdiff(package_dependencies(...),
installed.packages(priority = "high")[,"Package"])
install.packages(needed_pkgs, repos = fancyrepos)

Best,
~G


> On Fri, Oct 28, 2022 at 8:42 PM Gabriel Becker 
> wrote:
> >
> > Hi Jan,
> >
> > The reason, I suspect without speaking for R-core, is that by design you
> should not be specifying package dependencies as additional packages to
> install. install.packages already does this for you, as it did in the
> construct of a repository code that I provided previously in the thread.
> You should be *only* doing
> >
> > install.packages(, repos = *)
> >
> > Then everything happens automatically via extremely well tested very
> mature code.
> >
> > I (still) don't understand why you'd need to pass install.packages the
> vector of dependencies yourself, as that is counter to install.packages'
> core design.
> >
> > Does that make sense?
> >
> > Best,
> > ~G
> >
> > On Fri, Oct 28, 2022 at 12:18 PM Jan Gorecki 
> wrote:
> >>
> >> Gabriel,
> >>
> >> I am trying to design generic solution that could be applied to
> >> arbitrary package. Therefore I went with the latter solution you
> >> proposed.
> >> If we wouldn't have to exclude base packages, then its a 3 liner
> >>
> >> file.copy("DESCRIPTION", file.path(tdir<-tempdir(), "PACKAGES"));
> >> db<-available.packages(paste0("file://", tdir));
> >> utils::install.packages(tools::package_dependencies("pkgname", db,
> >> which="most")[[1L]])
> >>
> >> As you noticed, we still have to filter out base packages. Otherwise
> >> it won't be a robust utility that can be used in CI. Therefore we have
> >> to add a call to tools:::.get_standard_package_names() which is an
> >> internal function (as of now). Not only complicating the call but also
> >> putting the functionality outside of safe use.
> >>
> >> Considering above, don't you agree that the following one liner could
> >> nicely address the problem? The problem that hundreds/thousands of
> >> packages are now addressing in their CI scripts by using a third party
> >> packages.
> >>
> >> utils::install.packages(packages.dcf("DESCRIPTION", which="most"))
> >>
> >> It is hard to me to understand why R members don't consider this basic
> >> functionality to be part of base R. Possibly they just don't need it
> >> themselves. Yet isn't this sufficient that hundreds/thousands of
> >> packages does need this functionality?
> >>
> >> Best regards,
> >> Jan
> >>
> >> On Mon, Oct 17, 2022 at 8:39 AM Jan Gorecki 
> wrote:
> >> >
> >> > Gabriel and Simon
> >> >
> >> > I completely agree with what you are saying.
> >> > The thing is that obtaining recursive deps, all/most whatever, is
> already well supported in core R. What is missing is just this single
> functionality I am requesting.
> >> >
> >> > If you will look into the branch you can see there is mirror.packages
> function meant to mirror a slice of CRAN. It is doing exactly what you
> described: package_dependencies; to obtain recursive deps, then download
> all, etc.
> >> > I would love to have this function provided by core R as well, but we
> need to start somewhere.
> >> >
> >> > There are other use cases as well.
> >> > For example CI, where one wants to inst

Re: [Rd] Lazy-evaluate elements wrapped with invisible

2022-10-28 Thread Bill Dunlap
You can play with the idea by returning an environment that contains
delayed assignments.  E.g.,

> f <- function(x) {
+delayedAssign("eval_date", { cat("Evaluating 'date'\n"); date()})
+delayedAssign("sum_x", { cat("Evaluating 'sum_x'\n"); sum(x)})
+environment()
+ }
> fx <- f(1:10)
> date()
[1] "Fri Oct 28 14:22:12 2022"
> Sys.sleep(2)
> fx$eval_date
Evaluating 'date'
[1] "Fri Oct 28 14:22:24 2022"
> Sys.sleep(2)
> fx$eval_date
[1] "Fri Oct 28 14:22:24 2022"
> fx$sum_x
Evaluating 'sum_x'
[1] 55
> fx$sum_x
[1] 55

-Bill

On Fri, Oct 28, 2022 at 2:11 PM Gabriel Becker 
wrote:

> Hi Dipterix,
>
>
> On Fri, Oct 28, 2022 at 1:10 PM Dipterix Wang 
> wrote:
>
> > Hi,
> >
> > I was wondering if it is a good idea to delay the evaluation of
> expression
> > within invisible(), just like data()/delayedAssign()?
> >
> > The idea is a function might return an invisible object. This object
> might
> > not be used by the users if the function returns are not assigned nor
> > passed to another function call. For example,
> >
> > f <- function() {
> >   # do something eagerly
> >
> >   return(invisible({
> > # calculate message that might take long/extra memory, but only
> useful
> > if printed out
> >   }))
> > }
> >
> > If `f()` is not immediately assigned to a variable, then there is no
> > reason to evaluate invisible(…).
> >
>
> This is not quite true. The value, even when invisible, is captured by
> .Last.value, and
>
> > f <- function() invisible(5)
>
> > f()
>
> > .Last.value
>
> [1] 5
>
>
> Now that doesn't actually preclude what you're suggesting (just have to
> wait for .Last.value to be populated by something else), but it does
> complicate it to the extent that I'm not sure the benefit we'd get would be
> worth it.
>
> Also, in the case you're describing, you'd be pushing the computational
> cost into printing, which, imo, is not where it should live. Printing a
> values generally speaking, should just print things, imo.
>
> That said, if you really wanted to do this, you could approach the behavior
> you want, I believe (but again, I think this is a bad idea) by returning a
> custom class that wraps formula (or, I imagine, tidyverse style quosures)
> that reach back into the call frame you return them from, and evaluating
> them only on demand.
>
> Best,
> ~G
>
>
> > This idea is somewhere between `delayedAssign` and eager evaluation.
> Maybe
> > we could call it delayedInvisible()?
> >
> > Best,
> > - Zhengjia
> >
> > __
> > R-devel@r-project.org mailing list
> > https://stat.ethz.ch/mailman/listinfo/r-devel
> >
>
> [[alternative HTML version deleted]]
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] Lazy-evaluate elements wrapped with invisible

2022-10-28 Thread Gabriel Becker
Hi Dipterix,


On Fri, Oct 28, 2022 at 1:10 PM Dipterix Wang 
wrote:

> Hi,
>
> I was wondering if it is a good idea to delay the evaluation of expression
> within invisible(), just like data()/delayedAssign()?
>
> The idea is a function might return an invisible object. This object might
> not be used by the users if the function returns are not assigned nor
> passed to another function call. For example,
>
> f <- function() {
>   # do something eagerly
>
>   return(invisible({
> # calculate message that might take long/extra memory, but only useful
> if printed out
>   }))
> }
>
> If `f()` is not immediately assigned to a variable, then there is no
> reason to evaluate invisible(…).
>

This is not quite true. The value, even when invisible, is captured by
.Last.value, and

> f <- function() invisible(5)

> f()

> .Last.value

[1] 5


Now that doesn't actually preclude what you're suggesting (just have to
wait for .Last.value to be populated by something else), but it does
complicate it to the extent that I'm not sure the benefit we'd get would be
worth it.

Also, in the case you're describing, you'd be pushing the computational
cost into printing, which, imo, is not where it should live. Printing a
values generally speaking, should just print things, imo.

That said, if you really wanted to do this, you could approach the behavior
you want, I believe (but again, I think this is a bad idea) by returning a
custom class that wraps formula (or, I imagine, tidyverse style quosures)
that reach back into the call frame you return them from, and evaluating
them only on demand.

Best,
~G


> This idea is somewhere between `delayedAssign` and eager evaluation. Maybe
> we could call it delayedInvisible()?
>
> Best,
> - Zhengjia
>
> __
> R-devel@r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel
>

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tools:: extracting pkg dependencies from DCF

2022-10-28 Thread Jan Gorecki
Gabriel,

It is the most basic CI use case. One wants to install only
dependencies only of the package, and run R CMD check on the package.

Unless you say that installing the package and then running R CMD
check on that package is considered good practice. Then yes,
functionality I am asking about is not needed. Somehow I never thought
that this could be considered a good practice just by the fact that
installation of the package could already impact environment in which
check is taking place.

Best,
Jan

On Fri, Oct 28, 2022 at 8:42 PM Gabriel Becker  wrote:
>
> Hi Jan,
>
> The reason, I suspect without speaking for R-core, is that by design you 
> should not be specifying package dependencies as additional packages to 
> install. install.packages already does this for you, as it did in the 
> construct of a repository code that I provided previously in the thread. You 
> should be *only* doing
>
> install.packages(, repos = *)
>
> Then everything happens automatically via extremely well tested very mature 
> code.
>
> I (still) don't understand why you'd need to pass install.packages the vector 
> of dependencies yourself, as that is counter to install.packages' core design.
>
> Does that make sense?
>
> Best,
> ~G
>
> On Fri, Oct 28, 2022 at 12:18 PM Jan Gorecki  wrote:
>>
>> Gabriel,
>>
>> I am trying to design generic solution that could be applied to
>> arbitrary package. Therefore I went with the latter solution you
>> proposed.
>> If we wouldn't have to exclude base packages, then its a 3 liner
>>
>> file.copy("DESCRIPTION", file.path(tdir<-tempdir(), "PACKAGES"));
>> db<-available.packages(paste0("file://", tdir));
>> utils::install.packages(tools::package_dependencies("pkgname", db,
>> which="most")[[1L]])
>>
>> As you noticed, we still have to filter out base packages. Otherwise
>> it won't be a robust utility that can be used in CI. Therefore we have
>> to add a call to tools:::.get_standard_package_names() which is an
>> internal function (as of now). Not only complicating the call but also
>> putting the functionality outside of safe use.
>>
>> Considering above, don't you agree that the following one liner could
>> nicely address the problem? The problem that hundreds/thousands of
>> packages are now addressing in their CI scripts by using a third party
>> packages.
>>
>> utils::install.packages(packages.dcf("DESCRIPTION", which="most"))
>>
>> It is hard to me to understand why R members don't consider this basic
>> functionality to be part of base R. Possibly they just don't need it
>> themselves. Yet isn't this sufficient that hundreds/thousands of
>> packages does need this functionality?
>>
>> Best regards,
>> Jan
>>
>> On Mon, Oct 17, 2022 at 8:39 AM Jan Gorecki  wrote:
>> >
>> > Gabriel and Simon
>> >
>> > I completely agree with what you are saying.
>> > The thing is that obtaining recursive deps, all/most whatever, is already 
>> > well supported in core R. What is missing is just this single 
>> > functionality I am requesting.
>> >
>> > If you will look into the branch you can see there is mirror.packages 
>> > function meant to mirror a slice of CRAN. It is doing exactly what you 
>> > described: package_dependencies; to obtain recursive deps, then download 
>> > all, etc.
>> > I would love to have this function provided by core R as well, but we need 
>> > to start somewhere.
>> >
>> > There are other use cases as well.
>> > For example CI, where one wants to install all/most dependencies and then 
>> > run R CMD check. Then we don't worry about recursive deps are they will be 
>> > resolved automatically.
>> > I don't think it's reasonable to force users to use 3rd party packages to 
>> > handle such a common and simple use case. Otherwise one has to hard code 
>> > deps in CI script. Not robust at all.
>> >
>> > packages.dcf and repos.dcf makes all that way easier, and are solid base 
>> > for building customized orchestration like mirroring slice of CRAN.
>> >
>> > Best regards
>> > Jan
>> >
>> > On Sun, Oct 16, 2022, 01:31 Simon Urbanek  
>> > wrote:
>> >>
>> >> Jan,
>> >>
>> >> I think using a single DCF as input is not very practical and would not 
>> >> be useful in the context you describe (creating self contained repos) 
>> >> since they typically concern a list of packages, but essentially 
>> >> splitting out the part of install.packages() which determines which files 
>> >> will be pulled from where would be very useful as it would be trivial to 
>> >> use it to create repository (what we always do in corporate environments) 
>> >> instead of installing the packages. I suspect that install packages is 
>> >> already too complex so instead of adding a flag to install.packages one 
>> >> could move that functionality into a separate function - we all do that 
>> >> constantly for the sites we manage, so it would be certainly something 
>> >> worthwhile.
>> >>
>> >> Cheers,
>> >> Simon
>> >>
>> >>
>> >> > On Oct 15, 2022, at 7:14 PM, Jan Gorecki  wrote:
>> >> >
>> >> > 

[Rd] Lazy-evaluate elements wrapped with invisible

2022-10-28 Thread Dipterix Wang
Hi,

I was wondering if it is a good idea to delay the evaluation of expression 
within invisible(), just like data()/delayedAssign()?

The idea is a function might return an invisible object. This object might not 
be used by the users if the function returns are not assigned nor passed to 
another function call. For example,

f <- function() {
  # do something eagerly

  return(invisible({
# calculate message that might take long/extra memory, but only useful if 
printed out
  }))
}

If `f()` is not immediately assigned to a variable, then there is no reason to 
evaluate invisible(…).

This idea is somewhere between `delayedAssign` and eager evaluation. Maybe we 
could call it delayedInvisible()?

Best,
- Zhengjia

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


[Rd] pmin() and pmax() should process a single list of vectors, rather than returning it

2022-10-28 Thread Sebastian Martin Krantz
Dear R Core,

The {kit} package has a nice set of parallel statistical functions
complimenting base R's pmin() and pmax(): psum(), pprod(), pmean(), etc..
These can be called on a set of vectors like pmin() and pmax() e.g.
with(mtcars,  psum(mpg, carb, wt)) or on a single list of vectors e.g.
psum(mtcars). In contrast, pmin() and pmax() only allow the former. Calling
pmax(mtcars) oddly returns mtcars as is, without giving any error or
warning. I think this behavior should be changed to come in line with the
kit versions.

kit::psum is defined as:
psum <- function(..., na.rm=FALSE) .Call(CpsumR,  na.rm, if (...length() ==
1L && is.list(..1)) ..1 else list(...))

The first line of pmin() and pmax() is elts <- list(...). I propose
changing that first line to:
elts <- if (...length() == 1L && is.list(..1)) unclass(..1) else list(...).

This will provide convenient functionality (do.call(pmax, mtcars) is
inconvenient), and guard against the (odd) behavior of simply returning a
list passed to these functions.

Best regards,

Sebastian Krantz

[[alternative HTML version deleted]]

__
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel


Re: [Rd] tools:: extracting pkg dependencies from DCF

2022-10-28 Thread Gabriel Becker
Hi Jan,

The reason, I suspect without speaking for R-core, is that by design you
should not be specifying package dependencies as additional packages to
install. install.packages already does this for you, as it did in the
construct of a repository code that I provided previously in the thread.
You should be *only* doing

install.packages(, repos = *)

Then everything happens automatically via extremely well tested very mature
code.

I (still) don't understand why you'd need to pass install.packages the
vector of dependencies yourself, as that is counter to install.packages'
core design.

Does that make sense?

Best,
~G

On Fri, Oct 28, 2022 at 12:18 PM Jan Gorecki  wrote:

> Gabriel,
>
> I am trying to design generic solution that could be applied to
> arbitrary package. Therefore I went with the latter solution you
> proposed.
> If we wouldn't have to exclude base packages, then its a 3 liner
>
> file.copy("DESCRIPTION", file.path(tdir<-tempdir(), "PACKAGES"));
> db<-available.packages(paste0("file://", tdir));
> utils::install.packages(tools::package_dependencies("pkgname", db,
> which="most")[[1L]])
>
> As you noticed, we still have to filter out base packages. Otherwise
> it won't be a robust utility that can be used in CI. Therefore we have
> to add a call to tools:::.get_standard_package_names() which is an
> internal function (as of now). Not only complicating the call but also
> putting the functionality outside of safe use.
>
> Considering above, don't you agree that the following one liner could
> nicely address the problem? The problem that hundreds/thousands of
> packages are now addressing in their CI scripts by using a third party
> packages.
>
> utils::install.packages(packages.dcf("DESCRIPTION", which="most"))
>
> It is hard to me to understand why R members don't consider this basic
> functionality to be part of base R. Possibly they just don't need it
> themselves. Yet isn't this sufficient that hundreds/thousands of
> packages does need this functionality?
>
> Best regards,
> Jan
>
> On Mon, Oct 17, 2022 at 8:39 AM Jan Gorecki  wrote:
> >
> > Gabriel and Simon
> >
> > I completely agree with what you are saying.
> > The thing is that obtaining recursive deps, all/most whatever, is
> already well supported in core R. What is missing is just this single
> functionality I am requesting.
> >
> > If you will look into the branch you can see there is mirror.packages
> function meant to mirror a slice of CRAN. It is doing exactly what you
> described: package_dependencies; to obtain recursive deps, then download
> all, etc.
> > I would love to have this function provided by core R as well, but we
> need to start somewhere.
> >
> > There are other use cases as well.
> > For example CI, where one wants to install all/most dependencies and
> then run R CMD check. Then we don't worry about recursive deps are they
> will be resolved automatically.
> > I don't think it's reasonable to force users to use 3rd party packages
> to handle such a common and simple use case. Otherwise one has to hard code
> deps in CI script. Not robust at all.
> >
> > packages.dcf and repos.dcf makes all that way easier, and are solid base
> for building customized orchestration like mirroring slice of CRAN.
> >
> > Best regards
> > Jan
> >
> > On Sun, Oct 16, 2022, 01:31 Simon Urbanek 
> wrote:
> >>
> >> Jan,
> >>
> >> I think using a single DCF as input is not very practical and would not
> be useful in the context you describe (creating self contained repos) since
> they typically concern a list of packages, but essentially splitting out
> the part of install.packages() which determines which files will be pulled
> from where would be very useful as it would be trivial to use it to create
> repository (what we always do in corporate environments) instead of
> installing the packages. I suspect that install packages is already too
> complex so instead of adding a flag to install.packages one could move that
> functionality into a separate function - we all do that constantly for the
> sites we manage, so it would be certainly something worthwhile.
> >>
> >> Cheers,
> >> Simon
> >>
> >>
> >> > On Oct 15, 2022, at 7:14 PM, Jan Gorecki 
> wrote:
> >> >
> >> > Hi Gabriel,
> >> >
> >> > It's very nice usage you provided here. Maybe instead of adding new
> >> > function we could extend packages_depenedncies then? To accept file
> path to
> >> > dsc file.
> >> >
> >> > What about repos.dcf? Maybe additional repositories could be an
> attribute
> >> > attached to returned character vector.
> >> >
> >> > The use case is to, for a given package sources, obtain its
> dependencies,
> >> > so one can use that for installing them/mirroring CRAN subset, or
> whatever.
> >> > The later is especially important for a production environment where
> one
> >> > wants to have fixed version of packages, and mirroring relevant
> subset of
> >> > CRAN is the most simple, and IMO reliable, way to manage such
> environment.
> >> >
> >> > Regards

Re: [Rd] tools:: extracting pkg dependencies from DCF

2022-10-28 Thread Jan Gorecki
Gabriel,

I am trying to design generic solution that could be applied to
arbitrary package. Therefore I went with the latter solution you
proposed.
If we wouldn't have to exclude base packages, then its a 3 liner

file.copy("DESCRIPTION", file.path(tdir<-tempdir(), "PACKAGES"));
db<-available.packages(paste0("file://", tdir));
utils::install.packages(tools::package_dependencies("pkgname", db,
which="most")[[1L]])

As you noticed, we still have to filter out base packages. Otherwise
it won't be a robust utility that can be used in CI. Therefore we have
to add a call to tools:::.get_standard_package_names() which is an
internal function (as of now). Not only complicating the call but also
putting the functionality outside of safe use.

Considering above, don't you agree that the following one liner could
nicely address the problem? The problem that hundreds/thousands of
packages are now addressing in their CI scripts by using a third party
packages.

utils::install.packages(packages.dcf("DESCRIPTION", which="most"))

It is hard to me to understand why R members don't consider this basic
functionality to be part of base R. Possibly they just don't need it
themselves. Yet isn't this sufficient that hundreds/thousands of
packages does need this functionality?

Best regards,
Jan

On Mon, Oct 17, 2022 at 8:39 AM Jan Gorecki  wrote:
>
> Gabriel and Simon
>
> I completely agree with what you are saying.
> The thing is that obtaining recursive deps, all/most whatever, is already 
> well supported in core R. What is missing is just this single functionality I 
> am requesting.
>
> If you will look into the branch you can see there is mirror.packages 
> function meant to mirror a slice of CRAN. It is doing exactly what you 
> described: package_dependencies; to obtain recursive deps, then download all, 
> etc.
> I would love to have this function provided by core R as well, but we need to 
> start somewhere.
>
> There are other use cases as well.
> For example CI, where one wants to install all/most dependencies and then run 
> R CMD check. Then we don't worry about recursive deps are they will be 
> resolved automatically.
> I don't think it's reasonable to force users to use 3rd party packages to 
> handle such a common and simple use case. Otherwise one has to hard code deps 
> in CI script. Not robust at all.
>
> packages.dcf and repos.dcf makes all that way easier, and are solid base for 
> building customized orchestration like mirroring slice of CRAN.
>
> Best regards
> Jan
>
> On Sun, Oct 16, 2022, 01:31 Simon Urbanek  wrote:
>>
>> Jan,
>>
>> I think using a single DCF as input is not very practical and would not be 
>> useful in the context you describe (creating self contained repos) since 
>> they typically concern a list of packages, but essentially splitting out the 
>> part of install.packages() which determines which files will be pulled from 
>> where would be very useful as it would be trivial to use it to create 
>> repository (what we always do in corporate environments) instead of 
>> installing the packages. I suspect that install packages is already too 
>> complex so instead of adding a flag to install.packages one could move that 
>> functionality into a separate function - we all do that constantly for the 
>> sites we manage, so it would be certainly something worthwhile.
>>
>> Cheers,
>> Simon
>>
>>
>> > On Oct 15, 2022, at 7:14 PM, Jan Gorecki  wrote:
>> >
>> > Hi Gabriel,
>> >
>> > It's very nice usage you provided here. Maybe instead of adding new
>> > function we could extend packages_depenedncies then? To accept file path to
>> > dsc file.
>> >
>> > What about repos.dcf? Maybe additional repositories could be an attribute
>> > attached to returned character vector.
>> >
>> > The use case is to, for a given package sources, obtain its dependencies,
>> > so one can use that for installing them/mirroring CRAN subset, or whatever.
>> > The later is especially important for a production environment where one
>> > wants to have fixed version of packages, and mirroring relevant subset of
>> > CRAN is the most simple, and IMO reliable, way to manage such environment.
>> >
>> > Regards
>> > Jan
>> >
>> > On Fri, Oct 14, 2022, 23:34 Gabriel Becker  wrote:
>> >
>> >> Hi Jan and Jan,
>> >>
>> >> Can you explain a little more what exactly you want the non-recursive,
>> >> non-version aware dependencies from an individual package for?
>> >>
>> >> Either way package_dependencies will do this for you* with a little
>> >> "aggressive convincing". It wants output from available.packages, but who
>> >> really cares what it wants? It's a function and we are people :)
>> >>
>> >>> library(tools)
>> >>> db <- read.dcf("~/gabe/checkedout/rtables_clean/DESCRIPTION")
>> >>> package_dependencies("rtables", db, which = intersect(c("Depends",
>> >> "Suggests", "Imports", "LinkingTo"), colnames(db)))
>> >> $rtables
>> >> [1] "methods""magrittr"   "formatters" "dplyr"  "tibble"
>> >> [6] "tidyr"  "testt