Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-12-04 Thread Jonathan Dieter
On Thu, 2022-12-01 at 00:41 +, Daniel Alley wrote:
> 
> * zchunk and deltarpm both reimplement / "bundle" multiple different
> hashing algorithms

zchunk does have bundled versions of various hashing algorithms, but,
if it's compiled against OpenSSL (as it is in Fedora), it uses the
OpenSSL hashing algorithms rather than the bundled versions.

Jonathan
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-12-01 Thread Simo Sorce
On Thu, 2022-12-01 at 00:41 +, Daniel Alley wrote:
> I'm quite not sure how one would go about empirically measuring
> something like that - at least in the general case.  It might be an
> interesting research topic. So no, unfortunately I don't really have
> hard evidence for this.

We did run discovery for this in RHEL when we started the SHA-1
deprecation process.
Some embedded implementations were found but the vast majority of
programs used one of the available libraries for SHA-1. I do not expect
SHA-256 to be any different.

so empirically I can tell you there isn't anywhere as much "vendoring"
in C as you claim. Using dynamically linked libraries is well establish
and the reason why sometimes the dependency chain is ... monstrous.

> I just know that of all the C libraries I've looked at, in my
> personal experience it seems to be a very common phenomenon to copy
> or reimplement code that in Rust you would just import and re-use. 

Perhaps this is true in the niche you are interested in?

> It's just a pattern that one notices frequently when it comes to C
> libraries, especially crossplatform ones that can't rely exclusively
> on the existence of a Linux-like package manager.

Yes, those kind of libraries tend to be quite bad in this regard, OTOH
it can be done right.
For example the NSS library generally carries copies of external
dependencies, but the configure script looks for a system version and
links to the one if available on build.
So if you looked at NSS you may think it vendors, and it does, but
smartly and in a way that is compatible with systems integration.

> If you want specific examples, the ones that pop to mind are:
> 
> * zchunk and deltarpm both reimplement / "bundle" multiple different
> hashing algorithms
> * libcomps implements about 4 different relatively common data
> structures

I am not sure this qualify as vendoring/bundling, kind of borderline
but I see why you added it as a case.

> * GTK appears to contain a bundled, forked copy of the CRoaring
> library
> 
> 
> 
> 
> ___
> devel mailing list -- devel@lists.fedoraproject.org
> To unsubscribe send an email to devel-le...@lists.fedoraproject.org
> Fedora Code of Conduct: 
> https://docs.fedoraproject.org/en-US/project/code-of-conduct/
> List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
> List Archives: 
> https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
> Do not reply to spam, report it: 
> https://pagure.io/fedora-infrastructure/new_issue

-- 
Simo Sorce
RHEL Crypto Team
Red Hat, Inc


___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Daniel Alley
I'm quite not sure how one would go about empirically measuring something like 
that - at least in the general case.  It might be an interesting research 
topic. So no, unfortunately I don't really have hard evidence for this.

I just know that of all the C libraries I've looked at, in my personal 
experience it seems to be a very common phenomenon to copy or reimplement code 
that in Rust you would just import and re-use. 

It's just a pattern that one notices frequently when it comes to C libraries, 
especially crossplatform ones that can't rely exclusively on the existence of a 
Linux-like package manager.

If you want specific examples, the ones that pop to mind are:

* zchunk and deltarpm both reimplement / "bundle" multiple different hashing 
algorithms
* libcomps implements about 4 different relatively common data structures
* GTK appears to contain a bundled, forked copy of the CRoaring library




___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Chris Adams
Once upon a time, Daniel Alley  said:
> 100 C packages with 100 separate copies of sha256.c sitting in their source 
> trees (which seems like an entirely realistic comparison)

You keep saying this - do you have any evidence that this is the case?
-- 
Chris Adams 
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Daniel Alley
> I think almost all of these qualify as "Core system libraries that
> pretty much everything depends on.".
> Building their C dependencies from vendored copies (if that is even
> supported) and statically linking them seems like a pretty bad idea in
> almost all cases here, especially for things where the version of a
> program on the "host" and the accompanying shared library should
> match.

Yes, we're in complete agreement.  I'm not suggesting anything like that.  
Vendoring libraries like openssl is a bad idea.

What I'm saying is that it's not very logically justified to say that just 
because core system libraries like openssl shouldn't be vendored, all vendoring 
is disallowed regardless of how small and focused they are or how few 
dependents they have.

Because - most C libraries have "dark dependencies" that are effectively the 
same but worse, in some ways.  Given the choice between 100 Rust packages 
vendoring 10 different copies of the sha256 crate and 100 C packages with 100 
separate copies of sha256.c sitting in their source trees (which seems like an 
entirely realistic comparison), why would the latter be completely A-OK while 
the latter is completely disallowed?

> But ... none of these "tiny" Rust crates are dynamically linked in
> Fedora anyway - because Rust doesn't really support that?
> So I fail to see your point there, unless you meant to say "C projects
> don't 'bundle', they just often 'copy' some code into their
> projects"?
> 
> Fabio

Yes, that's essentially what I'm saying.  I feel like the "no bundling" policy 
draws distinctions that don't entirely make sense, especially when it comes to 
the small, focused leaf-node dependencies that people often complain about.

Clearly the "left-pad" scenario is bad and should be avoided, but on the other 
hand is having 800 different linked list implementations, 500 different hash 
table implementations, 25 different half-baked XML parsers, really so much 
"better", or is it just what we're used to?
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Simo Sorce
On Wed, 2022-11-30 at 18:26 +, Simon Farnsworth via devel wrote:
> On Wednesday, 30 November 2022 17:47:16 GMT Fabio Valentini wrote:
> > On Wed, Nov 30, 2022 at 6:21 PM Daniel Alley  wrote:
> > > I feel like there is insufficient recognition of the extent to which C
> > > libraries do "bundling".  Not "bundling" in the sense of vendoring a
> > > whole library, but in the sense of including one-off implementations of
> > > basic data structures, configuration parsers, hashing algorithms, etc.  I
> > > would love to hear anyone argue that 100 different variations of
> > > "sha256.c" across 100 different packages better follows the spirit of the
> > > "no bundling" guidelines than a vendored crate named "sha256" with 100x
> > > as many eyes on it, and a higher likelihood to actually be updated if a
> > > problem is found.
> > 
> > > 
> > > 
> > > Many of the tiny, "sprawling" Rust dependencies are like this - not all of
> > > them of course, but many.
> > 
> > But ... none of these "tiny" Rust crates are dynamically linked in
> > Fedora anyway - because Rust doesn't really support that?
> > So I fail to see your point there, unless you meant to say "C projects
> > don't 'bundle', they just often 'copy' some code into their projects"?
> > 
> I think the point he's making is that developers don't write common 
> functionality from scratch, in general. We reuse code from elsewhere.
> 
> It's just that in C, I'll copy-and-paste code from the web into my library or 
> application, not necessarily even bothering with a full "vendoring", whereas 
> in Rust, I'll use the crc crate (say), or the base64 crate, or other simple 
> utility crate.
> 
> The result is that I have N implementations of common functionality, each 
> with 
> its own unique quirks and security risks, in my C binaries; but my C binaries 
> have only a small number of dependencies.
> 
> In Rust, however, I'm directly reusing the small utility crate, and while I 
> may use `cargo vendor` to import the crate's source into my tree, I'm 
> unlikely 
> to edit it. The result is that a Rust binary has a larger number of 
> dependencies than the equivalent C program, because I'm depending on a crate 
> instead of copy-and-pasting the code and "hiding" it from the packager.
> 
> This is a challenge for Fedora: how do we cope with a world where instead of 
> having a few tens of dependencies, and a lot of copy-and-paste code, we have 
> hundreds of dependencies, but no copy-and-paste code?
> 
> One answer is to say that Rust is bad for encouraging developers to depend on 
> small crates instead of copying-and-pasting  "small" utilities around. 
> Another 
> (which you're doing a great job of) is to package up all the dependencies,  
> so 
> that we represent the true dependency tree in RPM. Yet another would be to 
> manually decide which dependencies get bundled, and which don't - doing the 
> same thing as the C world does to keep its dependency count down.


This is a bit of a misleading characterization.

- the amount of copying in C programs is overblown in my opinion (no
data provided to back this up though)
- dependency minimization for C programs/libraries is assumed but no
data provided to back it up.
- the dilemma is how to manage rust programs not to decide what to
bundle or not, that's basically decided upstream

Although there are certainly instances of copying/paste in C code, the
vast majority of reuse comes through linking to utility libraries
dynamically, which makes distro-level maintenance much easier because
you have only once place to fix/rebuild/distribute when there are
serious issues.

The problems with Rust crates (or go modules, or any other
"vendoring"), is that not only you have to go and find each place where
a problematic crate was vendored; you also have to figure out, often
under pressure, if the crate can simply be summarily updated or if you
need a backport because the vendoring application can't cope with the
semantic changes that happened in the problematic crate's new version.

Multiply this by N packages using M different versions of the
problematic crate.

Although vendored crates can be tracked (this i much better than
copy/pasting), with additional tooling, the distribution remains on the
hook for solving the same problem in N packages, without easy
coordination. Some upstream may be quick and do the work for you, some
may not care, disappear etc... or are simply too slow for the urgency
the distribution has leading potentially to diverging solutions ...

Simo.

-- 
Simo Sorce
RHEL Crypto Team
Red Hat, Inc


___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Simon Farnsworth via devel
On Wednesday, 30 November 2022 17:47:16 GMT Fabio Valentini wrote:
> On Wed, Nov 30, 2022 at 6:21 PM Daniel Alley  wrote:
> > I feel like there is insufficient recognition of the extent to which C
> > libraries do "bundling".  Not "bundling" in the sense of vendoring a
> > whole library, but in the sense of including one-off implementations of
> > basic data structures, configuration parsers, hashing algorithms, etc.  I
> > would love to hear anyone argue that 100 different variations of
> > "sha256.c" across 100 different packages better follows the spirit of the
> > "no bundling" guidelines than a vendored crate named "sha256" with 100x
> > as many eyes on it, and a higher likelihood to actually be updated if a
> > problem is found.
>
> >
> >
> > Many of the tiny, "sprawling" Rust dependencies are like this - not all of
> > them of course, but many.
> 
> But ... none of these "tiny" Rust crates are dynamically linked in
> Fedora anyway - because Rust doesn't really support that?
> So I fail to see your point there, unless you meant to say "C projects
> don't 'bundle', they just often 'copy' some code into their projects"?
> 
I think the point he's making is that developers don't write common 
functionality from scratch, in general. We reuse code from elsewhere.

It's just that in C, I'll copy-and-paste code from the web into my library or 
application, not necessarily even bothering with a full "vendoring", whereas 
in Rust, I'll use the crc crate (say), or the base64 crate, or other simple 
utility crate.

The result is that I have N implementations of common functionality, each with 
its own unique quirks and security risks, in my C binaries; but my C binaries 
have only a small number of dependencies.

In Rust, however, I'm directly reusing the small utility crate, and while I 
may use `cargo vendor` to import the crate's source into my tree, I'm unlikely 
to edit it. The result is that a Rust binary has a larger number of 
dependencies than the equivalent C program, because I'm depending on a crate 
instead of copy-and-pasting the code and "hiding" it from the packager.

This is a challenge for Fedora: how do we cope with a world where instead of 
having a few tens of dependencies, and a lot of copy-and-paste code, we have 
hundreds of dependencies, but no copy-and-paste code?

One answer is to say that Rust is bad for encouraging developers to depend on 
small crates instead of copying-and-pasting  "small" utilities around. Another 
(which you're doing a great job of) is to package up all the dependencies,  so 
that we represent the true dependency tree in RPM. Yet another would be to 
manually decide which dependencies get bundled, and which don't - doing the 
same thing as the C world does to keep its dependency count down.

-- 
Simon Farnsworth

___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Fabio Valentini
On Wed, Nov 30, 2022 at 6:21 PM Daniel Alley  wrote:
>
> > Do I really need to explain this point? I think linking against system
> > OpenSSL is *way better* than statically linking to a random vendored
> > copy of it.
>
> There are maybe about 100-120 libraries for which this is obviously the case. 
>  openssl, glibc, glib2, zlib, libxml2, libcurl, kde libraries, etc.  Core 
> system libraries that pretty much everything depends on.  Dynamically linking 
> such libraries has real benefits.
>
> For everything else though?  No, not so much.

These are actually all good examples of stuff that we dynamically link to.
This is the (not 100% exhaustive) list of system libraries we always
dynamically link to:

- all GTK / GNOME libraries (dbus, freetype, fontconfig, atk,
gdk-pixbuf, gdk, gtk3, gtk4, gio, glib2, graphene, gsk4, pango, cairo,
gstreamer, etc.)
- multimedia codecs / libraries (aom, dav1d, pipewire, pulseaudio, etc.)
- crypto libraries (openssl, libsodium, nettle, curl)
- compression libraries (bzip2, flate2, libz, lzma, zstd)
- database connectors (libsqlite, libpq)
- low-level device / storage libraries (devicemapper, libblkid)

I think almost all of these qualify as "Core system libraries that
pretty much everything depends on.".
Building their C dependencies from vendored copies (if that is even
supported) and statically linking them seems like a pretty bad idea in
almost all cases here, especially for things where the version of a
program on the "host" and the accompanying shared library should
match.

The only exception (that I know of) for "crate dependency built from
vendored sources and statically linked" in Fedora right now is
libgit2, because the version in Fedora is chronically outdated, and
dealing with that has become very painful - and started to block some
necessary updates to support more recent versions of Rust / cargo.

> I feel like there is insufficient recognition of the extent to which C 
> libraries do "bundling".  Not "bundling" in the sense of vendoring a whole 
> library, but in the sense of including one-off implementations of basic data 
> structures, configuration parsers, hashing algorithms, etc.  I would love to 
> hear anyone argue that 100 different variations of "sha256.c" across 100 
> different packages better follows the spirit of the "no bundling" guidelines 
> than a vendored crate named "sha256" with 100x as many eyes on it, and a 
> higher likelihood to actually be updated if a problem is found.
>
> Many of the tiny, "sprawling" Rust dependencies are like this - not all of 
> them of course, but many.

But ... none of these "tiny" Rust crates are dynamically linked in
Fedora anyway - because Rust doesn't really support that?
So I fail to see your point there, unless you meant to say "C projects
don't 'bundle', they just often 'copy' some code into their projects"?

Fabio
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Daniel Alley
> Do I really need to explain this point? I think linking against system
> OpenSSL is *way better* than statically linking to a random vendored
> copy of it.

There are maybe about 100-120 libraries for which this is obviously the case.  
openssl, glibc, glib2, zlib, libxml2, libcurl, kde libraries, etc.  Core system 
libraries that pretty much everything depends on.  Dynamically linking such 
libraries has real benefits.

For everything else though?  No, not so much.

I feel like there is insufficient recognition of the extent to which C 
libraries do "bundling".  Not "bundling" in the sense of vendoring a whole 
library, but in the sense of including one-off implementations of basic data 
structures, configuration parsers, hashing algorithms, etc.  I would love to 
hear anyone argue that 100 different variations of "sha256.c" across 100 
different packages better follows the spirit of the "no bundling" guidelines 
than a vendored crate named "sha256" with 100x as many eyes on it, and a higher 
likelihood to actually be updated if a problem is found.

Many of the tiny, "sprawling" Rust dependencies are like this - not all of them 
of course, but many.

Torvalds has similar feelings: 
https://lore.kernel.org/lkml/CAHk-=whs8QZf3YnifdLv57+FhBi5_WeNTG1B-suOES=rcus...@mail.gmail.com/
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Stephen Smoogen
On Wed, 30 Nov 2022 at 07:54, Daniel P. Berrangé 
wrote:

> On Wed, Nov 30, 2022 at 11:54:10AM +, Peter Robinson wrote:
> > Hi Fabio,
> >
> > Been meaning to reply to this, but it got lost in the mail pile.
> >
>
> > > > But running `cargo fetch` with a clean cache pulls down *390*
> crates. Of
> > > > these, it looks like 199 (!) are already packaged as
> rust-[crate]-devel,
> > > > which is *amazing*. But... that still is hundreds that I'd have to
> add. And
> > > > mostly they are things I don't know _anything_ about.
> > >
> > > You must realize that this is an extreme case. For many Rust
> > > applications that people want to package for Fedora, the number of
> > > dependencies that are missing is rather small, *because* most popular
> > > libraries are already packaged.
> >
> > In my experience, and I'm not packaging any form of gaming engine,
> > it's not an extreme case, for a few small things and one slightly
> > larger thing I *have* packaged I now maintain over 60 more packages. I
> > have another one I want to package and AFAICT I get to package another
> > 50-100 but frankly it's hard to tell, and this is something I have
> > control over and have been actively trying to do upstream reduction of
> > deps.
>
> The magnitude of the number of deps is the real killer for not
> only Rust, but also Go, and arguably anything that isn't a C
> based application.
>
>
Packaging up all of a project's deps individually is viable when
> there are a relatively small number of them, especially if most
> of the deps are shared by other apps, and the libraries have good
> API + ABI stability promises. This is common in the case of most
> C applications/libraries, since shared libraries are relatively
> speaking fairly broad in functional scope, and often long lived,
> because that is the mindset of the C ecosystem in general.
>
> Modern languages & their ecosystems though have brought about
> a world where the granularity of deps is way smaller, where
> the focus is on being able to evolve rapidly, with less focus
> on long term stable API/ABI, as apps are expected to just fix
> against specific versions they've tested on.
>
>
I wonder if we should look at the standard libraries in the late 1960/1970
languages as being the same as 'opinionated' Operating Systems. Most of
them started out with a period where every mainframe might have a
'language' and a set of 'libraries' of routines but every site pretty much
mangled them to fit their own 'local' needs. Software vendors which existed
usually ended up having consultants go out to 'fix' their code to match
whatever routines and tooling existed on each mainframe. This got to be
unworkable and various 'libraries' were put forward which were to
standardize things. However, going from the 'old war tales' my mentors and
professors from that era, the same arguments that the current languages
(Rust, etc) have against standards were existent. The issue was the scale
of the problem was much more on the side of the hardware/OS vendors putting
out a standard chosen library (and then fighting over getting those to
match up for the software vendors who still needed to spend a lot of
consulting time). Most of those arguments seem to have 'died' out as people
got tired of fighting the differences versus the delight of new challenges
that many programmers have when dealing with a 'new' language. [Same with
C++ back in the late 1980's where every vendor had a slightly different set
of libraries which for its proponents was the best thing over all the
others.. and then by the late 1990's was 'Oh no, not this again'. Java went
through this up until the late 00's. ]

I don't think we are at the state where enough of the new people who are
getting into Rust have gotten tired of fighting 'oh, I have to f'ing change
my code again to make it work with these 4 things?' Basically when enough
people HAVE to code in the language versus just 'WANT' to, but hate the
grind.. ability to standardize is going to happen.


> So functionality that lives in single library in C, often ends
> up being spread across 20-200 modules in any of the modern
> non-C languages. We've been experiancing this problem in Fedora
> for a very long time and while we have spec auto-generators,
> that still leaves masses of busy work, and also still leaves
> the problem that what we ship doesn't match what upstream has
> tested, so the result is often less reliable.
>
> IMHO the only thing that can make packaging such fine grained
> modules sustainable long term, is if the process could be 100%
> automated.  I don't mean just having a tool to auto-generate
> a spec file. It needs to be fully automated to the extent that
> you just tell a robot "i need  deps for this package" and it
> does everything automatically, except for human approval of
> the results of a code license audit.
>
> That of course would be a non-trivial service to build, and
> it still begs the question of whether its really beneficial
> in 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Neal Gompa
On Wed, Nov 30, 2022 at 8:16 AM Fabio Valentini  wrote:
>
> On Wed, Nov 30, 2022 at 12:54 PM Peter Robinson  wrote:
> >
> > > This is true, and probably also not "fixable". We need to make some
> > > amount of non-upstreamable patches to some crates (most notably,
> > > removing Windows- or mac OS-specific dependencies, because we don't
> > > want to package those), but in some cases, these are "incompatible"
> > > changes, and Rust *developers* should not be targeting our downstream
> > > sources that have these differences with actual upstream sources.
> >
> > Yet you say above "We *do* provide value to both users *and*
> > developers" yet you say developers shouldn't be targeting that work?
>
> We provide value to developers by basically running a huge free CI
> across a wide range of architectures.
>
> That doesn't mean that they should develop against crate sources we
> have in Fedora, similarly to how upstream Python developers probably
> should use pip/venv instead of "whatver is in Fedora". The same
> applies to basically every language ecosystem except C/C++, where the
> package manager story is just so bad that system libraries are
> actually the only reliable development environment.
>
> > > This is due to a limitation of how cargo handles target-specific
> > > dependencies - all dependencies that are *mentioned in any way* need
> > > to be *present* for it to resolve dependencies / enabled optional
> > > features / update its lockfile etc. But since we don't want to package
> > > bindings for Windows and mac OS system APIs, we need to actually patch
> > > them out, otherwise builds will fail.
> >
> > And that ends up being quite a bit of work from my point of view. Also
> > the way the packaging works with options things like devel or optional
> > features ends up being very painful. I will often drop out optional
> > features just so I can do less packaging.
>
> Recent versions of rust2rpm automatically generate a patch to remove
> non-linux dependencies, so that work has effectively been reduced to
> zero.
> Disabling unused optional dependencies can also be done with either a
> rust2rpm config option or with a simple patch, so that should not be a
> problem, either. I also don't understand why disabling some optional
> features is such a bad thing? I don't think we have a policy in Fedora
> that says we *must* enable *all possible functionality and optional
> features*.
>
> > > > We're doing okay with #1, but... I think #3 _even_ with all of the work 
> > > > in
> > > > Rust-to-RPM packaging isn't sufficient. I've played with the Bevy game
> > > > engine and will probably have a few things it would be nice to package 
> > > > to
> > > > make available in Fedora Linux. I might not even mind maintaining Bevy
> > > > itself.
> > >
> > > Somebody actually already started packaging Bevy components - some
> > > packages are already approved and some are still pending review. Not
> > > sure what the progress has been there, but it's not *impossible*.
> >
> > Well nothing is *impossible* if you have enough stamina, resources or
> > whatever else. I don't find saying something isn't *impossible*
> > necessarily makes it compelling.
>
> I agree. And I think we can do better, as I said in my original post.
>
> > > > But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> > > > these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> > > > which is *amazing*. But... that still is hundreds that I'd have to add. 
> > > > And
> > > > mostly they are things I don't know _anything_ about.
> > >
> > > You must realize that this is an extreme case. For many Rust
> > > applications that people want to package for Fedora, the number of
> > > dependencies that are missing is rather small, *because* most popular
> > > libraries are already packaged.
> >
> > In my experience, and I'm not packaging any form of gaming engine,
> > it's not an extreme case, for a few small things and one slightly
> > larger thing I *have* packaged I now maintain over 60 more packages. I
> > have another one I want to package and AFAICT I get to package another
> > 50-100 but frankly it's hard to tell, and this is something I have
> > control over and have been actively trying to do upstream reduction of
> > deps.
>
> As far as I know, salimma's intern worked on a tool to determine
> missing dependencies for Rust projects in Fedora recursively, which
> should make it easier to determine the exact set of things that you
> would need to do here.
>
> > And requests to make things like this easier have been open for over a year:
> > https://pagure.io/fedora-rust/rust2rpm/issue/140
>
> Right. I'm sorry about the issue reports that have been open for a long time.
>
> rust2rpm / rust-packaging is an entirely community-run project. We
> don't get any support from Red Hat (or other places) for anything
> except the Rust compiler itself. And with the original developer of
> rust2rpm (Igor) being gone, nobody 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Fabio Valentini
On Wed, Nov 30, 2022 at 12:54 PM Peter Robinson  wrote:
>
> > This is true, and probably also not "fixable". We need to make some
> > amount of non-upstreamable patches to some crates (most notably,
> > removing Windows- or mac OS-specific dependencies, because we don't
> > want to package those), but in some cases, these are "incompatible"
> > changes, and Rust *developers* should not be targeting our downstream
> > sources that have these differences with actual upstream sources.
>
> Yet you say above "We *do* provide value to both users *and*
> developers" yet you say developers shouldn't be targeting that work?

We provide value to developers by basically running a huge free CI
across a wide range of architectures.

That doesn't mean that they should develop against crate sources we
have in Fedora, similarly to how upstream Python developers probably
should use pip/venv instead of "whatver is in Fedora". The same
applies to basically every language ecosystem except C/C++, where the
package manager story is just so bad that system libraries are
actually the only reliable development environment.

> > This is due to a limitation of how cargo handles target-specific
> > dependencies - all dependencies that are *mentioned in any way* need
> > to be *present* for it to resolve dependencies / enabled optional
> > features / update its lockfile etc. But since we don't want to package
> > bindings for Windows and mac OS system APIs, we need to actually patch
> > them out, otherwise builds will fail.
>
> And that ends up being quite a bit of work from my point of view. Also
> the way the packaging works with options things like devel or optional
> features ends up being very painful. I will often drop out optional
> features just so I can do less packaging.

Recent versions of rust2rpm automatically generate a patch to remove
non-linux dependencies, so that work has effectively been reduced to
zero.
Disabling unused optional dependencies can also be done with either a
rust2rpm config option or with a simple patch, so that should not be a
problem, either. I also don't understand why disabling some optional
features is such a bad thing? I don't think we have a policy in Fedora
that says we *must* enable *all possible functionality and optional
features*.

> > > We're doing okay with #1, but... I think #3 _even_ with all of the work in
> > > Rust-to-RPM packaging isn't sufficient. I've played with the Bevy game
> > > engine and will probably have a few things it would be nice to package to
> > > make available in Fedora Linux. I might not even mind maintaining Bevy
> > > itself.
> >
> > Somebody actually already started packaging Bevy components - some
> > packages are already approved and some are still pending review. Not
> > sure what the progress has been there, but it's not *impossible*.
>
> Well nothing is *impossible* if you have enough stamina, resources or
> whatever else. I don't find saying something isn't *impossible*
> necessarily makes it compelling.

I agree. And I think we can do better, as I said in my original post.

> > > But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> > > these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> > > which is *amazing*. But... that still is hundreds that I'd have to add. 
> > > And
> > > mostly they are things I don't know _anything_ about.
> >
> > You must realize that this is an extreme case. For many Rust
> > applications that people want to package for Fedora, the number of
> > dependencies that are missing is rather small, *because* most popular
> > libraries are already packaged.
>
> In my experience, and I'm not packaging any form of gaming engine,
> it's not an extreme case, for a few small things and one slightly
> larger thing I *have* packaged I now maintain over 60 more packages. I
> have another one I want to package and AFAICT I get to package another
> 50-100 but frankly it's hard to tell, and this is something I have
> control over and have been actively trying to do upstream reduction of
> deps.

As far as I know, salimma's intern worked on a tool to determine
missing dependencies for Rust projects in Fedora recursively, which
should make it easier to determine the exact set of things that you
would need to do here.

> And requests to make things like this easier have been open for over a year:
> https://pagure.io/fedora-rust/rust2rpm/issue/140

Right. I'm sorry about the issue reports that have been open for a long time.

rust2rpm / rust-packaging is an entirely community-run project. We
don't get any support from Red Hat (or other places) for anything
except the Rust compiler itself. And with the original developer of
rust2rpm (Igor) being gone, nobody who understands the low-level stuff
in there is still around.
zbyszek and I are trying our best to keep things running, but at this
point, we'll need basically a rewrite of both rust2rpm and the RPM
dependency / provides generators to support new 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Daniel P . Berrangé
On Wed, Nov 30, 2022 at 11:54:10AM +, Peter Robinson wrote:
> Hi Fabio,
> 
> Been meaning to reply to this, but it got lost in the mail pile.
> 

> > > But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> > > these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> > > which is *amazing*. But... that still is hundreds that I'd have to add. 
> > > And
> > > mostly they are things I don't know _anything_ about.
> >
> > You must realize that this is an extreme case. For many Rust
> > applications that people want to package for Fedora, the number of
> > dependencies that are missing is rather small, *because* most popular
> > libraries are already packaged.
> 
> In my experience, and I'm not packaging any form of gaming engine,
> it's not an extreme case, for a few small things and one slightly
> larger thing I *have* packaged I now maintain over 60 more packages. I
> have another one I want to package and AFAICT I get to package another
> 50-100 but frankly it's hard to tell, and this is something I have
> control over and have been actively trying to do upstream reduction of
> deps.

The magnitude of the number of deps is the real killer for not
only Rust, but also Go, and arguably anything that isn't a C
based application.

Packaging up all of a project's deps individually is viable when
there are a relatively small number of them, especially if most
of the deps are shared by other apps, and the libraries have good
API + ABI stability promises. This is common in the case of most
C applications/libraries, since shared libraries are relatively
speaking fairly broad in functional scope, and often long lived,
because that is the mindset of the C ecosystem in general.

Modern languages & their ecosystems though have brought about
a world where the granularity of deps is way smaller, where
the focus is on being able to evolve rapidly, with less focus
on long term stable API/ABI, as apps are expected to just fix
against specific versions they've tested on.

So functionality that lives in single library in C, often ends
up being spread across 20-200 modules in any of the modern
non-C languages. We've been experiancing this problem in Fedora
for a very long time and while we have spec auto-generators,
that still leaves masses of busy work, and also still leaves
the problem that what we ship doesn't match what upstream has
tested, so the result is often less reliable.

IMHO the only thing that can make packaging such fine grained
modules sustainable long term, is if the process could be 100%
automated.  I don't mean just having a tool to auto-generate
a spec file. It needs to be fully automated to the extent that
you just tell a robot "i need  deps for this package" and it
does everything automatically, except for human approval of
the results of a code license audit.

That of course would be a non-trivial service to build, and
it still begs the question of whether its really beneficial
in the long term.


> > Sure, but isn't that the case for most projects that a newcomer wants
> > to package, regardless of programming language? Say, somebody wants to
> > package some cool new Python project for machine learning, then
> > there's probably also some linear algebra package or SIMD math library
> > in the dependency tree that's missing from Fedora. How is that
> > different?
> 
> I think the big difference here from my experience, and I've packaged
> right across the Fedora spectrum, is that most of the python style
> projects are able to use a pretty comprehensive standard library and
> crypto library which helps minimise a lot of the extras I've seen in
> the rust ecosystem.

Python isn't that different from the Rust world to be honest. 

For a short while we had OpenStack in Fedora, but it wasn't
sustainable to maintain its large set of python module deps,
fighting between the versions Fedora already had, vs the
versions that OpenStack actually wanted (and was tested
against by upstream. It was never ending busy-work for the
people trying to keep it working in Fedora, taking time
away from more beneficial work, and delivering a system that
was often broken because module version combinations we used
were not tested.


> > - we port projects to new versions of dependencies
> 
> That's valuable for other projects but not Fedora, and it's not
> something I am going to do personally as a rust packager.

It isn't sustainable in the general case, as it requires a level
of expertize / knowledge that most packagers can't be assumed
to have. If they have that knowledge and the time to invest in
this, they can do it directly in upstream, regardless of what
Fedora packaging approach is used.


> I don't think it's as bad as other ecosystems but I don't agree with
> your assessment of "kind of a *good* situation", I feel there may be a
> little bit of Stockholm syndrome coming into play here to be honest.

I feel like Fedora is successfull in packaging huge numbers of
deps 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-30 Thread Peter Robinson
Hi Fabio,

Been meaning to reply to this, but it got lost in the mail pile.

> > I _very much_ appreciate all the work you and the other Rust SIG folks
> > (Igor and Zbyszek in particular but I'm sure others as well!) have put into
> > packaging rust apps and crates and all of the systems around that.
>
> I'll respond inline.

Same.

> > I fundamentally disagree with Kevin on a deep level about "entirely
> > useless", but ... find myself kind of agreeing about the "unpackagable"
> > part. I mean: clearly we've found a way, but I'm really not sure we're
> > providing a lot of _value_ in this approach, and I'm also not sure it's as
> > successful as it could be.
>
> We *do* provide value to both users *and* developers by doing things
> the way we do, but the benefits might not be obvious to people who
> don't know how (Rust) packaging works, and what we as package
> maintainers do.

As a rust packager I initially agreed but I am now having doubts here.

> > There are three ways having things packaged in Fedora repos _can_ be
> > helpful:
> >
> > 1. End-user applications and tools
> > 2. Useful development environment
> > 3. As convenience for ourselves for building packages for #1 or #2
> >
> > I am not discounting the value of #3 -- making a shared thing that we all
> > work on together is kind of the whole point, and the nicer we can make that
> > the better we can bring in more people, and those of us already here have a
> > lighter load and can work on the things we're most interested in. But
> > ultimately, we're doing it so we make a useful system for users. That means
> > the first two.
>
> This I can agree with :)
>
> > I'll start with the second: our system for Rust doesn't really do that.
> > Developers are going to use cargo and crates.io and we're not going to
> > convince them that they should do otherwise. (I don't expect anyone
> > disagrees with this.)
>
> This is true, and probably also not "fixable". We need to make some
> amount of non-upstreamable patches to some crates (most notably,
> removing Windows- or mac OS-specific dependencies, because we don't
> want to package those), but in some cases, these are "incompatible"
> changes, and Rust *developers* should not be targeting our downstream
> sources that have these differences with actual upstream sources.

Yet you say above "We *do* provide value to both users *and*
developers" yet you say developers shouldn't be targeting that work?

> This is due to a limitation of how cargo handles target-specific
> dependencies - all dependencies that are *mentioned in any way* need
> to be *present* for it to resolve dependencies / enabled optional
> features / update its lockfile etc. But since we don't want to package
> bindings for Windows and mac OS system APIs, we need to actually patch
> them out, otherwise builds will fail.

And that ends up being quite a bit of work from my point of view. Also
the way the packaging works with options things like devel or optional
features ends up being very painful. I will often drop out optional
features just so I can do less packaging.

> > We're doing okay with #1, but... I think #3 _even_ with all of the work in
> > Rust-to-RPM packaging isn't sufficient. I've played with the Bevy game
> > engine and will probably have a few things it would be nice to package to
> > make available in Fedora Linux. I might not even mind maintaining Bevy
> > itself.
>
> Somebody actually already started packaging Bevy components - some
> packages are already approved and some are still pending review. Not
> sure what the progress has been there, but it's not *impossible*.

Well nothing is *impossible* if you have enough stamina, resources or
whatever else. I don't find saying something isn't *impossible*
necessarily makes it compelling.

> > But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> > these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> > which is *amazing*. But... that still is hundreds that I'd have to add. And
> > mostly they are things I don't know _anything_ about.
>
> You must realize that this is an extreme case. For many Rust
> applications that people want to package for Fedora, the number of
> dependencies that are missing is rather small, *because* most popular
> libraries are already packaged.

In my experience, and I'm not packaging any form of gaming engine,
it's not an extreme case, for a few small things and one slightly
larger thing I *have* packaged I now maintain over 60 more packages. I
have another one I want to package and AFAICT I get to package another
50-100 but frankly it's hard to tell, and this is something I have
control over and have been actively trying to do upstream reduction of
deps.

And requests to make things like this easier have been open for over a year:
https://pagure.io/fedora-rust/rust2rpm/issue/140

> Bevy is a bit special, because it (presumably) pulls in lots of GPU /
> OpenGL / Vulkan related libraries, which we didn't 

Re: [Rust] Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-04 Thread Matthew Miller
On Tue, Nov 01, 2022 at 01:30:01PM -0500, Michel Alexandre Salim wrote:
> I've finally gotten round to doing some polishing and getting it
> packaged:
> - updates for Fedora 36, 37, and Rawhide: 
> https://bodhi.fedoraproject.org/updates/?search=rust-update-set-0.0.1=python-rust-update-set
> - Pagure repo: https://pagure.io/fedora-rust/rust-update-set
> 
> There are some fixes for corner cases I encountered while trying to update our
> `below` packages with this, and some changes needed to get this packageable,
> but those are minor. It mostly works really well, and is at a stage where
> hopefully people can take a look and make suggestions for improvements.


Cool -- glad to see this!

-- 
Matthew Miller

Fedora Project Leader
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Kevin Kofler via devel
Matthew Miller wrote:
> Rust tends to be more fine-grained. I don't think this is necessarily
> rust-specific _really_ — I think it's a trend as people get more used to
> this way of doing things.

And this is inherently a PITA to package, unfortunately.

It is indeed not Rust-specific, other new fancy languages are a similar 
mess, see Node.js (with NPM), Go, etc. Another reason why GNU/Linux 
developers should stick to C and/or C++.

That said, both KDE and GNOME have also gone through a phase of splitting 
craze whose consequences we are still suffering from: The software from the 
KDE project used to be a dozen packages that could be updated and built by 
one person by hand in a day. Now we have hundreds of packages released on 3 
different release cycles (4 if you include the new Plasma Mobile Gear) that 
need scripts to update, take days to build even with scripts, and lead to 
updates on whose sheer number of packages Bodhi and OpenQA frequently choke. 
And this goes also and especially for the libraries: kdelibs used to be one 
package, now there are dozens of kf5-* packages, likewise Qt. And it is no 
better in GNOME land, they started splitting stuff even before KDE did.

But the situation is worse in all those "modern" languages like Rust or Go. 
(And also that language that calls itself "modern" even though it has been 
designed decades ago to allow for pointlessly jumping text on websites. NPM 
is infamous for containing "packages" with a single one-line function.)

> I have not tried this with any Rust package. My experience in the past is
> that many upstreams find this the kind of thing that makes them go on long
> blog rants about distro packaging -- they picked a version, it works fine,
> they don't need the distraction of being told they must update it.

That unhelpfulness (and complete lack of understanding of and for how 
distributions work) by upstreams is a big issue in those parallel 
ecosystems.

> But even when this doesn't happen, it gets into the matter of expertise.
> If I need to update a dependency for a newer-version of the
> sub-dependency, and I don't know enough about either code base to do
> anything other than file a "please update" bug, then everything is blocked
> on that.

Normally, it should just be a matter of changing the version number. If that 
fails to build, IMHO, the dependency (the library) is broken. Library 
upstream "maintainers" with a complete disregard for backwards compatibility 
are a PITA, no matter in what language.

>> We only maintain compat packages where porting to the new version (and
>> submitting the changes upstream) is not feasible. Again, isn't that
>> how Fedora is supposed to work?
> 
> I guess it depends on how broadly one reads "feasible". :)

We normally try pretty aggressively to port packages to new library versions 
where the incompatibilities are not too bad. I do not see why it should be 
any different when the library happens to be written in Rust.

>> Examples of that might be:
>> - wasmtime: I ultimately abandoned the attempt to package it "because
>> Fedora Legal", but the packages themselves worked fine
> 
> An aside, but: did I miss something with this on the Legal list? The only
> thing I'm finding is a question about how to phrase `Apache-2.0 WITH
> LLVM-exception`.

See: https://bugzilla.redhat.com/show_bug.cgi?id=2130953

>> We have talked about this multiple times, but it won't work.
>> I think this was tried with first-class maven artifact support in
>> koji, but we all know how the Java packaging fiasco ended.
> 
> I would rather see it as: we learned some lessons from that approach and
> can do it better.

Without a concrete proposal on how you want to "do it better", there is 
really nothing to discuss, because the only thing that we can talk about as 
is is the approach that we know did not work. So, suggest a new approach and 
we can analyze whether it has any chance of working any better or not.

My guess is that any working approach to allow foreign artifact types in 
Koji, and also reliably deliver them to users (including ones that want to 
build or rebuild software), would ultimately be more work than just using 
RPMs.

>> - we change build flags to default to dynamically linking to system
>> libraries instead of statically linking against vendored copies
> 
> This too.
> 
> Mostly, at least. Assuming this isn't _prebuilt binaries_ or similar,
> upstream may or may not have a good reason or strong opinion.

Why would we care about a "strong opinion"? Either there is a good reason or 
there is not. Irrational demands by unreasonable, uncooperative upstreams 
ought to be just ignored. Free Software means we can adapt the software to 
our needs. If upstream will not allow that, it is not Free Software.

> I really hope we can look at these and learn how to do it better, instead
> of deciding that better isn't possible. And — while I'm not really up on
> node — I have pretty good hindsight on what went wrong 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Fabio Valentini
On Tue, Nov 1, 2022 at 7:07 PM Demi Marie Obenour  wrote:
>
> On 11/1/22 10:40, Matthew Miller wrote:
> > On Wed, Oct 19, 2022 at 01:04:39PM +0200, Fabio Valentini wrote:
> >> For intra-project dependencies (i.e. bevy components depending on
> >> exact versions of bevy components), this is kind of expected, and we
> >> have tools to deal with this kind of situation (though bevy is on a
> >> different scale). For dependencies on third-party libraries, this is
> >> kind of unexpected, and I wonder why they do things like that? Locking
> >> some dependencies to exact versions is usually handled by relying on
> >> the lockfile, instead.
> >
> > I was wrong about this. I actually didn't realize that the ^ was optional. I
> > was, um, cargo-culting that around. Ah well. Anyway, that's less of a
> > problem than I worried.
>
> Will the bevy components ever be used outside of bevy?  If not,
> then they should be bundled.  if so, they should not be bundled.

Yeah, this is something that I hope to be able to improve soon.

> >>> The packaging guidelines say that I SHOULD create patches to update to
> >>> latest versions of dependencies, and that I should further convince the
> >>> upstream to take them. Candidly, that seems like a waste of everone's
> >>> time.
> >> This is *not* a waste of time. If we don't invest time to do that, many
> >> project's dependencies grow stale, and actually *increase* the need for us
> >> to maintain compat packages.
> >
> > I have not tried this with any Rust package. My experience in the past is
> > that many upstreams find this the kind of thing that makes them go on long
> > blog rants about distro packaging -- they picked a version, it works fine,
> > they don't need the distraction of being told they must update it.
>
> I heavily doubt this will be an issue with Rust.  There is a reason
> that dependabot is so popular.

Exactly. I forgot to reply to this in my previous post, but many Rust
projects on GitHub use dependabot (or similar bots) to automatically
bump their dependencies to the latest version. The projects that
*don'*t do that are the odd ones out, but still, PRs submitted to
these projects are usually received positively.

(snip)

> > I don't dispute that helping projects keep up to the latest is valuable
> > work. It even seems like it might be in-scope work for Fedora. But couldn't
> > we do that as something _separate_ from blocking ourselves (either literally
> > or through the extra overhead of compat packages) from packaging the
> > dependent app?
>
> I don’t think it should be a blocker unless e.g. the out-of-date
> dependency has a serious security vulnerability.

Yeah, I tend to be rather pragmatic in this regard.
Patch and submit to upstream if it's easy, keep a compat for the older
version around if it's not.

The only exception to this rule I recently made was to retire some
outdated versions of HTTP libraries, which were only used by obsolete
packages anyway.

Fabio
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Michel Alexandre Salim
Hi all,

Just a note that over the summer, our intern did a project to try and
address some of these issues (namely, that while it's trivial to convert
a single crate to an RPM, trying to automate packaging all the
dependencies and making sure that you don't break anything else while
doing so is tedious and error-prone). Matt and Fabio might recall me
getting their inputs on this several months ago.

The tool:
- recursively checking out packages
- updating existing packages while carrying over manual changes
- creating compatibility packages when upgrading packages with dependents
- automate parallel COPR test builds of sets of packages in dependency order
- chain-build all packages in Koji with requested side tag
- merging and chain-building all packages across release branches
- help review a rust update in Bodhi and verify no compatibility issues are 
introduced

It does not automatically commit anything, of course, and right now is a
bit opinionated in favor of picking the lowest satisfactory version
possible, which is probably not what we want long term -- but it
hopefully provides a nice foundation on which to build on

I've finally gotten round to doing some polishing and getting it
packaged:
- updates for Fedora 36, 37, and Rawhide: 
https://bodhi.fedoraproject.org/updates/?search=rust-update-set-0.0.1=python-rust-update-set
- Pagure repo: https://pagure.io/fedora-rust/rust-update-set

There are some fixes for corner cases I encountered while trying to update our
`below` packages with this, and some changes needed to get this packageable,
but those are minor. It mostly works really well, and is at a stage where
hopefully people can take a look and make suggestions for improvements.

Best regards,

-- 
Michel Alexandre Salim
identities: https://keyoxide.org/5dce2e7e9c3b1cffd335c1d78b229d2f7ccc04f2


signature.asc
Description: PGP signature
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Fabio Valentini
On Tue, Nov 1, 2022 at 3:40 PM Matthew Miller  wrote:
>
> On Wed, Oct 19, 2022 at 01:04:39PM +0200, Fabio Valentini wrote:
> > I'll respond inline.
>
> Me too -- and apologies for the delay.
>
>
> > > I fundamentally disagree with Kevin on a deep level about "entirely
> > > useless", but ... find myself kind of agreeing about the "unpackagable"
> > > part. I mean: clearly we've found a way, but I'm really not sure we're
> > > providing a lot of _value_ in this approach, and I'm also not sure it's
> > > as successful as it could be.
> > We *do* provide value to both users *and* developers by doing things
> > the way we do, but the benefits might not be obvious to people who
> > don't know how (Rust) packaging works, and what we as package
> > maintainers do.
>
> Let me rephrase: I absolutely think you've provided value and are providing
> value (and I appreciate it). I am not convinced that the value is in the
> RPM-izing part, though.
>
>
> [...]
> > This is due to a limitation of how cargo handles target-specific
> > dependencies - all dependencies that are *mentioned in any way* need
> > to be *present* for it to resolve dependencies / enabled optional
> > features / update its lockfile etc. But since we don't want to package
> > bindings for Windows and mac OS system APIs, we need to actually patch
> > them out, otherwise builds will fail.
>
> Theoretically, if we had our own crate repository, we could either make
> those changes there (possibly using something like packit to carry the
> patches) -- or, just, not make the changes and not worry because we know
> those won't end up used anyway?

That won't do. It would mean that we end up having to review and keep
track of *more* crates instead of fewer.

> > You must realize that this is an extreme case. For many Rust
> > applications that people want to package for Fedora, the number of
> > dependencies that are missing is rather small, *because* most popular
> > libraries are already packaged.
>
> It may be that I just hear about the difficult cases.

Of course. People who manager to package their favourite Rust app
without problems don't come to complain, do they?

> > We might need to reconsider how to package projects like this. I'm
> > pretty sure we could find a way to package them in a way that's
> > compatible with how we're currently doing things but would be much
> > less busywork.
>
> Okay, I'm open to that.

I'll have more time to spend on this soon, I hope, so I'll try to come
back with a prototype ASAP.

> > Sure, but isn't that the case for most projects that a newcomer wants
> > to package, regardless of programming language? Say, somebody wants to
> > package some cool new Python project for machine learning, then
> > there's probably also some linear algebra package or SIMD math library
> > in the dependency tree that's missing from Fedora. How is that
> > different?
>
> Rust tends to be more fine-grained. I don't think this is necessarily
> rust-specific _really_ — I think it's a trend as people get more used to
> this way of doing things. With Python, there are some big packages
> (including "batteries included" standard Python itself) which tend to group
> big related sets of functionality. (notably: numpy, scipy, pandas...)

It's probably a misconception that the Rust standard library is small.
It's not small, it's just not not very *broad*.
And there's some libraries that are also quite "batteries included",
for example, hyper and tokio, which could be considered "standard" at
this point.

(snip)

> I have not tried this with any Rust package. My experience in the past is
> that many upstreams find this the kind of thing that makes them go on long
> blog rants about distro packaging -- they picked a version, it works fine,
> they don't need the distraction of being told they must update it.
>
> But even when this doesn't happen, it gets into the matter of expertise. If
> I need to update a dependency for a newer-version of the sub-dependency, and
> I don't know enough about either code base to do anything other than file a
> "please update" bug, then everything is blocked on that.

It's not, though. This is what SIGs are for.
I explicitly tell people who are new to Rust packaging that I'll help
them get their dependencies reviewed and keep them up-to-date so they
can focus on their own stuff.
That some people don't accept help when it is offered is a different problem ...

> I don't dispute that helping projects keep up to the latest is valuable
> work. It even seems like it might be in-scope work for Fedora. But couldn't
> we do that as something _separate_ from blocking ourselves (either literally
> or through the extra overhead of compat packages) from packaging the
> dependent app?
>
> > > The guidelines provide for creating compat packages, but that means 1) the
> > > existing shared work is less useful, 2) requires even more extra steps, 
> > > and
> > > 3) even without reviews for compat has extra administrative overhead.

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Demi Marie Obenour
On 11/1/22 10:40, Matthew Miller wrote:
> On Wed, Oct 19, 2022 at 01:04:39PM +0200, Fabio Valentini wrote:
>> I'll respond inline.
> 
> Me too -- and apologies for the delay.
> 
> 
>>> I fundamentally disagree with Kevin on a deep level about "entirely
>>> useless", but ... find myself kind of agreeing about the "unpackagable"
>>> part. I mean: clearly we've found a way, but I'm really not sure we're
>>> providing a lot of _value_ in this approach, and I'm also not sure it's
>>> as successful as it could be.
>> We *do* provide value to both users *and* developers by doing things
>> the way we do, but the benefits might not be obvious to people who
>> don't know how (Rust) packaging works, and what we as package
>> maintainers do.
> 
> Let me rephrase: I absolutely think you've provided value and are providing
> value (and I appreciate it). I am not convinced that the value is in the
> RPM-izing part, though.

Please do not stop packaging Rust crates.  Qubes OS at least will be relying
on these packages to build its own Rust code in the not-too-distant future.

> [...]
>> This is due to a limitation of how cargo handles target-specific
>> dependencies - all dependencies that are *mentioned in any way* need
>> to be *present* for it to resolve dependencies / enabled optional
>> features / update its lockfile etc. But since we don't want to package
>> bindings for Windows and mac OS system APIs, we need to actually patch
>> them out, otherwise builds will fail.
> 
> Theoretically, if we had our own crate repository, we could either make
> those changes there (possibly using something like packit to carry the
> patches) -- or, just, not make the changes and not worry because we know
> those won't end up used anyway?
> 
> 
>> You must realize that this is an extreme case. For many Rust
>> applications that people want to package for Fedora, the number of
>> dependencies that are missing is rather small, *because* most popular
>> libraries are already packaged.
> 
> It may be that I just hear about the difficult cases.

Probably so.

>> Sure, but isn't that the case for most projects that a newcomer wants
>> to package, regardless of programming language? Say, somebody wants to
>> package some cool new Python project for machine learning, then
>> there's probably also some linear algebra package or SIMD math library
>> in the dependency tree that's missing from Fedora. How is that
>> different?
> 
> Rust tends to be more fine-grained. I don't think this is necessarily
> rust-specific _really_ — I think it's a trend as people get more used to
> this way of doing things. With Python, there are some big packages
> (including "batteries included" standard Python itself) which tend to group
> big related sets of functionality. (notably: numpy, scipy, pandas...)

There is another factor that I think needs to be mentioned here:
In Rust, a crate is not only the unit of packaging, but also of
compilation.  A project may well have several crates purely for
internal code organization or to reduce build time.  Some of these
crates may never be intended for use outside of the larger project.
In this case, bundling them is absolutely the correct decision, as
nobody else should ever be depending on them.  However, bundling is
*not* the right decision if a crate *is* intended to have outside
users.

>> For intra-project dependencies (i.e. bevy components depending on
>> exact versions of bevy components), this is kind of expected, and we
>> have tools to deal with this kind of situation (though bevy is on a
>> different scale). For dependencies on third-party libraries, this is
>> kind of unexpected, and I wonder why they do things like that? Locking
>> some dependencies to exact versions is usually handled by relying on
>> the lockfile, instead.
> 
> I was wrong about this. I actually didn't realize that the ^ was optional. I
> was, um, cargo-culting that around. Ah well. Anyway, that's less of a
> problem than I worried.

Will the bevy components ever be used outside of bevy?  If not,
then they should be bundled.  if so, they should not be bundled.

>>> The packaging guidelines say that I SHOULD create patches to update to
>>> latest versions of dependencies, and that I should further convince the
>>> upstream to take them. Candidly, that seems like a waste of everone's
>>> time.
>> This is *not* a waste of time. If we don't invest time to do that, many
>> project's dependencies grow stale, and actually *increase* the need for us
>> to maintain compat packages.
> 
> I have not tried this with any Rust package. My experience in the past is
> that many upstreams find this the kind of thing that makes them go on long
> blog rants about distro packaging -- they picked a version, it works fine,
> they don't need the distraction of being told they must update it.

I heavily doubt this will be an issue with Rust.  There is a reason
that dependabot is so popular.

> But even when this doesn't happen, it gets into the matter of 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-11-01 Thread Matthew Miller
On Wed, Oct 19, 2022 at 01:04:39PM +0200, Fabio Valentini wrote:
> I'll respond inline.

Me too -- and apologies for the delay.


> > I fundamentally disagree with Kevin on a deep level about "entirely
> > useless", but ... find myself kind of agreeing about the "unpackagable"
> > part. I mean: clearly we've found a way, but I'm really not sure we're
> > providing a lot of _value_ in this approach, and I'm also not sure it's
> > as successful as it could be.
> We *do* provide value to both users *and* developers by doing things
> the way we do, but the benefits might not be obvious to people who
> don't know how (Rust) packaging works, and what we as package
> maintainers do.

Let me rephrase: I absolutely think you've provided value and are providing
value (and I appreciate it). I am not convinced that the value is in the
RPM-izing part, though.


[...]
> This is due to a limitation of how cargo handles target-specific
> dependencies - all dependencies that are *mentioned in any way* need
> to be *present* for it to resolve dependencies / enabled optional
> features / update its lockfile etc. But since we don't want to package
> bindings for Windows and mac OS system APIs, we need to actually patch
> them out, otherwise builds will fail.

Theoretically, if we had our own crate repository, we could either make
those changes there (possibly using something like packit to carry the
patches) -- or, just, not make the changes and not worry because we know
those won't end up used anyway?


> You must realize that this is an extreme case. For many Rust
> applications that people want to package for Fedora, the number of
> dependencies that are missing is rather small, *because* most popular
> libraries are already packaged.

It may be that I just hear about the difficult cases.


> We might need to reconsider how to package projects like this. I'm
> pretty sure we could find a way to package them in a way that's
> compatible with how we're currently doing things but would be much
> less busywork.

Okay, I'm open to that.



> Sure, but isn't that the case for most projects that a newcomer wants
> to package, regardless of programming language? Say, somebody wants to
> package some cool new Python project for machine learning, then
> there's probably also some linear algebra package or SIMD math library
> in the dependency tree that's missing from Fedora. How is that
> different?

Rust tends to be more fine-grained. I don't think this is necessarily
rust-specific _really_ — I think it's a trend as people get more used to
this way of doing things. With Python, there are some big packages
(including "batteries included" standard Python itself) which tend to group
big related sets of functionality. (notably: numpy, scipy, pandas...)

> For intra-project dependencies (i.e. bevy components depending on
> exact versions of bevy components), this is kind of expected, and we
> have tools to deal with this kind of situation (though bevy is on a
> different scale). For dependencies on third-party libraries, this is
> kind of unexpected, and I wonder why they do things like that? Locking
> some dependencies to exact versions is usually handled by relying on
> the lockfile, instead.

I was wrong about this. I actually didn't realize that the ^ was optional. I
was, um, cargo-culting that around. Ah well. Anyway, that's less of a
problem than I worried.

> > The packaging guidelines say that I SHOULD create patches to update to
> > latest versions of dependencies, and that I should further convince the
> > upstream to take them. Candidly, that seems like a waste of everone's
> > time.
> This is *not* a waste of time. If we don't invest time to do that, many
> project's dependencies grow stale, and actually *increase* the need for us
> to maintain compat packages.

I have not tried this with any Rust package. My experience in the past is
that many upstreams find this the kind of thing that makes them go on long
blog rants about distro packaging -- they picked a version, it works fine,
they don't need the distraction of being told they must update it.

But even when this doesn't happen, it gets into the matter of expertise. If
I need to update a dependency for a newer-version of the sub-dependency, and
I don't know enough about either code base to do anything other than file a
"please update" bug, then everything is blocked on that.

I don't dispute that helping projects keep up to the latest is valuable
work. It even seems like it might be in-scope work for Fedora. But couldn't
we do that as something _separate_ from blocking ourselves (either literally
or through the extra overhead of compat packages) from packaging the
dependent app?

> > The guidelines provide for creating compat packages, but that means 1) the
> > existing shared work is less useful, 2) requires even more extra steps, and
> > 3) even without reviews for compat has extra administrative overhead.
> 
> We only maintain compat packages where porting to the new 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-24 Thread Zbigniew Jędrzejewski-Szmek
On Wed, Oct 19, 2022 at 01:04:39PM +0200, Fabio Valentini wrote:
> On Wed, Oct 19, 2022 at 11:25 AM Matthew Miller
>  wrote:
> >
> > I _very much_ appreciate all the work you and the other Rust SIG folks
> > (Igor and Zbyszek in particular but I'm sure others as well!) have put into

[… and Fabio in particular]

> > I'll start with the second: our system for Rust doesn't really do that.
> > Developers are going to use cargo and crates.io and we're not going to
> > convince them that they should do otherwise. (I don't expect anyone
> > disagrees with this.)
> 
> This is true, and probably also not "fixable".

Also, arguably it's not something we'd like to "fix" at all.

I think not-using-the-rpm-packaged-software-during-development is
something that is equally true for Rust, C, Python, Java, and pretty
much any language. As long as Rust is not exclusive to Fedora/CentOS/RHEL
family, rpms are just going to be one of the possible ways to get rust
code.

For me, the goal is to make the automatic conversion from rust crates
to rpms as smooth as possible. I want the rpm package database to be
the CDN we use to get verified Rust code.

> > *This is what open source winning looks like.*

[Snip various details here. I think Fabio's points are very good.]

> > We could also attach other metadata to the packages in the cache. Maybe some
> > popularity, update frequency from Cargo.io, but also package review flags:
> > checked license against source, and whatever other auditing we think should
> > be done. This moves the focus from specfile-correctness to the package
> > itself, and the effort from packaging to reviewing. (I'd suggest that for
> > the experiment, we not make any deep auditing manditory, but instead
> > encouraged.) And these flags should be able to be added by anyone in the
> > Rust SIG, not necessarily just at import.
> 
> This is already the case, though?
> Writing a spec file for a new crate is already automated to the point
> where "standard" crates can be 100% automatically generated and need
> zero manual edits.
> If manual changes *are* required, then these changes would also be
> required in the "first-class crate artifact" scenario, so you don't
> gain anything.
> And if there's other problems that are caught during package review,
> the distribution mechanism doesn't matter, either.
> 
> In my experience, changing the distribution mechanism or packaging
> paradigm will often make things *worse* instead of better. For
> example, the implosion of the NodeJS package ecosystem in Fedora was
> not only caused by the horrid state NPM, but also because the new
> packaging guidelines which prefer bundling essentially made it
> impossible for packagers to verify that objectionable content is
> present in vendored dependencies. For Java, Modularity was seen as a
> "solution", but the result was that basically everybody - except for
> the Red Hat maintainers who maintained the modules - just stopped
> doing Java packaging because of the hostile environment.
> 
> > Fedora _needs_ to adapt to stay relevant in the world where every language
> > stack has developed a packaging ecosystem which effectively ignores us. Some
> > of them are missing lessons they could have learned, ah well — but they also
> > have a lot of nice new ideas we're missing. And, no matter what we think,
> > we're clearly not going to stop them.
> >
> > Rust packaging seems like a great place to lead the way — and then we can
> > maybe expand to Go, which has similar issues, and then Java (where, you
> > know, things have already collapsed despite heroic effort.)
> 
> Oh, actually, I don't think Rust packaging is a good place to start
> here at all. :)
> 
> The way cargo works already maps very neatly onto how RPM packages
> work, which is definitely *not* the case for other language
> ecosystems. I also think we could even massively improve handling of
> "large" projects with many sub-components (like bevy, zola, wasmtime,
> deno, etc.) - which are currently the only projects that are "painful"
> to package - *without* completely changing the underlying packaging
> paradigm or distribution mechanism. (I've been wanting to actually
> write better tooling for this use case, but alas, Bachelor thesis is
> more important for now.)
> 
> Given that, I think we're actually in kind of a *good* situation with
> Rust packaging, especially compared to other language ecosystems - not
> only right now, but also looking at the future. And looking at the
> alternatives, all attempts at trying different approaches (maven
> artifacts in koji, vendoring NodeJS dependencies, Java Modules, etc.)
> have *failed* and ultimately made things worse instead of improving
> the situation - the only thing that has proven to be sustainable (for
> now) is ... maybe surprisingly, plain RPM packages.

Yep. I expect some power-law distribution of package popularity. It
seems like there's an endless supply of crates, but most just don't
matter. If we get some 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-22 Thread Demi Marie Obenour
On 10/20/22 10:01, Neal Gompa wrote:
> On Thu, Oct 20, 2022 at 9:39 AM Kevin Kofler via devel 
>  wrote:
>> Rust needs to adapt to become relevant in GNU/Linux distributions.
>>
> 
> There is nobody pushing for Rust to improve anymore. When Igor and I
> were building out Fedora Rust, we did some engagement with Rust
> upstream about stabilizing Rust's ABI so we could ship dynamic
> libraries. While one or two members of the Rust core team were
> sympathetic, most of the Rust community attacked me for "trying to
> make Rust like C" and I got crap from people on the Rust community
> channels, Twitter, and other places. Eventually, I flamed out because
> there's only so much punishment someone can take over it.

Personal attacks are not okay.

> A couple of years ago, there was a revival of the topic[1], but it
> went nowhere again.
> 
> Until the situation changes, I'm very firmly anti-Rust. Unfortunately,
> sometimes I have no choice but to deal with it.
> 
> [1]: https://github.com/slightknack/rust-abi-wiki

There are a couple major constraints that apply to Rust:

1. Rust implements generics via monomorphisation.  This means that e.g. Vec 
*has no ABI
   at all*: code will only be generated for Vec when it is instantiated.

2. Rust relies heavily on cross-crate inlining to get good performance.  If e.g.
   Option::map is not inlined into the caller, performance will be terrible.

Neither of those constraints are unique to Rust.  C++ template libraries have 
exactly
the same problems.  It’s just that we think of C++ headers as being the 
interface to a
library, instead of the implementation.  Trying to ship the implementation of 
types
like Vec and Option in a shared library makes no more sense than doing so with 
C++’s
std::vector and std::optional, which is to say, none at all.

Another factor is maintaining a stable ABI severely limits library evolution.  
In C++,
ABI stability has resulted in std::regex being *far* slower than third-party 
regular
expression libraries.  I am not surprised that Rust wants to avoid similar 
problems.
-- 
Sincerely,
Demi Marie Obenour (she/her/hers)
___
devel mailing list -- devel@lists.fedoraproject.org
To unsubscribe send an email to devel-le...@lists.fedoraproject.org
Fedora Code of Conduct: 
https://docs.fedoraproject.org/en-US/project/code-of-conduct/
List Guidelines: https://fedoraproject.org/wiki/Mailing_list_guidelines
List Archives: 
https://lists.fedoraproject.org/archives/list/devel@lists.fedoraproject.org
Do not reply to spam, report it: 
https://pagure.io/fedora-infrastructure/new_issue


Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-21 Thread Blaise Pabon
To Neal's point, I had the audacity to suggest some improvements along the
lines of docutils and the response was underwhelming.
https://users.rust-lang.org/t/rust-analog-to-the-python-compilers-docutils/82813?u=blaisep

On Thu, Oct 20, 2022 at 10:02 AM Neal Gompa  wrote:

> On Thu, Oct 20, 2022 at 9:39 AM Kevin Kofler via devel
>  wrote:
> >
> > Matthew Miller wrote:
> > > I fundamentally disagree with Kevin on a deep level about "entirely
> > > useless", but ... find myself kind of agreeing about the "unpackagable"
> > > part. I mean: clearly we've found a way, but I'm really not sure we're
> > > providing a lot of _value_ in this approach, and I'm also not sure
> it's as
> > > successful as it could be.
> >
> > We cannot ship what we cannot package. (Just repackaging upstream blobs
> is a
> > no-go, it is explicitly banned by Fedora Packaging Guidelines for good
> > reasons.)
> >
> > > I'll start with the second: our system for Rust doesn't really do that.
> > > Developers are going to use cargo and crates.io and we're not going to
> > > convince them that they should do otherwise. (I don't expect anyone
> > > disagrees with this.)
> >
> > That is one of the issues (along with lack of proper shared library
> support)
> > that makes Rust such a painful language: it makes it way too easy for
> > developers to just add a dependency on some random unpackaged (by
> > distributions) library with some random license and with random
> transitive
> > dependencies.
> >
> > In a reasonable programming language, you have to think twice before you
> add
> > yet another dependency to your project. Tools like cargo make it way too
> > easy, just add a line and rebuild. So then you end up with a mess like
> this:
> >
> > > But running `cargo fetch` with a clean cache pulls down *390* crates.
> >
> > 390 dependencies is absolutely insane! No project should ever depend on
> so
> > many libraries. This is completely unmaintainable.
> >
> > > *This is what open source winning looks like.*
> > >
> > > I remember a Byte magazine article from the 1990 (I just checked!) with
> > > the title "There Is a Silver Bullet: The birth of interchangeable,
> > > reusable software components will bring software into the information
> > > age". [1] This was about the newly-hot idea of Object Oriented
> > > Programming. It was very exciting. But, of course, that vision of the
> > > world did not happen. It turns out proprietary software *can't* do
> this.
> > >
> > > But now we have it! I don't have to reinvent every basic wheel — but
> even
> > > more than that, I do not have to be an expert in the intricacies of
> safe
> > > concurrency to write an app that uses that under the hood. That's
> amazing!
> > > I can do such powerful things from high-level interfaces and trust the
> > > expertise of those who really understand the deep computer science
> some of
> > > this requires.
> >
> > Code reuse is great when it makes sense, but this "NPM culture" where
> > developers happily depend on tons of packages containing trivial one-line
> > functions is completely insane.
> >
> > If this is now the essence of "Open Source", then it will ultimately
> lose.
> > Proprietary software companies will just be waiting for the next paradigm
> > shift to lock us into their proprietary ecosystem again. For a large
> part,
> > this has already happened with smartphones: you get to choose only
> between
> > the Android walled garden (with an open core and lots of proprietary
> Google
> > services and (Google and third-party) apps on top) and the iOS walled
> garden
> > (which is completely proprietary and closed, even strictly enforcing an
> App
> > Store monopoly). We are struggling to escape from that with devices like
> the
> > PinePhone. But with development practices relying on an unmaintainable
> > dependency hell, it will never happen.
> >
> > > I am competent enough to write a silly toy game using Bevy. It might be
> > > good enough that others will enjoy it. *I am not competent to maintain
> > > many of these dependencies.* I don't even know what most of them DO.
> > > "anyhow"? "bytemuck"?
> >
> > Welcome to dependency hell!
> >
> > > Worse, many of the Bevy deps are specified with exact versions. Maybe I
> > > could make the package work with the packaged versions, but ... that
> > > requires deep expertise and even then might lead to unexpected behavior
> > > and has a high chance of putting me at odds with both the engine
> upstream
> > > and any other games which use it. The packaging guidelines say that I
> > > SHOULD create patches to update to latest versions of dependencies, and
> > > that I should further convince the upstream to take them. Candidly,
> that
> > > seems like a waste of everone's time.
> >
> > Hardcoding exact versions of dependencies is one of the worst
> misfeatures of
> > those language-specific build tools. It effectively prevents you from
> fixing
> > security issues without patching every single application. And of 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-20 Thread Neal Gompa
On Thu, Oct 20, 2022 at 9:39 AM Kevin Kofler via devel
 wrote:
>
> Matthew Miller wrote:
> > I fundamentally disagree with Kevin on a deep level about "entirely
> > useless", but ... find myself kind of agreeing about the "unpackagable"
> > part. I mean: clearly we've found a way, but I'm really not sure we're
> > providing a lot of _value_ in this approach, and I'm also not sure it's as
> > successful as it could be.
>
> We cannot ship what we cannot package. (Just repackaging upstream blobs is a
> no-go, it is explicitly banned by Fedora Packaging Guidelines for good
> reasons.)
>
> > I'll start with the second: our system for Rust doesn't really do that.
> > Developers are going to use cargo and crates.io and we're not going to
> > convince them that they should do otherwise. (I don't expect anyone
> > disagrees with this.)
>
> That is one of the issues (along with lack of proper shared library support)
> that makes Rust such a painful language: it makes it way too easy for
> developers to just add a dependency on some random unpackaged (by
> distributions) library with some random license and with random transitive
> dependencies.
>
> In a reasonable programming language, you have to think twice before you add
> yet another dependency to your project. Tools like cargo make it way too
> easy, just add a line and rebuild. So then you end up with a mess like this:
>
> > But running `cargo fetch` with a clean cache pulls down *390* crates.
>
> 390 dependencies is absolutely insane! No project should ever depend on so
> many libraries. This is completely unmaintainable.
>
> > *This is what open source winning looks like.*
> >
> > I remember a Byte magazine article from the 1990 (I just checked!) with
> > the title "There Is a Silver Bullet: The birth of interchangeable,
> > reusable software components will bring software into the information
> > age". [1] This was about the newly-hot idea of Object Oriented
> > Programming. It was very exciting. But, of course, that vision of the
> > world did not happen. It turns out proprietary software *can't* do this.
> >
> > But now we have it! I don't have to reinvent every basic wheel — but even
> > more than that, I do not have to be an expert in the intricacies of safe
> > concurrency to write an app that uses that under the hood. That's amazing!
> > I can do such powerful things from high-level interfaces and trust the
> > expertise of those who really understand the deep computer science some of
> > this requires.
>
> Code reuse is great when it makes sense, but this "NPM culture" where
> developers happily depend on tons of packages containing trivial one-line
> functions is completely insane.
>
> If this is now the essence of "Open Source", then it will ultimately lose.
> Proprietary software companies will just be waiting for the next paradigm
> shift to lock us into their proprietary ecosystem again. For a large part,
> this has already happened with smartphones: you get to choose only between
> the Android walled garden (with an open core and lots of proprietary Google
> services and (Google and third-party) apps on top) and the iOS walled garden
> (which is completely proprietary and closed, even strictly enforcing an App
> Store monopoly). We are struggling to escape from that with devices like the
> PinePhone. But with development practices relying on an unmaintainable
> dependency hell, it will never happen.
>
> > I am competent enough to write a silly toy game using Bevy. It might be
> > good enough that others will enjoy it. *I am not competent to maintain
> > many of these dependencies.* I don't even know what most of them DO.
> > "anyhow"? "bytemuck"?
>
> Welcome to dependency hell!
>
> > Worse, many of the Bevy deps are specified with exact versions. Maybe I
> > could make the package work with the packaged versions, but ... that
> > requires deep expertise and even then might lead to unexpected behavior
> > and has a high chance of putting me at odds with both the engine upstream
> > and any other games which use it. The packaging guidelines say that I
> > SHOULD create patches to update to latest versions of dependencies, and
> > that I should further convince the upstream to take them. Candidly, that
> > seems like a waste of everone's time.
>
> Hardcoding exact versions of dependencies is one of the worst misfeatures of
> those language-specific build tools. It effectively prevents you from fixing
> security issues without patching every single application. And of course it
> can cause version conflicts between different applications or even between
> transitive dependencies within an application. (And, while the former ones
> can typically be resolved by shipping both versions of the library, the
> latter ones cannot, because you cannot usually link two versions of the same
> library into the same program.)
>
> > So, going back to Kevin's point: it _does_ feel like this is unpackagable.
> > But that's because the barrier to participation 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-20 Thread Kevin Kofler via devel
Matthew Miller wrote:
> I fundamentally disagree with Kevin on a deep level about "entirely
> useless", but ... find myself kind of agreeing about the "unpackagable"
> part. I mean: clearly we've found a way, but I'm really not sure we're
> providing a lot of _value_ in this approach, and I'm also not sure it's as
> successful as it could be.

We cannot ship what we cannot package. (Just repackaging upstream blobs is a 
no-go, it is explicitly banned by Fedora Packaging Guidelines for good 
reasons.)

> I'll start with the second: our system for Rust doesn't really do that.
> Developers are going to use cargo and crates.io and we're not going to
> convince them that they should do otherwise. (I don't expect anyone
> disagrees with this.)

That is one of the issues (along with lack of proper shared library support) 
that makes Rust such a painful language: it makes it way too easy for 
developers to just add a dependency on some random unpackaged (by 
distributions) library with some random license and with random transitive 
dependencies.

In a reasonable programming language, you have to think twice before you add 
yet another dependency to your project. Tools like cargo make it way too 
easy, just add a line and rebuild. So then you end up with a mess like this:

> But running `cargo fetch` with a clean cache pulls down *390* crates.

390 dependencies is absolutely insane! No project should ever depend on so 
many libraries. This is completely unmaintainable.

> *This is what open source winning looks like.*
> 
> I remember a Byte magazine article from the 1990 (I just checked!) with
> the title "There Is a Silver Bullet: The birth of interchangeable,
> reusable software components will bring software into the information
> age". [1] This was about the newly-hot idea of Object Oriented
> Programming. It was very exciting. But, of course, that vision of the
> world did not happen. It turns out proprietary software *can't* do this.
> 
> But now we have it! I don't have to reinvent every basic wheel — but even
> more than that, I do not have to be an expert in the intricacies of safe
> concurrency to write an app that uses that under the hood. That's amazing!
> I can do such powerful things from high-level interfaces and trust the
> expertise of those who really understand the deep computer science some of
> this requires.

Code reuse is great when it makes sense, but this "NPM culture" where 
developers happily depend on tons of packages containing trivial one-line 
functions is completely insane.

If this is now the essence of "Open Source", then it will ultimately lose. 
Proprietary software companies will just be waiting for the next paradigm 
shift to lock us into their proprietary ecosystem again. For a large part, 
this has already happened with smartphones: you get to choose only between 
the Android walled garden (with an open core and lots of proprietary Google 
services and (Google and third-party) apps on top) and the iOS walled garden 
(which is completely proprietary and closed, even strictly enforcing an App 
Store monopoly). We are struggling to escape from that with devices like the 
PinePhone. But with development practices relying on an unmaintainable 
dependency hell, it will never happen.

> I am competent enough to write a silly toy game using Bevy. It might be
> good enough that others will enjoy it. *I am not competent to maintain
> many of these dependencies.* I don't even know what most of them DO.
> "anyhow"? "bytemuck"?

Welcome to dependency hell!

> Worse, many of the Bevy deps are specified with exact versions. Maybe I
> could make the package work with the packaged versions, but ... that
> requires deep expertise and even then might lead to unexpected behavior
> and has a high chance of putting me at odds with both the engine upstream
> and any other games which use it. The packaging guidelines say that I
> SHOULD create patches to update to latest versions of dependencies, and
> that I should further convince the upstream to take them. Candidly, that
> seems like a waste of everone's time.

Hardcoding exact versions of dependencies is one of the worst misfeatures of 
those language-specific build tools. It effectively prevents you from fixing 
security issues without patching every single application. And of course it 
can cause version conflicts between different applications or even between 
transitive dependencies within an application. (And, while the former ones 
can typically be resolved by shipping both versions of the library, the 
latter ones cannot, because you cannot usually link two versions of the same 
library into the same program.)

> So, going back to Kevin's point: it _does_ feel like this is unpackagable.
> But that's because the barrier to participation seems too high.

No, the answer to upstream shipping an unpackageable mess cannot possibly be 
to make it easier to smuggle unpackageable messes into Fedora!

> It's not because it's statically-linked 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-20 Thread Blaise Pabon
Lots of good wisdom here, thank you. IMHO Rust will benefit from whatever
"adult supervision" Fedora can provide.
-Blaise (currently undergoing treatment for injuries sustained supporting
npm in production)

On Wed, Oct 19, 2022 at 7:05 AM Fabio Valentini 
wrote:

> On Wed, Oct 19, 2022 at 11:25 AM Matthew Miller
>  wrote:
> >
> > I _very much_ appreciate all the work you and the other Rust SIG folks
> > (Igor and Zbyszek in particular but I'm sure others as well!) have put
> into
> > packaging rust apps and crates and all of the systems around that.
>
> I'll respond inline.
>
> > I fundamentally disagree with Kevin on a deep level about "entirely
> > useless", but ... find myself kind of agreeing about the "unpackagable"
> > part. I mean: clearly we've found a way, but I'm really not sure we're
> > providing a lot of _value_ in this approach, and I'm also not sure it's
> as
> > successful as it could be.
>
> We *do* provide value to both users *and* developers by doing things
> the way we do, but the benefits might not be obvious to people who
> don't know how (Rust) packaging works, and what we as package
> maintainers do.
>
> > There are three ways having things packaged in Fedora repos _can_ be
> > helpful:
> >
> > 1. End-user applications and tools
> > 2. Useful development environment
> > 3. As convenience for ourselves for building packages for #1 or #2
> >
> > I am not discounting the value of #3 -- making a shared thing that we all
> > work on together is kind of the whole point, and the nicer we can make
> that
> > the better we can bring in more people, and those of us already here
> have a
> > lighter load and can work on the things we're most interested in. But
> > ultimately, we're doing it so we make a useful system for users. That
> means
> > the first two.
>
> This I can agree with :)
>
> > I'll start with the second: our system for Rust doesn't really do that.
> > Developers are going to use cargo and crates.io and we're not going to
> > convince them that they should do otherwise. (I don't expect anyone
> > disagrees with this.)
>
> This is true, and probably also not "fixable". We need to make some
> amount of non-upstreamable patches to some crates (most notably,
> removing Windows- or mac OS-specific dependencies, because we don't
> want to package those), but in some cases, these are "incompatible"
> changes, and Rust *developers* should not be targeting our downstream
> sources that have these differences with actual upstream sources.
>
> This is due to a limitation of how cargo handles target-specific
> dependencies - all dependencies that are *mentioned in any way* need
> to be *present* for it to resolve dependencies / enabled optional
> features / update its lockfile etc. But since we don't want to package
> bindings for Windows and mac OS system APIs, we need to actually patch
> them out, otherwise builds will fail.
>
> > We're doing okay with #1, but... I think #3 _even_ with all of the work
> in
> > Rust-to-RPM packaging isn't sufficient. I've played with the Bevy game
> > engine and will probably have a few things it would be nice to package to
> > make available in Fedora Linux. I might not even mind maintaining Bevy
> > itself.
>
> Somebody actually already started packaging Bevy components - some
> packages are already approved and some are still pending review. Not
> sure what the progress has been there, but it's not *impossible*.
>
> > But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> > these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> > which is *amazing*. But... that still is hundreds that I'd have to add.
> And
> > mostly they are things I don't know _anything_ about.
>
> You must realize that this is an extreme case. For many Rust
> applications that people want to package for Fedora, the number of
> dependencies that are missing is rather small, *because* most popular
> libraries are already packaged.
>
> Bevy is a bit special, because it (presumably) pulls in lots of GPU /
> OpenGL / Vulkan related libraries, which we didn't need to package for
> anything else yet, and it's also split into dozens of small libraries
> itself, which can be painful to package, that is true.
>
> We might need to reconsider how to package projects like this. I'm
> pretty sure we could find a way to package them in a way that's
> compatible with how we're currently doing things but would be much
> less busywork.
>
> > *This is what open source winning looks like.*
> >
> > I remember a Byte magazine article from the 1990 (I just checked!) with
> the
> > title "There Is a Silver Bullet: The birth of interchangeable, reusable
> > software components will bring software into the information age". [1]
> > This was about the newly-hot idea of Object Oriented Programming. It was
> > very exciting. But, of course, that vision of the world did not happen.
> It
> > turns out proprietary software *can't* do this.
> >
> > But now we have it! I 

Re: musings on rust packaging [was Re: F38 proposal: RPM Sequoia (System-Wide Change proposal)]

2022-10-19 Thread Fabio Valentini
On Wed, Oct 19, 2022 at 11:25 AM Matthew Miller
 wrote:
>
> I _very much_ appreciate all the work you and the other Rust SIG folks
> (Igor and Zbyszek in particular but I'm sure others as well!) have put into
> packaging rust apps and crates and all of the systems around that.

I'll respond inline.

> I fundamentally disagree with Kevin on a deep level about "entirely
> useless", but ... find myself kind of agreeing about the "unpackagable"
> part. I mean: clearly we've found a way, but I'm really not sure we're
> providing a lot of _value_ in this approach, and I'm also not sure it's as
> successful as it could be.

We *do* provide value to both users *and* developers by doing things
the way we do, but the benefits might not be obvious to people who
don't know how (Rust) packaging works, and what we as package
maintainers do.

> There are three ways having things packaged in Fedora repos _can_ be
> helpful:
>
> 1. End-user applications and tools
> 2. Useful development environment
> 3. As convenience for ourselves for building packages for #1 or #2
>
> I am not discounting the value of #3 -- making a shared thing that we all
> work on together is kind of the whole point, and the nicer we can make that
> the better we can bring in more people, and those of us already here have a
> lighter load and can work on the things we're most interested in. But
> ultimately, we're doing it so we make a useful system for users. That means
> the first two.

This I can agree with :)

> I'll start with the second: our system for Rust doesn't really do that.
> Developers are going to use cargo and crates.io and we're not going to
> convince them that they should do otherwise. (I don't expect anyone
> disagrees with this.)

This is true, and probably also not "fixable". We need to make some
amount of non-upstreamable patches to some crates (most notably,
removing Windows- or mac OS-specific dependencies, because we don't
want to package those), but in some cases, these are "incompatible"
changes, and Rust *developers* should not be targeting our downstream
sources that have these differences with actual upstream sources.

This is due to a limitation of how cargo handles target-specific
dependencies - all dependencies that are *mentioned in any way* need
to be *present* for it to resolve dependencies / enabled optional
features / update its lockfile etc. But since we don't want to package
bindings for Windows and mac OS system APIs, we need to actually patch
them out, otherwise builds will fail.

> We're doing okay with #1, but... I think #3 _even_ with all of the work in
> Rust-to-RPM packaging isn't sufficient. I've played with the Bevy game
> engine and will probably have a few things it would be nice to package to
> make available in Fedora Linux. I might not even mind maintaining Bevy
> itself.

Somebody actually already started packaging Bevy components - some
packages are already approved and some are still pending review. Not
sure what the progress has been there, but it's not *impossible*.

> But running `cargo fetch` with a clean cache pulls down *390* crates. Of
> these, it looks like 199 (!) are already packaged as rust-[crate]-devel,
> which is *amazing*. But... that still is hundreds that I'd have to add. And
> mostly they are things I don't know _anything_ about.

You must realize that this is an extreme case. For many Rust
applications that people want to package for Fedora, the number of
dependencies that are missing is rather small, *because* most popular
libraries are already packaged.

Bevy is a bit special, because it (presumably) pulls in lots of GPU /
OpenGL / Vulkan related libraries, which we didn't need to package for
anything else yet, and it's also split into dozens of small libraries
itself, which can be painful to package, that is true.

We might need to reconsider how to package projects like this. I'm
pretty sure we could find a way to package them in a way that's
compatible with how we're currently doing things but would be much
less busywork.

> *This is what open source winning looks like.*
>
> I remember a Byte magazine article from the 1990 (I just checked!) with the
> title "There Is a Silver Bullet: The birth of interchangeable, reusable
> software components will bring software into the information age". [1]
> This was about the newly-hot idea of Object Oriented Programming. It was
> very exciting. But, of course, that vision of the world did not happen. It
> turns out proprietary software *can't* do this.
>
> But now we have it! I don't have to reinvent every basic wheel — but even
> more than that, I do not have to be an expert in the intricacies of safe
> concurrency to write an app that uses that under the hood. That's amazing! I
> can do such powerful things from high-level interfaces and trust the
> expertise of those who really understand the deep computer science some of
> this requires.
>
> I am competent enough to write a silly toy game using Bevy. It might be good
> enough