This is intentional change and the explanation is here in the [commit
message](https://github.com/rpm-software-management/rpm/commit/9343ecd94cd873e6dc1c06428975163cbb9cf9af).
Since you are using new RPM to build SRPM which is targetting older system,
now it is probably time to find some
that's a good start at least. There's also --importdb. So with slightly more
code that could sync back and forth with a directory
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
@lnussel If you want that, just write a post-transaction trigger to run `rpm
--exportdb` somewhere...
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
Note that installing the headers on disk as part of package installation does
not exclude actually using one of the existing database formats.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
Closed #1151.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/rpm-software-management/rpm/issues/1151#event-3204921026___
Rpm-maint mailing list
Would it be possible to implement a "filesystem" database backend? Most likely.
We've even tossed it around a few times between ourselves, but there's just
very little benefit and all manner of downsides, such as those @Conan-Kudo
already mentioned. I wouldn't call locking, atomic operations
> I have no insight to the other database formats so can't comment on how that
> is handled. For setups that never modify the running system but rather either
> prepare images or modify snapshots the transactional capabilities of rpm do
> not matter anyways. If anything goes wrong, no new
> * People can and will randomly manipulate files to force the package
> manager to do weird things (it's even documented in various troubleshooting
> guides)
Well, obfuscating the database for the purpose of avoiding people to mess with
it doesn't sound like an overly good motivation. RPM
Thanks for merging that part. Anyway, those generators are very simple and I
think having them as parametric ones make sense. Related to this, I have not
seen your opinion on moving these things to a rpm-extras and start actually
releasing those things and inform distributions. WDYT?
--
You
Oh and FWIW, the double-buildroot bugfix merged separately via #1165.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
Possibly here is a part of the logic causing the error.
https://github.com/rpm-software-management/rpm/blob/10127cdb2364de2c1408950a25b730920e665689/rpmio/macro.c#L638-L641
Here is the testing code to test the logic.
debuginfo deals with any number of files *per package*, whereas there generally
is just one metainfo/desktop file in a single package. A single fork+exec in a
context of a build is lost in the noise, but when you deals in the dozens it
starts to add up.
--
You are receiving this because you
Thanks @ignatenkobrain for spotting and the patch.
I must've just looked at runCmd() being passed buildRoot and thinking it's
there to be prepended to the path, "obviously", when its actually being passed
down a dozen layers or so just to set RPM_BUILD_ROOT environment. Doh.
:roll_eyes:
I
Merged #1165 into master.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/rpm-software-management/rpm/pull/1165#event-3204580405___
Rpm-maint mailing list
Closed #1162 via #1165.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/rpm-software-management/rpm/issues/1162#event-3204580420___
Rpm-maint mailing list
The generator mechanism needs to learn to deal with multiple files in a go,
preserving the per-file nature of the resulting data. We're not making any
changes that would make that transition any more difficult that it is, so this
is no way 4.16 material.
In principle, while some new
@pmatilai if debuginfo qualifies, why metainfo one would not? They are
essentially doing very same simple thing.
Another point to WHY to do it is to speed up generators. The desktop one is
very simple and can save bunch of fork()s, so why not to replace it?
--
You are receiving this because
Submitting @ignatenkobrain 's patch as a separate PR because this is a clear
and simple bugfix that deserves to be fixed fast and regardless of any other
changes.
--
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
fn already contains full path to a file, so no need to prepend it once
more. This is actually breaking things.
Before:
D: Calling %{__pythonname_provides %{?__pythonname_provides_opts}}() on
Mm.
We don't want to convert everything to parametric macros just because we *can*,
because they're just harder to debug and otherwise work on for the casual
observer. Only convert those were it actually makes a difference: those that
affect large number of files per package. Of these,
20 matches
Mail list logo