Re: wip-ungrafting builds stuck

2021-04-30 Thread Leo Famulari
On Fri, Apr 30, 2021 at 06:32:54PM +0200, Ludovic Courtès wrote:
> I’ve just merged ‘wip-ungrafting’ in master!  It was at 76% according to
> , with mostly ARM builds missing compared to
> ‘master’.  Now we have fresh binaries to download (or build)!

Hooray!

> Thanks Leo for getting it into shape!

Hopefully it is in good shape. All that I did let the build farm work
and then upgraded / reconfigured my x86_64 systems based on it.

I don't expect any problems. I decided what to ungraft based on the
criteria explained in this thread:

https://lists.gnu.org/archive/html/guix-devel/2021-04/msg00331.html



Re: wip-ungrafting builds stuck

2021-04-30 Thread Ludovic Courtès
Hey there!

I’ve just merged ‘wip-ungrafting’ in master!  It was at 76% according to
, with mostly ARM builds missing compared to
‘master’.  Now we have fresh binaries to download (or build)!

For the record, ‘wip-ungrafting’ was merged in ‘version-1.3.0’ a few
days ago already.

Thanks Leo for getting it into shape!

Ludo’.



Re: Why is glib still grafted on the 'wip-ungrafting' branch? (was Re: wip-ungrafting builds stuck)

2021-04-22 Thread Leo Famulari
On Thu, Apr 22, 2021 at 12:27:52PM -0400, Mark H Weaver wrote:
> I don't understand why it's relevant how many patches are involved.  It
> sounds like if I had concatenated all of the CVE-2021-27219 patches into
> a single file, you would have judged that as "simple", and therefore
> ungrafted it, although it makes no substantive difference.

I know you understand the subtle risks of grafting, compared to
rebuilding packages with the grafted changes. Just because something
works as a graft, or seems to work as a graft, there is no guarantee
that it will continue to work when we absorb the graft and rebuild all
dependent packages.

I decided to use this "simple change" heuristic based on my own
experience working with grafts. Experience grants intuition, and my
intuition tells that me that grafts with fewer lines of changed code are
less likely to cause build failures or to change the behaviour of a
package beyond the desired security fix.

Remember, the goal of this branch was to attempt to *quickly* absorb
some grafts. I had to use a heuristic approach. Both in deciding which
grafts to absorb, and in explaining my decisions to you (I did not
expect you to misunderstand).

I could have told you that I selected these grafts based on "number of
lines of changed code", but it was easier to write "number of patches".

If you had concatenated those patches, I would have noticed that the
file was gigantic and chosen not to ungraft it at this time.

And to preempt the reply that you are sure to send, yes, I actually
looked at the content of the patches when making my decisions.



Re: Why is glib still grafted on the 'wip-ungrafting' branch? (was Re: wip-ungrafting builds stuck)

2021-04-22 Thread Mark H Weaver
Hi Leo,

Leo Famulari  writes:

> On Wed, Apr 21, 2021 at 04:47:06PM -0400, Mark H Weaver wrote:
>> I just noticed that 'glib' is still grafted on the 'wip-ungrafting'
>> branch.  Was that intentional?
>> 
>> https://git.sv.gnu.org/cgit/guix.git/tree/gnu/packages/glib.scm?h=wip-ungrafting=e12210dc92098d8581cea3007d57dbb6be16bb41#n171
>
> Yes. For that branch I only selected grafts that I judged to be
> "simple". There are many other grafts still in place on that branch.

Okay, thanks for the explanation.

> My criteria for simplicity are grafts that either apply one or two
> patches, or are minor version upgrades of projects that are known to
> care about ABI compatibility.

I don't understand why it's relevant how many patches are involved.  It
sounds like if I had concatenated all of the CVE-2021-27219 patches into
a single file, you would have judged that as "simple", and therefore
ungrafted it, although it makes no substantive difference.

Anyway, it makes no difference to me; I'll continue doing my own thing
on my private branch.  I just wanted to make sure that it wasn't an
oversight.

 Thanks,
   Mark



Re: Why is glib still grafted on the 'wip-ungrafting' branch? (was Re: wip-ungrafting builds stuck)

2021-04-21 Thread Leo Famulari
On Wed, Apr 21, 2021 at 04:47:06PM -0400, Mark H Weaver wrote:
> I just noticed that 'glib' is still grafted on the 'wip-ungrafting'
> branch.  Was that intentional?
> 
> https://git.sv.gnu.org/cgit/guix.git/tree/gnu/packages/glib.scm?h=wip-ungrafting=e12210dc92098d8581cea3007d57dbb6be16bb41#n171

Yes. For that branch I only selected grafts that I judged to be
"simple". There are many other grafts still in place on that branch.

My criteria for simplicity are grafts that either apply one or two
patches, or are minor version upgrades of projects that are known to
care about ABI compatibility.

We want to ungraft as much as possible for the upcoming release, to
improve performance of package operations.

However, we lack the time and humanpower to validate the ungrafting of
the more complicated grafts in time for the release. Some of the
remaining grafts should never have been made, in my opinion, and I want
to discuss our policies on this subject — after the release.

In any case, I'm not confident that we will include wip-ungrafting in
the release. The build failure rate of the wip-ungrafting branch is
higher than 10%, which I think is too high:

https://ci.guix.gnu.org/jobset/ungrafting


signature.asc
Description: PGP signature


Why is glib still grafted on the 'wip-ungrafting' branch? (was Re: wip-ungrafting builds stuck)

2021-04-21 Thread Mark H Weaver
I just noticed that 'glib' is still grafted on the 'wip-ungrafting'
branch.  Was that intentional?

https://git.sv.gnu.org/cgit/guix.git/tree/gnu/packages/glib.scm?h=wip-ungrafting=e12210dc92098d8581cea3007d57dbb6be16bb41#n171

  Mark



Re: wip-ungrafting builds stuck

2021-04-20 Thread Maxim Cournoyer
Hi,

Mark H Weaver  writes:

> Mathieu Othacehe  writes:
>
>>> Any idea what could be wrong, Mathieu?  What would you suggest to do
>>> when investigating such issues?
>>
>> Yes I noticed it. The main problem here is that almost all workers are
>> stuck building Rust.
>>
>> I see two actions here:
>>
>> 1. Understand why Rust is taking so long to build.
>
> The attached patch, which I've been using on my private branch of Guix
> for a long time and is fully tested, would significantly speed up the
> Rust bootstrap.  I never submitted it because I wasn't sure it would be
> of interest.
>
>Mark

FWIW, the tests are already disabled on core-updates, which reduced the
build time by 33%.  Also we now bootstrap from 1.29 there instead of
1.19.  The net result is a 50% faster bootstrap (8 hours) compared to
master (16 hours) on a 3900X Ryzen processor.  That's still a lot of
time.  I hope that the effort to produce a GCC front end for Rust
succeeds [0].  Otherwise mrustc, which we already use to bootstrap rust,
is now able to bootstrap from 1.39 (unreleased yet though), so we should
look into making use of it [1].

Maxim

[0]  https://github.com/Rust-GCC/gccrs
[1]  https://github.com/thepowersgang/mrustc/



Re: wip-ungrafting builds stuck

2021-04-18 Thread Mark H Weaver
Mathieu Othacehe  writes:

>> Any idea what could be wrong, Mathieu?  What would you suggest to do
>> when investigating such issues?
>
> Yes I noticed it. The main problem here is that almost all workers are
> stuck building Rust.
>
> I see two actions here:
>
> 1. Understand why Rust is taking so long to build.

The attached patch, which I've been using on my private branch of Guix
for a long time and is fully tested, would significantly speed up the
Rust bootstrap.  I never submitted it because I wasn't sure it would be
of interest.

   Mark

>From cfe7ed9732e77eacf7f9e6b6db0d731b2f7d100e Mon Sep 17 00:00:00 2001
From: Mark H Weaver 
Date: Sun, 19 Jul 2020 23:13:28 -0400
Subject: [PATCH] gnu: rust: Disable check phases of versions before 1.40.

This saves 49 hours of total build time on a Thinkpad X200.

* gnu/packages/rust.scm (rust-1.19)[arguments]: Add "#:tests? #f".
(rust-1.20, rust-1.26)[arguments]: Modify the custom 'check' phase to honor
the '#:tests?' argument.
(rust-1.40)[arguments]: Override '#:tests?' argument to be #t.
---
 gnu/packages/rust.scm | 40 ++--
 1 file changed, 26 insertions(+), 14 deletions(-)

diff --git a/gnu/packages/rust.scm b/gnu/packages/rust.scm
index 3952a17908..2b9f26c204 100644
--- a/gnu/packages/rust.scm
+++ b/gnu/packages/rust.scm
@@ -236,6 +236,8 @@ safety and thread safety guarantees.")
 (arguments
  `(#:imported-modules ,%cargo-utils-modules ;for `generate-all-checksums'
#:modules ((guix build utils) (ice-9 match) (guix build gnu-build-system))
+   #:tests? #f  ;Disable the test suite for early versions of rust to save
+;compile time.
#:phases
(modify-phases %standard-phases
  (add-after 'unpack 'set-env
@@ -596,11 +598,16 @@ jemalloc = \"" jemalloc "/lib/libjemalloc_pic.a" "\"
  (invoke "./x.py" "build")
  (invoke "./x.py" "build" "src/tools/cargo")))
  (replace 'check
-   (lambda* _
- ;; Disable parallel execution to prevent EAGAIN errors when
- ;; running tests.
- (invoke "./x.py" "-j1" "test" "-vv")
- (invoke "./x.py" "-j1" "test" "src/tools/cargo")
+   (lambda* (#:key (tests? #t) #:allow-other-keys)
+ (if tests?
+ (let ((parallel-job-spec
+;; Disable parallel execution to prevent EAGAIN
+;; errors when running tests.
+"-j1"))
+   (invoke "./x.py" parallel-job-spec "test" "-vv")
+   (invoke "./x.py" parallel-job-spec "test"
+   "src/tools/cargo"))
+ (format #t "test suite not run~%"))
  #t))
  (replace 'install
(lambda* (#:key outputs #:allow-other-keys)
@@ -774,15 +781,16 @@ jemalloc = \"" jemalloc "/lib/libjemalloc_pic.a" "\"
  ;; binaryen was replaced with LLD project from LLVM
  (delete 'dont-build-native)
  (replace 'check
-   (lambda* _
- ;; Enable parallel execution.
- (let ((parallel-job-spec
-(string-append "-j" (number->string
- (min 4
-  (parallel-job-count))
-   (invoke "./x.py" parallel-job-spec "test" "-vv")
-   (invoke "./x.py" parallel-job-spec "test"
-   "src/tools/cargo"
+   (lambda* (#:key (tests? #t) #:allow-other-keys)
+ (if tests?
+ (let ((parallel-job-spec
+;; Enable parallel execution.
+(format #f "-j~a" (min 4 (parallel-job-count)
+   (invoke "./x.py" parallel-job-spec "test" "-vv")
+   (invoke "./x.py" parallel-job-spec "test"
+   "src/tools/cargo"))
+ (format #t "test suite not run~%"))
+ #t))
  (replace 'remove-unsupported-tests
(lambda* _
  ;; Our ld-wrapper cannot process non-UTF8 bytes in LIBRARY_PATH.
@@ -1226,6 +1234,10 @@ move around."
;; which makes this workaround only necessary for this release.
(cons* #:validate-runpath? #f
  (substitute-keyword-arguments (package-arguments base-rust)
+   ((#:tests? _ #t)
+;; Enable the test suite, which was disabled for earlier versions
+;; of rust to save compile time.
+#t)
((#:phases phases)
 `(modify-phases ,phases
;; We often need to patch tests with various Guix-specific paths.
-- 
2.31.1



Re: wip-ungrafting builds stuck

2021-04-18 Thread Mathieu Othacehe


Hello,

> Any idea what could be wrong, Mathieu?  What would you suggest to do
> when investigating such issues?

Yes I noticed it. The main problem here is that almost all workers are
stuck building Rust.

I see two actions here:

1. Understand why Rust is taking so long to build.

2. Implement the BuildDependencies table to avoid this kind of
situations where all workers are stuck building the same thing, as
discussed here: https://issues.guix.gnu.org/46402.

Thanks,

Mathieu



Re: wip-ungrafting builds stuck

2021-04-18 Thread Mathieu Othacehe


Hello,

> Any idea what could be wrong, Mathieu?  What would you suggest to do
> when investigating such issues?

Yes I noticed it. The main problem here is that almost all workers are
stuck building Rust.

I see two possible actions here.

1. Understand why Rust is taking so long to build.

2. Implement the BuildDependencies table to avoid this kind of
situations where all workers are stuck building the same thing, as
discussed here: https://issues.guix.gnu.org/46402.

Thanks,

Mathieu



wip-ungrafting builds stuck

2021-04-18 Thread Ludovic Courtès
Hi!

The ‘wip-ungrafting’ branch that Leo set up has been building for ~36h.
It was at 26% 24h hours ago and it’s now stuck at 33%, even though all
but two workers are idle.

 shows that many x86_64
builds are missing.

Any idea what could be wrong, Mathieu?  What would you suggest to do
when investigating such issues?

Thanks,
Ludo’.