Re: Rock64 segfaults (was: Merging ‘staging’?)

2022-07-15 Thread Timotej Lazar
Hi,

sorry to hijack the thread – I found your post when debugging random
segfaults on the rock64, and just wanted to post a possible solution for
anyone having the same problem.

"pelzflorian (Florian Pelz)"  [2022-06-09 
19:19:30+0200]:
> (The build of llvm@11 also needed a few retries because gcc randomly
> fails sometimes (once with a segfault).  That is not a Guix bug
> though, I think, but peculiarities of the rock64.)

This seems to be a hardware issue that can be fixed by downclocking the
memory¹. Mainline u-boot has two variants of the rk3328-sdram-lpddr3
.dtsi file, so I just did

(define u-boot-rock64-rk3328/666
  (package
(inherit u-boot-rock64-rk3328)
(arguments
 (substitute-keyword-arguments (package-arguments u-boot-rock64-rk3328)
   ((#:phases phases)
`(modify-phases ,phases
   (add-after 'unpack 'change-ddr-clock
 (lambda _
   (substitute* "arch/arm/dts/rk3328-rock64-u-boot.dtsi"
 (("rk3328-sdram-lpddr3-1600.dtsi") 
"rk3328-sdram-lpddr3-666.dtsi"))

and used that for the bootloader:

(bootloader
  (bootloader-configuration
(bootloader
  (bootloader
(inherit u-boot-rock64-rk3328-bootloader)
(package u-boot-rock64-rk3328/666)))
…

With this I was able to compile both gcc and llvm several times; before,
compilation would reliably crash within an hour unless it was done using
a single core. I assume performance with the lower memory rate is
considerably worse, but I haven’t done any measurements.

It’s very nice that I can include this in my OS configuration and then
pretty much forget about it. Big thanks to all guix for a system where
making such changes is so simple!

Regards,
Timotej

¹ https://forum.armbian.com/topic/15082-rock64-focal-fossa-memory-frequency/



Re: Release ? (was: Merging ‘staging’?)

2022-06-14 Thread Efraim Flashner
On Tue, Jun 14, 2022 at 12:32:13PM +0200, zimoun wrote:
> Hi,
> 
> On Mon, 13 Jun 2022 at 09:03, Ludovic Courtès  wrote:
> 
> > Merged, enjoy!  :-)
> 
> Cool!
> 
> 
> > Next up: release and ‘core-updates’.
> 
> Do you want to do a ’core-updates’ merge before the release?  I think a
> release on July would be very welcome.  WDYT?

I think we do release and then core-updates, before we get bogged down
in fixing packages in core-updates.

-- 
Efraim Flashner  אפרים פלשנר
GPG key = A28B F40C 3E55 1372 662D  14F7 41AA E7DC CA3D 8351
Confidentiality cannot be guaranteed on emails sent or received unencrypted


signature.asc
Description: PGP signature


Release ? (was: Merging ‘staging’?)

2022-06-14 Thread zimoun
Hi,

On Mon, 13 Jun 2022 at 09:03, Ludovic Courtès  wrote:

> Merged, enjoy!  :-)

Cool!


> Next up: release and ‘core-updates’.

Do you want to do a ’core-updates’ merge before the release?  I think a
release on July would be very welcome.  WDYT?


Cheers,
simon



Re: Merging ‘staging’?

2022-06-13 Thread Thiago Jung Bauermann


Hello,

Ludovic Courtès  writes:

> Hello Guix,
>
> Ludovic Courtès  skribis:
>
>> Which means I’ll merge ‘staging’ in ‘master’ tomorrow morning if nothing
>> has collapsed by then.  :-)
>
> Merged, enjoy!  :-)

Nice!

> Thanks to everyone who helped on the way!

Thank you for merging the branch!

> Next up: release and ‘core-updates’.

Exciting times. :-)

-- 
Thanks
Thiago



Re: Merging ‘staging’?

2022-06-13 Thread Ludovic Courtès
Hello Guix,

Ludovic Courtès  skribis:

> Which means I’ll merge ‘staging’ in ‘master’ tomorrow morning if nothing
> has collapsed by then.  :-)

Merged, enjoy!  :-)

Thanks to everyone who helped on the way!

Next up: release and ‘core-updates’.

Ludo’.



Re: Merging ‘staging’?

2022-06-12 Thread Ludovic Courtès
Hi,

Efraim Flashner  skribis:

> Let's do it

S… turns out commit e31ab8c24848a7691a838af8df61d3e7097cddbc on
‘master’ unwillingly triggered a rebuild of 2K packages.

It’s too late to revert (they’ve been built anyway), but I’ve merged
‘master’ in ‘staging’ so they can be built there.

Which means I’ll merge ‘staging’ in ‘master’ tomorrow morning if nothing
has collapsed by then.  :-)

Ludo’.



Re: Merging ‘staging’?

2022-06-12 Thread Ludovic Courtès
Hi,

Thiago Jung Bauermann  skribis:

> Sorry for the delay. I've built some packages from the staging branch on
> powerpc64le-linux (including Emacs, which brings in a lot of stuff) and
> it seems good.

That’s good news, thanks for testing!

Ludo’.



Re: Merging ‘staging’?

2022-06-11 Thread Thiago Jung Bauermann


Hello,

Ludovic Courtès  writes:

> ‘guix weather -s i686-linux’ says 89% (which is underestimated, because
> it wrongfully checks for all the packages, including unsupported
> packages), which sounds good.
>
> We have to check for AArch64 & co.  Any takers?

Sorry for the delay. I've built some packages from the staging branch on
powerpc64le-linux (including Emacs, which brings in a lot of stuff) and
it seems good.

-- 
Thanks
Thiago



Re: Merging ‘staging’?

2022-06-11 Thread Efraim Flashner



On June 11, 2022 9:53:14 AM UTC, "Ludovic Courtès"  wrote:
>Hi,
>
>Efraim Flashner  skribis:
>
>> My main concern is that so few of the missing items are queued to be
>> built.
>
>I wonder if that info is accurate.
>
>For instance, https://ci.guix.gnu.org/build/716980/details shows the
>derivation for /gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3.
>It is queued, just not being processed (yet).
>
>> I've restarted some of them manually, but we're going to need to tell
>> cuirass to rebuild those packages.
>
>It went from 17.1% to 17.3% in two days, even though the AArch64
>machines have been busy all along it seems.  Maybe they’ve been
>processing the backlog that had accumulated on ‘master’ rather than the
>things we care about.
>
>--8<---cut here---start->8---
>$ ./pre-inst-env  guix weather -s aarch64-linux -c200
>computing 18,932 package derivations for aarch64-linux...
>looking for 19,704 store items on https://ci.guix.gnu.org...
>https://ci.guix.gnu.org
>  17.3% substitutes available (3,410 out of 19,704)
>  at least 22,359.4 MiB of nars (compressed)
>  24,158.2 MiB on disk (uncompressed)
>  0.008 seconds per request (127.0 seconds in total)
>  128.7 requests per second
>
>  0.0% (0 out of 16,294) of the missing items are queued
>  at least 1,000 queued builds
>  powerpc64le-linux: 987 (98.7%)
>  aarch64-linux: 12 (1.2%)
>  x86_64-linux: 1 (.1%)
>  build rate: 40.03 builds per hour
>  x86_64-linux: 16.83 builds per hour
>  aarch64-linux: 11.23 builds per hour
>  i686-linux: 9.03 builds per hour
>  powerpc64le-linux: 3.04 builds per hour
>1496 packages are missing from 'https://ci.guix.gnu.org' for 'aarch64-linux', 
>among which:
>  14236libxft@2.3.3
> /gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
>  12379nghttp2@1.44.0  
> /gnu/store/4cs553aj8x1xbavfzfdh4gn5idf5ij81-nghttp2-1.44.0-lib 
> /gnu/store/4phvjrp12n8bfpncyp89414q2pa4d46q-nghttp2-1.44.0 
>  10066icu4c@69.1  
> /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
>  7723 jbig2dec@0.19   
> /gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
>  6989 ninja@1.10.2
> /gnu/store/jdq3xrcj0am4j698xr81hwgj8skcn7xq-ninja-1.10.2 
>  6927 python-libxml2@2.9.12   
> /gnu/store/swgvpb6pb6nf84a08m8flmy7fjhxwvni-python-libxml2-2.9.12 
>  6924 mallard-ducktype@1.0.2  
> /gnu/store/662lari1slhz56q8333sdczq1av9f799-mallard-ducktype-1.0.2 
>  6552 libxt@1.2.1 
> /gnu/store/rqqkla9kqjq3182gdqynkq2jb4jf2bl5-libxt-1.2.1-doc 
> /gnu/store/82pzzkny0gwdfkfa5zvnyy4vbzxqnwyy-libxt-1.2.1 
>  6503 python-fonttools@4.28.5 
> /gnu/store/vp3g1ddrv067gay8zk1xd7c0kcgq1cv5-python-fonttools-4.28.5 
>  5120 python-wheel@0.37.0 
> /gnu/store/0rmkfrmrjsvf2hvl9px7h3g91zypmycz-python-wheel-0.37.0 
>  5113 python-flit-core-bootstrap@3.5.1
> /gnu/store/pcwmq56r3wb8wdv8blj8ajfsmvcj6wm1-python-flit-core-bootstrap-3.5.1 
>  4555 openblas@0.3.20 
> /gnu/store/wc8cbkcvgqkdf65fj7drb82plkf8igyw-openblas-0.3.20 
>  4231 libxfixes@6.0.0 
> /gnu/store/h70i9gyvif5kzpw7a8bdaxhw7sqiv7wz-libxfixes-6.0.0 
>  4041 python-markupsafe@2.0.1 
> /gnu/store/1k63s700a1sy3pvcms0kax7csqsjzxdr-python-markupsafe-2.0.1 
>  3788 libxrandr@1.5.2 
> /gnu/store/9gqkb75sr60lxic9a3nx543a6wxz0866-libxrandr-1.5.2 
>  3563 eudev@3.2.11
> /gnu/store/nvdlxv30l6f06nclls7j6zi4s8cm1jnc-eudev-3.2.11 
> /gnu/store/5ja9bj947iir15ypmqzrwxk04w6vhiyz-eudev-3.2.11-static 
>  3360 libpsl@0.21.1   
> /gnu/store/5jyxcfsgk0hsj6rzfsz84wnap6bn57ka-libpsl-0.21.1 
>  3329 libevent@2.1.12 
> /gnu/store/28iivyxgqvnp2alx2fyqqx18md7pwja8-libevent-2.1.12-bin 
> /gnu/store/cvvsdqkcb8z78bh7rhzirv2n4sb4svgg-libevent-2.1.12 
>  3320 libxkbfile@1.1.0
> /gnu/store/kk0xk8j8ffgxvfs5kz1vxp2ipz28xwh4-libxkbfile-1.1.0 
>  3297 xcb-util@0.4.0  
> /gnu/store/n2x5kvy5zxbnh28ak7rbvw6brp97z1c9-xcb-util-0.4.0 
>  3279 xcb-util-wm@0.4.1   
> /gnu/store/hhasmz0l5f10f9nfg0nx378c7bcgf9v0-xcb-util-wm-0.4.1 
>  3263 libxres@1.2.1   
> /gnu/store/5mpjhsyq2bbri006g7lh6h82239s8fb8-libxres-1.2.1 
>  3215 python-flit-core@3.5.1  
> /gnu/store/zldar3i40q3l40fypm8n6fh3kpddxhzq-python-flit-core-3.5.1 
>  3213 python-pytz@2022.1  
> /gnu/store/byyn7jzs85r2n11hdbk98dk2ikb74z9n-python-pytz-2022.1 
>  3205 python-pycparser@2.21   
> /gnu/store/4a5l67ama5msmd2qi8w7mvlq64b04znb-python-pycparser-2.21-doc 
> /gnu/store/98g8chr9pcxyz0czp9fq2w3q8rm2sn14-python-pycparser-2.21 
>  3201 python-idna@3.3 
> /gnu/store/fj5sc68s4g0rh5gycnzvx2nkmviphfyb-python-idna-3.3 
>  3170 python-pretend@1.0.9
> /gnu/store/ksyw3ffmmhizh25ql0jp7yfv5llg55km-python-pretend-1.0.9 
>  3132 python-semantic-version@2.8.5   
> /gnu/store/0g95jbn19fwr52f5l5j1ibv7h9g7dsyl-python-semantic-version-2.8.5 
>  3127 python-asn1crypto@1.4.0 
> /gnu/store/12xs95mcz9v9swwyinsagfdn0qa74rgw-python-asn1crypto-1.4.0 
>  3126 python-cryptography-vectors@3.4.8   
> /gnu/store/5ww2a12dzyq8fbazwkqhn26wv5ahk1xj-python-cryptography-vectors-3.4.8 
>  2754 

Re: Merging ‘staging’?

2022-06-11 Thread Tom Fitzhenry


Ludovic Courtès  writes:
> It went from 17.1% to 17.3% in two days, even though the AArch64
> machines have been busy all along it seems.  Maybe they’ve been
> processing the backlog that had accumulated on ‘master’ rather than the
> things we care about.

Per https://issues.guix.gnu.org/55848, it looks like
grunewald/kreuzberg/pankow are busy processing jobs, but each time
failing after 15 minutes with the same network-level error.

> Objections?

SGTM!



Re: Merging ‘staging’?

2022-06-11 Thread Ludovic Courtès
Hi,

Efraim Flashner  skribis:

> My main concern is that so few of the missing items are queued to be
> built.

I wonder if that info is accurate.

For instance, https://ci.guix.gnu.org/build/716980/details shows the
derivation for /gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3.
It is queued, just not being processed (yet).

> I've restarted some of them manually, but we're going to need to tell
> cuirass to rebuild those packages.

It went from 17.1% to 17.3% in two days, even though the AArch64
machines have been busy all along it seems.  Maybe they’ve been
processing the backlog that had accumulated on ‘master’ rather than the
things we care about.

--8<---cut here---start->8---
$ ./pre-inst-env  guix weather -s aarch64-linux -c200
computing 18,932 package derivations for aarch64-linux...
looking for 19,704 store items on https://ci.guix.gnu.org...
https://ci.guix.gnu.org
  17.3% substitutes available (3,410 out of 19,704)
  at least 22,359.4 MiB of nars (compressed)
  24,158.2 MiB on disk (uncompressed)
  0.008 seconds per request (127.0 seconds in total)
  128.7 requests per second

  0.0% (0 out of 16,294) of the missing items are queued
  at least 1,000 queued builds
  powerpc64le-linux: 987 (98.7%)
  aarch64-linux: 12 (1.2%)
  x86_64-linux: 1 (.1%)
  build rate: 40.03 builds per hour
  x86_64-linux: 16.83 builds per hour
  aarch64-linux: 11.23 builds per hour
  i686-linux: 9.03 builds per hour
  powerpc64le-linux: 3.04 builds per hour
1496 packages are missing from 'https://ci.guix.gnu.org' for 'aarch64-linux', 
among which:
  14236 libxft@2.3.3
/gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
  12379 nghttp2@1.44.0  
/gnu/store/4cs553aj8x1xbavfzfdh4gn5idf5ij81-nghttp2-1.44.0-lib 
/gnu/store/4phvjrp12n8bfpncyp89414q2pa4d46q-nghttp2-1.44.0 
  10066 icu4c@69.1  /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
  7723  jbig2dec@0.19   
/gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
  6989  ninja@1.10.2
/gnu/store/jdq3xrcj0am4j698xr81hwgj8skcn7xq-ninja-1.10.2 
  6927  python-libxml2@2.9.12   
/gnu/store/swgvpb6pb6nf84a08m8flmy7fjhxwvni-python-libxml2-2.9.12 
  6924  mallard-ducktype@1.0.2  
/gnu/store/662lari1slhz56q8333sdczq1av9f799-mallard-ducktype-1.0.2 
  6552  libxt@1.2.1 
/gnu/store/rqqkla9kqjq3182gdqynkq2jb4jf2bl5-libxt-1.2.1-doc 
/gnu/store/82pzzkny0gwdfkfa5zvnyy4vbzxqnwyy-libxt-1.2.1 
  6503  python-fonttools@4.28.5 
/gnu/store/vp3g1ddrv067gay8zk1xd7c0kcgq1cv5-python-fonttools-4.28.5 
  5120  python-wheel@0.37.0 
/gnu/store/0rmkfrmrjsvf2hvl9px7h3g91zypmycz-python-wheel-0.37.0 
  5113  python-flit-core-bootstrap@3.5.1
/gnu/store/pcwmq56r3wb8wdv8blj8ajfsmvcj6wm1-python-flit-core-bootstrap-3.5.1 
  4555  openblas@0.3.20 
/gnu/store/wc8cbkcvgqkdf65fj7drb82plkf8igyw-openblas-0.3.20 
  4231  libxfixes@6.0.0 
/gnu/store/h70i9gyvif5kzpw7a8bdaxhw7sqiv7wz-libxfixes-6.0.0 
  4041  python-markupsafe@2.0.1 
/gnu/store/1k63s700a1sy3pvcms0kax7csqsjzxdr-python-markupsafe-2.0.1 
  3788  libxrandr@1.5.2 
/gnu/store/9gqkb75sr60lxic9a3nx543a6wxz0866-libxrandr-1.5.2 
  3563  eudev@3.2.11
/gnu/store/nvdlxv30l6f06nclls7j6zi4s8cm1jnc-eudev-3.2.11 
/gnu/store/5ja9bj947iir15ypmqzrwxk04w6vhiyz-eudev-3.2.11-static 
  3360  libpsl@0.21.1   
/gnu/store/5jyxcfsgk0hsj6rzfsz84wnap6bn57ka-libpsl-0.21.1 
  3329  libevent@2.1.12 
/gnu/store/28iivyxgqvnp2alx2fyqqx18md7pwja8-libevent-2.1.12-bin 
/gnu/store/cvvsdqkcb8z78bh7rhzirv2n4sb4svgg-libevent-2.1.12 
  3320  libxkbfile@1.1.0
/gnu/store/kk0xk8j8ffgxvfs5kz1vxp2ipz28xwh4-libxkbfile-1.1.0 
  3297  xcb-util@0.4.0  
/gnu/store/n2x5kvy5zxbnh28ak7rbvw6brp97z1c9-xcb-util-0.4.0 
  3279  xcb-util-wm@0.4.1   
/gnu/store/hhasmz0l5f10f9nfg0nx378c7bcgf9v0-xcb-util-wm-0.4.1 
  3263  libxres@1.2.1   
/gnu/store/5mpjhsyq2bbri006g7lh6h82239s8fb8-libxres-1.2.1 
  3215  python-flit-core@3.5.1  
/gnu/store/zldar3i40q3l40fypm8n6fh3kpddxhzq-python-flit-core-3.5.1 
  3213  python-pytz@2022.1  
/gnu/store/byyn7jzs85r2n11hdbk98dk2ikb74z9n-python-pytz-2022.1 
  3205  python-pycparser@2.21   
/gnu/store/4a5l67ama5msmd2qi8w7mvlq64b04znb-python-pycparser-2.21-doc 
/gnu/store/98g8chr9pcxyz0czp9fq2w3q8rm2sn14-python-pycparser-2.21 
  3201  python-idna@3.3 
/gnu/store/fj5sc68s4g0rh5gycnzvx2nkmviphfyb-python-idna-3.3 
  3170  python-pretend@1.0.9
/gnu/store/ksyw3ffmmhizh25ql0jp7yfv5llg55km-python-pretend-1.0.9 
  3132  python-semantic-version@2.8.5   
/gnu/store/0g95jbn19fwr52f5l5j1ibv7h9g7dsyl-python-semantic-version-2.8.5 
  3127  python-asn1crypto@1.4.0 
/gnu/store/12xs95mcz9v9swwyinsagfdn0qa74rgw-python-asn1crypto-1.4.0 
  3126  python-cryptography-vectors@3.4.8   
/gnu/store/5ww2a12dzyq8fbazwkqhn26wv5ahk1xj-python-cryptography-vectors-3.4.8 
  2754  python-pyyaml@6.0   
/gnu/store/pxxnkbrljhizypjpfdl27djjzsyflnic-python-pyyaml-6.0 
  2542  python-dnspython@2.1.0  
/gnu/store/m66znr3nwqb7w132yiw9iznzqif2z84r-python-dnspython-2.1.0 
  2539  

Re: Merging ‘staging’?

2022-06-11 Thread pelzflorian (Florian Pelz)
On Thu, Jun 09, 2022 at 09:02:14PM +0200, pelzflorian (Florian Pelz) wrote:
> I will try again with staging at 091eb323ba27.  But it will take a
> long time; feel free to proceed regardless.

All my manifest built and runs on rock64 aarch64.  Thank you all!

Regards,
Florian



Re: Merging ‘staging’?

2022-06-10 Thread Efraim Flashner
On Wed, Jun 08, 2022 at 11:24:22PM +0200, Ludovic Courtès wrote:
> Hey Efraim,
> 
> Efraim Flashner  skribis:
> 
> > On Mon, Jun 06, 2022 at 11:17:47PM +0200, Ludovic Courtès wrote:
> >> Hi,
> >> 
> >> Ludovic Courtès  skribis:
> >> 
> >> ‘guix weather -s i686-linux’ says 89% (which is underestimated, because
> >> it wrongfully checks for all the packages, including unsupported
> >> packages), which sounds good.
> >> 
> >> We have to check for AArch64 & co.  Any takers?
> >> 
> >> Overall it seems to me we should be able to merge ‘staging’ within a
> >> couple of days.  Thoughts?
> >
> > Currently ci.guix.gnu.org isn't building any of the aarch64 packages,
> > and it looks like it hasn't since about May 26th. Once those start
> > building again I expect we'll see it's doing well. Minus perhaps the
> > rust stuff since I'm not sure the offload build machines can handle
> > that.
> 
> Hmm the situation is bad indeed:
> 
> --8<---cut here---start->8---
> $ ./pre-inst-env  guix weather -s aarch64-linux -c200
> computing 18,932 package derivations for aarch64-linux...
> looking for 19,704 store items on https://ci.guix.gnu.org...
> https://ci.guix.gnu.org
>   17.1% substitutes available (3,360 out of 19,704)
>   at least 22,303.1 MiB of nars (compressed)
>   24,029.2 MiB on disk (uncompressed)
>   0.006 seconds per request (114.2 seconds in total)
>   172.5 requests per second
> 
>   5.3% (870 out of 16,344) of the missing items are queued
>   at least 1,000 queued builds
>   aarch64-linux: 840 (84.0%)
>   x86_64-linux: 84 (8.4%)
>   powerpc64le-linux: 72 (7.2%)
>   armhf-linux: 4 (.4%)
>   build rate: 143.62 builds per hour
>   i686-linux: 99.96 builds per hour
>   x86_64-linux: 31.51 builds per hour
>   aarch64-linux: 25.66 builds per hour
>   powerpc64le-linux: 3.14 builds per hour
> 729 packages are missing from 'https://ci.guix.gnu.org' for 'aarch64-linux', 
> among which:
>   14236   libxft@2.3.3
> /gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
>   10066   icu4c@69.1  
> /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
>   7723jbig2dec@0.19   
> /gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
>   6552libxt@1.2.1 
> /gnu/store/rqqkla9kqjq3182gdqynkq2jb4jf2bl5-libxt-1.2.1-doc 
> /gnu/store/82pzzkny0gwdfkfa5zvnyy4vbzxqnwyy-libxt-1.2.1 
>   4555openblas@0.3.20 
> /gnu/store/wc8cbkcvgqkdf65fj7drb82plkf8igyw-openblas-0.3.20 
>   4231libxfixes@6.0.0 
> /gnu/store/h70i9gyvif5kzpw7a8bdaxhw7sqiv7wz-libxfixes-6.0.0 
>   3788libxrandr@1.5.2 
> /gnu/store/9gqkb75sr60lxic9a3nx543a6wxz0866-libxrandr-1.5.2 
>   3320libxkbfile@1.1.0
> /gnu/store/kk0xk8j8ffgxvfs5kz1vxp2ipz28xwh4-libxkbfile-1.1.0 
>   3297xcb-util@0.4.0  
> /gnu/store/n2x5kvy5zxbnh28ak7rbvw6brp97z1c9-xcb-util-0.4.0 
>   3279xcb-util-wm@0.4.1   
> /gnu/store/hhasmz0l5f10f9nfg0nx378c7bcgf9v0-xcb-util-wm-0.4.1 
>   3263libxres@1.2.1   
> /gnu/store/5mpjhsyq2bbri006g7lh6h82239s8fb8-libxres-1.2.1 
>   1383xprop@1.2.5 
> /gnu/store/61x7vscvrz8zqda73dlq5x35anhz8f8k-xprop-1.2.5 
>   1381xdg-user-dirs@0.17  
> /gnu/store/gab0ycim14xflaxzp8arg8hfapx73z3l-xdg-user-dirs-0.17 
>   1368docbook2x@0.8.8 
> /gnu/store/cclsrji5p14aj8bg3kmhkj5532k7l5m4-docbook2x-0.8.8 
>   1286perl-net-ssleay@1.92
> /gnu/store/cn77pic9249j7h002a9701v0xzpi28yk-perl-net-ssleay-1.92 
>   1179emacs-minimal@28.1  
> /gnu/store/c768r60pj3f2af2ysbjqdnlxwpgwmmpp-emacs-minimal-28.1 
>   1121go-std@1.17.9   
> /gnu/store/fqdkzg6nlzhj93ysjqxrmqsz8srq2l9r-go-std-1.17.9 
>   1021ruby-activemodel@6.1.3  
> /gnu/store/12gm34s68siqdfagc16ldjdp03klc5xw-ruby-activemodel-6.1.3 
>   1005ruby-shoulda-matchers@2.8.0 
> /gnu/store/zcbdlp0qqb3axsmfvd8rdmn70kwdiw11-ruby-shoulda-matchers-2.8.0 
>   1001ruby-webmock@2.3.2  
> /gnu/store/jfrng4404avghlkyxw2y5hqpdn8v3mih-ruby-webmock-2.3.2 
>976ruby-sawyer@0.8.2   
> /gnu/store/w0aaby4lvdcz0dr5006mx2ik6l1v2s5l-ruby-sawyer-0.8.2 
>634jikes@1.22  
> /gnu/store/nwkrhbzbd8s9dysgik58af93b2kjm4w3-jikes-1.22 
>622nss-certs@3.71  
> /gnu/store/01k1v00g7vc7n90y5yr9bacrnr3ml46p-nss-certs-3.71 
>576texlive-latex-cmap@59745
> /gnu/store/zdws0y31a1hhgxb80qc2lx4smmp9pmgc-texlive-latex-cmap-59745 
>558libxft@2.3.3
> /gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
>491xmodmap@1.0.10  
> /gnu/store/rm8ww2flazqypwiagizbbsslv0g86h67-xmodmap-1.0.10 
>489ucd@14.0.0  
> /gnu/store/f3hkrqnsj9dadliaixrlh17z4c9bsjfl-ucd-14.0.0 
>400jbig2dec@0.19   
> /gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
>317icu4c@69.1  
> /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
>309libxt@1.2.1 
> 

Re: Merging ‘staging’?

2022-06-10 Thread pelzflorian (Florian Pelz)
On Thu, Jun 09, 2022 at 09:02:14PM +0200, pelzflorian (Florian Pelz) wrote:
> I will try again with staging at 091eb323ba27.

llvm@11 was built successfully.  Sorry for the noise.

Regards,
Florian



Re: Merging ‘staging’?

2022-06-09 Thread pelzflorian (Florian Pelz)
On Thu, Jun 09, 2022 at 08:41:21PM +0300, Efraim Flashner wrote:
> I know I've built llvm@11 and mesa on aarch64 hardware for staging.

Oh I see I'm missing the last merge; I'm still at commit b422687cbd.

I will try again with staging at 091eb323ba27.  But it will take a
long time; feel free to proceed regardless.

Regards,
Florian



Re: Merging ‘staging’?

2022-06-09 Thread Efraim Flashner
On Thu, Jun 09, 2022 at 07:19:30PM +0200, pelzflorian (Florian Pelz) wrote:
> On Mon, Jun 06, 2022 at 11:17:47PM +0200, Ludovic Courtès wrote:
> > We have to check for AArch64 & co.  Any takers?
> > 
> > Overall it seems to me we should be able to merge ‘staging’ within a
> > couple of days.  Thoughts?
> > 
> > Ludo’.
> > 
> 
> I mostly succeeded in updating my rock64 aarch64 machine
> 
> guix time-machine --branch=staging -- package -m 
> ~/keep/guixsd/rock64-manifest.scm
> 
> but building llvm@11 fails (needed for mesa, I think).  The log ends with:
> 
> [...]
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> [ 97%] Built target verify-uselistorder
> make  -f tools/yaml2obj/CMakeFiles/yaml2obj.dir/build.make 
> tools/yaml2obj/CMakeFiles/yaml2obj.dir/depend
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
> /gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
> cmake_depends "Unix Makefiles" 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/tools/yaml2obj 
> /tmp/guix-build-llvm-11.0.0.drv-0/build 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj/CMakeFiles/yaml2obj.dir/DependInfo.cmake
>  --color=
> Consolidate compiler generated dependencies of target yaml2obj
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> make  -f tools/yaml2obj/CMakeFiles/yaml2obj.dir/build.make 
> tools/yaml2obj/CMakeFiles/yaml2obj.dir/build
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> [ 98%] Linking CXX executable ../../bin/yaml2obj
> cd /tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj && 
> /gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
> cmake_link_script CMakeFiles/yaml2obj.dir/link.txt --verbose=1
> /gnu/store/dbcbcaxq20kbkhh2mr8k98qfnymq22kp-gcc-10.3.0/bin/c++  -fPIC 
> -fvisibility-inlines-hidden -Werror=date-time -Wall -Wextra 
> -Wno-unused-parameter -Wwrite-strings -Wcast-qual 
> -Wno-missing-field-initializers -pedantic -Wno-long-long 
> -Wimplicit-fallthrough -Wno-maybe-uninitialized -Wno-class-memaccess 
> -Wno-redundant-move -Wno-noexcept-type -Wdelete-non-virtual-dtor -Wno-comment 
> -ffunction-sections -fdata-sections -O3 -DNDEBUG  -Wl,-allow-shlib-undefined  
> -Wl,-O3 -Wl,--gc-sections CMakeFiles/yaml2obj.dir/yaml2obj.cpp.o -o 
> ../../bin/yaml2obj  
> -Wl,-rpath,/tmp/guix-build-llvm-11.0.0.drv-0/build/lib 
> ../../lib/libLLVMObjectYAML.so.11 -lpthread ../../lib/libLLVMSupport.so.11 
> -Wl,-rpath-link,/tmp/guix-build-llvm-11.0.0.drv-0/build/lib 
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> [ 98%] Built target yaml2obj
> make  -f examples/Bye/CMakeFiles/Bye.dir/build.make 
> examples/Bye/CMakeFiles/Bye.dir/depend
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
> /gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
> cmake_depends "Unix Makefiles" 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/examples/Bye 
> /tmp/guix-build-llvm-11.0.0.drv-0/build 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/examples/Bye 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/examples/Bye/CMakeFiles/Bye.dir/DependInfo.cmake
>  --color=
> Consolidate compiler generated dependencies of target Bye
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> make  -f examples/Bye/CMakeFiles/Bye.dir/build.make 
> examples/Bye/CMakeFiles/Bye.dir/build
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> make[2]: Nothing to be done for 'examples/Bye/CMakeFiles/Bye.dir/build'.
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> [ 98%] Built target Bye
> make  -f unittests/Passes/CMakeFiles/TestPlugin.dir/build.make 
> unittests/Passes/CMakeFiles/TestPlugin.dir/depend
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
> /gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
> cmake_depends "Unix Makefiles" 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
> /tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/unittests/Passes 
> /tmp/guix-build-llvm-11.0.0.drv-0/build 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/unittests/Passes 
> /tmp/guix-build-llvm-11.0.0.drv-0/build/unittests/Passes/CMakeFiles/TestPlugin.dir/DependInfo.cmake
>  --color=
> Consolidate compiler generated dependencies of target TestPlugin
> make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> make  -f unittests/Passes/CMakeFiles/TestPlugin.dir/build.make 
> unittests/Passes/CMakeFiles/TestPlugin.dir/build
> make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
> make[2]: Nothing to 

Re: Merging ‘staging’?

2022-06-09 Thread pelzflorian (Florian Pelz)
On Mon, Jun 06, 2022 at 11:17:47PM +0200, Ludovic Courtès wrote:
> We have to check for AArch64 & co.  Any takers?
> 
> Overall it seems to me we should be able to merge ‘staging’ within a
> couple of days.  Thoughts?
> 
> Ludo’.
> 

I mostly succeeded in updating my rock64 aarch64 machine

guix time-machine --branch=staging -- package -m 
~/keep/guixsd/rock64-manifest.scm

but building llvm@11 fails (needed for mesa, I think).  The log ends with:

[...]
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
[ 97%] Built target verify-uselistorder
make  -f tools/yaml2obj/CMakeFiles/yaml2obj.dir/build.make 
tools/yaml2obj/CMakeFiles/yaml2obj.dir/depend
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
/gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
cmake_depends "Unix Makefiles" 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/tools/yaml2obj 
/tmp/guix-build-llvm-11.0.0.drv-0/build 
/tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj 
/tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj/CMakeFiles/yaml2obj.dir/DependInfo.cmake
 --color=
Consolidate compiler generated dependencies of target yaml2obj
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
make  -f tools/yaml2obj/CMakeFiles/yaml2obj.dir/build.make 
tools/yaml2obj/CMakeFiles/yaml2obj.dir/build
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
[ 98%] Linking CXX executable ../../bin/yaml2obj
cd /tmp/guix-build-llvm-11.0.0.drv-0/build/tools/yaml2obj && 
/gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
cmake_link_script CMakeFiles/yaml2obj.dir/link.txt --verbose=1
/gnu/store/dbcbcaxq20kbkhh2mr8k98qfnymq22kp-gcc-10.3.0/bin/c++  -fPIC 
-fvisibility-inlines-hidden -Werror=date-time -Wall -Wextra 
-Wno-unused-parameter -Wwrite-strings -Wcast-qual 
-Wno-missing-field-initializers -pedantic -Wno-long-long -Wimplicit-fallthrough 
-Wno-maybe-uninitialized -Wno-class-memaccess -Wno-redundant-move 
-Wno-noexcept-type -Wdelete-non-virtual-dtor -Wno-comment -ffunction-sections 
-fdata-sections -O3 -DNDEBUG  -Wl,-allow-shlib-undefined  -Wl,-O3 
-Wl,--gc-sections CMakeFiles/yaml2obj.dir/yaml2obj.cpp.o -o ../../bin/yaml2obj  
-Wl,-rpath,/tmp/guix-build-llvm-11.0.0.drv-0/build/lib 
../../lib/libLLVMObjectYAML.so.11 -lpthread ../../lib/libLLVMSupport.so.11 
-Wl,-rpath-link,/tmp/guix-build-llvm-11.0.0.drv-0/build/lib 
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
[ 98%] Built target yaml2obj
make  -f examples/Bye/CMakeFiles/Bye.dir/build.make 
examples/Bye/CMakeFiles/Bye.dir/depend
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
/gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
cmake_depends "Unix Makefiles" 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/examples/Bye 
/tmp/guix-build-llvm-11.0.0.drv-0/build 
/tmp/guix-build-llvm-11.0.0.drv-0/build/examples/Bye 
/tmp/guix-build-llvm-11.0.0.drv-0/build/examples/Bye/CMakeFiles/Bye.dir/DependInfo.cmake
 --color=
Consolidate compiler generated dependencies of target Bye
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
make  -f examples/Bye/CMakeFiles/Bye.dir/build.make 
examples/Bye/CMakeFiles/Bye.dir/build
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
make[2]: Nothing to be done for 'examples/Bye/CMakeFiles/Bye.dir/build'.
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
[ 98%] Built target Bye
make  -f unittests/Passes/CMakeFiles/TestPlugin.dir/build.make 
unittests/Passes/CMakeFiles/TestPlugin.dir/depend
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
cd /tmp/guix-build-llvm-11.0.0.drv-0/build && 
/gnu/store/6lfyb68pdy0b1vggzbvw8grkv2ws6vhl-cmake-minimal-3.21.4/bin/cmake -E 
cmake_depends "Unix Makefiles" 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src 
/tmp/guix-build-llvm-11.0.0.drv-0/llvm-11.0.0.src/unittests/Passes 
/tmp/guix-build-llvm-11.0.0.drv-0/build 
/tmp/guix-build-llvm-11.0.0.drv-0/build/unittests/Passes 
/tmp/guix-build-llvm-11.0.0.drv-0/build/unittests/Passes/CMakeFiles/TestPlugin.dir/DependInfo.cmake
 --color=
Consolidate compiler generated dependencies of target TestPlugin
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
make  -f unittests/Passes/CMakeFiles/TestPlugin.dir/build.make 
unittests/Passes/CMakeFiles/TestPlugin.dir/build
make[2]: Entering directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
make[2]: Nothing to be done for 
'unittests/Passes/CMakeFiles/TestPlugin.dir/build'.
make[2]: Leaving directory '/tmp/guix-build-llvm-11.0.0.drv-0/build'
[ 98%] Built target TestPlugin
make  -f unittests/Support/DynamicLibrary/CMakeFiles/SecondLib.dir/build.make 

Re: Merging ‘staging’?

2022-06-08 Thread Ludovic Courtès
Hey Efraim,

Efraim Flashner  skribis:

> On Mon, Jun 06, 2022 at 11:17:47PM +0200, Ludovic Courtès wrote:
>> Hi,
>> 
>> Ludovic Courtès  skribis:
>> 
>> ‘guix weather -s i686-linux’ says 89% (which is underestimated, because
>> it wrongfully checks for all the packages, including unsupported
>> packages), which sounds good.
>> 
>> We have to check for AArch64 & co.  Any takers?
>> 
>> Overall it seems to me we should be able to merge ‘staging’ within a
>> couple of days.  Thoughts?
>
> Currently ci.guix.gnu.org isn't building any of the aarch64 packages,
> and it looks like it hasn't since about May 26th. Once those start
> building again I expect we'll see it's doing well. Minus perhaps the
> rust stuff since I'm not sure the offload build machines can handle
> that.

Hmm the situation is bad indeed:

--8<---cut here---start->8---
$ ./pre-inst-env  guix weather -s aarch64-linux -c200
computing 18,932 package derivations for aarch64-linux...
looking for 19,704 store items on https://ci.guix.gnu.org...
https://ci.guix.gnu.org
  17.1% substitutes available (3,360 out of 19,704)
  at least 22,303.1 MiB of nars (compressed)
  24,029.2 MiB on disk (uncompressed)
  0.006 seconds per request (114.2 seconds in total)
  172.5 requests per second

  5.3% (870 out of 16,344) of the missing items are queued
  at least 1,000 queued builds
  aarch64-linux: 840 (84.0%)
  x86_64-linux: 84 (8.4%)
  powerpc64le-linux: 72 (7.2%)
  armhf-linux: 4 (.4%)
  build rate: 143.62 builds per hour
  i686-linux: 99.96 builds per hour
  x86_64-linux: 31.51 builds per hour
  aarch64-linux: 25.66 builds per hour
  powerpc64le-linux: 3.14 builds per hour
729 packages are missing from 'https://ci.guix.gnu.org' for 'aarch64-linux', 
among which:
  14236 libxft@2.3.3
/gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
  10066 icu4c@69.1  /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
  7723  jbig2dec@0.19   
/gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
  6552  libxt@1.2.1 
/gnu/store/rqqkla9kqjq3182gdqynkq2jb4jf2bl5-libxt-1.2.1-doc 
/gnu/store/82pzzkny0gwdfkfa5zvnyy4vbzxqnwyy-libxt-1.2.1 
  4555  openblas@0.3.20 
/gnu/store/wc8cbkcvgqkdf65fj7drb82plkf8igyw-openblas-0.3.20 
  4231  libxfixes@6.0.0 
/gnu/store/h70i9gyvif5kzpw7a8bdaxhw7sqiv7wz-libxfixes-6.0.0 
  3788  libxrandr@1.5.2 
/gnu/store/9gqkb75sr60lxic9a3nx543a6wxz0866-libxrandr-1.5.2 
  3320  libxkbfile@1.1.0
/gnu/store/kk0xk8j8ffgxvfs5kz1vxp2ipz28xwh4-libxkbfile-1.1.0 
  3297  xcb-util@0.4.0  
/gnu/store/n2x5kvy5zxbnh28ak7rbvw6brp97z1c9-xcb-util-0.4.0 
  3279  xcb-util-wm@0.4.1   
/gnu/store/hhasmz0l5f10f9nfg0nx378c7bcgf9v0-xcb-util-wm-0.4.1 
  3263  libxres@1.2.1   
/gnu/store/5mpjhsyq2bbri006g7lh6h82239s8fb8-libxres-1.2.1 
  1383  xprop@1.2.5 /gnu/store/61x7vscvrz8zqda73dlq5x35anhz8f8k-xprop-1.2.5 
  1381  xdg-user-dirs@0.17  
/gnu/store/gab0ycim14xflaxzp8arg8hfapx73z3l-xdg-user-dirs-0.17 
  1368  docbook2x@0.8.8 
/gnu/store/cclsrji5p14aj8bg3kmhkj5532k7l5m4-docbook2x-0.8.8 
  1286  perl-net-ssleay@1.92
/gnu/store/cn77pic9249j7h002a9701v0xzpi28yk-perl-net-ssleay-1.92 
  1179  emacs-minimal@28.1  
/gnu/store/c768r60pj3f2af2ysbjqdnlxwpgwmmpp-emacs-minimal-28.1 
  1121  go-std@1.17.9   
/gnu/store/fqdkzg6nlzhj93ysjqxrmqsz8srq2l9r-go-std-1.17.9 
  1021  ruby-activemodel@6.1.3  
/gnu/store/12gm34s68siqdfagc16ldjdp03klc5xw-ruby-activemodel-6.1.3 
  1005  ruby-shoulda-matchers@2.8.0 
/gnu/store/zcbdlp0qqb3axsmfvd8rdmn70kwdiw11-ruby-shoulda-matchers-2.8.0 
  1001  ruby-webmock@2.3.2  
/gnu/store/jfrng4404avghlkyxw2y5hqpdn8v3mih-ruby-webmock-2.3.2 
   976  ruby-sawyer@0.8.2   
/gnu/store/w0aaby4lvdcz0dr5006mx2ik6l1v2s5l-ruby-sawyer-0.8.2 
   634  jikes@1.22  /gnu/store/nwkrhbzbd8s9dysgik58af93b2kjm4w3-jikes-1.22 
   622  nss-certs@3.71  
/gnu/store/01k1v00g7vc7n90y5yr9bacrnr3ml46p-nss-certs-3.71 
   576  texlive-latex-cmap@59745
/gnu/store/zdws0y31a1hhgxb80qc2lx4smmp9pmgc-texlive-latex-cmap-59745 
   558  libxft@2.3.3
/gnu/store/2smylcjq5cppxcrj2mn229hlmh6bf7w4-libxft-2.3.3 
   491  xmodmap@1.0.10  
/gnu/store/rm8ww2flazqypwiagizbbsslv0g86h67-xmodmap-1.0.10 
   489  ucd@14.0.0  /gnu/store/f3hkrqnsj9dadliaixrlh17z4c9bsjfl-ucd-14.0.0 
   400  jbig2dec@0.19   
/gnu/store/lrpczm2m0agx0x5rp6kmn8c46mc85l35-jbig2dec-0.19 
   317  icu4c@69.1  /gnu/store/n9nx5rmnhdf3hdswhvns31diayq9vfq2-icu4c-69.1 
   309  libxt@1.2.1 
/gnu/store/rqqkla9kqjq3182gdqynkq2jb4jf2bl5-libxt-1.2.1-doc 
/gnu/store/82pzzkny0gwdfkfa5zvnyy4vbzxqnwyy-libxt-1.2.1 
   290  libxfixes@6.0.0 
/gnu/store/h70i9gyvif5kzpw7a8bdaxhw7sqiv7wz-libxfixes-6.0.0 
   272  libunwind-julia@1.3.1   
/gnu/store/i8c1hpdxdjb86vn4bqjiml5gad6mnrw5-libunwind-julia-1.3.1 
   268  ocaml@4.14.0
/gnu/store/zvz0xlwm7mvipr7c8q15fw7r6r25nn4s-ocaml-4.14.0 
   256  mailutils@3.15  
/gnu/store/6c5sd4z84p8mbck8b5vn4vd6s05mkbrc-mailutils-3.15 
   246  libxrandr@1.5.2 

Re: Merging ‘staging’?

2022-06-08 Thread Efraim Flashner
On Mon, Jun 06, 2022 at 11:17:47PM +0200, Ludovic Courtès wrote:
> Hi,
> 
> Ludovic Courtès  skribis:
> 
> ‘guix weather -s i686-linux’ says 89% (which is underestimated, because
> it wrongfully checks for all the packages, including unsupported
> packages), which sounds good.
> 
> We have to check for AArch64 & co.  Any takers?
> 
> Overall it seems to me we should be able to merge ‘staging’ within a
> couple of days.  Thoughts?

Currently ci.guix.gnu.org isn't building any of the aarch64 packages,
and it looks like it hasn't since about May 26th. Once those start
building again I expect we'll see it's doing well. Minus perhaps the
rust stuff since I'm not sure the offload build machines can handle
that.

-- 
Efraim Flashner  אפרים פלשנר
GPG key = A28B F40C 3E55 1372 662D  14F7 41AA E7DC CA3D 8351
Confidentiality cannot be guaranteed on emails sent or received unencrypted


signature.asc
Description: PGP signature


Merging ‘staging’?

2022-06-06 Thread Ludovic Courtès
Hi,

Ludovic Courtès  skribis:

> Thanks for the heads-up!  I think merging ‘staging’ should be
> top-priority.  People are welcome to upgrade/reconfigure from it with:
>
>   guix time-machine --branch=staging -- …
>
> Remaining things to check:
>
>   - [ ] system tests

I set up a jobset and we’re doing okay compared to ‘master’:

  https://ci.guix.gnu.org/jobset/staging-tests

The main regression is the timescaledb test, because timescaledb fails
its tests on that branch (help welcome!).

Then there’s what looks like a transient failure of the installation
test, which is more worrisome but not a regression I believe:

--8<---cut here---start->8---
Jun  5 17:44:46 localhost installer[252]: command ("guix" "system" "init" 
"--fallback" "--no-grafts" "--no-substitutes" "/mnt/etc/config.scm" "/mnt") 
succeeded 

conversation expecting pattern ((quote installation-complete))
Jun  5 17:44:46 localhost shepherd[1]: Service guix-daemon has been stopped. 

Jun  5 17:44:46 localhost shepherd[1]: Service guix-daemon has been started. 

Jun  5 17:44:46 localhost installer[252]: crashing due to uncaught exception: 
system-error ("umount" "~S: ~A" ("/remove" "Device or resource busy") (16)) 

Jun  5 17:44:47 localhost installer[252]: running form # 
("Unexpected problem") with 1 clients 

Jun  5 17:44:47 localhost installer[252]: form # 
("Unexpected problem"): client 20 replied #t 

/gnu/store/87wf3r1v18an9cj8d2jkh0240g07dqd6-shepherd-marionette.scm:1:1718: 
ERROR:
  1. :
  pattern: ((quote installation-complete))
  sexp: (contents-dialog (title "Unexpected problem") (text "The installer 
has encountered an unexpected problem. The backtrace is displayed below. You 
may choose to exit or create a dump archive.") (content "In 
./gnu/installer/steps.scm:\n   144:13 19 (run ((substitutes . #f) (network 
(select-technology . #< name: \"Wired\" type: \"ethernet\" 
powered?: #t connected?: #t>) (power-technology . #) (# . #<) 
???) ???) ???)\n   144:13 18 (run ((user #< name: \"root\" real-name: 
\"\" group: \"users\" password:  home-directory: \"/root\"> #< 
name: \"alice\" real-name: \"Alice\" group: \"users\" password:  ???) 
???) ???)\n   144:13 17 (run ((services #< name: \"GNOME\" 
type: desktop recommended?: #f snippet: ((service gnome-desktop-service-type)) 
packages: ()> #< name: \"Xfce\" ty???> ???) ???) ???)\n   
142:23 16 (run ((partition #< name: #f type: normal file-name: 
\"/dev/vda1\" disk-file-name: \"/dev/vda\" crypt-label: #f crypt-password: #f 
fs-type: ext4 bootable?: #t esp?:???> ???) ???) ???)\nIn 
./gnu/installer/newt/final.scm:\n   128:11 15 (run-final-page ((partition 
#< name: #f type: normal file-name: \"/dev/vda1\" 
disk-file-name: \"/dev/vda\" crypt-label: #f crypt-password: #f fs-type: ext4 
bootable???> ???) ???) ???)\n   101:21 14 (run-install-shell \"en_HK.utf8\" 
#:users _)\nIn ./gnu/installer/final.scm:\n191:5 13 (install-system 
\"en_HK.utf8\" #:users _)\n   113:13 12 (call-with-mnt-container #)\nIn 
gnu/build/linux-container.scm:\n   265:16 11 (run-container _ _ (mnt) _ 
# #:guest-uid _ 
#:guest-gid _)\nIn ./gnu/installer/final.scm:\n   222:13 10 (_)\nIn 
gnu/build/install.scm:\n270:5  9 (unmount-cow-store \"/mnt\" 
\"/tmp/guix-inst\")\nIn guix/build/syscalls.scm:\n588:8  8 (_ _ _ 
#:update-mtab? _)\nIn ice-9/boot-9.scm:\n  1685:16  7 (raise-exception _ 
#:continuable? _)\n  1780:13  6 (_ #< components: 
(#<> #< origin: \"umount\"> #< message: \"~S: 
~A\"> #< irritants: (\"/remove\" \"Device or resource busy\")> 
#<???>)\nIn ice-9/eval.scm:\n619:8  5 (_ #(#(#(#) system-error (\"umount\" \"~S: ~A\" (\"/remove\" \"Device or 
resource busy\") (16))) #> 
# ???))\n   626:19  4 (_ #(#(#(#) system-error (\"umount\" \"~S: ~A\" (\"/remove\" \"Device or 
resource busy\") (16))) #> 
# ???))\nIn ./gnu/installer/dump.scm:\n 54:4  3 (prepare-dump 
system-error (\"umount\" \"~S: ~A\" (\"/remove\" \"Device or resource busy\") 
(16)) #:result _)\nIn ice-9/ports.scm:\n   433:17  2 (call-with-output-file _ _ 
#:binary _ #:encoding _)\nIn ./gnu/installer/dump.scm:\n56:27  1 (_ 
#)\nIn unknown file:\n   0 (make-stack 
#t)\n./gnu/installer/dump.scm:58:36: In procedure umount: \"/remove\": Device 
or resource busy\n"))
Jun  5 17:44:47 localhost installer[196]: unmounting "/mnt/" 

Backtrace:

Re: Merging staging?

2019-01-10 Thread Ludovic Courtès
Hi,

Mark H Weaver  skribis:

> Mark H Weaver  writes:
>
>> Efraim Flashner  writes:
>>
>>> So I think I'd like to see a comparison on hydra of staging vs master
>>> and if it's good enough we go ahead and merge
>>
>> I just asked Hydra to produce another evaluation of 'staging'.  When it
>> has finished doing so, we'll know more.
>
> Hydra failed to create the evaluation, with the following error:
>
> hydra-eval-guile-jobs returned exit code 1:
> adding `/gnu/store/qcssqgcyk67v146c9n2byw1acnx7d693-git-export' to the load 
> path
> Backtrace:
> In ice-9/eval.scm:
> 619:8 19 (_ #(#(#)))
> In ice-9/command-line.scm:
>181:18 18 (_ #)
> In unknown file:
>   17 (eval (apply (module-ref (resolve-interface (?)) #) #) #)
> In /usr/local/bin/hydra-eval-guile-jobs:
>244:18 16 (eval-guile-jobs . _)
> In ice-9/eval.scm:
> 619:8 15 (_ #(#(#(#) # ?) ?))
>626:19 14 (_ #(#(#(#) # ?) ?))
> In guix/store.scm:
>   1698:24 13 (run-with-store _ _ #:guile-for-build _ #:system _ # _)
> In guix/channels.scm:
> 402:2 12 (_ _)
> 394:2 11 (_ _)
> 310:2 10 (_ _)
> In unknown file:
>9 (_ # # ?)
> In guix/gexp.scm:
> 702:2  8 (_ _)
> In guix/monads.scm:
> 471:9  7 (_ _)
> In guix/gexp.scm:
>572:13  6 (_ _)
> In guix/store.scm:
>1571:8  5 (_ _)
>   1594:38  4 (_ #)
> In guix/packages.scm:
>873:16  3 (cache! # # ?)
>   1195:22  2 (thunk)
>   1128:25  1 (bag->derivation # #< ?)
> In srfi/srfi-1.scm:
>592:17  0 (map1 (("source" # url: "?>) ?))
>
> srfi/srfi-1.scm:592:17: In procedure map1:
> Throw to key `srfi-34' with args `(# "ghostscript-CVE-2018-16509.patch: patch not found"] 49753e0>)'.

Presumably this happens while trying to build the inferior that will
then compute the derivations.

I suppose there’s a load path issue or a .go vs .scm mtime issue that
leads hydra-eval-guile-jobs to load the wrong gnu/packages/*.{scm,go}
files, eventually leading to that error.

This is weird because that works well for ‘master’ on hydra.

Ludo’.



Re: Merging staging?

2019-01-09 Thread Mark H Weaver
Mark H Weaver  writes:

> Efraim Flashner  writes:
>
>> So I think I'd like to see a comparison on hydra of staging vs master
>> and if it's good enough we go ahead and merge
>
> I just asked Hydra to produce another evaluation of 'staging'.  When it
> has finished doing so, we'll know more.

Hydra failed to create the evaluation, with the following error:

--8<---cut here---start->8---
hydra-eval-guile-jobs returned exit code 1:
adding `/gnu/store/qcssqgcyk67v146c9n2byw1acnx7d693-git-export' to the load path
Backtrace:
In ice-9/eval.scm:
619:8 19 (_ #(#(#)))
In ice-9/command-line.scm:
   181:18 18 (_ #)
In unknown file:
  17 (eval (apply (module-ref (resolve-interface (?)) #) #) #)
In /usr/local/bin/hydra-eval-guile-jobs:
   244:18 16 (eval-guile-jobs . _)
In ice-9/eval.scm:
619:8 15 (_ #(#(#(#) # ?) ?))
   626:19 14 (_ #(#(#(#) # ?) ?))
In guix/store.scm:
  1698:24 13 (run-with-store _ _ #:guile-for-build _ #:system _ # _)
In guix/channels.scm:
402:2 12 (_ _)
394:2 11 (_ _)
310:2 10 (_ _)
In unknown file:
   9 (_ # # ?)
In guix/gexp.scm:
702:2  8 (_ _)
In guix/monads.scm:
471:9  7 (_ _)
In guix/gexp.scm:
   572:13  6 (_ _)
In guix/store.scm:
   1571:8  5 (_ _)
  1594:38  4 (_ #)
In guix/packages.scm:
   873:16  3 (cache! # # ?)
  1195:22  2 (thunk)
  1128:25  1 (bag->derivation # #< ?)
In srfi/srfi-1.scm:
   592:17  0 (map1 (("source" # url: "?>) ?))

srfi/srfi-1.scm:592:17: In procedure map1:
Throw to key `srfi-34' with args `(#)'.

Some deprecated features have been used.  Set the environment
variable GUILE_WARN_DEPRECATED to "detailed" and rerun the
program to get more information.  Set it to "no" to suppress
this message.
--8<---cut here---end--->8---

I have no idea why this error happened, because I cannot find any
reference to that patch in the current 'staging' branch.

Ludovic, any idea what happened here?

   Mark



Re: Merging staging?

2019-01-09 Thread Ludovic Courtès
Hello!

Efraim Flashner  skribis:

> On Thu, Dec 20, 2018 at 03:58:09PM +0100, Julien Lepiller wrote:
>> Hi Guix,
>> 
>> I'd like to get staging merged soon, as it wasn't for quite some time. Here
>> are some stats about the current state of substitutes for staging:
>> 
>> According to guix weather, we have:
>> 
>> | architecture | berlin | hydra |
>> +--++---+
>> | x86_64   | 36.5%  | 81.7% |
>> | i686 | 23.8%  | 71.0% |
>> | aarch64  | 22.2%  | 00.0% |
>> | armhf| 17.0%  | 45.6% |
>> 
>> What should the next step be?
>> 
>
> I have two patches that I should be done with by the end of today, to
> add some more flags to libdrm and mesa for armhf and aarch64. I can hold
> on to it for later, or I can go ahead and push it when it's ready. It
> shouldn't cause any rebuilds on x86_64 or i686.

So what’s the status of this branch now, does anyone know?  :-)

Seriously though, we should do something about it.

Ludo’.



Re: Merging staging?

2019-01-09 Thread Efraim Flashner



On January 9, 2019 7:15:56 PM UTC, Mark H Weaver  wrote:
>Hi Efraim,
>
>Efraim Flashner  writes:
>
>> I merged master into staging today and got the following chart:
>>
>> | architecture | berlin | hydra |
>> +--++---+
>> | x86_64   | 42.2%  | 78.8% |
>> | i686 | 28.2%  | 64.3% |
>> | aarch64  | 0% | 23.0% |
>> | armhf| 16.8%  | 50.4% |
>>
>> but compared to master:
>>
>> | architecture | berlin | hydra |
>> +--++---+
>> | x86_64   | 57.5%  | 91.9% |
>> | i686 | 44.4%  | 77.8% |
>> | aarch64  | 0% | 41.8% |
>> | armhf| 30.3%  | 69.6% |
>
>Something looks wrong here, because Hydra has never supported aarch64,
>i.e. it does not create jobs for that system and has never been
>connected to an aarch64 build slave, so I don't see how it could
>possibly have more than 0% coverage on that system.
>
>> So I think I'd like to see a comparison on hydra of staging vs master
>> and if it's good enough we go ahead and merge
>
>I just asked Hydra to produce another evaluation of 'staging'.  When it
>has finished doing so, we'll know more.
>
>  Thanks!
>Mark

And I thought I triple checked I had them in the right columns!

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.



Re: Merging staging?

2019-01-09 Thread Mark H Weaver
Hi Efraim,

Efraim Flashner  writes:

> I merged master into staging today and got the following chart:
>
> | architecture | berlin | hydra |
> +--++---+
> | x86_64   | 42.2%  | 78.8% |
> | i686 | 28.2%  | 64.3% |
> | aarch64  | 0% | 23.0% |
> | armhf| 16.8%  | 50.4% |
>
> but compared to master:
>
> | architecture | berlin | hydra |
> +--++---+
> | x86_64   | 57.5%  | 91.9% |
> | i686 | 44.4%  | 77.8% |
> | aarch64  | 0% | 41.8% |
> | armhf| 30.3%  | 69.6% |

Something looks wrong here, because Hydra has never supported aarch64,
i.e. it does not create jobs for that system and has never been
connected to an aarch64 build slave, so I don't see how it could
possibly have more than 0% coverage on that system.

> So I think I'd like to see a comparison on hydra of staging vs master
> and if it's good enough we go ahead and merge

I just asked Hydra to produce another evaluation of 'staging'.  When it
has finished doing so, we'll know more.

  Thanks!
Mark



Re: Merging staging?

2019-01-09 Thread Efraim Flashner
On Sun, Dec 23, 2018 at 04:00:48PM +0200, Efraim Flashner wrote:
> On Thu, Dec 20, 2018 at 03:58:09PM +0100, Julien Lepiller wrote:
> > Hi Guix,
> > 
> > I'd like to get staging merged soon, as it wasn't for quite some time. Here
> > are some stats about the current state of substitutes for staging:
> > 
> > According to guix weather, we have:
> > 
> > | architecture | berlin | hydra |
> > +--++---+
> > | x86_64   | 36.5%  | 81.7% |
> > | i686 | 23.8%  | 71.0% |
> > | aarch64  | 22.2%  | 00.0% |
> > | armhf| 17.0%  | 45.6% |
> > 
> > What should the next step be?
> > 
> 

I merged master into staging today and got the following chart:

| architecture | berlin | hydra |
+--++---+
| x86_64   | 42.2%  | 78.8% |
| i686 | 28.2%  | 64.3% |
| aarch64  | 0% | 23.0% |
| armhf| 16.8%  | 50.4% |

but compared to master:

| architecture | berlin | hydra |
+--++---+
| x86_64   | 57.5%  | 91.9% |
| i686 | 44.4%  | 77.8% |
| aarch64  | 0% | 41.8% |
| armhf| 30.3%  | 69.6% |

So I think I'd like to see a comparison on hydra of staging vs master
and if it's good enough we go ahead and merge


-- 
Efraim Flashner  אפרים פלשנר
GPG key = A28B F40C 3E55 1372 662D  14F7 41AA E7DC CA3D 8351
Confidentiality cannot be guaranteed on emails sent or received unencrypted


signature.asc
Description: PGP signature


Re: Merging staging?

2018-12-23 Thread Efraim Flashner
On Thu, Dec 20, 2018 at 03:58:09PM +0100, Julien Lepiller wrote:
> Hi Guix,
> 
> I'd like to get staging merged soon, as it wasn't for quite some time. Here
> are some stats about the current state of substitutes for staging:
> 
> According to guix weather, we have:
> 
> | architecture | berlin | hydra |
> +--++---+
> | x86_64   | 36.5%  | 81.7% |
> | i686 | 23.8%  | 71.0% |
> | aarch64  | 22.2%  | 00.0% |
> | armhf| 17.0%  | 45.6% |
> 
> What should the next step be?
> 

I have two patches that I should be done with by the end of today, to
add some more flags to libdrm and mesa for armhf and aarch64. I can hold
on to it for later, or I can go ahead and push it when it's ready. It
shouldn't cause any rebuilds on x86_64 or i686.

-- 
Efraim Flashner  אפרים פלשנר
GPG key = A28B F40C 3E55 1372 662D  14F7 41AA E7DC CA3D 8351
Confidentiality cannot be guaranteed on emails sent or received unencrypted


signature.asc
Description: PGP signature


Re: Merging staging?

2018-12-21 Thread Ludovic Courtès
Hello Mark,

Mark H Weaver  skribis:

> It's a good question.  I have several hypotheses:

These are all valid but there’s a couple more to consider.  ;-)

Specifically we’ve had ENOSPC issues on some build nodes lately, and as
I wrote elsewhere, ‘guix offload’ would report them as “permanent
failures”.  Thus guix-daemon on berlin would cache those failures and
never retry afterwards.  This is fixed by commit
b96e05aefd7a4f734cfec3b27c2d38320d43b687.

Commit 63b0c3eaccdf1816b419632cd7fe721934d2eb27 also arranges so we
don’t choose machines low on disk space.

Another issue I’ve noticed is “database is locked” offloading crashes,
fixed by bdf860c2e99077d431da0cc1db4fc14db2a35d31.  We probably don’t
get these on hydra.gnu.org because we’re running a version that predates
the replacement of the ‘guix-register’ program by (guix store database).

There’s a few more issues about offloading in the bug tracker.  I
suspect these explain the low availability of substitutes to a large
extent.

Ludo’.



Re: Merging staging?

2018-12-21 Thread Gábor Boskovits
Hello,

2018. dec. 21., P 2:18 dátummal Mark H Weaver  ezt írta:

> Hi Julien,
>
> I've rearranged your reply from "top-posting" style to "bottom-posting"
> style.  Please consider using bottom-posting in the future.
>
> I wrote:
>
> > Julien Lepiller  writes:
> >
> >> I'd like to get staging merged soon, as it wasn't for quite some
> >> time. Here are some stats about the current state of substitutes for
> >> staging:
> >>
> >> According to guix weather, we have:
> >>
> >> | architecture | berlin | hydra |
> >> +--++---+
> >> | x86_64   | 36.5%  | 81.7% |
> >> | i686 | 23.8%  | 71.0% |
> >> | aarch64  | 22.2%  | 00.0% |
> >> | armhf| 17.0%  | 45.6% |
> >>
> >> What should the next step be?
> >
> > I think we should wait until the coverage on armhf and aarch64 have
> > become larger, for the sake of users on those systems.
> >
> > Also, I've seen some commits that make me wonder if hydra is still
> > being configured as an authorized substitute server on new Guix
> > installations.
> > Do you know?
> >
> > If 'berlin' is the only substitute server by default, then we certainly
> > need to wait for those numbers to get higher, no?
> >
> > What do you think?
>
> Julien Lepiller  responded:
>
> > I agree, but I wonder if there is a reason for these to be so low?
>
> It's a good question.  I have several hypotheses:
>
> * Unfortunately, it is fairly common for builds for important core
>   packages to spuriously fail, often due to unreliable test suites, and
>   to cause thousands of other important dependent packages to fail.
>   When this happens on Hydra, I can see what's going on, and restart the
>   build and all of its dependents.
>

This is currently a problem, we can't see
which dependency causes the dependency failure.


>   I wouldn't be surprised if some important core packages spuriously
>   failed to build on Berlin, but we have no effective way to see what
>   happened there.  If that's the case, the 'guix weather' numbers above
>   might never get much higher no matter how long we wait.
>
> * Berlin's build slots may have been occupied for long periods of time
>   by 'test.*' jobs stuck in an endless "waiting for udevd..." loop, as
>   described in .
>
>   Hydra's web interface allows me to monitor active jobs and manually
>   kill those stuck jobs when I find them.  I don't know how to do that
>   on Berlin.
>
> * Especially on armhf and aarch64, where Berlin has very little build
>   capacity, and new builds are being added to Berlin's build queue must
>   faster than they can be built, it is quite possible that Berlin is
>   spending most of its effort on long-outdated builds.
>
>   On Hydra, I can see when this is happening, and often intervene by
>   cancelling large numbers of outdated builds on armhf, so that it
>   remains focused on the most popular and up-to-date packages.
>
We are currently missing an admin interface on berlin, and we would need
that, as canceling a job should be privileged.


> * On WIP branches like 'core-updates' and 'staging', when a new
>   evaluation is done, I cancel all outdated Hydra jobs on those
>   branches.  I don't know if anything similar is done on Berlin.
>
> In summary, there are several things that I regularly do to make
> efficient use of Hydra's limited build capacity.  I periodically look at
> Berlin's web interface to see how it has progressed, but it is currently
> mostly a black box to me.  I see no effective way to focus its limited
> resources on the most important builds, or to see when build slots are
> stuck.
>
>  Regards,
>Mark
>
I am currently looking around how to improve the situation. Suggestions are
welcome.

G_bor

>
>


Re: Merging staging?

2018-12-20 Thread Mark H Weaver
Hi Julien,

I've rearranged your reply from "top-posting" style to "bottom-posting"
style.  Please consider using bottom-posting in the future.

I wrote:

> Julien Lepiller  writes:
>
>> I'd like to get staging merged soon, as it wasn't for quite some
>> time. Here are some stats about the current state of substitutes for
>> staging:
>>
>> According to guix weather, we have:
>>
>> | architecture | berlin | hydra |
>> +--++---+
>> | x86_64   | 36.5%  | 81.7% |
>> | i686 | 23.8%  | 71.0% |
>> | aarch64  | 22.2%  | 00.0% |
>> | armhf| 17.0%  | 45.6% |
>>
>> What should the next step be?
>
> I think we should wait until the coverage on armhf and aarch64 have
> become larger, for the sake of users on those systems.
>
> Also, I've seen some commits that make me wonder if hydra is still
> being configured as an authorized substitute server on new Guix
> installations.
> Do you know?
>
> If 'berlin' is the only substitute server by default, then we certainly
> need to wait for those numbers to get higher, no?
>
> What do you think?

Julien Lepiller  responded:

> I agree, but I wonder if there is a reason for these to be so low?

It's a good question.  I have several hypotheses:

* Unfortunately, it is fairly common for builds for important core
  packages to spuriously fail, often due to unreliable test suites, and
  to cause thousands of other important dependent packages to fail.
  When this happens on Hydra, I can see what's going on, and restart the
  build and all of its dependents.

  I wouldn't be surprised if some important core packages spuriously
  failed to build on Berlin, but we have no effective way to see what
  happened there.  If that's the case, the 'guix weather' numbers above
  might never get much higher no matter how long we wait.

* Berlin's build slots may have been occupied for long periods of time
  by 'test.*' jobs stuck in an endless "waiting for udevd..." loop, as
  described in .

  Hydra's web interface allows me to monitor active jobs and manually
  kill those stuck jobs when I find them.  I don't know how to do that
  on Berlin.

* Especially on armhf and aarch64, where Berlin has very little build
  capacity, and new builds are being added to Berlin's build queue must
  faster than they can be built, it is quite possible that Berlin is
  spending most of its effort on long-outdated builds.

  On Hydra, I can see when this is happening, and often intervene by
  cancelling large numbers of outdated builds on armhf, so that it
  remains focused on the most popular and up-to-date packages.

* On WIP branches like 'core-updates' and 'staging', when a new
  evaluation is done, I cancel all outdated Hydra jobs on those
  branches.  I don't know if anything similar is done on Berlin.

In summary, there are several things that I regularly do to make
efficient use of Hydra's limited build capacity.  I periodically look at
Berlin's web interface to see how it has progressed, but it is currently
mostly a black box to me.  I see no effective way to focus its limited
resources on the most important builds, or to see when build slots are
stuck.

 Regards,
   Mark



Re: Merging staging?

2018-12-20 Thread Julien Lepiller
I agree, but I wonder if there is a reason for these to be so low?

Le 20 décembre 2018 19:36:54 GMT+01:00, Mark H Weaver  a écrit 
:
>Hi Julien,
>
>Julien Lepiller  writes:
>
>> I'd like to get staging merged soon, as it wasn't for quite some
>> time. Here are some stats about the current state of substitutes for
>> staging:
>>
>> According to guix weather, we have:
>>
>> | architecture | berlin | hydra |
>> +--++---+
>> | x86_64   | 36.5%  | 81.7% |
>> | i686 | 23.8%  | 71.0% |
>> | aarch64  | 22.2%  | 00.0% |
>> | armhf| 17.0%  | 45.6% |
>>
>> What should the next step be?
>
>I think we should wait until the coverage on armhf and aarch64 have
>become larger, for the sake of users on those systems.
>
>Also, I've seen some commits that make me wonder if hydra is still
>being
>configured as an authorized substitute server on new Guix
>installations.
>Do you know?
>
>If 'berlin' is the only substitute server by default, then we certainly
>need to wait for those numbers to get higher, no?
>
>What do you think?
>
>Regards,
>  Mark



Re: Merging staging?

2018-12-20 Thread Mark H Weaver
Hi Julien,

Julien Lepiller  writes:

> I'd like to get staging merged soon, as it wasn't for quite some
> time. Here are some stats about the current state of substitutes for
> staging:
>
> According to guix weather, we have:
>
> | architecture | berlin | hydra |
> +--++---+
> | x86_64   | 36.5%  | 81.7% |
> | i686 | 23.8%  | 71.0% |
> | aarch64  | 22.2%  | 00.0% |
> | armhf| 17.0%  | 45.6% |
>
> What should the next step be?

I think we should wait until the coverage on armhf and aarch64 have
become larger, for the sake of users on those systems.

Also, I've seen some commits that make me wonder if hydra is still being
configured as an authorized substitute server on new Guix installations.
Do you know?

If 'berlin' is the only substitute server by default, then we certainly
need to wait for those numbers to get higher, no?

What do you think?

Regards,
  Mark



Merging staging?

2018-12-20 Thread Julien Lepiller

Hi Guix,

I'd like to get staging merged soon, as it wasn't for quite some time. 
Here are some stats about the current state of substitutes for staging:


According to guix weather, we have:

| architecture | berlin | hydra |
+--++---+
| x86_64   | 36.5%  | 81.7% |
| i686 | 23.8%  | 71.0% |
| aarch64  | 22.2%  | 00.0% |
| armhf| 17.0%  | 45.6% |

What should the next step be?