https://git.altlinux.org/tasks/412270/logs/events.1.1.log
https://packages.altlinux.org/tasks/412270

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp     6:44     -    5:20

2026-Mar-22 16:12:12 :: test-only task #412270 for sisyphus started by vt:
#100 build 8470-alt1 from /people/vt/packages/llama.cpp.git fetched at 
2026-Mar-22 16:12:10
2026-Mar-22 16:12:14 :: [aarch64] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-22 16:12:14 :: [x86_64] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-22 16:12:14 :: [i586] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-22 16:12:21 :: [i586] #100 llama.cpp.git 8470-alt1: build SKIPPED
build/100/x86_64/log:[00:02:24] debuginfo.req: WARNING: 
/usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:24] debuginfo.req: WARNING: 
/usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-22 16:17:34 :: [x86_64] #100 llama.cpp.git 8470-alt1: build OK
2026-Mar-22 16:18:58 :: [aarch64] #100 llama.cpp.git 8470-alt1: build OK
2026-Mar-22 16:19:06 :: 100: build check OK
2026-Mar-22 16:19:07 :: build check OK
2026-Mar-22 16:19:21 :: #100: llama.cpp.git 8470-alt1: version check OK
2026-Mar-22 16:19:21 :: build version check OK
--- llama.cpp-cpu-8470-alt1.x86_64.rpm.share    2026-03-22 16:19:25.113215444 
+0000
+++ llama.cpp-cpu-8470-alt1.aarch64.rpm.share   2026-03-22 16:19:26.482228057 
+0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md     100644  UTF-8 Unicode English text, 
with very long lines
-/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text, 
with very long lines
+/usr/share/doc/llama.cpp/build-options.txt     100644  ASCII English text
 /usr/share/doc/llama.cpp/docs  40755   directory
warning (#100): non-identical /usr/share part
2026-Mar-22 16:19:43 :: noarch check OK
2026-Mar-22 16:19:45 :: plan: src +1 -1 =21773, aarch64 +8 -8 =38617, x86_64 
+10 -10 =39627
#100 llama.cpp 8192-alt1 -> 1:8470-alt1
 Sun Mar 22 2026 Vitaly Chikunov <vt@altlinux> 1:8470-alt1
 - Update to b8470 (2026-03-22).
2026-Mar-22 16:20:32 :: patched apt indices
2026-Mar-22 16:20:42 :: created next repo
2026-Mar-22 16:20:53 :: duplicate provides check OK
2026-Mar-22 16:21:34 :: dependencies check OK
2026-Mar-22 16:22:15 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-22 16:22:26 :: [x86_64] #100 libllama: install check OK
2026-Mar-22 16:22:31 :: [x86_64] #100 libllama-debuginfo: install check OK
2026-Mar-22 16:22:35 :: [aarch64] #100 libllama: install check OK
        x86_64: libllama-devel=1:8470-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-22 16:22:36 :: [x86_64] #100 libllama-devel: install check OK
2026-Mar-22 16:22:46 :: [aarch64] #100 libllama-debuginfo: install check OK
2026-Mar-22 16:22:54 :: [x86_64] #100 llama.cpp: install check OK
        aarch64: libllama-devel=1:8470-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-22 16:22:56 :: [aarch64] #100 libllama-devel: install check OK
2026-Mar-22 16:23:00 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Mar-22 16:23:08 :: [aarch64] #100 llama.cpp: install check OK
2026-Mar-22 16:23:09 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-22 16:23:19 :: [aarch64] #100 llama.cpp-cpu: install check OK
2026-Mar-22 16:23:26 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Mar-22 16:23:35 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Mar-22 16:23:45 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Mar-22 16:23:47 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2026-Mar-22 16:23:51 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Mar-22 16:23:58 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check 
OK
2026-Mar-22 16:24:02 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install 
check OK
2026-Mar-22 16:24:19 :: [x86_64-i586] generated apt indices
2026-Mar-22 16:24:19 :: [x86_64-i586] created next repo
2026-Mar-22 16:24:30 :: [x86_64-i586] dependencies check OK
2026-Mar-22 16:24:31 :: gears inheritance check OK
2026-Mar-22 16:24:31 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Mar-22 16:24:32 :: acl check OK
2026-Mar-22 16:24:45 :: created contents_index files
2026-Mar-22 16:24:53 :: created hash files: aarch64 src x86_64
2026-Mar-22 16:24:56 :: task #412270 for sisyphus TESTED
_______________________________________________
Sisyphus-incominger mailing list
[email protected]
https://lists.altlinux.org/mailman/listinfo/sisyphus-incominger

Reply via email to