Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package llamacpp for openSUSE:Factory 
checked in at 2025-08-25 20:38:58
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/llamacpp (Old)
 and      /work/SRC/openSUSE:Factory/.llamacpp.new.30751 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "llamacpp"

Mon Aug 25 20:38:58 2025 rev:17 rq:1301212 version:6188

Changes:
--------
--- /work/SRC/openSUSE:Factory/llamacpp/llamacpp.changes        2025-08-13 
16:32:45.754034455 +0200
+++ /work/SRC/openSUSE:Factory/.llamacpp.new.30751/llamacpp.changes     
2025-08-25 20:41:04.135558407 +0200
@@ -1,0 +2,19 @@
+Sun Aug 17 22:17:38 UTC 2025 - Eyad Issa <[email protected]>
+
+- Update to version 6188:
+  * Vulkan backend improvements: larger workgroups, optimized
+    argsort, fused adds, bounds checking, out-of-bounds and compile
+    warning fixes, performance logging.
+  * OpenCL backend: initial FA and mxfp4 support.
+  * Model support: vision LiquidAI LFM2-VL family, 18-layer Gemma
+    3-270m model type.
+  * Common: fixed double BOS, improved chat templates, added
+    override-tensor and CPU MoE draft parameters.
+  * GGML: initial IBM zDNN backend, rope_multi update, conv_1d_dw
+    bug fix, block_iq4_nlx8 repack, improved Mistral integration.
+  * Server: SWA checkpoints, -td/-tbd parameters, harmony thought
+    message filtering.
+  * Perplexity: improved error hints and constraint reporting.
+  * GPT-OSS: harmony parsing implemented.
+
+-------------------------------------------------------------------

Old:
----
  llamacpp-6139.tar.gz

New:
----
  llamacpp-6188.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ llamacpp.spec ++++++
--- /var/tmp/diff_new_pack.pglQFW/_old  2025-08-25 20:41:04.847588237 +0200
+++ /var/tmp/diff_new_pack.pglQFW/_new  2025-08-25 20:41:04.851588404 +0200
@@ -20,7 +20,7 @@
 %global backend_dir %{_libdir}/ggml
 
 Name:           llamacpp
-Version:        6139
+Version:        6188
 Release:        0
 Summary:        Inference of Meta's LLaMA model (and others) in pure C/C++
 License:        MIT

++++++ llamacpp-6139.tar.gz -> llamacpp-6188.tar.gz ++++++
/work/SRC/openSUSE:Factory/llamacpp/llamacpp-6139.tar.gz 
/work/SRC/openSUSE:Factory/.llamacpp.new.30751/llamacpp-6188.tar.gz differ: 
char 16, line 1

Reply via email to