Script 'mail_helper' called by obssrc
Hello community,

here is the log from the commit of package llamacpp for openSUSE:Factory 
checked in at 2025-09-02 17:58:24
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Comparing /work/SRC/openSUSE:Factory/llamacpp (Old)
 and      /work/SRC/openSUSE:Factory/.llamacpp.new.1977 (New)
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Package is "llamacpp"

Tue Sep  2 17:58:24 2025 rev:18 rq:1302234 version:6269

Changes:
--------
--- /work/SRC/openSUSE:Factory/llamacpp/llamacpp.changes        2025-08-25 
20:41:04.135558407 +0200
+++ /work/SRC/openSUSE:Factory/.llamacpp.new.1977/llamacpp.changes      
2025-09-02 17:58:46.163624899 +0200
@@ -1,0 +2,25 @@
+Mon Aug 25 13:29:14 UTC 2025 - Eyad Issa <eyadlore...@gmail.com>
+
+- Update to version 6269:
+  * Model and conversion: support for Seed-OSS, GPT-OSS 
+       response_format, interns1-mini, Ernie 4.5, gpt-oss type 
+       strings, improved Mistral templates, new model conversion 
+       tool/example with torch-cpu.
+  * Vulkan backend: multiple optimizations (rms_norm, mul_mat_id,
+    synchronization, conv2d, subgroup ops), new ops (exp, 
+    conv_2d_dw f16, ggml_mean).
+  * GGML/CPU: added conv3d op, WebGPU quantization support, 
+       Q5_0/Q5_1 on s390x, mxfp4 intrinsics on ppc64le.
+  * Server and chat: multimodal completion and embeddings 
+       JSON support, improved OpenAI API compatibility and usage 
+       statistics, disabled context shift by default, fixed ordering 
+       of tasks, webui issues, debug assertions, clarified 
+       reasoning_format.
+  * KV cache: unified handling improvements, support for reuse, 
+       removal of deprecated APIs, simplifications.
+  * Miscellaneous: fixed logging of non-ASCII characters, removed 
+       deprecated or unused code and build artifacts.
+  * Full commit log:
+    https://github.com/ggml-org/llama.cpp/compare/b6188...b6269
+
+-------------------------------------------------------------------

Old:
----
  llamacpp-6188.tar.gz

New:
----
  llamacpp-6269.tar.gz

++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Other differences:
------------------
++++++ llamacpp.spec ++++++
--- /var/tmp/diff_new_pack.CB0s1V/_old  2025-09-02 17:58:46.911656378 +0200
+++ /var/tmp/diff_new_pack.CB0s1V/_new  2025-09-02 17:58:46.911656378 +0200
@@ -20,7 +20,7 @@
 %global backend_dir %{_libdir}/ggml
 
 Name:           llamacpp
-Version:        6188
+Version:        6269
 Release:        0
 Summary:        Inference of Meta's LLaMA model (and others) in pure C/C++
 License:        MIT

++++++ llamacpp-6188.tar.gz -> llamacpp-6269.tar.gz ++++++
/work/SRC/openSUSE:Factory/llamacpp/llamacpp-6188.tar.gz 
/work/SRC/openSUSE:Factory/.llamacpp.new.1977/llamacpp-6269.tar.gz differ: char 
15, line 1

Reply via email to