Script 'mail_helper' called by obssrc Hello community, here is the log from the commit of package llamacpp for openSUSE:Factory checked in at 2025-02-16 22:40:49 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:Factory/llamacpp (Old) and /work/SRC/openSUSE:Factory/.llamacpp.new.8181 (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "llamacpp" Sun Feb 16 22:40:49 2025 rev:4 rq:1246038 version:4719 Changes: -------- --- /work/SRC/openSUSE:Factory/llamacpp/llamacpp.changes 2025-02-03 21:46:12.681171367 +0100 +++ /work/SRC/openSUSE:Factory/.llamacpp.new.8181/llamacpp.changes 2025-02-16 22:48:55.361043252 +0100 @@ -1,0 +2,8 @@ +Sat Feb 15 01:03:56 UTC 2025 - eyadlore...@gmail.com + +- Update to version 4719: + * Too many changes to list here. Please refer to the upstream + changelog for more information. + https://github.com/ggerganov/llama.cpp/compare/b4589...b4719 + +------------------------------------------------------------------- Old: ---- llamacpp-4589.obscpio New: ---- llamacpp-4719.obscpio ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ llamacpp.spec ++++++ --- /var/tmp/diff_new_pack.IZWouP/_old 2025-02-16 22:48:59.229204060 +0100 +++ /var/tmp/diff_new_pack.IZWouP/_new 2025-02-16 22:48:59.233204226 +0100 @@ -17,7 +17,7 @@ Name: llamacpp -Version: 4589 +Version: 4719 Release: 0 Summary: llama-cli tool to run inference using the llama.cpp library License: MIT @@ -142,10 +142,6 @@ # used for shader compilation only rm %{buildroot}%{_bindir}/vulkan-shaders-gen -# fix .pc file paths -mkdir -p %{buildroot}/%{_libdir}/pkgconfig -mv %{buildroot}%{_prefix}/lib/pkgconfig/* %{buildroot}/%{_libdir}/pkgconfig/ - # remove .py extension mv %{buildroot}%{_bindir}/convert_hf_to_gguf.py %{buildroot}%{_bindir}/convert_hf_to_gguf ++++++ _service ++++++ --- /var/tmp/diff_new_pack.IZWouP/_old 2025-02-16 22:48:59.277206055 +0100 +++ /var/tmp/diff_new_pack.IZWouP/_new 2025-02-16 22:48:59.277206055 +0100 @@ -4,7 +4,7 @@ <param name="filename">llamacpp</param> <param name="url">https://github.com/ggerganov/llama.cpp.git</param> <param name="scm">git</param> - <param name="revision">b4589</param> + <param name="revision">b4719</param> <param name="versionformat">@PARENT_TAG@</param> <param name="versionrewrite-pattern">b(.*)</param> <param name="changesgenerate">enable</param> ++++++ _servicedata ++++++ --- /var/tmp/diff_new_pack.IZWouP/_old 2025-02-16 22:48:59.305207219 +0100 +++ /var/tmp/diff_new_pack.IZWouP/_new 2025-02-16 22:48:59.309207386 +0100 @@ -1,6 +1,6 @@ <servicedata> <service name="tar_scm"> <param name="url">https://github.com/ggerganov/llama.cpp.git</param> - <param name="changesrevision">eb7cf15a808d4d7a71eef89cc6a9b96fe82989dc</param></service></servicedata> + <param name="changesrevision">89daa2564f6eab33be53c6a1b39273af536d6bb3</param></service></servicedata> (No newline at EOF) ++++++ llamacpp-4589.obscpio -> llamacpp-4719.obscpio ++++++ ++++ 35699 lines of diff (skipped) ++++++ llamacpp.obsinfo ++++++ --- /var/tmp/diff_new_pack.IZWouP/_old 2025-02-16 22:49:01.749308825 +0100 +++ /var/tmp/diff_new_pack.IZWouP/_new 2025-02-16 22:49:01.753308992 +0100 @@ -1,5 +1,5 @@ name: llamacpp -version: 4589 -mtime: 1738176344 -commit: eb7cf15a808d4d7a71eef89cc6a9b96fe82989dc +version: 4719 +mtime: 1739565968 +commit: 89daa2564f6eab33be53c6a1b39273af536d6bb3