Package: llama.cpp-tools
Version: 8064+dfsg-1

When starting the llama.cpp server to provide an API and a simple web
service and providing it on a network, the web service do not work as
the requests are sent to the wrong location.

I start my server like this:

  LC_ALL=C llama-server -ngl 256  -c $(( 42 * 1024)) --temp 0.7 \
    --repeat_penalty 1.1 -n -1 \
    -m ~/models-llama/Qwen3-Coder-30B-A3B-Instruct-Q5_K_S.gguf

The cause is the hardcoded use of http://127.0.0.1:8080 as the API end
point in the web service javascript.

Applying the following patch fixes it locally, but it would be better if
the baseURL was dynamically calculated from where the web page
originated, and perhaps be overridable using some local configuration.

--- 
/usr/share/llama.cpp-tools/llama-server/themes/simplechat/simplechat.js.orig    
    2026-02-19 06:30:45.484328145 +0100
+++ /usr/share/llama.cpp-tools/llama-server/themes/simplechat/simplechat.js     
2026-02-19 06:33:44.000033528 +0100
@@ -722,7 +722,7 @@
 class Me {
 
     constructor() {
-        this.baseURL = "http://127.0.0.1:8080";;
+        this.baseURL = "https://server.example.com:8080";;
         this.defaultChatIds = [ "Default", "Other" ];
         this.multiChat = new MultiChatUI();
         this.bStream = true;

-- 
Happy hacking
Petter Reinholdtsen

Reply via email to