qiqizjl commented on PR #12841:
URL: https://github.com/apache/apisix/pull/12841#issuecomment-3690993845

   Hi @Baoyuantop, thanks for the review!
   
   You are right that PR #12549 changed the default values to `0` to fix JSON 
log format issues. However, this introduced a problem with Prometheus metrics 
reporting.
   
   Here is the issue:
   
   **Current state after PR #12549:**
   - `ngx_tpl.lua` sets default values to `0`
   - `exporter.lua` (lines 376-394) checks `if vars.llm_prompt_tokens ~= ""` to 
decide whether to report metrics
   - Since `0 != ""` (empty string), metrics are **always reported** even when 
LLM plugin is not enabled
   
   **The problem we are solving:**
   When LLM plugin is not enabled:
   - `llm_prompt_tokens` = `0` (default)
   - `exporter.lua` sees `0 != ""` → reports metrics with value `0`
   - This creates invalid/unnecessary Prometheus metrics
   
   **Our solution:**
   - Change defaults back to `` (empty string)
   - Update `openai-base.lua` to check `== ""` instead of `== "0"`
   - `exporter.lua` already checks `~= ""`, so it will correctly skip reporting 
when LLM is not active
   
   This ensures:
   1. Prometheus only reports metrics when LLM functionality is actually used
   2. The `openai-base.lua` logic remains consistent
   
   Does this make sense?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to