hw872715125 opened a new issue, #12532:
URL: https://github.com/apache/apisix/issues/12532

   ### Current Behavior
   
   I have deployed APISIX using Docker, and I am using ai-proxy and 
ai-prompt-decorator. During my testing of the API, I found that when 
ai-prompt-decorator is used, the user's historical input information is 
automatically appended to the messages in the logs.
   
   <img width="966" height="95" alt="Image" 
src="https://github.com/user-attachments/assets/2807851a-37b3-46f7-87ff-48e728b99d3c";
 />
   
   When I do not use the ai-prompt-decorator plugin, the user's historical 
input content is not automatically appended.
   
   ### Expected Behavior
   
   Normally, the ai-prompt-decorator plugin should only insert the system 
prompt, and should not append the user's historical input content.
   
   ### Error Logs
   
   _No response_
   
   ### Steps to Reproduce
   
   This is my route configuration.
   curl "http://127.0.0.1:9180/apisix/admin/routes"; -X PUT \
     -H "X-API-KEY: ${admin_key}" \
     -d '{
       "id": "ai-chat-route",
       "uri": "/v1/chat/completions/daily",
       "hosts": ["test.webber.com"],
       "methods": ["POST", "OPTIONS"],
       "plugins": {
           "limit-count": {
               "count": 3,
               "time_window": 5,
               "key_type": "var_combination",
               "key": "$remote_addr $http_user_id",
               "rejected_code": 429,
               "rejected_msg": "Too many requests",
               "policy": "local"
             },
             "cors": {},
             "jwt-auth": {
               "header": "Authorization",
               "hide_credentials": true
             },
             "prometheus": {
               "prefer_name": true
             },
             "file-logger": {
               "path": "logs/file.log",
               "include_req_body": true,
               "include_resp_body": true
              },
             "ai-proxy": {
               "provider": "openai-compatible",
               "auth": {
                 "header": {
                   "Authorization": "Bearer '"${api_key}"'"
                 }
               },
               "override": {
                 "endpoint": 
"https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions";
               },
               "logging": {
                   "summaries": true,
                   "payloads": true
               },
               "timeout": 180000
             },
             "ai-prompt-decorator": {
               "prepend":[
                 {
                   "role": "system",
                   "content": "'"${fallen_angle_daily_prompt}"'"
                 }
               ]
             }
         }
     }'
   
   This is my API test code.
   
   var myHeaders = new Headers();
   
   myHeaders.append("Content-Type", "application/json");
   myHeaders.append("Authorization", "jjww")
   
   var raw = JSON.stringify({
      "model": "qwen-max-latest",
      "messages": [
         {
            "role": "user",
            "content": "bye"
         }
      ],
      "temperature": 0.7,
      "top_p": 0.8,
      "enable_search": true,
      "search_options": {
         "forced_search": true
      },
      "stream": true,
      "stream_options": {
         "include_usage": true
      }
   });
   
   var requestOptions = {
      method: 'POST',
      headers: myHeaders,
      body: raw,
      redirect: 'follow'
   };
   
   fetch("https://test.webber.com/v1/chat/completions/daily";, requestOptions)
      .then(response => response.text())
      .then(result => console.log(result))
      .catch(error => console.log('error', error));
   
   ### Environment
   
   <img width="1459" height="170" alt="Image" 
src="https://github.com/user-attachments/assets/d43c57da-429b-48ff-a6e5-672d89e4bbf0";
 />
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@apisix.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to