Plus-L opened a new issue, #12142:
URL: https://github.com/apache/apisix/issues/12142

   ### Current State
   
   Hello, I encountered an issue while using the APISIX plugin ai-proxy-multi. 
Following the documentation resulted in an error message
   
https://apisix.apache.org/zh/docs/apisix/plugins/ai-proxy-multi/#send-request-to-an-openai-compatible-llm
   curl "http://127.0.0.1:9180/apisix/admin/routes"; -X PUT \
     -H "X-API-KEY: ${ADMIN_API_KEY}" \
     -d '{
       "id": "ai-proxy-multi-route",
       "uri": "/anything",
       "methods": ["POST"],
       "plugins": {
         "ai-proxy-multi": {
           "providers": [
             {
               "name": "openai-compatible",
               "model": "qwen-plus",
               "weight": 1,
               "priority": 1,
               "auth": {
                 "header": {
                   "Authorization": "Bearer '"$OPENAI_API_KEY"'"
                 }
               },
               "override": {
                 "endpoint": 
"https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions";
               }
             }
           ],
           "passthrough": false
         }
       },
       "upstream": {
         "type": "roundrobin",
         "nodes": {
           "httpbin.org": 1
         }
       }
     }'
   
   run this, response: 
   {
   "error_msg": "failed to check the configuration of plugin ai-proxy-multi 
err: property \"instances\" is required"
   }
   
   ### Desired State
   
   Based on the error message, I changed providers to instances and configured 
providers within a single plugin
   {
       "id": "ai-proxy-multi-route",
       "uri": "/chat/completions",
       "methods": [
           "POST"
       ],
       "plugins": {
           "ai-proxy-multi": {
               "instances": [
                   {
                       "name": "openai-compatible",
                       "provider": "openai-compatible",
                       "model": "deepseek/deepseek-chat-v3-0324:free",
                       "weight": 1,
                       "priority": 1,
                       "auth": {
                           "header": {
                               "Authorization": "Bearer xxx"
                           }
                       },
                       "endpoint": 
"https://openrouter.ai/api/v1/chat/completions";
                   }
               ],
               "passthrough": false
           }
       },
       "upstream": {
           "type": "roundrobin",
           "scheme": "https",
           "nodes": {
               "openrouter.ai:443": 1
           }
       }
   }
   After modifying it like this, the route creation was successful, but when 
calling chat/completeness, an error occurred:
   ```
   2025/04/15 08:46:50 [error] 51#51: *1514703 lua entry thread aborted: 
runtime error: .../local/apisix//deps/share/lua/5.1/resty/http_connect.lua:179: 
attempt to concatenate local 'request_host' (a nil value)
   stack traceback:
   coroutine 0:
        .../local/apisix//deps/share/lua/5.1/resty/http_connect.lua: in 
function 'connect'
        /usr/local/apisix/apisix/plugins/ai-drivers/openai-base.lua:78: in 
function 'request'
        /usr/local/apisix/apisix/plugins/ai-proxy/base.lua:47: in function 
'phase_func'
        /usr/local/apisix/apisix/plugin.lua:1205: in function 'common_phase'
        /usr/local/apisix/apisix/init.lua:458: in function 'handle_upstream'
        /usr/local/apisix/apisix/init.lua:723: in function 'http_access_phase'
        access_by_lua(nginx.conf:323):2: in main chunk, client: xxxx, server: 
_, request: "POST /chat/completions HTTP/1.1", host: "xxxx"
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@apisix.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to