kayx23 commented on code in PR #12878:
URL: https://github.com/apache/apisix/pull/12878#discussion_r2680753084


##########
docs/en/latest/plugins/ai-proxy-multi.md:
##########
@@ -58,7 +58,7 @@ In addition, the Plugin also supports logging LLM request 
information in the acc
 | balancer.key                       | string         | False    |             
                      |              | Used when `type` is `chash`. When 
`hash_on` is set to `header` or `cookie`, `key` is required. When `hash_on` is 
set to `consumer`, `key` is not required as the consumer name will be used as 
the key automatically. |
 | instances                          | array[object]  | True     |             
                      |              | LLM instance configurations. |
 | instances.name                     | string         | True     |             
                      |              | Name of the LLM service instance. |
-| instances.provider                 | string         | True     |             
                      | [openai, deepseek, azure-openai, aimlapi, 
openai-compatible] | LLM service provider. When set to `openai`, the Plugin 
will proxy the request to `api.openai.com`. When set to `deepseek`, the Plugin 
will proxy the request to `api.deepseek.com`. When set to `aimlapi`, the Plugin 
uses the OpenAI-compatible driver and proxies the request to `api.aimlapi.com` 
by default. When set to `openai-compatible`, the Plugin will proxy the request 
to the custom endpoint configured in `override`. |
+| instances.provider                 | string         | True     |             
                      | [openai, deepseek, azure-openai, aimlapi, openrouter, 
openai-compatible] | LLM service provider. When set to `openai`, the Plugin 
will proxy the request to `api.openai.com`. When set to `deepseek`, the Plugin 
will proxy the request to `api.deepseek.com`. When set to `aimlapi`, the Plugin 
uses the OpenAI-compatible driver and proxies the request to `api.aimlapi.com` 
by default. When set to `openrouter`, the Plugin uses the OpenAI-compatible 
driver and proxies the request to `openrouter.ai` by default. When set to 
`openai-compatible`, the Plugin will proxy the request to the custom endpoint 
configured in `override`. |

Review Comment:
   ok



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to