bzp2010 commented on code in PR #12565:
URL: https://github.com/apache/apisix/pull/12565#discussion_r2309780132


##########
docs/en/latest/plugins/ai-request-rewrite.md:
##########
@@ -36,7 +36,7 @@ The `ai-request-rewrite` plugin intercepts client requests 
before they are forwa
 | **Field**                 | **Required** | **Type** | **Description**        
                                                              |
 | ------------------------- | ------------ | -------- | 
------------------------------------------------------------------------------------
 |
 | prompt                    | Yes          | String   | The prompt send to LLM 
service.                                                      |
-| provider                  | Yes          | String   | Name of the LLM 
service. Available options: openai, deekseek, aimlapi and openai-compatible. 
When `aimlapi` is selected, the plugin uses the OpenAI-compatible driver with a 
default endpoint of `https://api.aimlapi.com/v1/chat/completions`.   |
+| provider                  | Yes          | String   | Name of the LLM 
service. Available options: openai, deekseek,azure-openai, aimlapi and 
openai-compatible. When `aimlapi` is selected, the plugin uses the 
OpenAI-compatible driver with a default endpoint of 
`https://api.aimlapi.com/v1/chat/completions`.   |

Review Comment:
   ```suggestion
   | provider                  | Yes          | String   | Name of the LLM 
service. Available options: openai, deekseek, azure-openai, aimlapi and 
openai-compatible. When `aimlapi` is selected, the plugin uses the 
OpenAI-compatible driver with a default endpoint of 
`https://api.aimlapi.com/v1/chat/completions`.   |
   ```



##########
docs/en/latest/plugins/ai-proxy.md:
##########
@@ -51,7 +51,7 @@ In addition, the Plugin also supports logging LLM request 
information in the acc
 
 | Name               | Type    | Required | Default | Valid values             
                 | Description |
 
|--------------------|--------|----------|---------|------------------------------------------|-------------|
-| provider          | string  | True     |         | [openai, deepseek, 
aimlapi, openai-compatible] | LLM service provider. When set to `openai`, the 
Plugin will proxy the request to `https://api.openai.com/chat/completions`. 
When set to `deepseek`, the Plugin will proxy the request to 
`https://api.deepseek.com/chat/completions`. When set to `aimlapi`, the Plugin 
uses the OpenAI-compatible driver and proxies the request to 
`https://api.aimlapi.com/v1/chat/completions` by default. When set to 
`openai-compatible`, the Plugin will proxy the request to the custom endpoint 
configured in `override`. |
+| provider          | string  | True     |         | [openai, 
deepseek,azure-openai, aimlapi, openai-compatible] | LLM service provider. When 
set to `openai`, the Plugin will proxy the request to 
`https://api.openai.com/chat/completions`. When set to `deepseek`, the Plugin 
will proxy the request to `https://api.deepseek.com/chat/completions`. When set 
to `aimlapi`, the Plugin uses the OpenAI-compatible driver and proxies the 
request to `https://api.aimlapi.com/v1/chat/completions` by default. When set 
to `openai-compatible`, the Plugin will proxy the request to the custom 
endpoint configured in `override`. |

Review Comment:
   ```suggestion
   | provider          | string  | True     |         | [openai, deepseek, 
azure-openai, aimlapi, openai-compatible] | LLM service provider. When set to 
`openai`, the Plugin will proxy the request to 
`https://api.openai.com/chat/completions`. When set to `deepseek`, the Plugin 
will proxy the request to `https://api.deepseek.com/chat/completions`. When set 
to `aimlapi`, the Plugin uses the OpenAI-compatible driver and proxies the 
request to `https://api.aimlapi.com/v1/chat/completions` by default. When set 
to `openai-compatible`, the Plugin will proxy the request to the custom 
endpoint configured in `override`. |
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: notifications-unsubscr...@apisix.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to