Copilot commented on code in PR #12881:
URL: https://github.com/apache/apisix/pull/12881#discussion_r2679326118


##########
docs/en/latest/plugins/ai-request-rewrite.md:
##########
@@ -36,7 +36,7 @@ The `ai-request-rewrite` plugin intercepts client requests 
before they are forwa
 | **Field**                 | **Required** | **Type** | **Description**        
                                                              |
 | ------------------------- | ------------ | -------- | 
------------------------------------------------------------------------------------
 |
 | prompt                    | Yes          | String   | The prompt send to LLM 
service.                                                      |
-| provider                  | Yes          | String   | Name of the LLM 
service. Available options: openai, deekseek, azure-openai, aimlapi and 
openai-compatible. When `aimlapi` is selected, the plugin uses the 
OpenAI-compatible driver with a default endpoint of 
`https://api.aimlapi.com/v1/chat/completions`.   |
+| provider                  | Yes          | String   | Name of the LLM 
service. Available options: openai, deekseek, azure-openai, aimlapi, 
anthropic-openai, openai-compatible. When `aimlapi` is selected, the plugin 
uses the OpenAI-compatible driver with a default endpoint of 
`https://api.aimlapi.com/v1/chat/completions`. When `anthropic-openai` is 
selected, the plugin uses the OpenAI-compatible driver with a default endpoint 
of `https://api.anthropic.com/v1/chat/completions`.   |

Review Comment:
   The provider name is misspelled as "deekseek" but should be "deepseek" to 
match the actual provider enum value.



##########
t/plugin/ai-proxy-anthropic-openai.t:
##########
@@ -0,0 +1,298 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+use t::APISIX 'no_plan';
+
+log_level("info");
+repeat_each(1);
+no_long_string();
+no_root_location();
+
+
+my $resp_file = 't/assets/ai-proxy-response.json';
+open(my $fh, '<', $resp_file) or die "Could not open file '$resp_file' $!";
+my $resp = do { local $/; <$fh> };
+close($fh);
+
+print "Hello, World!\n";
+print $resp;
+
+
+add_block_preprocessor(sub {
+    my ($block) = @_;
+
+    if (!defined $block->request) {
+        $block->set_value("request", "GET /t");
+    }
+
+    my $user_yaml_config = <<_EOC_;
+plugins:
+  - ai-proxy-multi
+  - prometheus
+_EOC_
+    $block->set_value("extra_yaml_config", $user_yaml_config);
+
+    my $http_config = $block->http_config // <<_EOC_;
+        server {
+            server_name anthropic;
+            listen 6725;
+
+            default_type 'application/json';
+
+            location /v1/chat/completions {
+                content_by_lua_block {
+                    local json = require("cjson.safe")
+
+                    if ngx.req.get_method() ~= "POST" then
+                        ngx.status = 400
+                        ngx.say("Unsupported request method: ", 
ngx.req.get_method())
+                    end
+                    ngx.req.read_body()
+                    local body, err = ngx.req.get_body_data()
+                    body, err = json.decode(body)
+
+                    local test_type = ngx.req.get_headers()["test-type"]
+                    if test_type == "options" then
+                        if body.foo == "bar" then
+                            ngx.status = 200
+                            ngx.say("options works")
+                        else
+                            ngx.status = 500
+                            ngx.say("model options feature doesn't work")
+                        end
+                        return
+                    end
+
+                    local header_auth = ngx.req.get_headers()["authorization"]
+                    local query_auth = ngx.req.get_uri_args()["apikey"]
+
+                    if header_auth ~= "Bearer token" and query_auth ~= 
"apikey" then
+                        ngx.status = 401
+                        ngx.say("Unauthorized")
+                        return
+                    end
+
+                    if header_auth == "Bearer token" or query_auth == "apikey" 
then
+                        ngx.req.read_body()
+                        local body, err = ngx.req.get_body_data()
+                        body, err = json.decode(body)
+
+                        if not body.messages or #body.messages < 1 then
+                            ngx.status = 400
+                            ngx.say([[{ "error": "bad request"}]])
+                            return
+                        end
+                        if body.messages[1].content == "write an SQL query to 
get all rows from student table" then
+                            ngx.print("SELECT * FROM STUDENTS")
+                            return
+                        end
+
+                        ngx.status = 200
+                        ngx.say([[$resp]])
+                        return
+                    end
+
+
+                    ngx.status = 503
+                    ngx.say("reached the end of the test suite")
+                }
+            }
+
+            location /random {
+                content_by_lua_block {
+                    ngx.say("path override works")
+                }
+            }
+        }
+_EOC_
+
+    $block->set_value("http_config", $http_config);
+});
+
+run_tests();
+
+__DATA__
+
+=== TEST 1: set route with right auth header
+--- config
+    location /t {
+        content_by_lua_block {
+            local t = require("lib.test_admin").test
+            local code, body = t('/apisix/admin/routes/1',
+                 ngx.HTTP_PUT,
+                 [[{
+                    "uri": "/anything",
+                    "plugins": {
+                        "ai-proxy-multi": {
+                            "instances": [
+                                {
+                                    "name": "anthropic-openai",
+                                    "provider": "anthropic-openai",
+                                    "weight": 1,
+                                    "auth": {
+                                        "header": {
+                                            "Authorization": "Bearer token"
+                                        }
+                                    },
+                                    "options": {
+                                        "model": "claude-sonnet-4-5",
+                                        "max_tokens": 512,
+                                        "temperature": 1.0
+                                    },
+                                    "override": {
+                                        "endpoint": 
"http://localhost:6725/v1/chat/completions";
+                                    }
+                                }
+                            ],
+                            "ssl_verify": false
+                        }
+                    }
+                }]]
+            )
+
+            if code >= 300 then
+                ngx.status = code
+            end
+            ngx.say(body)
+        }
+    }
+--- response_body
+passed
+
+
+
+=== TEST 2: send request
+--- request
+POST /anything
+{ "messages": [ { "role": "system", "content": "You are a mathematician" }, { 
"role": "user", "content": "What is 1+1?"} ] }
+--- more_headers
+Authorization: Bearer token
+--- error_code: 200
+--- response_body eval
+qr/\{ "content": "1 \+ 1 = 2\.", "role": "assistant" \}/
+
+
+
+=== TEST 3: set route with stream = true (SSE)
+--- config
+    location /t {
+        content_by_lua_block {
+            local t = require("lib.test_admin").test
+            local code, body = t('/apisix/admin/routes/1',
+                 ngx.HTTP_PUT,
+                 [[{
+                    "uri": "/anything",
+                    "plugins": {
+                        "ai-proxy-multi": {
+                            "instances": [
+                                {
+                                    "name": "anthropic-openai",
+                                    "provider": "anthropic-openai",
+                                    "weight": 1,
+                                    "auth": {
+                                        "header": {
+                                            "Authorization": "Bearer token"
+                                        }
+                                    },
+                                    "options": {
+                                        "model": "claude-sonnet-4-5",
+                                        "max_tokens": 512,
+                                        "temperature": 1.0,
+                                        "stream": true
+                                    },
+                                    "override": {
+                                        "endpoint": 
"http://localhost:7737/v1/chat/completions";

Review Comment:
   The endpoint in TEST 3 uses port 7737, but the mock server is configured to 
listen on port 6725 (line 53). This will cause TEST 4 to fail because it tries 
to send requests to the endpoint configured in TEST 3. The endpoint should be 
changed to "http://localhost:6725/v1/chat/completions"; to match the mock server 
configuration.
   ```suggestion
                                           "endpoint": 
"http://localhost:6725/v1/chat/completions";
   ```



##########
docs/zh/latest/plugins/ai-request-rewrite.md:
##########
@@ -36,7 +36,7 @@ description: ai-request-rewrite 插件在客户端请求转发到上游服务之
 | **字段**                 | **必选项** | **类型** | **描述**                           
                                           |
 | ------------------------- | ------------ | -------- | 
------------------------------------------------------------------------------------
 |
 | prompt                    | 是          | String   | 发送到 LLM 服务的提示。           
                                           |
-| provider                  | 是          | String   | LLM 
服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi 和 openai-compatible。当选择 
`aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 
`https://api.aimlapi.com/v1/chat/completions`。   |
+| provider                  | 是          | String   | LLM 
服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi、anthropic-openai 和 
openai-compatible。当选择 `aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 
`https://api.aimlapi.com/v1/chat/completions`。   |

Review Comment:
   The provider name is misspelled as "deekseek" but should be "deepseek" to 
match the actual provider enum value.



##########
docs/zh/latest/plugins/ai-proxy-multi.md:
##########
@@ -58,7 +58,7 @@ description: ai-proxy-multi 插件通过负载均衡、重试、故障转移和
 | balancer.key                       | string         | 否    |                 
                  |              | 当 `type` 为 `chash` 时使用。当 `hash_on` 设置为 
`header` 或 `cookie` 时,需要 `key`。当 `hash_on` 设置为 `consumer` 时,不需要 
`key`,因为消费者名称将自动用作键。 |
 | instances                          | array[object]  | 是     |                
                   |              | LLM 实例配置。 |
 | instances.name                     | string         | 是     |                
                   |              | LLM 服务实例的名称。 |
-| instances.provider                 | string         | 是     |                
                   | [openai, deepseek, azure-openai, aimlapi, 
openai-compatible] | LLM 服务提供商。设置为 `openai` 时,插件将代理请求到 `api.openai.com`。设置为 
`deepseek` 时,插件将代理请求到 `api.deepseek.com`。设置为 `aimlapi` 时,插件使用 OpenAI 
兼容驱动程序,默认将请求代理到 `api.aimlapi.com`。设置为 `openai-compatible` 时,插件将代理请求到在 
`override` 中配置的自定义端点。 |
+| instances.provider                 | string         | 是     |                
                   | [openai, deepseek, azure-openai, aimlapi, openrouter, 
openai-compatible, anthropic-openai] | LLM 服务提供商。设置为 `openai` 时,插件将代理请求到 
`api.openai.com`。设置为 `deepseek` 时,插件将代理请求到 `api.deepseek.com`。设置为 `aimlapi` 
时,插件使用 OpenAI 兼容驱动程序,默认将请求代理到 `api.aimlapi.com`。设置为 `openrouter` 时,插件使用 OpenAI 
兼容驱动程序,默认将请求代理到 `openrouter.ai`。设置为 `anthropic-openai` 时,插件使用 OpenAI 
兼容驱动程序,默认将请求代理到 `api.anthropic.com`。设置为 `openai-compatible` 时,插件将代理请求到在 
`override` 中配置的自定义端点。 |

Review Comment:
   Documentation mentions "openrouter" as a supported provider, but this is 
inconsistent with the actual implementation. The PR only adds support for 
"anthropic-openai", not "openrouter". Either the "openrouter" provider support 
needs to be implemented with corresponding schema updates and driver file, or 
it should be removed from the documentation.



##########
docs/en/latest/plugins/ai-proxy-multi.md:
##########
@@ -58,7 +58,7 @@ In addition, the Plugin also supports logging LLM request 
information in the acc
 | balancer.key                       | string         | False    |             
                      |              | Used when `type` is `chash`. When 
`hash_on` is set to `header` or `cookie`, `key` is required. When `hash_on` is 
set to `consumer`, `key` is not required as the consumer name will be used as 
the key automatically. |
 | instances                          | array[object]  | True     |             
                      |              | LLM instance configurations. |
 | instances.name                     | string         | True     |             
                      |              | Name of the LLM service instance. |
-| instances.provider                 | string         | True     |             
                      | [openai, deepseek, azure-openai, aimlapi, 
openai-compatible] | LLM service provider. When set to `openai`, the Plugin 
will proxy the request to `api.openai.com`. When set to `deepseek`, the Plugin 
will proxy the request to `api.deepseek.com`. When set to `aimlapi`, the Plugin 
uses the OpenAI-compatible driver and proxies the request to `api.aimlapi.com` 
by default. When set to `openai-compatible`, the Plugin will proxy the request 
to the custom endpoint configured in `override`. |
+| instances.provider                 | string         | True     |             
                      | [openai, deepseek, azure-openai, aimlapi, openrouter, 
openai-compatible, anthropic-openai] | LLM service provider. When set to 
`openai`, the Plugin will proxy the request to `api.openai.com`. When set to 
`deepseek`, the Plugin will proxy the request to `api.deepseek.com`. When set 
to `aimlapi`, the Plugin uses the OpenAI-compatible driver and proxies the 
request to `api.aimlapi.com` by default. When set to `openrouter`, the Plugin 
uses the OpenAI-compatible driver and proxies the request to `openrouter.ai` by 
default. When set to `anthropic-openai`, the Plugin will proxy the request to 
`api.anthropic.com` by default. When set to `openai-compatible`, the Plugin 
will proxy the request to the custom endpoint configured in `override`. |

Review Comment:
   Documentation mentions "openrouter" as a supported provider, but this is 
inconsistent with the actual implementation. The PR only adds support for 
"anthropic-openai", not "openrouter". Either the "openrouter" provider support 
needs to be implemented with corresponding schema updates and driver file, or 
it should be removed from the documentation.



##########
t/plugin/ai-proxy-anthropic-openai.t:
##########
@@ -0,0 +1,298 @@
+
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#     http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+
+use t::APISIX 'no_plan';
+
+log_level("info");
+repeat_each(1);
+no_long_string();
+no_root_location();
+
+
+my $resp_file = 't/assets/ai-proxy-response.json';
+open(my $fh, '<', $resp_file) or die "Could not open file '$resp_file' $!";
+my $resp = do { local $/; <$fh> };
+close($fh);
+
+print "Hello, World!\n";
+print $resp;
+
+

Review Comment:
   Debug print statements should be removed from test files before merging. 
These print statements appear to be left over from development/debugging.
   ```suggestion
   
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to