This is an automated email from the ASF dual-hosted git repository.
shreemaanabhishek pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/apisix.git
The following commit(s) were added to refs/heads/master by this push:
new a91e79a75 docs: fix the ai-proxy override.endpoint (#11700)
a91e79a75 is described below
commit a91e79a7527288540d61bb94c2ea1179854de2de
Author: Vedran Vidovic <[email protected]>
AuthorDate: Mon Nov 4 16:31:36 2024 +0100
docs: fix the ai-proxy override.endpoint (#11700)
---
docs/en/latest/plugins/ai-proxy.md | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/docs/en/latest/plugins/ai-proxy.md
b/docs/en/latest/plugins/ai-proxy.md
index a6a4e3542..0f68911bb 100644
--- a/docs/en/latest/plugins/ai-proxy.md
+++ b/docs/en/latest/plugins/ai-proxy.md
@@ -62,7 +62,7 @@ Proxying requests to OpenAI is supported now. Other LLM
services will be support
| model.options.temperature | No | Number | Matching temperature
for models. Range: 0.0 - 5.0 |
| model.options.top_p | No | Number | Top-p probability
mass. Range: 0 - 1 |
| model.options.stream | No | Boolean | Stream response by
SSE. Default: false |
-| model.override.endpoint | No | String | Override the endpoint
of the AI provider |
+| override.endpoint | No | String | Override the endpoint
of the AI provider |
| passthrough | No | Boolean | If enabled, the
response from LLM will be sent to the upstream. Default: false |
| timeout | No | Integer | Timeout in
milliseconds for requests to LLM. Range: 1 - 60000. Default: 3000 |
| keepalive | No | Boolean | Enable keepalive for
requests to LLM. Default: true |