This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-agents.git


The following commit(s) were added to refs/heads/main by this push:
     new c1f7670e [doc] Add Ollama server 0.9.0+ version requirement (#512)
c1f7670e is described below

commit c1f7670ec5f4ce00142a16e27edec3713ebd7766
Author: Eugene <[email protected]>
AuthorDate: Mon Feb 2 15:29:50 2026 +0800

    [doc] Add Ollama server 0.9.0+ version requirement (#512)
---
 docs/content/docs/development/chat_models.md               | 4 ++++
 docs/content/docs/get-started/quickstart/react_agent.md    | 8 ++++++--
 docs/content/docs/get-started/quickstart/workflow_agent.md | 8 ++++++--
 3 files changed, 16 insertions(+), 4 deletions(-)

diff --git a/docs/content/docs/development/chat_models.md 
b/docs/content/docs/development/chat_models.md
index bfe72338..4a43c8d3 100644
--- a/docs/content/docs/development/chat_models.md
+++ b/docs/content/docs/development/chat_models.md
@@ -495,6 +495,10 @@ Ollama provides local chat models that run on your 
machine, offering privacy, co
 
 #### Prerequisites
 
+{{< hint info >}}
+Ollama server **0.9.0** or higher is required.
+{{< /hint >}}
+
 1. Install Ollama from [https://ollama.com/](https://ollama.com/)
 2. Start the Ollama server: `ollama serve`
 3. Download a chat model: `ollama pull qwen3:8b`
diff --git a/docs/content/docs/get-started/quickstart/react_agent.md 
b/docs/content/docs/get-started/quickstart/react_agent.md
index f1f763c3..dda03435 100644
--- a/docs/content/docs/get-started/quickstart/react_agent.md
+++ b/docs/content/docs/get-started/quickstart/react_agent.md
@@ -265,10 +265,14 @@ If you can't navigate to the web UI at 
[localhost:8081](localhost:8081), you can
 
 Download and install Ollama from the official 
[website](https://ollama.com/download).
 
-Then run the qwen3:8b model, which is required by the quickstart examples
+{{< hint info >}}
+Ollama server **0.9.0** or higher is required.
+{{< /hint >}}
+
+Then pull the qwen3:8b model, which is required by the quickstart examples
 
 ```bash
-ollama run qwen3:8b
+ollama pull qwen3:8b
 ```
 
 ### Submit Flink Agents Job to Standalone Flink Cluster
diff --git a/docs/content/docs/get-started/quickstart/workflow_agent.md 
b/docs/content/docs/get-started/quickstart/workflow_agent.md
index 65fa5e8e..082684a3 100644
--- a/docs/content/docs/get-started/quickstart/workflow_agent.md
+++ b/docs/content/docs/get-started/quickstart/workflow_agent.md
@@ -394,10 +394,14 @@ If you can't navigate to the web UI at 
[localhost:8081](localhost:8081), you can
 
 Download and install Ollama from the official 
[website](https://ollama.com/download).
 
-Then run the qwen3:8b model, which is required by the quickstart examples
+{{< hint info >}}
+Ollama server **0.9.0** or higher is required.
+{{< /hint >}}
+
+Then pull the qwen3:8b model, which is required by the quickstart examples
 
 ```bash
-ollama run qwen3:8b
+ollama pull qwen3:8b
 ```
 
 ### Submit Flink Agents Job to Standalone Flink Cluster

Reply via email to