This is an automated email from the ASF dual-hosted git repository.
xtsong pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-agents.git
The following commit(s) were added to refs/heads/main by this push:
new d514886 [doc] Add instructions for java in Prompt, Tool and ChatModel
doc. (#263)
d514886 is described below
commit d514886fff3df49b514e7297e9a9715dfd0c5dff
Author: Wenjin Xie <[email protected]>
AuthorDate: Thu Oct 9 17:36:21 2025 +0800
[doc] Add instructions for java in Prompt, Tool and ChatModel doc. (#263)
---
docs/content/docs/development/chat_models.md | 198 ++++++++++++++++++++++++++-
docs/content/docs/development/prompts.md | 144 ++++++++++++++++++-
docs/content/docs/development/tool_use.md | 76 +++++++++-
3 files changed, 406 insertions(+), 12 deletions(-)
diff --git a/docs/content/docs/development/chat_models.md
b/docs/content/docs/development/chat_models.md
index 741eadd..394e589 100644
--- a/docs/content/docs/development/chat_models.md
+++ b/docs/content/docs/development/chat_models.md
@@ -30,19 +30,19 @@ Chat models enable agents to communicate with Large
Language Models (LLMs) for n
## Getting Started
-To use chat models in your agents, you need to define both a connection and
setup using decorators, then interact with the model through events.
+To use chat models in your agents, you need to define both a connection and
setup using decorators/annotations, then interact with the model through events.
-### Resource Decorators
+### Resource Declaration
-Flink Agents provides decorators to simplify chat model setup within agents:
+Flink Agents provides decorators in python and annotations in java to simplify
chat model setup within agents:
-#### @chat_model_connection
+#### @chat_model_connection/@ChatModelConnection
-The `@chat_model_connection` decorator marks a method that creates a chat
model connection. This is typically defined once and shared across multiple
chat model setups.
+The `@chat_model_connection` decorator or `@ChatModelConnection` annotation
marks a method that creates a chat model connection. This is typically defined
once and shared across multiple chat model setups.
-#### @chat_model_setup
+#### @chat_model_setup/@ChatModelSetup
-The `@chat_model_setup` decorator marks a method that creates a chat model
setup. This references a connection and adds chat-specific configuration like
prompts and tools.
+The `@chat_model_setup` decorator or `@ChatModelSetup` annotation marks a
method that creates a chat model setup. This references a connection and adds
chat-specific configuration like prompts and tools.
### Chat Events
@@ -55,6 +55,9 @@ Chat models communicate through built-in events:
Here's how to define and use chat models in a workflow agent:
+{{< tabs "Usage Example" >}}
+
+{{< tab "Python" >}}
```python
class MyAgent(Agent):
@@ -96,6 +99,46 @@ class MyAgent(Agent):
# Handle the LLM's response
# Process the response as needed for your use case
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class MyAgent extends Agent {
+ @ChatModelConnection
+ public static ResourceDescriptor ollamaConnection() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelConnection.class.getName())
+ .addInitialArgument("endpoint", "http://localhost:11434")
+ .build();
+ }
+
+ @ChatModelSetup
+ public static ResourceDescriptor ollamaChatModel() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelSetup.class.getName())
+ .addInitialArgument("connection", "ollamaConnection")
+ .addInitialArgument("model", "qwen3:8b")
+ .build();
+ }
+
+ @Action(listenEvents = {InputEvent.class})
+ public static void processInput(InputEvent event, RunnerContext ctx)
throws Exception {
+ ChatMessage userMessage =
+ new ChatMessage(MessageRole.USER, String.format("input: {%s}",
event.getInput()));
+ ctx.sendEvent(new ChatRequestEvent("ollamaChatModel",
List.of(userMessage)));
+ }
+
+ @Action(listenEvents = {ChatResponseEvent.class})
+ public static void processResponse(ChatResponseEvent event, RunnerContext
ctx)
+ throws Exception {
+ String response = event.getResponse().getContent();
+ // Handle the LLM's response
+ // Process the response as needed for your use case
+ }
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
+
## Built-in Providers
@@ -103,6 +146,10 @@ class MyAgent(Agent):
Anthropic provides cloud-based chat models featuring the Claude family, known
for their strong reasoning, coding, and safety capabilities.
+{{< hint warning >}}
+Anthropic is only supported in python currently.
+{{< /hint >}}
+
#### Prerequisites
1. Get an API key from [Anthropic Console](https://console.anthropic.com/)
@@ -182,13 +229,34 @@ Ollama provides local chat models that run on your
machine, offering privacy, co
#### OllamaChatModelConnection Parameters
+{{< tabs "OllamaChatModelConnection Parameters" >}}
+
+{{< tab "Python" >}}
+
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `base_url` | str | `"http://localhost:11434"` | Ollama server URL |
| `request_timeout` | float | `30.0` | HTTP request timeout in seconds |
+{{< /tab >}}
+
+{{< tab "Java" >}}
+
+| Parameter | Type | Default | Description |
+|------------------|--------|----------------------------|-------------|
+| `endpoint` | String | `"http://localhost:11434"` | Ollama server URL |
+| `requestTimeout` | long | `10` | HTTP request
timeout in seconds |
+
+{{< /tab >}}
+
+{{< /tabs >}}
+
#### OllamaChatModelSetup Parameters
+{{< tabs "OllamaChatModelSetup Parameters" >}}
+
+{{< tab "Python" >}}
+
| Parameter | Type | Default | Description |
|-----------|------|---------|-------------|
| `connection` | str | Required | Reference to connection method name |
@@ -200,9 +268,25 @@ Ollama provides local chat models that run on your
machine, offering privacy, co
| `keep_alive` | str \| float | `"5m"` | How long to keep model loaded in
memory |
| `extract_reasoning` | bool | `True` | Extract reasoning content from
response |
| `additional_kwargs` | dict | `{}` | Additional Ollama API parameters |
+{{< /tab >}}
+
+{{< tab "Java" >}}
+
+| Parameter | Type | Default | Description |
+|-----------|------------------|---------|-------------|
+| `connection` | String | Required | Reference to connection method
name |
+| `model` | String | Required | Name of the chat model to use |
+| `prompt` | Prompt \| String | None | Prompt template or reference to prompt
resource |
+| `tools` | List[String] | None | List of tool names available to the
model |
+{{< /tab >}}
+
+{{< /tabs >}}
#### Usage Example
+{{< tabs "Ollama Usage Example" >}}
+
+{{< tab "Python" >}}
```python
class MyAgent(Agent):
@@ -230,6 +314,34 @@ class MyAgent(Agent):
...
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class MyAgent extends Agent {
+ @ChatModelConnection
+ public static ResourceDescriptor ollamaConnection() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelConnection.class.getName())
+ .addInitialArgument("endpoint", "http://localhost:11434")
+ .addInitialArgument("requestTimeout", 120)
+ .build();
+ }
+
+ @ChatModelSetup
+ public static ResourceDescriptor ollamaChatModel() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelSetup.class.getName())
+ .addInitialArgument("connection", "ollamaConnection")
+ .addInitialArgument("model", "qwen3:8b")
+ .build();
+ }
+
+ ...
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
+
#### Available Models
@@ -249,6 +361,10 @@ Model availability and specifications may change. Always
check the official Olla
OpenAI provides cloud-based chat models with state-of-the-art performance for
a wide range of natural language tasks.
+{{< hint warning >}}
+OpenAI is only supported in python currently.
+{{< /hint >}}
+
#### Prerequisites
1. Get an API key from [OpenAI Platform](https://platform.openai.com/)
@@ -328,6 +444,10 @@ Model availability and specifications may change. Always
check the official Open
Tongyi provides cloud-based chat models from Alibaba Cloud, offering powerful
Chinese and English language capabilities.
+{{< hint warning >}}
+Tongyi is only supported in python currently.
+{{< /hint >}}
+
#### Prerequisites
1. Get an API key from [Alibaba Cloud DashScope](https://dashscope.aliyun.com/)
@@ -405,6 +525,9 @@ If you want to use chat models not offered by the built-in
providers, you can ex
Handles the connection to chat model services and provides the core chat
functionality.
+{{< tabs "Custom BaseChatModelConnection" >}}
+
+{{< tab "Python" >}}
```python
class MyChatModelConnection(BaseChatModelConnection):
@@ -421,11 +544,50 @@ class MyChatModelConnection(BaseChatModelConnection):
# - Returns: ChatMessage with the model's response
pass
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class MyChatModelConnection extends BaseChatModelConnection {
+
+ /**
+ * Creates a new chat model connection.
+ *
+ * @param descriptor a resource descriptor contains the initial parameters
+ * @param getResource a function to resolve resources (e.g., tools) by
name and type
+ */
+ public MyChatModelConnection(
+ ResourceDescriptor descriptor, BiFunction<String, ResourceType,
Resource> getResource) {
+ super(descriptor, getResource);
+ // get custom arguments from descriptor
+ String endpoint = descriptor.getArgument("endpoint");
+ ...
+ }
+
+
+ @Override
+ public ChatMessage chat(
+ List<ChatMessage> messages, List<Tool> tools, Map<String, Object>
arguments) {
+ // Core method: send messages to LLM and return response
+ // - messages: Input message sequence
+ // - tools: Optional list of tools available to the model
+ // - arguments: Additional parameters from ChatModelSetup
+ // - Returns: ChatMessage with the model's response
+ }
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
+
### BaseChatModelSetup
The setup class acts as a high-level configuration interface that defines
which connection to use and how to configure the chat model.
+{{< tabs "Prepare Agents Execution Environment" >}}
+
+{{< tab "Python" >}}
```python
class MyChatModelSetup(BaseChatModelSetup):
# Add your custom configuration fields here
@@ -436,3 +598,25 @@ class MyChatModelSetup(BaseChatModelSetup):
# This dictionary is passed as **kwargs to the chat() method
return {"model": self.model, "temperature": 0.7, ...}
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class MyChatModelSetup extends BaseChatModelSetup {
+ // Add your custom configuration fields here
+
+ @Override
+ public Map<String, Object> getParameters() {
+ Map<String, Object> params = new HashMap<>();
+ params.put("model", model);
+ ...
+ // Return model-specific configuration passed to chat()
+ // This dictionary is passed as arguments to the chat() method
+ return params;
+ }
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
+
diff --git a/docs/content/docs/development/prompts.md
b/docs/content/docs/development/prompts.md
index c28f9ff..17ffe3c 100644
--- a/docs/content/docs/development/prompts.md
+++ b/docs/content/docs/development/prompts.md
@@ -40,12 +40,18 @@ Local prompts are templates defined directly in your code.
They support variable
MCP (Model Context Protocol) prompts are managed by external MCP servers. They
enable dynamic prompt retrieval, centralized prompt management, and integration
with external prompt repositories.
+{{< hint warning >}}
+MCP Prompt is only supported in python currently.
+{{< /hint >}}
## Local Prompt
### Creating from Text
The simplest way to create a prompt is from a text string using
`Prompt.from_text()`:
+{{< tabs "Creating from Text" >}}
+
+{{< tab "Python" >}}
```python
product_suggestion_prompt_str = """
Based on the rating distribution and user dissatisfaction reasons, generate
three actionable suggestions for product improvement.
@@ -72,6 +78,36 @@ input:
product_suggestion_prompt = Prompt.from_text(product_suggestion_prompt_str)
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+// Prompt for product suggestion agent
+String PRODUCT_SUGGESTION_PROMPT_STR =
+ "Based on the rating distribution and user dissatisfaction reasons,
generate three actionable suggestions for product improvement.\n\n"
+ + "Input format:\n"
+ + "{\n"
+ + " \"id\": \"1\",\n"
+ + " \"score_histogram\": [\"10%\", \"20%\", \"10%\",
\"15%\", \"45%\"],\n"
+ + " \"unsatisfied_reasons\": [\"reason1\", \"reason2\",
\"reason3\"]\n"
+ + "}\n\n"
+ + "Ensure that your response can be parsed by Java JSON, use
the following format as an example:\n"
+ + "{\n"
+ + " \"suggestion_list\": [\n"
+ + " \"suggestion1\",\n"
+ + " \"suggestion2\",\n"
+ + " \"suggestion3\"\n"
+ + " ]\n"
+ + "}\n\n"
+ + "input:\n"
+ + "{input}";
+
+
+Prompt productSuggestionPrompt = new Prompt(PRODUCT_SUGGESTION_PROMPT_STR);
+```
+{{< /tab >}}
+
+{{< /tabs >}}
**Key points:**
- Use `{variable_name}` for template variables that will be substituted at
runtime
@@ -81,6 +117,9 @@ product_suggestion_prompt =
Prompt.from_text(product_suggestion_prompt_str)
For more control, create prompts from a sequence of `ChatMessage` objects
using `Prompt.from_messages()`:
+{{< tabs "Creating from Messages" >}}
+
+{{< tab "Python" >}}
```python
review_analysis_prompt = Prompt.from_messages(
messages=[
@@ -114,6 +153,35 @@ review_analysis_prompt = Prompt.from_messages(
],
)
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+Prompt reviewAnalysisPrompt =
+ new Prompt(
+ Arrays.asList(
+ new ChatMessage(
+ MessageRole.SYSTEM,
+ "Analyze the user review and product
information to determine a "
+ + "satisfaction score (1-5) and
potential reasons for dissatisfaction.\n\n"
+ + "Example input format:\n"
+ + "{\n"
+ + " \"id\": \"12345\",\n"
+ + " \"review\": \"The headphones
broke after one week of use. Very poor quality.\"\n"
+ + "}\n\n"
+ + "Ensure your response can be parsed
by Java JSON, using this format as an example:\n"
+ + "{\n"
+ + " \"id\": \"12345\",\n"
+ + " \"score\": 1,\n"
+ + " \"reasons\": [\n"
+ + " \"poor quality\"\n"
+ + " ]\n"
+ + "}"),
+ new ChatMessage(MessageRole.USER, "\"input\":\n" +
"{input}")));
+```
+{{< /tab >}}
+
+{{< /tabs >}}
**Key points:**
- Define multiple messages with different roles (SYSTEM, USER)
@@ -121,8 +189,11 @@ review_analysis_prompt = Prompt.from_messages(
### Using Prompts in Agents
-Register a prompt as an agent resource using the `@prompt` decorator:
+Register a prompt as an agent resource using the `@prompt` decorator in python
(or `@Prompt` annotation in java):
+{{< tabs "Using Prompts in Agents" >}}
+
+{{< tab "Python" >}}
```python
class ReviewAnalysisAgent(Agent):
@@ -188,6 +259,73 @@ class ReviewAnalysisAgent(Agent):
msg = ChatMessage(role=MessageRole.USER, extra_args={"input": content})
ctx.send_event(ChatRequestEvent(model="review_analysis_model",
messages=[msg]))
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class ReviewAnalysisAgent extends Agent {
+
+ private static final ObjectMapper MAPPER = new ObjectMapper();
+
+ @Prompt
+ public static org.apache.flink.agents.api.prompt.Prompt
reviewAnalysisPrompt() {
+ return new org.apache.flink.agents.api.prompt.Prompt(
+ Arrays.asList(
+ new ChatMessage(
+ MessageRole.SYSTEM,
+ "Analyze the user review and product
information to determine a "
+ + "satisfaction score (1-5) and
potential reasons for dissatisfaction.\n\n"
+ + "Example input format:\n"
+ + "{\n"
+ + " \"id\": \"12345\",\n"
+ + " \"review\": \"The headphones
broke after one week of use. Very poor quality.\"\n"
+ + "}\n\n"
+ + "Ensure your response can be parsed
by Java JSON, using this format as an example:\n"
+ + "{\n"
+ + " \"id\": \"12345\",\n"
+ + " \"score\": 1,\n"
+ + " \"reasons\": [\n"
+ + " \"poor quality\"\n"
+ + " ]\n"
+ + "}"),
+ new ChatMessage(MessageRole.USER, "\"input\":\n" +
"{input}")));
+ }
+
+ @ChatModelSetup
+ public static ResourceDescriptor reviewAnalysisModel() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelSetup.class.getName())
+ .addInitialArgument("connection", "ollamaChatModelConnection")
+ .addInitialArgument("model", "qwen3:8b")
+ .addInitialArgument("prompt", "reviewAnalysisPrompt")
+ .addInitialArgument("tools",
Collections.singletonList("notifyShippingManager"))
+ .addInitialArgument("extract_reasoning", "true")
+ .build();
+ }
+
+ /** Process input event and send chat request for review analysis. */
+ @Action(listenEvents = {InputEvent.class})
+ public static void processInput(InputEvent event, RunnerContext ctx)
throws Exception {
+ String input = (String) event.getInput();
+ MAPPER.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES,
false);
+ CustomTypesAndResources.ProductReview inputObj =
+ MAPPER.readValue(input,
CustomTypesAndResources.ProductReview.class);
+
+ ctx.getShortTermMemory().set("id", inputObj.getId());
+
+ String content =
+ String.format(
+ "{\n" + "\"id\": %s,\n" + "\"review\": \"%s\"\n" + "}",
+ inputObj.getId(), inputObj.getReview());
+ ChatMessage msg = new ChatMessage(MessageRole.USER, "",
Map.of("input", content));
+
+ ctx.sendEvent(new ChatRequestEvent("reviewAnalysisModel",
List.of(msg)));
+ }
+}
+
+```
+{{< /tab >}}
+
+{{< /tabs >}}
Prompts use `{variable_name}` syntax for template variables. Variables are
filled from `ChatMessage.extra_args`. The prompt is automatically applied when
the chat model is invoked.
@@ -197,6 +335,10 @@ Prompts use `{variable_name}` syntax for template
variables. Variables are fille
MCP (Model Context Protocol) is a standardized protocol for integrating AI
applications with external data sources and tools. MCP prompts allow dynamic
prompt retrieval from MCP servers.
{{< /hint >}}
+{{< hint warning >}}
+MCP Prompt is only supported in python currently.
+{{< /hint >}}
+
MCP prompts are managed by external MCP servers and automatically discovered
when you define an MCP server connection in your agent.
### Define MCP Server with Prompts
diff --git a/docs/content/docs/development/tool_use.md
b/docs/content/docs/development/tool_use.md
index 2e2a745..11f1fb4 100644
--- a/docs/content/docs/development/tool_use.md
+++ b/docs/content/docs/development/tool_use.md
@@ -28,16 +28,19 @@ Flink Agents provides a flexible and extensible tool use
mechanism. Developers c
## Local Function as Tool
-Developer can define the tool as a local Python function, and there are two
ways to define and register an local function as a tool:
+Developer can define the tool as a local Python/Java function, and there are
two ways to define and register a local function as a tool:
{{< hint info >}}
-Flink Agents uses the docstring of the tool function to generate the tool
metadata. The docstring of the python function should accurately describe the
tool's purpose, parameters, and return value, so that the LLM can understand
the tool and use it effectively.
+Flink Agents uses the docstring of the python tool function to generate the
tool metadata. The docstring of the python function should accurately describe
the tool's purpose, parameters, and return value, so that the LLM can
understand the tool and use it effectively.
{{< /hint >}}
### Define Tool as Static Method in Agent Class
-Developer can define the tool as a static method in the agent class while
defining the workflow agent, and use the `@tool` annotation to mark the method
as a tool. The tool can be referenced by its name in the `tools` list of the
`ResourceDescriptor` when creating the chat model in the agent.
+Developer can define the tool as a static method in the agent class while
defining the workflow agent, and use the `@tool` decorator to mark the function
as a tool in python (or `@Tool` annotation in java). The tool can be referenced
by its name in the `tools` list of the `ResourceDescriptor` when creating the
chat model in the agent.
+{{< tabs "Define Tool as Static Method in Agent Class" >}}
+
+{{< tab "Python" >}}
```python
class ReviewAnalysisAgent(Agent):
@@ -68,9 +71,36 @@ class ReviewAnalysisAgent(Agent):
...
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+public class ReviewAnalysisAgent extends Agent {
+
+ @Tool(description = "Notify the shipping manager when product received a
negative review due to shipping damage.")
+ public static void notifyShippingManager(
+ @ToolParam(name = "id") String id, @ToolParam(name = "review")
String review) {
+ CustomTypesAndResources.notifyShippingManager(id, review);
+ }
+
+ @ChatModelSetup
+ public static ResourceDescriptor reviewAnalysisModel() {
+ return
ResourceDescriptor.Builder.newBuilder(OllamaChatModelSetup.class.getName())
+ .addInitialArgument("connection", "ollamaChatModelConnection")
+ ...
+ .addInitialArgument("tools",
Collections.singletonList("notifyShippingManager")) // reference the tool by
its name
+ .build();
+ }
+
+ ...
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
**Key points:**
-- Use `@tool` decorator to define the tool
+- Use `@tool` decorator to define the tool in python (or `@Tool` annotation in
java)
- Reference the tool by its name in the `tools` list of the
`ResourceDescriptor`
@@ -78,6 +108,9 @@ class ReviewAnalysisAgent(Agent):
Developer can register the tool to the execution environment, and then
reference the tool by its name. This allows the tool to be reused by multiple
agents.
+{{< tabs "Register Tool to Execution Environment" >}}
+
+{{< tab "Python" >}}
```python
def notify_shipping_manager(id: str, review: str) -> None:
"""Notify the shipping manager when product received a negative review due
to
@@ -110,6 +143,37 @@ review_analysis_react_agent = ReActAgent(
...
)
```
+{{< /tab >}}
+
+{{< tab "Java" >}}
+```java
+@Tool(description = "Notify the shipping manager when product received a
negative review due to shipping damage.")
+public static void notifyShippingManager(
+ @ToolParam(name = "id") String id, @ToolParam(name = "review") String
review) {
+ ...
+}
+
+// Add notify shipping manager tool to the execution environment.
+agentsEnv
+ .addResource(
+ "notifyShippingManager",
+ ResourceType.TOOL,
+ org.apache.flink.agents.api.tools.Tool.fromMethod(
+ ReActAgentExample.class.getMethod(
+ "notifyShippingManager", String.class, String.class)));
+
+// Create react agent with notify shipping manager tool.
+ReActAgent reviewAnalysisReactAgent = new ReActAgent(
+
ResourceDescriptor.Builder.newBuilder(OllamaChatModelSetup.class.getName())
+ .addInitialArgument(
+ "tools",
Collections.singletonList("notifyShippingManager")) // reference the tool by
its name
+ ...
+ .build(),
+ ...);
+```
+{{< /tab >}}
+
+{{< /tabs >}}
**Key points:**
- Use `AgentsExecutionEnvironment.add_resource` to register the tool to the
execution environment
@@ -121,6 +185,10 @@ review_analysis_react_agent = ReActAgent(
MCP (Model Context Protocol) is a standardized protocol for integrating AI
applications with external data sources and tools. MCP tools allow dynamic tool
retrieval from MCP servers.
{{< /hint >}}
+{{< hint warning >}}
+MCP Tool is only supported in python currently.
+{{< /hint >}}
+
MCP tools are managed by external MCP servers and automatically discovered
when you define an MCP server connection in your agent.
### Define MCP Server with Tools