xintongsong commented on code in PR #459:
URL: https://github.com/apache/flink-agents/pull/459#discussion_r2711754822


##########
docs/content/docs/development/mcp.md:
##########
@@ -187,4 +183,27 @@ public class ReviewAnalysisAgent extends Agent {
 
 **Key points:**
 - All tools and prompts from the MCP server are automatically registered.
-- Reference MCP prompts and tools by their names, like reference [local 
prompt]({{< ref "docs/development/prompts#using-prompts-in-agents" >}}) and 
[function tool]({{< ref 
"docs/development/tool_use#define-tool-as-static-method-in-agent-class" >}}) .
\ No newline at end of file
+- Reference MCP prompts and tools by their names, like reference [local 
prompt]({{< ref "docs/development/prompts#using-prompts-in-agents" >}}) and 
[function tool]({{< ref 
"docs/development/tool_use#define-tool-as-static-method-in-agent-class" >}}) .
+
+## Appendix
+
+### MCP SDK
+
+Flink Agents offers two implementations of MCP support, based on MCP SDKs in 
different languages (Python and Java). Typically, users do not need to be aware 
of this, as the framework automatically determines the appropriate 
implementation based on the language and version. The default behavior is 
described as follows:
+
+| Agent Language | JDK Version      | Default Implementation |
+|----------------|------------------|------------------------|
+| Python         | Any              | Python SDK  |
+| Java           | JDK 17+          | Java SDK    |
+| Java           | JDK 16 and below | Python SDK  |
+
+#### Overriding the Default (Java API Only)

Review Comment:
   Why this is java only?



##########
docs/content/docs/development/chat_models.md:
##########
@@ -603,6 +603,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only)
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To use cross-language chat models, use the special wrapper model provider and 
specify the original provider as an argument:
+
+- **From Python to Java**: Use `Constant.JAVA_CHAT_MODEL_CONNECTION` and 
`Constant.JAVA_CHAT_MODEL_SETUP`, specifying the Java provider class via the 
`java_clazz` parameter
+- **From Java to Python**: Use `Constant.PYTHON_CHAT_MODEL_CONNECTION` and 
`Constant.PYTHON_CHAT_MODEL_SETUP`, specifying the Python provider via `module` 
and `clazz` parameters
+
+
+
+### Usage Example
+
+{{< tabs "Cross-Language Chat Model Usage Example" >}}
+
+{{< tab "Using Java Chat Model in Python" >}}
+```python
+class MyAgent(Agent):
+
+    @chat_model_connection
+    @staticmethod
+    def java_chat_model_connection() -> ResourceDescriptor:
+        """
+        ChatModelConnection responsible for ollama model service connection.
+        
+        This wraps the Java OllamaChatModelConnection class. In pure Java, 
+        the equivalent code would be:
+        
+            @ChatModelConnection
+            public static ResourceDescriptor ollamaConnection() {
+                return ResourceDescriptor.Builder
+                    .newBuilder(Constant.OllamaChatModelConnection)
+                    .addInitialArgument("endpoint", "http://localhost:11434";)
+                    .addInitialArgument("requestTimeout", 120)
+                    .build();
+            }
+        """
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_CHAT_MODEL_CONNECTION,
+            
java_clazz="org.apache.flink.agents.integrations.chatmodels.ollama.OllamaChatModelConnection",
+            endpoint="http://localhost:11434";,
+            requestTimeout=120,
+        )
+    
+    
+    @chat_model_setup
+    @staticmethod
+    def java_chat_model() -> ResourceDescriptor:
+        """ChatModel which focus on math, and reuse ChatModelConnection."""
+        return ResourceDescriptor(

Review Comment:
   This also needs an example.



##########
docs/content/docs/development/chat_models.md:
##########
@@ -603,6 +603,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only)

Review Comment:
   ```suggestion
   Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only).
   ```



##########
docs/content/docs/development/chat_models.md:
##########
@@ -603,6 +603,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only)
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To use cross-language chat models, use the special wrapper model provider and 
specify the original provider as an argument:

Review Comment:
   ```suggestion
   To leverage chat model supports provided in a different language, you need 
to declare the resource within a built-in cross-language wrapper, and specify 
the target provider as an argument:
   ```



##########
docs/content/docs/development/chat_models.md:
##########
@@ -603,6 +603,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only)
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To use cross-language chat models, use the special wrapper model provider and 
specify the original provider as an argument:
+
+- **From Python to Java**: Use `Constant.JAVA_CHAT_MODEL_CONNECTION` and 
`Constant.JAVA_CHAT_MODEL_SETUP`, specifying the Java provider class via the 
`java_clazz` parameter
+- **From Java to Python**: Use `Constant.PYTHON_CHAT_MODEL_CONNECTION` and 
`Constant.PYTHON_CHAT_MODEL_SETUP`, specifying the Python provider via `module` 
and `clazz` parameters
+
+
+
+### Usage Example
+
+{{< tabs "Cross-Language Chat Model Usage Example" >}}
+
+{{< tab "Using Java Chat Model in Python" >}}
+```python
+class MyAgent(Agent):
+
+    @chat_model_connection
+    @staticmethod
+    def java_chat_model_connection() -> ResourceDescriptor:
+        """
+        ChatModelConnection responsible for ollama model service connection.
+        
+        This wraps the Java OllamaChatModelConnection class. In pure Java, 
+        the equivalent code would be:
+        
+            @ChatModelConnection
+            public static ResourceDescriptor ollamaConnection() {
+                return ResourceDescriptor.Builder
+                    .newBuilder(Constant.OllamaChatModelConnection)
+                    .addInitialArgument("endpoint", "http://localhost:11434";)
+                    .addInitialArgument("requestTimeout", 120)
+                    .build();
+            }
+        """

Review Comment:
   Only the descriptor is needed.



##########
docs/content/docs/development/embedding_models.md:
##########
@@ -238,6 +238,131 @@ Current popular models include:
 Model availability and specifications may change. Always check the official 
OpenAI documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Resources

Review Comment:
   Embedding model and vector store also need to be updated.



##########
docs/content/docs/development/chat_models.md:
##########
@@ -603,6 +603,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only)
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To use cross-language chat models, use the special wrapper model provider and 
specify the original provider as an argument:
+
+- **From Python to Java**: Use `Constant.JAVA_CHAT_MODEL_CONNECTION` and 
`Constant.JAVA_CHAT_MODEL_SETUP`, specifying the Java provider class via the 
`java_clazz` parameter

Review Comment:
   ```suggestion
   - **Using Java chat models in Python**: Use 
`Constant.JAVA_CHAT_MODEL_CONNECTION` and `Constant.JAVA_CHAT_MODEL_SETUP`, 
specifying the Java chat model connection / setup class via the `java_clazz` 
parameter
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to