This is an automated email from the ASF dual-hosted git repository.

xtsong pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/flink-agents.git


The following commit(s) were added to refs/heads/main by this push:
     new a15e7643 [doc] Introduce cross-language resources (#459)
a15e7643 is described below

commit a15e7643681620d0673638afca2869a2ffc207d8
Author: Eugene <[email protected]>
AuthorDate: Fri Jan 23 14:28:42 2026 +0800

    [doc] Introduce cross-language resources (#459)
---
 docs/content/docs/development/chat_models.md      | 159 +++++++++++++++++++++-
 docs/content/docs/development/embedding_models.md | 131 +++++++++++++++++-
 docs/content/docs/development/mcp.md              |  27 +++-
 docs/content/docs/development/vector_stores.md    | 132 +++++++++++++++++-
 docs/content/docs/faq/faq.md                      |   8 ++
 5 files changed, 440 insertions(+), 17 deletions(-)

diff --git a/docs/content/docs/development/chat_models.md 
b/docs/content/docs/development/chat_models.md
index ebfdd397..261d1348 100644
--- a/docs/content/docs/development/chat_models.md
+++ b/docs/content/docs/development/chat_models.md
@@ -146,8 +146,8 @@ public class MyAgent extends Agent {
 
 Azure AI provides cloud-based chat models through Azure AI Inference API, 
supporting various models including GPT-4, GPT-4o, and other Azure-hosted 
models.
 
-{{< hint warning >}}
-Azure AI is only supported in Java currently.
+{{< hint info >}}
+Azure AI is only supported in Java currently. To use Azure AI from Python 
agents, see [Using Cross-Language Providers](#using-cross-language-providers).
 {{< /hint >}}
 
 #### Prerequisites
@@ -235,8 +235,8 @@ Model availability and specifications may change. Always 
check the official Azur
 
 Anthropic provides cloud-based chat models featuring the Claude family, known 
for their strong reasoning, coding, and safety capabilities.
 
-{{< hint warning >}}
-Anthropic is only supported in python currently.
+{{< hint info >}}
+Anthropic is only supported in Python currently. To use Anthropic from Java 
agents, see [Using Cross-Language Providers](#using-cross-language-providers).
 {{< /hint >}}
 
 #### Prerequisites
@@ -611,8 +611,8 @@ Model availability and specifications may change. Always 
check the official Open
 
 Tongyi provides cloud-based chat models from Alibaba Cloud, offering powerful 
Chinese and English language capabilities.
 
-{{< hint warning >}}
-Tongyi is only supported in python currently.
+{{< hint info >}}
+Tongyi is only supported in Python currently. To use Tongyi from Java agents, 
see [Using Cross-Language Providers](#using-cross-language-providers).
 {{< /hint >}}
 
 #### Prerequisites
@@ -680,6 +680,153 @@ Some popular options include:
 Model availability and specifications may change. Always check the official 
DashScope documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language chat model integration, allowing you to 
use chat models implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a chat model 
provider is only available in one language (e.g., Tongyi is currently 
Python-only).
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To leverage chat model supports provided in a different language, you need to 
declare the resource within a built-in cross-language wrapper, and specify the 
target provider as an argument:
+
+- **Using Java chat models in Python**: Use 
`Constant.JAVA_CHAT_MODEL_CONNECTION` and `Constant.JAVA_CHAT_MODEL_SETUP`, 
specifying the Java provider class via the `java_clazz` parameter
+- **Using Python chat models in Java**: Use 
`Constant.PYTHON_CHAT_MODEL_CONNECTION` and `Constant.PYTHON_CHAT_MODEL_SETUP`, 
specifying the Python provider via `module` and `clazz` parameters
+
+
+
+### Usage Example
+
+{{< tabs "Cross-Language Chat Model Usage Example" >}}
+
+{{< tab "Using Java Chat Model in Python" >}}
+```python
+class MyAgent(Agent):
+
+    @chat_model_connection
+    @staticmethod
+    def java_chat_model_connection() -> ResourceDescriptor:
+        # In pure Java, the equivalent ResourceDescriptor would be:
+        # ResourceDescriptor.Builder
+        #     .newBuilder(Constant.OllamaChatModelConnection)
+        #     .addInitialArgument("endpoint", "http://localhost:11434";)
+        #     .addInitialArgument("requestTimeout", 120)
+        #     .build();
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_CHAT_MODEL_CONNECTION,
+            
java_clazz="org.apache.flink.agents.integrations.chatmodels.ollama.OllamaChatModelConnection",
+            endpoint="http://localhost:11434";,
+            requestTimeout=120,
+        )
+    
+    
+    @chat_model_setup
+    @staticmethod
+    def java_chat_model() -> ResourceDescriptor:
+        # In pure Java, the equivalent ResourceDescriptor would be:
+        # ResourceDescriptor.Builder
+        #     .newBuilder(Constant.OllamaChatModelSetup)
+        #     .addInitialArgument("connection", "java_chat_model_connection")
+        #     .addInitialArgument("model", "qwen3:8b")
+        #     .addInitialArgument("prompt", "my_prompt")
+        #     .addInitialArgument("tools", List.of("my_tool1", "my_tool2"))
+        #     .addInitialArgument("extractReasoning", true)
+        #     .build();
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_CHAT_MODEL_SETUP,
+               
java_clazz="org.apache.flink.agents.integrations.chatmodels.ollama.OllamaChatModelSetup",
+            connection="java_chat_model_connection",
+            model="qwen3:8b",
+            prompt="my_prompt",
+            tools=["my_tool1", "my_tool2"],
+            extract_reasoning=True,
+        )
+
+    @action(InputEvent)
+    @staticmethod
+    def process_input(event: InputEvent, ctx: RunnerContext) -> None:
+        # Create a chat request with user message
+        user_message = ChatMessage(
+            role=MessageRole.USER,
+            content=f"input: {event.input}"
+        )
+        ctx.send_event(
+            ChatRequestEvent(model="java_chat_model", messages=[user_message])
+        )
+
+    @action(ChatResponseEvent)
+    @staticmethod
+    def process_response(event: ChatResponseEvent, ctx: RunnerContext) -> None:
+        response_content = event.response.content
+        # Handle the LLM's response
+        # Process the response as needed for your use case
+```
+{{< /tab >}}
+
+{{< tab "Using Python Chat Model in Java" >}}
+```java
+public class MyAgent extends Agent {
+
+    @ChatModelConnection
+    public static ResourceDescriptor pythonChatModelConnection() {
+        // In pure Python, the equivalent ResourceDescriptor would be:
+        // ResourceDescriptor(
+        //     clazz=Constant.OLLAMA_CHAT_MODEL_CONNECTION,
+        //     request_timeout=120.0
+        // )
+        return 
ResourceDescriptor.Builder.newBuilder(Constant.PYTHON_CHAT_MODEL_CONNECTION)
+                .addInitialArgument(
+                        "module", 
"flink_agents.integrations.chat_models.ollama_chat_model")
+                .addInitialArgument("clazz", "OllamaChatModelConnection")
+                .addInitialArgument("request_timeout", 120.0)
+                .build();
+    }
+  
+       @ChatModelSetup
+    public static ResourceDescriptor pythonChatModel() {
+        // In pure Python, the equivalent ResourceDescriptor would be:
+        // ResourceDescriptor(
+        //     clazz=Constant.OLLAMA_CHAT_MODEL_SETUP,
+        //     connection="pythonChatModelConnection",
+        //     model="qwen3:8b",
+        //     tools=["tool1", "tool2"],
+        //     extract_reasoning=True
+        // )
+        return 
ResourceDescriptor.Builder.newBuilder(Constant.PYTHON_CHAT_MODEL_SETUP)
+                .addInitialArgument(
+                        "module", 
"flink_agents.integrations.chat_models.ollama_chat_model")
+                .addInitialArgument("clazz", "OllamaChatModelSetup")
+                .addInitialArgument("connection", "pythonChatModelConnection")
+                .addInitialArgument("model", "qwen3:8b")
+                .addInitialArgument("tools", List.of("tool1", "tool2"))
+                .addInitialArgument("extract_reasoning", true)
+                .build();
+    }
+
+    @Action(listenEvents = {InputEvent.class})
+    public static void processInput(InputEvent event, RunnerContext ctx) 
throws Exception {
+        ChatMessage userMessage =
+                new ChatMessage(MessageRole.USER, String.format("input: {%s}", 
event.getInput()));
+        ctx.sendEvent(new ChatRequestEvent("pythonChatModel", 
List.of(userMessage)));
+    }
+
+    @Action(listenEvents = {ChatResponseEvent.class})
+    public static void processResponse(ChatResponseEvent event, RunnerContext 
ctx)
+            throws Exception {
+        String response = event.getResponse().getContent();
+        // Handle the LLM's response
+        // Process the response as needed for your use case
+    }
+}
+```
+{{< /tab >}}
+
+{{< /tabs >}}
+
 ## Custom Providers
 
 {{< hint warning >}}
diff --git a/docs/content/docs/development/embedding_models.md 
b/docs/content/docs/development/embedding_models.md
index 288d300d..400c477d 100644
--- a/docs/content/docs/development/embedding_models.md
+++ b/docs/content/docs/development/embedding_models.md
@@ -24,10 +24,6 @@ under the License.
 
 # Embedding Models
 
-{{< hint info >}}
-Embedding models are currently supported in the Python API only. Java API 
support is planned for future releases.
-{{< /hint >}}
-
 {{< hint info >}}
 This page covers text-based embedding models. Flink agents does not currently 
support multimodal embeddings.
 {{< /hint >}}
@@ -172,6 +168,10 @@ Model availability and specifications may change. Always 
check the official Olla
 
 OpenAI provides cloud-based embedding models with state-of-the-art performance.
 
+{{< hint info >}}
+OpenAI embedding models are currently supported in the Python API only. To use 
OpenAI from Java agents, see [Using Cross-Language 
Providers](#using-cross-language-providers).
+{{< /hint >}}
+
 #### Prerequisites
 
 1. Get an API key from [OpenAI Platform](https://platform.openai.com/)
@@ -238,6 +238,129 @@ Current popular models include:
 Model availability and specifications may change. Always check the official 
OpenAI documentation for the latest information before implementing in 
production.
 {{< /hint >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language embedding model integration, allowing you 
to use embedding models implemented in one language (Java or Python) from 
agents written in the other language. This is particularly useful when an 
embedding model provider is only available in one language (e.g., OpenAI 
embedding is currently Python-only).
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To leverage embedding model supports provided in a different language, you 
need to declare the resource within a built-in cross-language wrapper, and 
specify the target provider as an argument:
+
+- **Using Java embedding models in Python**: Use 
`Constant.JAVA_EMBEDDING_MODEL_CONNECTION` and 
`Constant.JAVA_EMBEDDING_MODEL_SETUP`, specifying the Java provider class via 
the `java_clazz` parameter
+- **Using Python embedding models in Java**: Use 
`Constant.PYTHON_EMBEDDING_MODEL_CONNECTION` and 
`Constant.PYTHON_EMBEDDING_MODEL_SETUP`, specifying the Python provider via 
`module` and `clazz` parameters
+
+### Usage Example
+
+{{< tabs "Cross-Language Embedding Model Usage Example" >}}
+
+{{< tab "Using Java Embedding Model in Python" >}}
+
+```python
+class MyAgent(Agent):
+
+    @embedding_model_connection
+    @staticmethod
+    def java_embedding_connection() -> ResourceDescriptor:
+        # In pure Java, the equivalent ResourceDescriptor would be:
+        # ResourceDescriptor.Builder
+        #     .newBuilder(Constant.OllamaEmbeddingModelConnection)
+        #     .addInitialArgument("host", "http://localhost:11434";)
+        #     .build();
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_EMBEDDING_MODEL_CONNECTION,
+            
java_clazz="org.apache.flink.agents.integrations.embeddingmodels.ollama.OllamaEmbeddingModelConnection",
+            host="http://localhost:11434";
+        )
+
+    @embedding_model_setup
+    @staticmethod
+    def java_embedding_model() -> ResourceDescriptor:
+        # In pure Java, the equivalent ResourceDescriptor would be:
+        # ResourceDescriptor.Builder
+        #     .newBuilder(Constant.OllamaEmbeddingModelSetup)
+        #     .addInitialArgument("connection", "java_embedding_connection")
+        #     .addInitialArgument("model", "nomic-embed-text")
+        #     .build();
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_EMBEDDING_MODEL_SETUP,
+            
java_clazz="org.apache.flink.agents.integrations.embeddingmodels.ollama.OllamaEmbeddingModelSetup",
+            connection="java_embedding_connection",
+            model="nomic-embed-text"
+        )
+
+    @action(InputEvent)
+    @staticmethod
+    def process_input(event: InputEvent, ctx: RunnerContext) -> None:
+        # Use the Java embedding model from Python
+        embedding_model = ctx.get_resource("java_embedding_model", 
ResourceType.EMBEDDING_MODEL)
+        embedding = embedding_model.embed(str(event.input))
+        # Process the embedding vector as needed
+```
+
+{{< /tab >}}
+
+{{< tab "Using Python Embedding Model in Java" >}}
+
+```java
+public class MyAgent extends Agent {
+
+    @EmbeddingModelConnection
+    public static ResourceDescriptor pythonEmbeddingConnection() {
+        // In pure Python, the equivalent ResourceDescriptor would be:
+        // ResourceDescriptor(
+        //     clazz=Constant.OLLAMA_EMBEDDING_MODEL_CONNECTION,
+        //     base_url="http://localhost:11434";
+        // )
+        return 
ResourceDescriptor.Builder.newBuilder(Constant.PYTHON_EMBEDDING_MODEL_CONNECTION)
+                .addInitialArgument(
+                        "module", 
+                        
"flink_agents.integrations.embedding_models.local.ollama_embedding_model")
+                .addInitialArgument("clazz", "OllamaEmbeddingModelConnection")
+                .addInitialArgument("base_url", "http://localhost:11434";)
+                .build();
+    }
+
+    @EmbeddingModelSetup
+    public static ResourceDescriptor pythonEmbeddingModel() {
+        // In pure Python, the equivalent ResourceDescriptor would be:
+        // ResourceDescriptor(
+        //     clazz=Constant.OLLAMA_EMBEDDING_MODEL_SETUP,
+        //     connection="ollama_connection",
+        //     model="nomic-embed-text"
+        // )
+        return 
ResourceDescriptor.Builder.newBuilder(Constant.PYTHON_EMBEDDING_MODEL_SETUP)
+                .addInitialArgument(
+                        "module", 
+                        
"flink_agents.integrations.embedding_models.local.ollama_embedding_model")
+                .addInitialArgument("clazz", "OllamaEmbeddingModelSetup")
+                .addInitialArgument("connection", "pythonEmbeddingConnection")
+                .addInitialArgument("model", "nomic-embed-text")
+                .build();
+    }
+
+    @Action(listenEvents = {InputEvent.class})
+    public static void processInput(InputEvent event, RunnerContext ctx) 
throws Exception {
+        // Use the Python embedding model from Java
+        BaseEmbeddingModelSetup embeddingModel = 
+            (BaseEmbeddingModelSetup) ctx.getResource(
+                "pythonEmbeddingModel", 
+                ResourceType.EMBEDDING_MODEL);
+        float[] embedding = embeddingModel.embed((String) event.getInput());
+        // Process the embedding vector as needed
+    }
+}
+```
+
+{{< /tab >}}
+
+{{< /tabs >}}
+
 ## Custom Providers
 
 {{< hint warning >}}
diff --git a/docs/content/docs/development/mcp.md 
b/docs/content/docs/development/mcp.md
index b44b47a4..de5bf697 100644
--- a/docs/content/docs/development/mcp.md
+++ b/docs/content/docs/development/mcp.md
@@ -26,10 +26,6 @@ under the License.
 
 MCP (Model Context Protocol) is a standardized protocol for integrating AI 
applications with external data sources and tools. Flink Agents provides the 
support for using prompts and tools from MCP server.
 
-{{< hint warning >}}
-**JDK Requirement (Java API Only):** If you are using the **Java API** to 
develop Flink Agents jobs with MCP, you need **JDK 17 or higher**. This 
requirement does not apply to **Python API** users - the Python SDK has its own 
MCP implementation and works with JDK 11+.
-{{< /hint >}}
-
 ## Declare MCP Server in Agent
 
 Developer can declare a mcp server by decorator/annotation when creating an 
Agent.
@@ -187,4 +183,25 @@ public class ReviewAnalysisAgent extends Agent {
 
 **Key points:**
 - All tools and prompts from the MCP server are automatically registered.
-- Reference MCP prompts and tools by their names, like reference [local 
prompt]({{< ref "docs/development/prompts#using-prompts-in-agents" >}}) and 
[function tool]({{< ref 
"docs/development/tool_use#define-tool-as-static-method-in-agent-class" >}}) .
\ No newline at end of file
+- Reference MCP prompts and tools by their names, like reference [local 
prompt]({{< ref "docs/development/prompts#using-prompts-in-agents" >}}) and 
[function tool]({{< ref 
"docs/development/tool_use#define-tool-as-static-method-in-agent-class" >}}) .
+
+## Appendix
+
+### MCP SDK
+
+Flink Agents offers two implementations of MCP support, based on MCP SDKs in 
different languages (Python and Java). Typically, users do not need to be aware 
of this, as the framework automatically determines the appropriate 
implementation based on the language and version. The default behavior is 
described as follows:
+
+| Agent Language | JDK Version      | Default Implementation |
+|----------------|------------------|------------------------|
+| Python         | Any              | Python SDK  |
+| Java           | JDK 17+          | Java SDK    |
+| Java           | JDK 16 and below | Python SDK  |
+
+
+As shown in the table above, for Java agents running on JDK 17+, the framework 
automatically uses the Java SDK implementation. If you need to use the Python 
SDK instead (not recommended), you can set the `lang` parameter to `"python"` 
in the `@MCPServer` annotation:
+```java
+@MCPServer(lang = "python")
+public static ResourceDescriptor myMcp() {
+    // ...
+}
+```
\ No newline at end of file
diff --git a/docs/content/docs/development/vector_stores.md 
b/docs/content/docs/development/vector_stores.md
index e628108d..1ef71d7c 100644
--- a/docs/content/docs/development/vector_stores.md
+++ b/docs/content/docs/development/vector_stores.md
@@ -254,7 +254,7 @@ public class MyAgent extends Agent {
 [Chroma](https://www.trychroma.com/home) is an open-source vector database 
that provides efficient storage and querying of embeddings with support for 
multiple deployment modes.
 
 {{< hint info >}}
-Chroma is currently supported in the Python API only.
+Chroma is currently supported in the Python API only. To use Chroma from Java 
agents, see [Using Cross-Language Providers](#using-cross-language-providers).
 {{< /hint >}}
 
 #### Prerequisites
@@ -390,7 +390,7 @@ def chroma_store() -> ResourceDescriptor:
 [Elasticsearch](https://www.elastic.co/elasticsearch/) is a distributed, 
RESTful search and analytics engine that supports vector search through dense 
vector fields and K-Nearest Neighbors (KNN).
 
 {{< hint info >}}
-Elasticsearch is currently supported in the Java API only.
+Elasticsearch is currently supported in the Java API only. To use 
Elasticsearch from Python agents, see [Using Cross-Language 
Providers](#using-cross-language-providers).
 {{< /hint >}}
 
 #### Prerequisites
@@ -445,6 +445,134 @@ public static ResourceDescriptor vectorStore() {
 
 {{< /tabs >}}
 
+## Using Cross-Language Providers
+
+Flink Agents supports cross-language vector store integration, allowing you to 
use vector stores implemented in one language (Java or Python) from agents 
written in the other language. This is particularly useful when a vector store 
provider is only available in one language (e.g., Elasticsearch is currently 
Java-only, Chroma is currently Python-only).
+
+{{< hint warning >}}
+**Limitations:**
+- Cross-language resources are currently supported only when [running in 
Flink]({{< ref "docs/operations/deployment#run-in-flink" >}}), not in local 
development mode
+- Complex object serialization between languages may have limitations
+{{< /hint >}}
+
+### How To Use
+
+To leverage vector store supports provided in a different language, you need 
to declare the resource within a built-in cross-language wrapper, and specify 
the target provider as an argument:
+
+- **Using Java vector stores in Python**: Use 
`Constant.JAVA_COLLECTION_MANAGEABLE_VECTOR_STORE`, specifying the Java 
provider class via the `java_clazz` parameter
+- **Using Python vector stores in Java**: Use 
`Constant.PYTHON_COLLECTION_MANAGEABLE_VECTOR_STORE`, specifying the Python 
provider via `module` and `clazz` parameters
+
+### Usage Example
+
+{{< tabs "Cross-Language Vector Store Usage Example" >}}
+
+{{< tab "Using Java Vector Store in Python" >}}
+
+```python
+class MyAgent(Agent):
+
+    # Define embedding model (can be Java or Python implementation)
+    @embedding_model_connection
+    @staticmethod
+    def my_embedding_connection() -> ResourceDescriptor:
+        # Configure embedding model connection as needed
+        pass
+
+    @embedding_model_setup
+    @staticmethod
+    def my_embedding_model() -> ResourceDescriptor:
+        # Configure embedding model setup as needed
+        pass
+
+    # Use Java vector store with embedding model
+    @vector_store
+    @staticmethod
+    def java_vector_store() -> ResourceDescriptor:
+        # In pure Java, the equivalent ResourceDescriptor would be:
+        # ResourceDescriptor.Builder
+        #     .newBuilder(Constant.ElasticsearchVectorStore)
+        #     .addInitialArgument("embedding_model", "my_embedding_model")
+        #     .addInitialArgument("host", "http://localhost:9200";)
+        #     .addInitialArgument("index", "my_documents")
+        #     .addInitialArgument("dims", 768)
+        #     .build();
+        return ResourceDescriptor(
+            clazz=Constant.JAVA_COLLECTION_MANAGEABLE_VECTOR_STORE,
+            
java_clazz="org.apache.flink.agents.integrations.vectorstores.elasticsearch.ElasticsearchVectorStore",
+            embedding_model="my_embedding_model",
+            host="http://localhost:9200";,
+            index="my_documents",
+            dims=768
+        )
+
+    @action(InputEvent)
+    @staticmethod
+    def process_input(event: InputEvent, ctx: RunnerContext) -> None:
+        # Use Java vector store from Python
+        vector_store = ctx.get_resource("java_vector_store", 
ResourceType.VECTOR_STORE)
+        
+        # Perform semantic search
+        query = VectorStoreQuery(query_text=str(event.input), limit=3)
+        result = vector_store.query(query)
+        
+        # Process the retrieved documents
+```
+
+{{< /tab >}}
+
+{{< tab "Using Python Vector Store in Java" >}}
+
+```java
+public class MyAgent extends Agent {
+
+    // Define embedding model (can be Java or Python implementation)
+    @EmbeddingModelConnection
+    public static ResourceDescriptor myEmbeddingConnection() {
+        // Configure embedding model connection as needed
+        return null;
+    }
+
+    @EmbeddingModelSetup
+    public static ResourceDescriptor myEmbeddingModel() {
+        // Configure embedding model setup as needed
+        return null;
+    }
+    
+    @VectorStore
+    public static ResourceDescriptor pythonVectorStore() {
+        // In pure Python, the equivalent ResourceDescriptor would be:
+        // ResourceDescriptor(
+        //     clazz=Constant.CHROMA_VECTOR_STORE,
+        //     embedding_model="my_embedding_model",
+        // )
+        return 
ResourceDescriptor.Builder.newBuilder(PYTHON_COLLECTION_MANAGEABLE_VECTOR_STORE)
+                .addInitialArgument(
+                        "module", 
+                        
"flink_agents.integrations.vector_stores.chroma.chroma_vector_store")
+                .addInitialArgument("clazz", "ChromaVectorStore")
+                .addInitialArgument("embedding_model", "myEmbeddingModel")
+                .build();
+    }
+
+    @Action(listenEvents = {InputEvent.class})
+    public static void processInput(InputEvent event, RunnerContext ctx) 
throws Exception {
+        // Use Python vector store from Java
+        VectorStore vectorStore = 
+            (VectorStore) ctx.getResource("pythonVectorStore", 
ResourceType.VECTOR_STORE);
+        
+        // Perform semantic search
+        VectorStoreQuery query = new VectorStoreQuery((String) 
event.getInput(), 3);
+        VectorStoreQueryResult result = vectorStore.query(query);
+        
+        // Process the retrieved documents
+    }
+}
+```
+
+{{< /tab >}}
+
+{{< /tabs >}}
+
 ## Custom Providers
 
 {{< hint warning >}}
diff --git a/docs/content/docs/faq/faq.md b/docs/content/docs/faq/faq.md
index ae90333c..a0f3a038 100644
--- a/docs/content/docs/faq/faq.md
+++ b/docs/content/docs/faq/faq.md
@@ -50,3 +50,11 @@ To ensure stability and compatibility when running Flink 
Agents jobs, please be
     ```
 
   If you see an error like this, switch immediately to one of the officially 
recommended installation methods and confirm that you're using a supported 
Python version.
+
+## Q2: Why do cross-language resources not work in local development mode?
+
+Cross-language resources, such as using Java resources from Python or vice 
versa, are currently supported only when running in Flink. Local development 
mode does not support cross-language resources.
+
+This limitation exists because cross-language communication requires the Flink 
runtime environment to effectively bridge Java and Python processes. In local 
development mode, this bridge is unavailable.
+
+To use cross-language resources, please test the functionality by deploying to 
a Flink standalone cluster.

Reply via email to