avichaym opened a new issue, #523:
URL: https://github.com/apache/flink-agents/issues/523

   ### Search before asking
   
   - [x] I searched in the 
[issues](https://github.com/apache/flink-agents/issues) and found nothing 
similar.
   
   ### Description
   
   ## Motivation
   
   Flink Agents currently supports Ollama, OpenAI, Anthropic (direct), and 
Azure AI as chat model providers, and Ollama/OpenAI for embeddings. There is no 
integration for Amazon Bedrock, which is the primary LLM gateway for AWS 
customers.
   
   ## Proposed Changes
   
   Add two new integration modules:
   
   - **Chat model** (`integrations/chat-models/bedrock/`) — Uses the Bedrock 
Converse API with native tool calling support. SigV4 auth via 
`DefaultCredentialsProvider`. Supports all Bedrock models accessible via 
Converse API (Claude, Llama, Mistral, Titan, etc.)
   
   - **Embedding model** (`integrations/embedding-models/bedrock/`) — Uses 
Titan Text Embeddings V2 via InvokeModel. Batch `embed(List<String>)` 
parallelizes via configurable thread pool (`embed_concurrency` parameter, 
default 4).
   
   I have a working implementation with unit tests ready to submit as a PR.
   
   
   ### Are you willing to submit a PR?
   
   - [x] I'm willing to submit a PR!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to