tsaiggo opened a new pull request, #290:
URL: https://github.com/apache/flink-agents/pull/290

   <!--
   * Thank you very much for contributing to Flink Agents.
   * Please add the relevant components in the PR title. E.g., [api], 
[runtime], [java], [python], [hotfix], etc.
   -->
   
   <!-- Please link the PR to the relevant issue(s). Hotfix doesn't need this. 
-->
   Linked issue: https://github.com/apache/flink-agents/issues/286
   
   ### Purpose of change
   
   <!-- What is the purpose of this change? -->
   This PR adds support for the Azure AI chat model as a foundational model  
for the agent.
   
   ### Tests
   
   <!-- How is this change verified? -->
   I added one test to verify this change, but I ran into some issues while 
running the test cases:
   
   1. When I tried to run the `AgentWithOllamaExample`, the output stream 
didn't produce any data. Consequently, the same thing happened when I tested my 
new `AgentWithAzureAIExample`. My assumption is that this isn't related to the 
chat model implementation itself, but is more likely a higher-level issue 
(perhaps in the core agent logic or streaming setup).
   2. Because of this, I switched to a different test case, 
`ReActAgentExample`. I modified it locally to use my Azure AI implementation, 
and it worked perfectly. However, I did not commit this change, as my PR should 
not introduce breaking changes to existing examples.
   3. Finally, since this PR integrates a cloud-hosted model, I cannot check in 
my personal endpoint or API key. Therefore, I created the new 
`AgentWithAzureAIExample` and leveraged the `MockChatModel` (from 
`AgentWithResource`) to add mock tests. The "happy path" for this mock test 
passes successfully.
   
   
   ### API
   
   <!-- Does this change touches any public APIs? -->
   No
   
   ### Documentation
   
   <!-- Should this change be covered by the user documentation?-->
   Yes, exactly. It's because we now offer more model options.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to