yunfengzhou-hub opened a new pull request, #27155:
URL: https://github.com/apache/flink/pull/27155

   ## What is the purpose of the change
   
   This PR proposes to pass several OpenAI API parameters through model 
function parameter, in order to better support controlling the randomness and 
format of Chat Model Function.
   
   
   ## Brief change log
   
   The following parameters are proposed to be passed 
   
   - presence-penalty: Number between -2.0 and 2.0. Positive values penalize 
new tokens based on whether they appear in the text so far, increasing the 
model's likelihood to talk about new topics.
   - n: How many chat completion choices to generate for each input message.
   - seed: the random seed to generate results
   - response-format: The format of the response, e.g., 'text' or 'json_object'.
   
   
   ## Verifying this change
   
   Unit tests are added to OpenAIChatModelTest to verify the newly introduced 
parameters.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): yes
     - The public API, i.e., is any changed class annotated with 
`@Public(Evolving)`: no
     - The serializers: no
     - The runtime per-record code paths (performance sensitive): yes
     - Anything that affects deployment or recovery: JobManager (and its 
components), Checkpointing, Kubernetes/Yarn, ZooKeeper: no
     - The S3 file system connector: no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? yes
     - If yes, how is the feature documented? docs
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to