[jira] [Commented] (IMPALA-12920) Support ai_generate_text built-in function for OpenAI's LLMs

2024-04-11 Thread ASF subversion and git services (Jira)


[ 
https://issues.apache.org/jira/browse/IMPALA-12920?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17836161#comment-17836161
 ] 

ASF subversion and git services commented on IMPALA-12920:
--

Commit 9837637d9342a49288a13a421d4e749818da1432 in impala's branch 
refs/heads/master from Yida Wu
[ https://gitbox.apache.org/repos/asf?p=impala.git;h=9837637d9 ]

IMPALA-12920: Support ai_generate_text built-in function for OpenAI's chat 
completion API

Added support for following built-in functions:
- ai_generate_text_default(prompt)
- ai_generate_text(ai_endpoint, prompt, ai_model,
  ai_api_key_jceks_secret, additional_params)

'ai_endpoint', 'ai_model' and 'ai_api_key_jceks_secret' are flagfile
options. 'ai_generate_text_default(prompt)' syntax expects all these
to be set to proper values. The other syntax, will try to use the
provided input parameter values, but fallback to instance level values
if the inputs are NULL or empty.

Only public OpenAI (api.openai.com) and Azure OpenAI (openai.azure.com)
API endpoints are currently supported.

Exposed these functions in FunctionContext so that they can also be
called from UDFs:
- ai_generate_text_default(context, model)
- ai_generate_text(context, ai_endpoint, prompt, ai_model,
  ai_api_key_jceks_secret, additional_params)

Testing:
- Added unit tests for AiGenerateTextInternal function
- Added fe test for JniFrontend::getSecretFromKeyStore
- Ran manual tests to make sure Impala can talk with OpenAI LLMs using
'ai_generate_text' built-in function. Example sql:
select ai_generate_text("https://api.openai.com/v1/chat/completions;,
"hello", "gpt-3.5-turbo", "open-ai-key",
'{"temperature": 0.9, "model": "gpt-4"}')
- Tested using standalone UDF SDK and made sure that the UDFs can invoke
  BuiltInFunctions (ai_generate_text and ai_generate_text_default)

Change-Id: Id4446957f6030bab1f985fdd69185c3da07d7c4b
Reviewed-on: http://gerrit.cloudera.org:8080/21168
Reviewed-by: Impala Public Jenkins 
Tested-by: Impala Public Jenkins 


> Support ai_generate_text built-in function for OpenAI's LLMs
> 
>
> Key: IMPALA-12920
> URL: https://issues.apache.org/jira/browse/IMPALA-12920
> Project: IMPALA
>  Issue Type: Task
>Reporter: Abhishek Rawat
>Assignee: Abhishek Rawat
>Priority: Major
>
> Built in function which can help communicate with [OpenAi's chat completion 
> API|https://platform.openai.com/docs/api-reference/chat] endpoint through SQL.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-all-unsubscr...@impala.apache.org
For additional commands, e-mail: issues-all-h...@impala.apache.org



[jira] [Commented] (IMPALA-12920) Support ai_generate_text built-in function for OpenAI's LLMs

2024-04-03 Thread Michael Smith (Jira)


[ 
https://issues.apache.org/jira/browse/IMPALA-12920?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=17833757#comment-17833757
 ] 

Michael Smith commented on IMPALA-12920:


We'll need some documentation for this.

> Support ai_generate_text built-in function for OpenAI's LLMs
> 
>
> Key: IMPALA-12920
> URL: https://issues.apache.org/jira/browse/IMPALA-12920
> Project: IMPALA
>  Issue Type: Task
>Reporter: Abhishek Rawat
>Assignee: Abhishek Rawat
>Priority: Major
>
> Built in function which can help communicate with [OpenAi's chat completion 
> API|https://platform.openai.com/docs/api-reference/chat] endpoint through SQL.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

-
To unsubscribe, e-mail: issues-all-unsubscr...@impala.apache.org
For additional commands, e-mail: issues-all-h...@impala.apache.org