[ 
https://issues.apache.org/jira/browse/MINIFICPP-2556?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Gábor Gyimesi updated MINIFICPP-2556:
-------------------------------------
    Description: Create a processor using the llama.cpp library to run 
inference on a local model with a specific prompt using the flow file data.  
(was: Create a processor using the llama.cpp library to run inference on 
specific prompt using the flow file data.)

> Create llama.cpp processor for LLM inference
> --------------------------------------------
>
>                 Key: MINIFICPP-2556
>                 URL: https://issues.apache.org/jira/browse/MINIFICPP-2556
>             Project: Apache NiFi MiNiFi C++
>          Issue Type: New Feature
>            Reporter: Gábor Gyimesi
>            Assignee: Gábor Gyimesi
>            Priority: Major
>
> Create a processor using the llama.cpp library to run inference on a local 
> model with a specific prompt using the flow file data.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to