adamdebreceni commented on code in PR #1987:
URL: https://github.com/apache/nifi-minifi-cpp/pull/1987#discussion_r2526503275


##########
extensions/llamacpp/processors/DefaultLlamaContext.cpp:
##########
@@ -88,7 +87,7 @@ std::optional<std::string> 
DefaultLlamaContext::applyTemplate(const std::vector<
   std::transform(messages.begin(), messages.end(), 
std::back_inserter(llama_messages),
                  [](const LlamaChatMessage& msg) { return 
llama_chat_message{.role = msg.role.c_str(), .content = msg.content.c_str()}; 
});
   std::string text;
-  text.resize(utils::configuration::DEFAULT_BUFFER_SIZE);
+  text.resize(4096);

Review Comment:
   for now I have added a constant with the same name in this file to be 
searchable, I think in the future we would like to use the `size_t 
getBufferSize(const Configure& configuration);` instead in 
`ConfigurationUtils.h` but it is not yet clear to me how we are going to do 
that, (maybe store it extension-locally on extension init, or get from context 
instead), the Configure interface is quite heavy, I don't think we want that to 
appear in its current form in the "c" extensions



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to