SplitfireUptown commented on code in PR #8551:
URL: https://github.com/apache/seatunnel/pull/8551#discussion_r1955770872
##########
seatunnel-transforms-v2/src/test/java/org/apache/seatunnel/transform/llm/LLMRequestJsonTest.java:
##########
@@ -209,4 +208,50 @@ void testCustomRequestJson() throws IOException {
"{\"messages\":[{\"role\":\"system\",\"content\":\"Determine
whether someone is Chinese or American by their
name\"},{\"role\":\"user\",\"content\":\"{\\\"id\\\":1,
\\\"name\\\":\\\"John\\\"}\"}],\"model\":\"custom-model\"}",
OBJECT_MAPPER.writeValueAsString(node));
}
+
+ @Test
+ void testCustomOllamaRequestJson() throws IOException {
Review Comment:
emmmm,new code add in
org.apache.seatunnel.transform.nlpmodel.llm.remote.custom.CustomModel#chatWithModel,i
need call
org.apache.seatunnel.transform.nlpmodel.llm.remote.AbstractModel#inference in
test code, if i want test it,i need a server to mock ollama data, because
request http interface code and new code in one function,i try many time and
dont know how can i run it in outside test code。can u help me please
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]