Copilot commented on code in PR #932:
URL: https://github.com/apache/dubbo-go-samples/pull/932#discussion_r2387672583


##########
llm/go-server/cmd/server.go:
##########
@@ -114,18 +118,18 @@ func (s *ChatServer) Chat(ctx context.Context, req 
*chat.ChatRequest, stream cha
                messages = append(messages, messageContent)
        }
 
-       _, err = s.llm.GenerateContent(
+       err = s.llmService.GenerateContent(
                ctx,
                messages,
-               llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) 
error {
+               func(ctx context.Context, chunk []byte) error {
                        if chunk == nil {
                                return nil
                        }
                        return stream.Send(&chat.ChatResponse{
                                Content: string(chunk),
                                Model:   cfg.ModelName,
                        })
-               }),
+               },
        )

Review Comment:
   The GenerateContent method signature changed from returning a result and 
error to only returning an error, but the callback function parameter should be 
wrapped with llms.WithStreamingFunc() option to match the langchaingo API 
expectations.



##########
llm/config/config.go:
##########
@@ -104,12 +141,51 @@ func Load(envFile string) (*Config, error) {
                        return
                }
 
+               // Load LLM base URL and API key
+               llmBaseURL := os.Getenv("LLM_BASE_URL")
+               llmAPIKey := os.Getenv("LLM_API_KEY")
+
+               // For backward compatibility with Ollama
                ollamaURL := os.Getenv("OLLAMA_URL")
-               if ollamaURL == "" {
-                       configErr = fmt.Errorf("OLLAMA_URL is not set")
+               if llmBaseURL == "" && ollamaURL != "" {
+                       // Use OLLAMA_URL as fallback for LLM_BASE_URL
+                       llmBaseURL = ollamaURL
+               }
+
+               // Set default URL for providers if not configured
+               if llmBaseURL == "" && config.LLMProvider == "ollama" {
+                       llmBaseURL = "http://localhost:11434";
+               }
+               if llmBaseURL == "" && config.LLMProvider == "openai" {
+                       llmBaseURL = "https://api.openai.com/v1";
+               }

Review Comment:
   Magic URL string should be defined as a constant. Consider defining 
provider-specific default URLs as constants at the package level for better 
maintainability.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to