imbajin commented on code in PR #302:
URL: 
https://github.com/apache/incubator-hugegraph-ai/pull/302#discussion_r2447065027


##########
hugegraph-llm/README.md:
##########
@@ -248,6 +170,79 @@ The system supports both English and Chinese prompts. To 
switch languages:
 
 **LLM Provider Support**: This project uses 
[LiteLLM](https://docs.litellm.ai/docs/providers) for multi-provider LLM 
support.
 
+### Programmatic Examples (new workflow engine)
+
+If you previously used high-level classes like `RAGPipeline` or `KgBuilder`, 
the project now exposes stable flows through the `Scheduler` API. Use 
`SchedulerSingleton.get_instance().schedule_flow(...)` to invoke workflows 
programmatically. Below are concise, working examples that match the new 
architecture.
+
+1) RAG (graph-only) query example
+
+```python
+from hugegraph_llm.flows.scheduler import SchedulerSingleton
+
+scheduler = SchedulerSingleton.get_instance()
+res = scheduler.schedule_flow(
+  "rag_graph_only",
+  query="Tell me about Al Pacino.",
+  graph_only_answer=True,
+  vector_only_answer=False,
+  raw_answer=False,
+  gremlin_tmpl_num=-1,
+  gremlin_prompt=None,
+)
+
+print(res.get("graph_only_answer"))
+```
+
+2) RAG (vector-only) query example
+
+```python
+from hugegraph_llm.flows.scheduler import SchedulerSingleton
+
+scheduler = SchedulerSingleton.get_instance()
+res = scheduler.schedule_flow(
+  "rag_vector_only",
+  query="Summarize the career of Ada Lovelace.",
+  vector_only_answer=True,
+  vector_search=True
+)
+
+print(res.get("vector_only_answer"))
+```
+
+3) Text -> Gremlin (text2gremlin) example
+
+```python
+from hugegraph_llm.flows.scheduler import SchedulerSingleton
+
+scheduler = SchedulerSingleton.get_instance()
+response = scheduler.schedule_flow(
+  "text2gremlin",
+  "find people who worked with Alan Turing",
+  2,  # example_num
+  "hugegraph",  # schema_input (graph name or schema)
+  None,  # gremlin_prompt_input (optional)
+  ["template_gremlin", "raw_gremlin"],
+)
+
+print(response.get("template_gremlin"))
+```
+
+4) Build example index (used by text2gremlin examples)
+
+```python
+from hugegraph_llm.flows.scheduler import SchedulerSingleton
+
+examples = [{"id": "natural language query", "gremlin": 
"g.V().hasLabel('person').valueMap()"}]
+res = SchedulerSingleton.get_instance().schedule_flow("build_examples_index", 
examples)
+print(res)
+```
+

Review Comment:
   ⚠️ **Important: Migration guide incomplete**
   
   While the migration note is helpful, it lacks concrete examples:
   - No side-by-side comparison of old vs new API
   - Doesn't explain parameter mapping
   - Missing error handling differences
   
   **Recommendation**: Add concrete migration examples:
   
   ```python
   # OLD (deprecated)
   from hugegraph_llm.operators.graph_rag_task import RAGPipeline
   graph_rag = RAGPipeline()
   result = (graph_rag
       .extract_keywords(text="Tell me about Al Pacino.")
       .keywords_to_vid()
       .query_graphdb(max_deep=2, max_graph_items=30)
       .merge_dedup_rerank()
       .synthesize_answer()
       .run())
   
   # NEW (recommended)
   from hugegraph_llm.flows.scheduler import SchedulerSingleton
   scheduler = SchedulerSingleton.get_instance()
   result = scheduler.schedule_flow(
       "rag_graph_only",
       query="Tell me about Al Pacino.",
       graph_only_answer=True,
       max_graph_items=30
   )
   ```



##########
hugegraph-llm/README.md:
##########
@@ -248,6 +170,79 @@ The system supports both English and Chinese prompts. To 
switch languages:
 
 **LLM Provider Support**: This project uses 
[LiteLLM](https://docs.litellm.ai/docs/providers) for multi-provider LLM 
support.
 
+### Programmatic Examples (new workflow engine)
+
+If you previously used high-level classes like `RAGPipeline` or `KgBuilder`, 
the project now exposes stable flows through the `Scheduler` API. Use 
`SchedulerSingleton.get_instance().schedule_flow(...)` to invoke workflows 
programmatically. Below are concise, working examples that match the new 
architecture.
+
+1) RAG (graph-only) query example
+
+```python
+from hugegraph_llm.flows.scheduler import SchedulerSingleton
+
+scheduler = SchedulerSingleton.get_instance()
+res = scheduler.schedule_flow(

Review Comment:
   ⚠️ **Important: Hardcoded flow names reduce maintainability**
   
   Flow names like `"rag_graph_only"`, `"rag_vector_only"`, `"text2gremlin"` 
are hardcoded strings throughout the examples. This is error-prone and makes 
refactoring difficult.
   
   **Recommendation**: Define flow names as constants or enums:
   
   ```python
   # In flows/__init__.py or flows/constants.py
   class FlowName:
       RAG_GRAPH_ONLY = "rag_graph_only"
       RAG_VECTOR_ONLY = "rag_vector_only"
       TEXT2GREMLIN = "text2gremlin"
       BUILD_EXAMPLES_INDEX = "build_examples_index"
   
   # Usage in examples:
   from hugegraph_llm.flows import FlowName
   res = scheduler.schedule_flow(
       FlowName.RAG_GRAPH_ONLY,
       query="Tell me about Al Pacino.",
       ...
   )
   ```
   
   This provides IDE autocomplete, type safety, and easier maintenance.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to