tvalentyn commented on code in PR #28474:
URL: https://github.com/apache/beam/pull/28474#discussion_r1327788133


##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -593,9 +594,11 @@ def __init__(
     Args:
       task (str or enum.Enum): task supported by HuggingFace Pipelines.
         Accepts a string task or an enum.Enum from PipelineTask.
-      model : path to pretrained model on Hugging Face Models Hub to use custom
-        model for the chosen task. If the model already defines the task then
-        no need to specify the task parameter.
+      model : path to the pretrained model-id on Hugging Face Models Hub
+        to use custom model for the chosen task. If the model already defines
+        the task then no need to specify the task parameter. Use the model_id
+        string instead of an actual model here. Model specific kwargs can be
+        specified with model_kwargs using load_pipeline_args.

Review Comment:
   it would help to format the text in docstings, for example take a look at 
huggingface docs in 
https://huggingface.co/docs/transformers/main_classes/model. Note how string 
constants are formatted italic (i think it's done via star-escaping (`*model 
id*`)) and code/arguments are escaped with back-ticks `. 
   
   model (str): path to the pre-trained *model id* on Hugging Face Models Hub



##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -593,9 +594,11 @@ def __init__(
     Args:
       task (str or enum.Enum): task supported by HuggingFace Pipelines.
         Accepts a string task or an enum.Enum from PipelineTask.
-      model : path to pretrained model on Hugging Face Models Hub to use custom
-        model for the chosen task. If the model already defines the task then
-        no need to specify the task parameter.
+      model : path to the pretrained model-id on Hugging Face Models Hub
+        to use custom model for the chosen task. If the model already defines
+        the task then no need to specify the task parameter. Use the model_id
+        string instead of an actual model here. Model specific kwargs can be
+        specified with model_kwargs using load_pipeline_args.

Review Comment:
   for my education, is it done like:  
`load_pipeline_args={model_kwargs={key='value'}}`. or `model_kwargs` is passed 
directly into `load_pipeline_args` ? If you think it makes sense, you could add 
a snippet, like  "... for example 
`load_pipeline_args={model_kwargs={key='value'}}`"
   
   



##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -593,9 +594,11 @@ def __init__(
     Args:
       task (str or enum.Enum): task supported by HuggingFace Pipelines.
         Accepts a string task or an enum.Enum from PipelineTask.
-      model : path to pretrained model on Hugging Face Models Hub to use custom
-        model for the chosen task. If the model already defines the task then
-        no need to specify the task parameter.
+      model : path to the pretrained model-id on Hugging Face Models Hub
+        to use custom model for the chosen task. If the model already defines
+        the task then no need to specify the task parameter. Use the model_id
+        string instead of an actual model here. Model specific kwargs can be

Review Comment:
   ```suggestion
           string instead of an actual model here. Model-specific kwargs can be
   ```



##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -593,9 +594,11 @@ def __init__(
     Args:
       task (str or enum.Enum): task supported by HuggingFace Pipelines.
         Accepts a string task or an enum.Enum from PipelineTask.
-      model : path to pretrained model on Hugging Face Models Hub to use custom
-        model for the chosen task. If the model already defines the task then
-        no need to specify the task parameter.
+      model : path to the pretrained model-id on Hugging Face Models Hub

Review Comment:
   also, please add a type `(str):`, i am guessing.



##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -593,9 +594,11 @@ def __init__(
     Args:
       task (str or enum.Enum): task supported by HuggingFace Pipelines.
         Accepts a string task or an enum.Enum from PipelineTask.
-      model : path to pretrained model on Hugging Face Models Hub to use custom
-        model for the chosen task. If the model already defines the task then
-        no need to specify the task parameter.
+      model : path to the pretrained model-id on Hugging Face Models Hub

Review Comment:
   is this some full path or just the model id itself?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to