damccorm commented on code in PR #29223:
URL: https://github.com/apache/beam/pull/29223#discussion_r1378968224


##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -647,8 +647,15 @@ def __init__(
     _validate_constructor_args_hf_pipeline(self._task, self._model)
 
   def _deduplicate_device_value(self, device: str):
+    current_device = device.upper() if device else None
     if 'device' not in self._load_pipeline_args:
-      if device == 'CPU':
+      if (not current_device and current_device != 'CPU' and

Review Comment:
   This should be `if current_device and ...` (remove the `not`). As it is, 
this will throw in the default case.
   
   Also, we should move this to the top level (outside of the `if 'device' not 
in self._load_pipeline_args:`)



##########
sdks/python/apache_beam/ml/inference/huggingface_inference.py:
##########
@@ -659,7 +666,7 @@ def _deduplicate_device_value(self, device: str):
               "but GPUs are not available. Switching to CPU.")
           self._load_pipeline_args['device'] = 'cpu'
     else:
-      if device:
+      if current_device:
         _LOGGER.warning(

Review Comment:
   After updates, I'm probably actually inclined to just throw here. There's no 
way this is an intentional valid configuration since we're defaulting `device` 
to None.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to