yeandy commented on code in PR #17470:
URL: https://github.com/apache/beam/pull/17470#discussion_r868007567


##########
sdks/python/apache_beam/ml/inference/pytorch.py:
##########
@@ -45,11 +46,22 @@ def run_inference(self, batch: List[torch.Tensor],
     This method stacks the list of Tensors in a vectorized format to optimize
     the inference call.
     """
-
-    batch = torch.stack(batch)
-    if batch.device != self._device:
-      batch = batch.to(self._device)
-    predictions = model(batch)
+    if isinstance(batch[0], dict):

Review Comment:
   SG to adding `typing.Dict[str, torch.Tensor]`.
   
   To handle only the general case (kwargs) means that we would be requiring 
users to add another layer of transformations for those who have simple (single 
argument) inputs. I feel like that may deviate from many `pytorch` users' 
expectation as a straightforward input format to use, and may be a turn-off in 
terms of ease of use. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to