rszper commented on code in PR #22250:
URL: https://github.com/apache/beam/pull/22250#discussion_r922553248


##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -171,7 +171,7 @@ In some cases, the `PredictionResults` output might not 
include the correct pred
 
 The RunInference API currently expects outputs to be an `Iterable[Any]`. 
Example return types are `Iterable[Tensor]` or `Iterable[Dict[str, Tensor]]`. 
When RunInference zips the inputs with the predictions, the predictions iterate 
over the dictionary keys instead of the batch elements. The result is that the 
key name is preserved but the prediction tensors are discarded. For more 
information, see the [Pytorch RunInference PredictionResult is a 
Dict](https://github.com/apache/beam/issues/22240) issue in the Apache Beam 
GitHub project.
 
-To work with the current RunInference implementation, you can create a wrapper 
class that overrides the `model(input)` call. In PyTorch, for example, your 
wrapper would override the `forward()` function and return an output with the 
appropriate format of `List[Dict[str, torch.Tensor]]`. For more information, 
see our [HuggingFace language modeling 
example](https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/inference/pytorch_language_modeling.py#L49).
+To work with the current RunInference implementation, you can create a wrapper 
class that overrides the `model(input)` call. In PyTorch, for example, your 
wrapper would override the `forward()` function and return an output with the 
appropriate format of `List[Dict[str, torch.Tensor]]`. For more information, 
see our [HuggingFace language modeling 
example](https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/inference/pytorch_language_modeling.py#L49)
 and our [Bert language modeling 
example](https://github.com/apache/beam/blob/master/sdks/python/apache_beam/examples/inference/pytorch_language_modeling.py).

Review Comment:
   Oops. This should be fixed now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to