yeandy commented on code in PR #22069: URL: https://github.com/apache/beam/pull/22069#discussion_r918286143
########## sdks/python/apache_beam/examples/inference/README.md: ########## @@ -30,57 +30,66 @@ because the `apache_beam.examples.inference` module was added in that release. pip install apache-beam==2.40.0 ``` +**Note:** You cannot batch elements of different sizes, because [`torch.stack()` expects tensors of the same length](https://github.com/pytorch/nestedtensor). Either elements need to be a fixed size, or you need to disable batching. To disable batching, set the maximum batch size to one: `max_batch_size=1`. + ### PyTorch dependencies +The following installation requirements are for the files used in these examples. + The RunInference API supports the PyTorch framework. To use PyTorch locally, first install `torch`. ``` pip install torch==1.11.0 Review Comment: Please bump this version down to 1.10.0 ########## sdks/python/apache_beam/examples/inference/README.md: ########## @@ -30,57 +30,66 @@ because the `apache_beam.examples.inference` module was added in that release. pip install apache-beam==2.40.0 ``` +**Note:** You cannot batch elements of different sizes, because [`torch.stack()` expects tensors of the same length](https://github.com/pytorch/nestedtensor). Either elements need to be a fixed size, or you need to disable batching. To disable batching, set the maximum batch size to one: `max_batch_size=1`. + ### PyTorch dependencies +The following installation requirements are for the files used in these examples. + The RunInference API supports the PyTorch framework. To use PyTorch locally, first install `torch`. ``` pip install torch==1.11.0 Review Comment: @rszper Please bump this version down to 1.10.0 -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected]
