rszper commented on code in PR #24125:
URL: https://github.com/apache/beam/pull/24125#discussion_r1022141133


##########
examples/notebooks/beam-ml/custom_remote_inference.ipynb:
##########
@@ -279,15 +291,19 @@
       "source": [
         "### Batching\n",
         "\n",
-        "Before we can chain all the different steps together in a pipeline, 
there is one more thing we need to understand: batching. When running inference 
with your model (both in Beam itself or in an external API), you can batch your 
input together to allow for more efficient execution of your model. When using 
a custom DoFn, you need to take care of the batching yourself, in contrast with 
the RunInference API which takes care of this for you.\n",
+        "Before we can chain together the pipeline steps, we need to 
understand batching.\n",
+        "When running inference with your model, either in Apache Beam or in 
an external API, you can batch your input to increase the efficiency of the 
model execution.\n",
+        "When using a custom DoFn, as in this example, you need to manage the 
batching.\n",
         "\n",
-        "In order to achieve this in our pipeline: we will introduce one more 
step in our pipeline, a `BatchElements` transform that will group elements 
together to form a batch of the desired size.\n",
+        "To manage the batching in this pipeline, include a `BatchElements` 
transform to group elements together and form a batch of the desired size.\n",
         "\n",
-        "⚠️ If you have a streaming pipeline, you may considering using 
[GroupIntoBatches](https://beam.apache.org/documentation/transforms/python/aggregation/groupintobatches/)
 as `BatchElements` doesn't batch things across bundles. `GroupIntoBatches` 
requires choosing a key within which things are batched.\n",
+        "**Caution:**\n",

Review Comment:
   Happily removing the caution.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to