rezarokni commented on code in PR #22949:
URL: https://github.com/apache/beam/pull/22949#discussion_r959439375


##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,53 @@ For detailed instructions explaining how to build and run 
a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache 
Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 Please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java)
 for the Java wrapper transform to use and please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java)
 for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and 
later through Apache Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 For information about the Java wrapper transform, see 
[RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java).
 For example pipelines, see 
[RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support
+
+To use TensorFlow with the RunInference API, you need to create a model 
handler from within `tfx_bsl`, import the required modules, and add the 
necessary code to your pipline.
+
+First, create a model handler from within `tfx_bsl`. The model handler can be 
keyed or unkeyed.
+For more information, see 
[run_inference.py](https://github.com/tensorflow/tfx-bsl/blob/d1fca25e5eeaac9ef0111ec13e7634df836f36f6/tfx_bsl/public/beam/run_inference.py)
 in the TensorFlow GitHub repository.
+
+```
+tf_handler = CreateModelHandler(inference_spec_type)
+
+# unkeyed
+beam.run_inference(tf_handler)
+
+# keyed
+beam.run_inference(beam.ml.inference.KeyedHandler(tf_handler))
+
+Args:
+  inference_spec_type: Model inference endpoint
+Returns:
+  A Beam RunInference ModelHandler for TensorFlow
+```
+
+Next, in your pipeline, import the required modules:
+
+```
+from tensorflow_serving.apis import prediction_log_pb2
+from apache_beam.ml.inference.base import RunInference
+from tfx_bsl.public.beam.run_inference import CreateModelHandler
+```
+
+Finally, add the code to your pipeline. This example shows a pipeline that 
uses a model that multiplies by five.

Review Comment:
   See previous comment.



##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,29 @@ For detailed instructions explaining how to build and run 
a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache 
Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 Please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java)
 for the Java wrapper transform to use and please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java)
 for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and 
later through Apache Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 For information about the Java wrapper transform, see 
[RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java).
 For example pipelines, see 
[RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support
+
+To use TensorFlow with the RunInference API, create a model handler from 
within `tfx_bsl`. The model handler can be keyed or unkeyed.
+For more information, see 
[run_inference.py](https://github.com/tensorflow/tfx-bsl/blob/d1fca25e5eeaac9ef0111ec13e7634df836f36f6/tfx_bsl/public/beam/run_inference.py)
 in the TensorFlow GitHub repository.
+
+```
+tf_handler = CreateModelHandler(inference_spec_type)

Review Comment:
   We will need to link the official notebook linked from github. But we can do 
this as a separate PR as that will take a while to get done.  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to