ryanthompson591 commented on code in PR #22949:
URL: https://github.com/apache/beam/pull/22949#discussion_r958504571


##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,29 @@ For detailed instructions explaining how to build and run 
a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache 
Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 Please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java)
 for the Java wrapper transform to use and please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java)
 for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and 
later through Apache Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 For information about the Java wrapper transform, see 
[RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java).
 For example pipelines, see 
[RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support
+
+To use TensorFlow with the RunInference API, create a model handler from 
within `tfx_bsl`. The model handler can be keyed or unkeyed.
+For more information, see 
[run_inference.py](https://github.com/tensorflow/tfx-bsl/blob/d1fca25e5eeaac9ef0111ec13e7634df836f36f6/tfx_bsl/public/beam/run_inference.py)
 in the TensorFlow GitHub repository.
+
+```
+tf_handler = CreateModelHandler(inference_spec_type)
+
+# unkeyed
+beam.run_inference(tf_handler)
+
+# keyed
+beam.run_inference(beam.ml.inference.KeyedHandler(tf_handler))
+
+Args:
+  inference_spec_type: Model inference endpoint
+Returns:
+  A Beam RunInference ModelHandler for TensorFlow
+"""
+return run_inference.create_model_handler(inference_spec_type, None, None)

Review Comment:
   This return statement seems not illustrative.  I think you probably just 
copied the implementation code over by accident.



##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,29 @@ For detailed instructions explaining how to build and run 
a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache 
Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 Please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java)
 for the Java wrapper transform to use and please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java)
 for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and 
later through Apache Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 For information about the Java wrapper transform, see 
[RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java).
 For example pipelines, see 
[RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support

Review Comment:
   I'm not sure what exactly you want as a canonical example. Do you mean 
sample code that would go into this doc, or something else?
   
   



##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -165,7 +165,29 @@ For detailed instructions explaining how to build and run 
a pipeline that uses M
 
 ## Beam Java SDK support
 
-RunInference API is available to Beam Java SDK 2.41.0 and later through Apache 
Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 Please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java)
 for the Java wrapper transform to use and please see 
[here](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java)
 for some example pipelines.
+The RunInference API is available with the Beam Java SDK versions 2.41.0 and 
later through Apache Beam's [Multi-language Pipelines 
framework](https://beam.apache.org/documentation/programming-guide/#multi-language-pipelines).
 For information about the Java wrapper transform, see 
[RunInference.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/main/java/org/apache/beam/sdk/extensions/python/transforms/RunInference.java).
 For example pipelines, see 
[RunInferenceTransformTest.java](https://github.com/apache/beam/blob/master/sdks/java/extensions/python/src/test/java/org/apache/beam/sdk/extensions/python/transforms/RunInferenceTransformTest.java).
+
+## TensorFlow support
+
+To use TensorFlow with the RunInference API, create a model handler from 
within `tfx_bsl`. The model handler can be keyed or unkeyed.
+For more information, see 
[run_inference.py](https://github.com/tensorflow/tfx-bsl/blob/d1fca25e5eeaac9ef0111ec13e7634df836f36f6/tfx_bsl/public/beam/run_inference.py)
 in the TensorFlow GitHub repository.
+
+```
+tf_handler = CreateModelHandler(inference_spec_type)

Review Comment:
   In this example it may make sense to add some information about importing 
the correct libs from tfx-bsl.
   
   Probably the best way to get this done accurately would be to make a simple 
notebook and then import the code from that here.
   
   I can try to make some time this week to do that.  Probably it's possible to 
just modify some code in the notebook you created that showed using the tfx-bsl 
interface and just replace that with the beam interface.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to