rszper commented on code in PR #26472:
URL: https://github.com/apache/beam/pull/26472#discussion_r1180635425
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
Review Comment:
```suggestion
"2. Provide a path to the saved weights of the model.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
+ "3. Using a URL for pretrained model on TensorFlow Hub (See this
[notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
Review Comment:
```suggestion
"3. Provide a URL for pretrained model on TensorFlow Hub. For an
example workflow, see [Apache Beam RunInference with TensorFlow and TensorFlow
Hub](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb).\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
+ "3. Using a URL for pretrained model on TensorFlow Hub (See this
[notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
+ "\n",
"This notebook demonstrates the following steps:\n",
"- Build a simple TensorFlow model.\n",
"- Set up example data.\n",
- "- Run those examples with the built-in model handlers and get a
prediction inside an Apache Beam pipeline.\n",
+ "- Run those examples with the built-in model handlers using:\n",
+ " * Saved Model\n",
+ " * Saved Weights\n",
Review Comment:
```suggestion
" * saved weights\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
+ "3. Using a URL for pretrained model on TensorFlow Hub (See this
[notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
+ "\n",
"This notebook demonstrates the following steps:\n",
"- Build a simple TensorFlow model.\n",
"- Set up example data.\n",
- "- Run those examples with the built-in model handlers and get a
prediction inside an Apache Beam pipeline.\n",
+ "- Run those examples with the built-in model handlers using:\n",
+ " * Saved Model\n",
Review Comment:
```suggestion
" * a saved model\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
Review Comment:
```suggestion
"1. Provide a path to the saved model.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
+ "3. Using a URL for pretrained model on TensorFlow Hub (See this
[notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
+ "\n",
"This notebook demonstrates the following steps:\n",
"- Build a simple TensorFlow model.\n",
"- Set up example data.\n",
- "- Run those examples with the built-in model handlers and get a
prediction inside an Apache Beam pipeline.\n",
+ "- Run those examples with the built-in model handlers using:\n",
+ " * Saved Model\n",
+ " * Saved Weights\n",
+ "\n",
+ " and get a prediction inside an Apache Beam pipeline.\n",
Review Comment:
If you make the change above, remove this line.
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -313,14 +326,34 @@
"metadata": {
"id": "2JbE7WkGcAkK"
},
- "execution_count": 8,
+ "execution_count": 18,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "Instead of saving the entire model, you can just save the model
weights for inference. This is slightly lightweight than saving and loading the
entire model. However, you need to pass the function to build TensorFlow model
to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with
`ModelType.SAVED_WEIGHTS`."
+ ],
+ "metadata": {
+ "id": "g_qVtXPeUcMS"
+ }
+ },
+ {
+ "cell_type": "code",
+ "source": [
+ "model.save_weights(save_weights_dir_multiply)"
+ ],
+ "metadata": {
+ "id": "Kl1C_NwaUbiv"
+ },
+ "execution_count": 19,
"outputs": []
},
{
"cell_type": "markdown",
"source": [
"## Run the pipeline\n",
- "Use the following code to run the pipeline."
+ "#### Use the following code to run the pipeline by specifying path to
the trained TensorFlow model."
Review Comment:
```suggestion
"Use the following code to run the pipeline by specifying the path
to the trained TensorFlow model."
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -209,36 +219,39 @@
"base_uri": "https://localhost:8080/"
},
"id": "SH7iq3zeBBJ-",
- "outputId": "e15cab6b-1271-4b0b-bac3-ba76f8991077"
+ "outputId": "5a3d3ce4-f9d8-4d87-a1bc-05afc3c9b06e"
},
"source": [
"# Create training data that represents the 5 times multiplication
table for the numbers 0 to 99.\n",
"# x is the data and y is the labels.\n",
"x = numpy.arange(0, 100) # Examples\n",
"y = x * 5 # Labels\n",
"\n",
- "# Build a simple linear regression model.\n",
+ "# create_model builds a simple linear regression model.\n",
Review Comment:
```suggestion
"# Use create_model to build a simple linear regression model.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -395,10 +428,55 @@
"output_type": "stream",
"name": "stdout",
"text": [
- "example is 20.0 prediction is [51.815357]\n",
- "example is 40.0 prediction is [101.63492]\n",
- "example is 60.0 prediction is [151.45448]\n",
- "example is 90.0 prediction is [226.18384]\n"
+ "example is 20.0 prediction is [21.896107]\n",
+ "example is 40.0 prediction is [41.795692]\n",
+ "example is 60.0 prediction is [61.69528]\n",
+ "example is 90.0 prediction is [91.544655]\n"
+ ]
+ }
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "#### Use the following code to run the pipeline with the saved
weights of a TensorFlow model.\n",
Review Comment:
```suggestion
"Use the following code to run the pipeline with the saved weights
of a TensorFlow model.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -313,14 +326,34 @@
"metadata": {
"id": "2JbE7WkGcAkK"
},
- "execution_count": 8,
+ "execution_count": 18,
+ "outputs": []
+ },
+ {
+ "cell_type": "markdown",
+ "source": [
+ "Instead of saving the entire model, you can just save the model
weights for inference. This is slightly lightweight than saving and loading the
entire model. However, you need to pass the function to build TensorFlow model
to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with
`ModelType.SAVED_WEIGHTS`."
Review Comment:
```suggestion
"Instead of saving the entire model, you can save the model weights
for inference. This method might be simpler than saving and loading the entire
model. However, you need to pass the function to build the TensorFlow model to
the `TFModelHandlerNumpy` or `TFModelHandlerTensor` class, as well as
`ModelType.SAVED_WEIGHTS`."
```
A couple things are unclear to me from this sentence:
1. Should this be "or" or "and": `TFModelHandlerNumpy` or
`TFModelHandlerTensor` class
2. Is `ModelType.SAVED_WEIGHTS` also being passed to `TFModelHandlerNumpy`
or `TFModelHandlerTensor`? We should make whatever is happening with
`ModelType.SAVED_WEIGHTS` a new sentence and be clear about what we're doing
with it.
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -73,10 +73,19 @@
"\n",
"If your model uses `tf.Example` as an input, see the [Apache Beam
RunInference with
`tfx-bsl`](https://colab.research.google.com/github/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_tensorflow_with_tfx.ipynb)
notebook.\n",
"\n",
+ "There are three ways to load a TensorFlow model:\n",
+ "1. Using a path to the saved model.\n",
+ "2. Using a path to the saved weights of model.\n",
+ "3. Using a URL for pretrained model on TensorFlow Hub (See this
[notebook](https://github.com/apache/beam/blob/master/examples/notebooks/beam-ml/run_inference_with_tensorflow_hub.ipynb))\n",
+ "\n",
"This notebook demonstrates the following steps:\n",
"- Build a simple TensorFlow model.\n",
"- Set up example data.\n",
- "- Run those examples with the built-in model handlers and get a
prediction inside an Apache Beam pipeline.\n",
+ "- Run those examples with the built-in model handlers using:\n",
Review Comment:
```suggestion
"- Run those examples with the built-in model handlers using one of
the following methods, and then get a prediction inside an Apache Beam
pipeline.\n",
```
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]