rszper commented on code in PR #26472:
URL: https://github.com/apache/beam/pull/26472#discussion_r1180769310
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -332,7 +330,16 @@
{
"cell_type": "markdown",
"source": [
- "Instead of saving the entire model, you can just save the model
weights for inference. This is slightly lightweight than saving and loading the
entire model. However, you need to pass the function to build TensorFlow model
to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with
`ModelType.SAVED_WEIGHTS`."
+ "Instead of saving the entire model, you can just [save the model
weights for
inference](https://www.tensorflow.org/guide/keras/save_and_serialize#saving_loading_only_the_models_weights_values).
This is useful in cases when you're using the model just for inference and
won't need any compilation information or optimizer state. This also allows
loading the weights with new model in case of transfer learning
applications.\n",
Review Comment:
```suggestion
"Instead of saving the entire model, you can [save the model weights
for
inference](https://www.tensorflow.org/guide/keras/save_and_serialize#saving_loading_only_the_models_weights_values).
You can use this method when you need the model for inference but don't need
any compilation information or optimizer state. In addition, when using
transfer learning applications, you can use this method to load the weights
with new models.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -332,7 +330,16 @@
{
"cell_type": "markdown",
"source": [
- "Instead of saving the entire model, you can just save the model
weights for inference. This is slightly lightweight than saving and loading the
entire model. However, you need to pass the function to build TensorFlow model
to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with
`ModelType.SAVED_WEIGHTS`."
+ "Instead of saving the entire model, you can just [save the model
weights for
inference](https://www.tensorflow.org/guide/keras/save_and_serialize#saving_loading_only_the_models_weights_values).
This is useful in cases when you're using the model just for inference and
won't need any compilation information or optimizer state. This also allows
loading the weights with new model in case of transfer learning
applications.\n",
+ "\n",
+ "With this approach, you need to pass the function to build TensorFlow
model to the `TFModelHandler` class you intend to use (`TFModelHandlerNumpy` /
`TFModelHandlerTensor`) along with `model_type=ModelType.SAVED_WEIGHTS`.\n",
Review Comment:
```suggestion
"With this approach, you need to pass the function to build the
TensorFlow model to the `TFModelHandler` class that you're using,
either`TFModelHandlerNumpy` or `TFModelHandlerTensor`. You also need to pass
`model_type=ModelType.SAVED_WEIGHTS` to the class.\n",
```
##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -332,7 +330,16 @@
{
"cell_type": "markdown",
"source": [
- "Instead of saving the entire model, you can just save the model
weights for inference. This is slightly lightweight than saving and loading the
entire model. However, you need to pass the function to build TensorFlow model
to the `TFModelHandlerNumpy` / `TFModelHandlerTensor` class along with
`ModelType.SAVED_WEIGHTS`."
+ "Instead of saving the entire model, you can just [save the model
weights for
inference](https://www.tensorflow.org/guide/keras/save_and_serialize#saving_loading_only_the_models_weights_values).
This is useful in cases when you're using the model just for inference and
won't need any compilation information or optimizer state. This also allows
loading the weights with new model in case of transfer learning
applications.\n",
+ "\n",
+ "With this approach, you need to pass the function to build TensorFlow
model to the `TFModelHandler` class you intend to use (`TFModelHandlerNumpy` /
`TFModelHandlerTensor`) along with `model_type=ModelType.SAVED_WEIGHTS`.\n",
Review Comment:
I took a stab at updating this, but please verify that what I wrote is
technically accurate.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]