Ron,
If you are talking about Tensorflow Saved model format, I personally think that
it is overkill for model serving. My preferred option is to used traditional TF
export, which can be optimized for serving.
As for processing I am using TF Java APIs, which basically is a population of
the
Yes you're right. I believe this is the use case that I'm after. So if I
understand correctly, transforms that do aggregations just assume that the
batch of data being aggregated is passed as part of a tensor column. Is it
possible to hook up a lookup call to another Tensorflow Serving
I do have Beam based Model serving implementation, which can take PMML or
Tensorflow.
It is listening on Kafka for both Models and data stream and can serve any
amount of models.
The model can be produced using any external application, exporting a complete
model pipeline.
The complete write
Hi, I was wondering if anyone has encountered or used Beam in the following
manner: 1. During machine learning training, use Beam to create the event
table. The flow may consist of some joins, aggregations, row-based
transformations, etc... 2. Once the model is created, deploy the model to