Hi Sergio,
By the way, you can also use TensorFrame allowing you to use TensorFlow
directly with Spark dataframe, and more direct access. I discussed with
Tim Hunter from Databricks about that who's working on TensorFrame.
Back on Beam, what you could do:
1. you expose the service on a micro
Hi JB,
On Tue, Nov 22, 2016 at 11:14 AM, Jean-Baptiste Onofré
wrote:
>
> DoFn will execute per element (with eventually a hook on StartBundle,
> FinishBundle, and Teardown). It's basic the way it works in IO WriteFn: we
> create the connection in StartBundle and send each element (with a batch)
>