Hi Fabian,

You have indeed found something we have not yet documented, mainly because
we have not yet tried it out ourselves.
The main class that gets called when running Beam pipelines is
"org.apache.hop.beam.run.MainBeam".

I was hoping the "Import as pipeline" button on a job would give you
everything you need to execute this but it does not.
I'll take a closer look the following days to see what is needed to use
this functionality, could be that we need to export the template based on a
pipeline.

Kr,
Hans

On Wed, 10 Aug 2022 at 15:46, Fabian Peters <[email protected]> wrote:

> Hi all!
>
> Thanks to Hans' work on the REST transform, I can now deploy my jobs to
> Dataflow.
>
> Next, I'd like to schedule a batch job
> <https://cloud.google.com/community/tutorials/schedule-dataflow-jobs-with-cloud-scheduler>,
> but for this I need to create a
> <https://cloud.google.com/dataflow/docs/concepts/dataflow-templates>
> template
> <https://cloud.google.com/dataflow/docs/concepts/dataflow-templates>.
> I've searched the Hop documentation but haven't found anything on this. I'm
> guessing that flex-templates
> <https://cloud.google.com/dataflow/docs/guides/templates/using-flex-templates#create_a_flex_template>
>  are
> the way to go, due to the fat-jar, but I'm wondering what to pass as
> the FLEX_TEMPLATE_JAVA_MAIN_CLASS.
>
> cheers
>
> Fabian
>

Reply via email to