Hi, Fernando, I have never tried with Python.
For Java, we do the following in a scheduled Jenkins Job:

               export GOOGLE_APPLICATION_CREDENTIALS=...
java -cp $artifact_location \ $full_qualified_class_name \ $pipeline_arg \ --runner=DataflowRunner \ --jobName=$job_name \ --project=... \
               --tempLocation=gs://...                     \
--autoscalingAlgorithm=THROUGHPUT_BASED \ --streaming=false \ --region=europe-west1 \
               --subnetwork=...

I hope it serves as some help.

Best Regards,
Leonardo Campos

On 17.01.2020 13:14, Fernando wrote:
Hi All,

We're migrating our pipelines to Google Dataflow, and we have several
of them.

I've made some research and found two solutions:

- Cloud Functions in Node.JS receive a POST and spawn the pipeline
- Dataflow template saved on GS and a Cloud Function to launch it

If someone have experience with the subject and some examples, I was
looking for the simpler solution and can be deploy with CI/CD.

Thanks in advance.

___
BR
Fernando

Reply via email to