Hi Kaxil,

I would recommend using the SSHOperator to start the Spark Job on the
master node of the HDInsight cluster.
This avoids the problems associated with Livy, and doesn't require you to
open ports/copy the hadoop configuration to your airflow machine.

Niels

2018-06-22 14:17 GMT+02:00 Naik Kaxil <k.n...@reply.com>:

> Hi all,
>
>
>
> Has anyone used the SparkSubmitOperator to submit Spark jobs on Azure
> HDInsight cluster? Are you using Livy or spark-submit to run remote Spark
> jobs?
>
>
>
> Regards,
>
> Kaxil
>
>
> Kaxil Naik
>
> Data Reply
> 2nd Floor, Nova South
> 160 Victoria Street, Westminster
> <https://maps.google.com/?q=160+Victoria+Street,+Westminster+%0D%0ALondon+SW1E+5LB+-+UK&entry=gmail&source=g>
> London SW1E 5LB - UK
> phone: +44 (0)20 7730 6000
> k.n...@reply.com
> www.reply.com
>
> [image: Data Reply]
>

Reply via email to