If you run in non-cluster mode, the driver logs would be written to stdout.
So airflow would capture those.


Op vr 22 jun. 2018 18:04 schreef Naik Kaxil <k.n...@reply.com>:

> Thanks @Neils and @Kyle.
>
> @Neils - I agree, I don't want to copy Hadoop configurations to my airflow
> VM. In this (Using SSHOperator) case, Airflow would just be receiving std
> output right as opposed to driver logs?
>
> @Kyle - If you can, then it would definitely be useful to have
> LivyOperators to Airflow.
>
> Regards,
> Kaxil
>
> On 22/06/2018, 13:34, "Niels Zeilemaker" <ni...@zeilemaker.nl> wrote:
>
>     Hi Kaxil,
>
>     I would recommend using the SSHOperator to start the Spark Job on the
>     master node of the HDInsight cluster.
>     This avoids the problems associated with Livy, and doesn't require you
> to
>     open ports/copy the hadoop configuration to your airflow machine.
>
>     Niels
>
>     2018-06-22 14:17 GMT+02:00 Naik Kaxil <k.n...@reply.com>:
>
>     > Hi all,
>     >
>     >
>     >
>     > Has anyone used the SparkSubmitOperator to submit Spark jobs on Azure
>     > HDInsight cluster? Are you using Livy or spark-submit to run remote
> Spark
>     > jobs?
>     >
>     >
>     >
>     > Regards,
>     >
>     > Kaxil
>     >
>     >
>     > Kaxil Naik
>     >
>     > Data Reply
>     > 2nd Floor, Nova South
>     > 160 Victoria Street, Westminster
>     > <
> https://maps.google.com/?q=160+Victoria+Street,+Westminster+%0D%0ALondon+SW1E+5LB+-+UK&entry=gmail&source=g
> >
>     > London SW1E 5LB - UK
>     > phone: +44 (0)20 7730 6000
>     > k.n...@reply.com
>     > www.reply.com
>     >
>     > [image: Data Reply]
>     >
>
>
>
>
>
>
> Kaxil Naik
>
> Data Reply
> 2nd Floor, Nova South
> 160 Victoria Street, Westminster
> London SW1E 5LB - UK
> phone: +44 (0)20 7730 6000
> k.n...@reply.com
> www.reply.com
>

Reply via email to