Use `bin/spark-submit --jars` option.
On Thu, Oct 19, 2017 at 11:54 PM, 郭鹏飞 wrote:
> You can use bin/spark-submit tool to submit you jar to the cluster.
>
> > 在 2017年10月19日,下午11:24,Uğur Sopaoğlu 写道:
> >
> > Hello,
> >
> > I have a very easy problem. How I run a spark job, I must copy jar file
>
You can use bin/spark-submit tool to submit you jar to the cluster.
> 在 2017年10月19日,下午11:24,Uğur Sopaoğlu 写道:
>
> Hello,
>
> I have a very easy problem. How I run a spark job, I must copy jar file to
> all worker nodes. Is there any way to do simple?.
>
> --
> Uğur Sopaoğlu
--
Simple way is to have a network volume mounted with same name to make
things easy
On Thu, 19 Oct 2017 at 8:24 PM Uğur Sopaoğlu wrote:
> Hello,
>
> I have a very easy problem. How I run a spark job, I must copy jar file to
> all worker nodes. Is there any way to do simple?.
>
>
> --
> Uğur Sopaoğ
This is a good place to start from:
https://spark.apache.org/docs/latest/submitting-applications.html
Best,
On Thu, Oct 19, 2017 at 5:24 PM, Uğur Sopaoğlu wrote:
> Hello,
>
> I have a very easy problem. How I run a spark job, I must copy jar file to
> all worker nodes. Is there any way to do s
Hello,
I have a very easy problem. How I run a spark job, I must copy jar file to
all worker nodes. Is there any way to do simple?.
--
Uğur Sopaoğlu