Re: jar file problem

2017-10-19 Thread Weichen Xu
Use `bin/spark-submit --jars` option. On Thu, Oct 19, 2017 at 11:54 PM, 郭鹏飞 wrote: > You can use bin/spark-submit tool to submit you jar to the cluster. > > > 在 2017年10月19日,下午11:24,Uğur Sopaoğlu 写道: > > > > Hello, > > > > I have a very easy problem. How I run a spark job, I must copy jar file >

Re: jar file problem

2017-10-19 Thread 郭鹏飞
You can use bin/spark-submit tool to submit you jar to the cluster. > 在 2017年10月19日,下午11:24,Uğur Sopaoğlu 写道: > > Hello, > > I have a very easy problem. How I run a spark job, I must copy jar file to > all worker nodes. Is there any way to do simple?. > > -- > Uğur Sopaoğlu --

Re: jar file problem

2017-10-19 Thread Imran Rajjad
Simple way is to have a network volume mounted with same name to make things easy On Thu, 19 Oct 2017 at 8:24 PM Uğur Sopaoğlu wrote: > Hello, > > I have a very easy problem. How I run a spark job, I must copy jar file to > all worker nodes. Is there any way to do simple?. > > > -- > Uğur Sopaoğ

Re: jar file problem

2017-10-19 Thread Riccardo Ferrari
This is a good place to start from: https://spark.apache.org/docs/latest/submitting-applications.html Best, On Thu, Oct 19, 2017 at 5:24 PM, Uğur Sopaoğlu wrote: > Hello, > > I have a very easy problem. How I run a spark job, I must copy jar file to > all worker nodes. Is there any way to do s

jar file problem

2017-10-19 Thread Uğur Sopaoğlu
Hello, I have a very easy problem. How I run a spark job, I must copy jar file to all worker nodes. Is there any way to do simple?. -- Uğur Sopaoğlu