You've got to start the shuffle service on all your workers. There's a script
for that in the 'sbin' directory.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-dynamic-allocation-and-shuffle-service-in-Standalone-Mode-tp26430p2643
zchakov
>> *Sent: *Tuesday, March 8, 2016 2:17 PM
>> *To: *Silvio Fiorito ;
>> user@spark.apache.org
>> *Subject: *Re: Using dynamic allocation and shuffle service in
>> Standalone Mode
>>
>>
>> Actually, I assumed that setting the flag in the spark job would
der sbin, start-shuffle-service.sh. Run
> that on each of your worker nodes.
>
>
>
>
>
>
>
> *From: *Yuval Itzchakov
> *Sent: *Tuesday, March 8, 2016 2:17 PM
> *To: *Silvio Fiorito ;
> user@spark.apache.org
> *Subject: *Re: Using dynamic allocatio
<mailto:user@spark.apache.org>
Subject: Re: Using dynamic allocation and shuffle service in Standalone Mode
Actually, I assumed that setting the flag in the spark job would turn on the
shuffle service in the workers. I now understand that assumption was wrong.
Is there any way to set the flag via t
at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> I verified all relevant
, correct?
> Can you confirm they’re still running and haven’t exited?
>
>
>
>
>
>
>
> *From: *Yuval.Itzchakov
> *Sent: *Tuesday, March 8, 2016 12:41 PM
> *To: *user@spark.apache.org
> *Subject: *Using dynamic allocation and shuffle service in Standalone Mode
>
&
ilure?
Yuval.
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Using-dynamic-allocation-and-shuffle-service-in-Standalone-Mode-tp26430.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
--
xecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
I verified all relevant ports are open. Has anyone else experienced such a
failure?
Yuval.
--
View this message in context:
http://apache-spark