Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread 谢廷稳
OK, yarn.scheduler.maximum-allocation-mb is 16384. I have ran it again, the command to run it is: ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn-cluster - -driver-memory 4g --executor-memory 8g lib/spark-examples*.jar 200 > > > 15/11/24 16:15:56 INFO

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread Sabarish Sasidharan
If yarn has only 50 cores then it can support max 49 executors plus 1 driver application master. Regards Sab On 24-Nov-2015 1:58 pm, "谢廷稳" wrote: > OK, yarn.scheduler.maximum-allocation-mb is 16384. > > I have ran it again, the command to run it is: > ./bin/spark-submit

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread Saisai Shao
Did you set this configuration "spark.dynamicAllocation.initialExecutors" ? You can set spark.dynamicAllocation.initialExecutors 50 to take try again. I guess you might be hitting this issue since you're running 1.5.0, https://issues.apache.org/jira/browse/SPARK-9092. But it still cannot explain

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread 谢廷稳
@Sab Thank you for your reply, but the cluster has 6 nodes which contain 300 cores and Spark application did not request resource from YARN. @SaiSai I have ran it successful with " spark.dynamicAllocation.initialExecutors" equals 50, but in

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread Saisai Shao
The document is right. Because of a bug introduce in https://issues.apache.org/jira/browse/SPARK-9092 which makes this configuration fail to work. It is fixed in https://issues.apache.org/jira/browse/SPARK-10790, you could change to newer version of Spark. On Tue, Nov 24, 2015 at 5:12 PM, 谢廷稳

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-24 Thread 谢廷稳
Thank you very much, after change to newer version, it did work well! 2015-11-24 17:15 GMT+08:00 Saisai Shao : > The document is right. Because of a bug introduce in > https://issues.apache.org/jira/browse/SPARK-9092 which makes this > configuration fail to work. > > It

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread Saisai Shao
Hi Tingwen, Would you minding sharing your changes in ExecutorAllocationManager#addExecutors(). >From my understanding and test, dynamic allocation can be worked when you set the min to max number of executors to the same number. Please check your Spark and Yarn log to make sure the executors

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread Saisai Shao
I don't think it is a bug, maybe something wrong with your Spark / Yarn configurations. On Tue, Nov 24, 2015 at 12:13 PM, 谢廷稳 wrote: > OK,the YARN cluster was used by myself,it have 6 node witch can run over > 100 executor, and the YARN RM logs showed that the Spark

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread 谢廷稳
Hi Saisai, Would you mind giving me some tips about this problem? After check YARN RM logs, I think Spark application didn't request resources from it, So, I guess this problem is none of YARN's business. and the spark conf of my cluster will be list in the following:

Re: Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread cherrywayb...@gmail.com
can you show your parameter values in your env ? yarn.nodemanager.resource.cpu-vcores yarn.nodemanager.resource.memory-mb cherrywayb...@gmail.com From: 谢廷稳 Date: 2015-11-24 12:13 To: Saisai Shao CC: spark users Subject: Re: A Problem About Running Spark 1.5 on YARN with Dynamic

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread 谢廷稳
Hi Saisai, I'm sorry for did not describe it clearly,YARN debug log said I have 50 executors,but ResourceManager showed that I only have 1 container for the AppMaster. I have checked YARN RM logs,after AppMaster changed state from ACCEPTED to RUNNING,it did not have log about this job any

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread 谢廷稳
Hi SaiSai, I have changed "if (numExecutorsTarget >= maxNumExecutors)" to "if (numExecutorsTarget > maxNumExecutors)" of the first line in the ExecutorAllocationManager#addExecutors() and it rans well. In my opinion,when I was set minExecutors equals maxExecutors,when the first time to add

Re: A Problem About Running Spark 1.5 on YARN with Dynamic Alloction

2015-11-23 Thread Saisai Shao
I think this behavior is expected, since you already have 50 executors launched, so no need to acquire additional executors. You change is not solid, it is just hiding the log. Again I think you should check the logs of Yarn and Spark to see if executors are started correctly. Why resource is