[ 
https://issues.apache.org/jira/browse/SPARK-19090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15816920#comment-15816920
 ] 

Yuming Wang commented on SPARK-19090:
-------------------------------------

Try this, it works for me:
{code}
sbin/start-thriftserver.sh --executor-memory 12g --driver-memory 8g 
--executor-cores 5 --num-executors 130 --hiveconf 
hive.server2.thrift.port=20402  --conf 
spark.scheduler.listenerbus.eventqueue.size=150000 --conf 
spark.ui.retainedTasks=200000 --conf spark.sql.codegen.wholeStage=true --conf 
spark.dynamicAllocation.enabled=true --conf spark.shuffle.service.enabled=true 
--conf spark.dynamicAllocation.maxExecutors=130 --conf 
spark.dynamicAllocation.executorIdleTimeout=200 --hiveconf 
hive.server2.thrift.bind.host=192.168.28.200 --conf 
spark.yarn.executor.memoryOverhead=4096 --conf 
"spark.executor.extraJavaOptions=-XX:+UseParallelGC -XX:+UseParallelOldGC 
-XX:+PrintFlagsFinal -XX:+PrintReferenceGC -verbose:gc -XX:+PrintGCDetails 
-XX:+PrintGCTimeStamps -XX:+PrintAdaptiveSizePolicy 
-XX:+UnlockDiagnosticVMOptions"
{code}

> Dynamic Resource Allocation not respecting spark.executor.cores
> ---------------------------------------------------------------
>
>                 Key: SPARK-19090
>                 URL: https://issues.apache.org/jira/browse/SPARK-19090
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.5.2, 1.6.1, 2.0.1
>            Reporter: nirav patel
>
> When enabling dynamic scheduling with yarn I see that all executors are using 
> only 1 core even if I specify "spark.executor.cores" to 6. If dynamic 
> scheduling is disabled then each executors will have 6 cores. i.e. it 
> respects  "spark.executor.cores". I have tested this against spark 1.5 . I 
> think it will be the same behavior with 2.x as well.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to