HI,all

I confuse that why shark window don't execute queries also occupied resources ? 
In the case of  shark window is not closed how can parallel execution multiple 
queries? 

for example 
Spark config:
 total  Nodes: 3 
 Spark total  cores :9
Shark config:
SPARK_JAVA_OPTS+="-Dspark.scheduler.allocation.file=/opt/spark-0.9.1-bin-hadoop1/conf/fairscheduler.xml
 "
        SPARK_JAVA_OPTS+="-Dspark.cores.max=9 "
        SPARK_JAVA_OPTS+="-Dspark.scheduler.mode=FAIR "

I can only open a window to execute queries, if set spark.cores.max less than 9 
I think there are not take full advantage of the cluster.


Thank you for your help !!!

2014-06-18 



majian 

Reply via email to