Leonidas Fegaras created MRQL-73:
------------------------------------

             Summary: Set the max number of tasks in Spark mode
                 Key: MRQL-73
                 URL: https://issues.apache.org/jira/browse/MRQL-73
             Project: MRQL
          Issue Type: Bug
          Components: Run-Time/Spark
    Affects Versions: 0.9.6
            Reporter: Leonidas Fegaras
            Assignee: Leonidas Fegaras
            Priority: Critical


The number of worker nodes in Spark distributed mode, which are specified by 
the MRQL -nodes parameter, must set the parameters SPARK_WORKER_INSTANCES 
(called SPARK_EXECUTOR_INSTANCES in Spark 1.3.*) and SPARK_WORKER_CORES; 
otherwise, Spark will always use all the available cores in the cluster.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to