[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-13 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16720239#comment-16720239
 ] 

ASF GitHub Bot commented on SPARK-26340:


srowen closed pull request #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/core/src/main/scala/org/apache/spark/SparkConf.scala 
b/core/src/main/scala/org/apache/spark/SparkConf.scala
index 21c5cbc04d813..8d135d3e083d7 100644
--- a/core/src/main/scala/org/apache/spark/SparkConf.scala
+++ b/core/src/main/scala/org/apache/spark/SparkConf.scala
@@ -605,6 +605,15 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable 
with Logging with Seria
   }
 }
 
+if (contains("spark.executor.cores") && contains("spark.task.cpus")) {
+  val executorCores = getInt("spark.executor.cores", 1)
+  val taskCpus = getInt("spark.task.cpus", 1)
+
+  if (executorCores < taskCpus) {
+throw new SparkException("spark.executor.cores must not be less than 
spark.task.cpus.")
+  }
+}
+
 val encryptionEnabled = get(NETWORK_ENCRYPTION_ENABLED) || 
get(SASL_ENCRYPTION_ENABLED)
 require(!encryptionEnabled || get(NETWORK_AUTH_ENABLED),
   s"${NETWORK_AUTH_ENABLED.key} must be enabled when enabling encryption.")
diff --git a/core/src/test/scala/org/apache/spark/SparkConfSuite.scala 
b/core/src/test/scala/org/apache/spark/SparkConfSuite.scala
index df274d949bae3..7cb03deae1391 100644
--- a/core/src/test/scala/org/apache/spark/SparkConfSuite.scala
+++ b/core/src/test/scala/org/apache/spark/SparkConfSuite.scala
@@ -138,6 +138,13 @@ class SparkConfSuite extends SparkFunSuite with 
LocalSparkContext with ResetSyst
 assert(sc.appName === "My other app")
   }
 
+  test("creating SparkContext with cpus per tasks bigger than cores per 
executors") {
+val conf = new SparkConf(false)
+  .set("spark.executor.cores", "1")
+  .set("spark.task.cpus", "2")
+intercept[SparkException] { sc = new SparkContext(conf) }
+  }
+
   test("nested property names") {
 // This wasn't supported by some external conf parsing libraries
 System.setProperty("spark.test.a", "a")


 


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Assignee: Nicolas Fraison
>Priority: Minor
> Fix For: 3.0.0
>
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717936#comment-16717936
 ] 

ASF GitHub Bot commented on SPARK-26340:


srowen commented on issue #23290: [SPARK-26340][Core] Ensure cores per executor 
is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446348399
 
 
   Yes, it shouldn't be in TaskSchedulerImpl. I think the check is OK as 
there's no good reason you'd allow 1-core executors when all tasks need 2 
cores. My only concern is that spark.executor.cores defaults to to 1 in YARN 
mode, but probably best to fail fast if that's the case.


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717636#comment-16717636
 ] 

ASF GitHub Bot commented on SPARK-26340:


MaxGekk commented on issue #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446292798
 
 
   Should the check be moved to `SparkConf` like:
   
https://github.com/apache/spark/blob/78fa1be29bc9fbe98dd0226418aafc221c5e5309/core/src/main/scala/org/apache/spark/SparkConf.scala#L597-L606
 ?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717248#comment-16717248
 ] 

ASF GitHub Bot commented on SPARK-26340:


AmplabJenkins removed a comment on issue #23290: [SPARK-26340][Core] Ensure 
cores per executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446218073
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717243#comment-16717243
 ] 

ASF GitHub Bot commented on SPARK-26340:


AmplabJenkins commented on issue #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446218653
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717242#comment-16717242
 ] 

ASF GitHub Bot commented on SPARK-26340:


AmplabJenkins removed a comment on issue #23290: [SPARK-26340][Core] Ensure 
cores per executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446217908
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717241#comment-16717241
 ] 

ASF GitHub Bot commented on SPARK-26340:


AmplabJenkins commented on issue #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446218073
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717240#comment-16717240
 ] 

ASF GitHub Bot commented on SPARK-26340:


AmplabJenkins commented on issue #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290#issuecomment-446217908
 
 
   Can one of the admins verify this patch?


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-26340) Ensure cores per executor is greater than cpu per task

2018-12-11 Thread ASF GitHub Bot (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-26340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16717233#comment-16717233
 ] 

ASF GitHub Bot commented on SPARK-26340:


ashangit opened a new pull request #23290: [SPARK-26340][Core] Ensure cores per 
executor is greater than cpu per task
URL: https://github.com/apache/spark/pull/23290
 
 
   Currently this check is only performed for dynamic allocation use case in
   ExecutorAllocationManager.
   
   ## What changes were proposed in this pull request?
   
   Checks that cpu per task is lower than number of cores per executor 
otherwise throw an exception
   
   ## How was this patch tested?
   
   manual tests
   
   Please review http://spark.apache.org/contributing.html before opening a 
pull request.
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Ensure cores per executor is greater than cpu per task
> --
>
> Key: SPARK-26340
> URL: https://issues.apache.org/jira/browse/SPARK-26340
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.2.2, 2.3.2
>Reporter: Nicolas Fraison
>Priority: Minor
>
> No check is performed to ensure spark.task.cpus is lower then 
> spark.executor.cores. Which can lead to jobs not able to assign tasks without 
> any understandable issues
> The check is only performed in the case of dynamic allocation usage in 
> ExecutorAllocationManager
> Adding the check in TaskSchedulerImpl ensure that an issue is thrown to the 
> driver



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org