Github user tnachen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/5144#discussion_r28045217
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolRequest.scala
 ---
    @@ -61,7 +61,7 @@ private[rest] class CreateSubmissionRequest extends 
SubmitRestProtocolRequest {
         assertProperty[Boolean](key, "boolean", _.toBoolean)
     
       private def assertPropertyIsNumeric(key: String): Unit =
    -    assertProperty[Int](key, "numeric", _.toInt)
    +    assertProperty[Double](key, "numeric", _.toDouble)
    --- End diff --
    
    This is actually for supporting fractional cores when providing 
--driver-cores.
    Cores in Spark are all integers, but it becomes quite limited when you want 
launch multiple jobs as each job minimumally has to take at least 1 core (2 in 
case of fine-grain mode). Making driver cores double you can then specifiy 
--driver-cores 0.1 and it will just use 10% of a single core share.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to