Github user vanzin commented on a diff in the pull request:
    --- Diff: 
core/src/test/scala/org/apache/spark/deploy/SparkSubmitSuite.scala ---
    @@ -775,17 +781,17 @@ class SparkSubmitSuite
       test("SPARK_CONF_DIR overrides spark-defaults.conf") {
    -    forConfDir(Map("spark.executor.memory" -> "2.3g")) { path =>
    +    forConfDir(Map("spark.executor.memory" -> "3g")) { path =>
    --- End diff --
    It's just that now instead of just printing an error to the output, an 
exception is actually thrown.
    [info] SparkSubmitSuite:
    [info] - SPARK_CONF_DIR overrides spark-defaults.conf *** FAILED *** (144 
    [info]   org.apache.spark.SparkException: Executor Memory cores must be a 
positive number
    [info]   at 
    [info]   at 
    That is because:
    scala> c.set("spark.abcde", "2.3g")
    res0: org.apache.spark.SparkConf = org.apache.spark.SparkConf@cd5ff55
    scala> c.getSizeAsBytes("spark.abcde")
    java.lang.NumberFormatException: Size must be specified as bytes (b), 
kibibytes (k), mebibytes (m), gibibytes (g), tebibytes (t), or pebibytes(p). 
E.g. 50b, 100k, or 250m.
    Fractional values are not supported. Input was: 2.3
    Just noticed that the error message is kinda wrong, but also this whole 
validation function (`validateSubmitArguments`) leaves a lot to be desired...


To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to