[ https://issues.apache.org/jira/browse/SPARK-11025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14950157#comment-14950157 ]
Stavros Kontopoulos commented on SPARK-11025: --------------------------------------------- falling back to previous impl: System.getProperties.clone().asInstanceOf[java.util.Properties].toMap[String, String] which was ignoring it, i guess at language level java does not complain so i think it is ok to ignore it...unless the general strategy is to catch everything that is wrong... but i think we should only validate what we use... i know -D only may come up as a mistake... just wanted to bring to the table what is the strategy and if for such minor mistakes should we fail the execution when spark context is created etc... > Exception key can't be empty at getSystemProperties function in utils > ---------------------------------------------------------------------- > > Key: SPARK-11025 > URL: https://issues.apache.org/jira/browse/SPARK-11025 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.3.0, 1.3.1, 1.4.0, 1.4.1, 1.5.1 > Reporter: Stavros Kontopoulos > Priority: Trivial > Labels: easyfix, easytest > > At file core/src/main/scala/org/apache/spark/util/Utils.scala > getSystemProperties function fails when someone passes -D to the jvm and as a > result the key passed is "" (empty). > Exception thrown: java.lang.IllegalArgumentException: key can't be empty > Empty keys should be ignored or just passed them without filtering at that > level as in previous versions. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org