[ https://issues.apache.org/jira/browse/SPARK-31193?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-31193: ---------------------------------- Target Version/s: (was: 3.1.0) > set spark.master and spark.app.name conf default value > ------------------------------------------------------ > > Key: SPARK-31193 > URL: https://issues.apache.org/jira/browse/SPARK-31193 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.3.0, 2.3.3, 2.4.0, 2.4.2, 2.4.3, 2.4.4, 2.4.5, 3.1.0 > Reporter: daile > Priority: Major > > I see the default value of master setting in spark-submit client > {code:java} > // Global defaults. These should be keep to minimum to avoid confusing > behavior. master = Option(master).getOrElse("local[*]") > {code} > but during our development and debugging, We will encounter this kind of > problem > Exception in thread "main" org.apache.spark.SparkException: A master URL must > be set in your configuration > This conflicts with the default setting > > {code:java} > //If we do > val sparkConf = new SparkConf().setAppName(“app”) > //When using the client to submit tasks to the cluster, the matser will be > overwritten by the local > sparkConf.set("spark.master", "local[*]"){code} > > so we have to do like this > {code:java} > val sparkConf = new SparkConf().setAppName(“app”) > //Because the program runs to set the priority of the master, we have to > first determine whether to set the master to avoid submitting the cluster to > run. > sparkConf.set("spark.master",sparkConf.get("spark.master","local[*]")){code} > > > so is spark.app.name > Is it better for users to handle it like submit client ? -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org