Senthil Kumar created SPARK-36643: ------------------------------------- Summary: Add more information in ERROR log while SparkConf is modified when spark.sql.legacy.setCommandRejectsSparkCoreConfs is set Key: SPARK-36643 URL: https://issues.apache.org/jira/browse/SPARK-36643 Project: Spark Issue Type: Improvement Components: Spark Core Affects Versions: 3.1.2 Reporter: Senthil Kumar
Right now, by default sql.legacy.setCommandRejectsSparkCoreConfs is set as true in Spark 3.* versions int order to avoid changing Spark Confs. But from the error message we get confused if we can not modify/change Spark conf in Spark 3.* or not. Current Error Message : {code:java} Exception in thread "main" org.apache.spark.sql.AnalysisException: Cannot modify the value of a Spark config: spark.driver.host at org.apache.spark.sql.RuntimeConfig.requireNonStaticConf(RuntimeConfig.scala:156) at org.apache.spark.sql.RuntimeConfig.set(RuntimeConfig.scala:40){code} So adding little more information( how to modify Spark Conf), in ERROR log while SparkConf is modified when spark.sql.legacy.setCommandRejectsSparkCoreConfs is 'true', will be helpful to avoid confusions. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org