dongjoon-hyun commented on a change in pull request #28661: URL: https://github.com/apache/spark/pull/28661#discussion_r432389246
########## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ########## @@ -1784,6 +1784,15 @@ object SQLConf { .version("3.0.0") .fallbackConf(ARROW_EXECUTION_ENABLED) + val PYSPARK_JVM_STACKTRACE_ENABLED = + buildConf("spark.sql.pyspark.jvmStacktrace.enabled") + .doc("When true, it shows the JVM stacktrace in the user-facing PySpark exception " + + "together with Python stacktrace. By default, it is disabled and hides JVM stacktrace " + + "and shows a Python-friendly exception only.") + .version("3.0.0") Review comment: @gatorsmile . Are you okay with 3.0.0 targeting here? Although we are in RC stage, this PR looks worth for backporting. (Also the default is `false`.) One question for @HyukjinKwon . Do we want this as a dynamic configuration instead of static configuration? I mean do we want to switch on/off during runtime? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org