HyukjinKwon commented on a change in pull request #28661:
URL: https://github.com/apache/spark/pull/28661#discussion_r432938342
##########
File path:
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
##########
@@ -1784,6 +1784,15 @@ object SQLConf {
.version("3.0.0")
.fallbackConf(ARROW_EXECUTION_ENABLED)
+ val PYSPARK_JVM_STACKTRACE_ENABLED =
+ buildConf("spark.sql.pyspark.jvmStacktrace.enabled")
+ .doc("When true, it shows the JVM stacktrace in the user-facing PySpark
exception " +
+ "together with Python stacktrace. By default, it is disabled and hides
JVM stacktrace " +
+ "and shows a Python-friendly exception only.")
+ .version("3.0.0")
Review comment:
It doesn't strictly have to be in Spark 3.0. I just wanted to have some
feedback quicker from users, and thought it's good to try this in Spark 3.0 as
technically we just touch the error messages only here.
I don't super strongly feel it has to land to Spark 3.0 - it's okay to
retarget 3.1 if anyone feels strongly it has to be only in the master. Let me
know :-)
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]