gerashegalov commented on code in PR #40372: URL: https://github.com/apache/spark/pull/40372#discussion_r1134801751
########## python/pyspark/errors/exceptions/captured.py: ########## @@ -65,8 +65,15 @@ def __str__(self) -> str: assert SparkContext._jvm is not None jvm = SparkContext._jvm - sql_conf = jvm.org.apache.spark.sql.internal.SQLConf.get() - debug_enabled = sql_conf.pysparkJVMStacktraceEnabled() + + # SPARK-42752: default to True to see issues with initialization + debug_enabled = True + try: + sql_conf = jvm.org.apache.spark.sql.internal.SQLConf.get() + debug_enabled = sql_conf.pysparkJVMStacktraceEnabled() + except BaseException: Review Comment: I advocate for keeping the likelihood of an unhelpful unprintable exception during initialization to the minimum. I would not want to revisit the issue for other runtime exceptions. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org