gengliangwang commented on code in PR #53659:
URL: https://github.com/apache/spark/pull/53659#discussion_r2658512543


##########
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:
##########
@@ -452,9 +451,24 @@ private[hive] class SparkSQLCLIDriver extends CliDriver 
with Logging {
                 case _ => t.getMessage
               }
               err.println(msg)
-              if (format == ErrorMessageFormat.PRETTY &&
-                !sessionState.getIsSilent &&
-                (!t.isInstanceOf[AnalysisException] || t.getCause != null)) {
+              // Print stack traces based on format and error type:
+              // - DEBUG format: Always print stack traces (for debugging)
+              // - PRETTY format: Only for internal errors (SQLSTATE XX***)
+              // - MINIMAL/STANDARD formats: Never print stack traces (JSON 
only)
+              val shouldPrintStackTrace = format match {

Review Comment:
   ok, on second thought, the inconsistency seems reasonable. 
   However, the new log format `DEBUG` seems only working for `spark-sql`, 
instead of all other CLI and applications. Shall we simply introduce a new 
configuration for the log output of `spark-sql`? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to