LuciferYang commented on code in PR #37976:
URL: https://github.com/apache/spark/pull/37976#discussion_r978238657
##########
sql/hive/src/test/resources/log4j2.properties:
##########
@@ -36,9 +36,9 @@ appender.file.fileName = target/unit-tests.log
appender.file.layout.type = PatternLayout
appender.file.layout.pattern = %d{HH:mm:ss.SSS} %t %p %c{1}: %m%n%ex
-# Set the logger level of File Appender to WARN
+# Set the logger level of File Appender to INFO
appender.file.filter.threshold.type = ThresholdFilter
-appender.file.filter.threshold.level = debug
+appender.file.filter.threshold.level = info
Review Comment:
https://github.com/apache/spark/commit/cd1d4110cfffb413ab585cf1cc8f1264243cb393
set the threshold from info to debug, but didn't explain why.
After run
```
mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl
-Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive
```
the size of the whole Spark directory is about 22G, and the size of
`sql/hive/target/unit-tests.log` is 12G. However, the debug level logs in the
log files seem worthless. Try to restore the log level to reduce disk space
occupation and disk write pressure.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]