[
https://issues.apache.org/jira/browse/SPARK-40544?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Yuming Wang reassigned SPARK-40544:
-----------------------------------
Assignee: Yang Jie
> The file size of `sql/hive/target/unit-tests.log` is too big
> ------------------------------------------------------------
>
> Key: SPARK-40544
> URL: https://issues.apache.org/jira/browse/SPARK-40544
> Project: Spark
> Issue Type: Improvement
> Components: SQL, Tests
> Affects Versions: 3.4.0
> Reporter: Yang Jie
> Assignee: Yang Jie
> Priority: Minor
>
> SPARK-6908 set the file appender log level threshold of the hive UTs from
> info to debug
> ,but didn't explain why.
>
> When I run
> {code:java}
> mvn clean install -Phadoop-3 -Phadoop-cloud -Pmesos -Pyarn -Pkinesis-asl
> -Phive-thriftserver -Pspark-ganglia-lgpl -Pkubernetes -Phive {code}
> the size of the whole Spark directory is about 22G, and the size of
> `sql/hive/target/unit-tests.log` is 12G. However, the debug level logs in the
> log files seem worthless, but it takes up a lot of disk space.
>
> {code:java}
> # Set the logger level of File Appender to WARN
> log4j.appender.FA.Threshold = DEBUG {code}
> And the original comment of this config is `{{{}Set the logger level of File
> Appender to WARN{}}}`, but the {{warn}} level has not been used
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]