[
https://issues.apache.org/jira/browse/SPARK-25590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16667640#comment-16667640
]
Ilan Filonenko commented on SPARK-25590:
----------------------------------------
Is this consistent with `kubernetes-model-4.1.0.jar` which is now what is being
packaged as a result of a versioning refactor? (However, this versioning is
pointed to 3.0.0 as per #SPARK-25828)
> kubernetes-model-2.0.0.jar masks default Spark logging config
> -------------------------------------------------------------
>
> Key: SPARK-25590
> URL: https://issues.apache.org/jira/browse/SPARK-25590
> Project: Spark
> Issue Type: Bug
> Components: Kubernetes
> Affects Versions: 2.4.0
> Reporter: Marcelo Vanzin
> Priority: Major
>
> That jar file, which is packaged when the k8s profile is enabled, has a log4j
> configuration embedded in it:
> {noformat}
> $ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j
> log4j.properties
> {noformat}
> What this causes is that Spark will always use that log4j configuration
> instead of its own default (log4j-defaults.properties), unless the user
> overrides it by somehow adding their own in the classpath before the
> kubernetes one.
> You can see that by running spark-shell. With the k8s jar in:
> {noformat}
> $ ./bin/spark-shell
> ...
> Setting default log level to "WARN"
> {noformat}
> Removing the k8s jar:
> {noformat}
> $ ./bin/spark-shell
> ...
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> {noformat}
> The proper fix would be for the k8s jar to not ship that file, and then just
> upgrade the dependency in Spark, but if there's something easy we can do in
> the meantime...
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]