[ 
https://issues.apache.org/jira/browse/SPARK-25590?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Marcelo Vanzin updated SPARK-25590:
-----------------------------------
    Description: 
That jar file, which is packaged when the k8s profile is enabled, has a log4j 
configuration embedded in it:

{noformat}
$ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j
log4j.properties
{noformat}

What this causes is that Spark will always use that log4j configuration instead 
of its own default (log4j-defaults.properties), unless the user overrides it by 
somehow adding their own in the classpath before the kubernetes one.

You can see that by running spark-shell. With the k8s jar in:

{noformat}
$ ./bin/spark-shell 
...
Setting default log level to "WARN"
{noformat}

Removing the k8s jar:

{noformat}
$ ./bin/spark-shell 
...
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
{noformat}

The proper fix would be for the k8s jar to not ship that file, and then just 
upgrade the dependency in Spark, but if there's something easy we can do in the 
meantime...

  was:
That jar file, which is packaged when the k8s profile is enabled, has a log4j 
configuration embedded in it:

{noformat}
$ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j
log4j.properties
{noformat}

What this causes is that Spark will always use that log4j configuration instead 
of its own default (log4j-defaults.properties), unless the user overrides it by 
somehow adding their own in the classpath before the kubernetes one.

You can see that by running spark-shell. With the k8s jar in:

{noformat}
$ ./bin/spark-shell 
...
Setting default log level to "WARN"
{noformat}

Removing the k8s jar:

{noformat}

{noformat}
$ ./bin/spark-shell 
...
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
{noformat}

The proper fix would be for the k8s jar to not ship that file, and then just 
upgrade the dependency in Spark, but if there's something easy we can do in the 
meantime...


> kubernetes-model-2.0.0.jar masks default Spark logging config
> -------------------------------------------------------------
>
>                 Key: SPARK-25590
>                 URL: https://issues.apache.org/jira/browse/SPARK-25590
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.4.0
>            Reporter: Marcelo Vanzin
>            Priority: Major
>
> That jar file, which is packaged when the k8s profile is enabled, has a log4j 
> configuration embedded in it:
> {noformat}
> $ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j
> log4j.properties
> {noformat}
> What this causes is that Spark will always use that log4j configuration 
> instead of its own default (log4j-defaults.properties), unless the user 
> overrides it by somehow adding their own in the classpath before the 
> kubernetes one.
> You can see that by running spark-shell. With the k8s jar in:
> {noformat}
> $ ./bin/spark-shell 
> ...
> Setting default log level to "WARN"
> {noformat}
> Removing the k8s jar:
> {noformat}
> $ ./bin/spark-shell 
> ...
> Using Spark's default log4j profile: 
> org/apache/spark/log4j-defaults.properties
> Setting default log level to "WARN".
> {noformat}
> The proper fix would be for the k8s jar to not ship that file, and then just 
> upgrade the dependency in Spark, but if there's something easy we can do in 
> the meantime...



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to