Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/148#issuecomment-37774742
Check for use of SPARK_LOG4J_CONF in yarn/
I think primarily in
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala
(The code has changed quite a bit since I wrote it, so I can only give
pointers unfortunately)
On Sun, Mar 16, 2014 at 1:32 PM, Patrick Wendell
<[email protected]>wrote:
> It's unfortunate that Hadoop publishes the log4j.properties file directly
> in it's jars. This is exactly why we've avoided doing this in Spark
because
> it creates a weird situation where you can't easily control logging
> preferences.
>
> @sryza <https://github.com/sryza> @mridulm <https://github.com/mridulm> -
> I think what we need is a precedence order here:
>
> - If a user explicitly gives a log4j.properties, use that
(@mridul<https://github.com/mridul>- I didn't actually know we allowed this in
the YARN launcher, where is the
> code relevant to that?)
> - If a user jar has a log4j.properties file in it, we should use that.
> - If not, we can use the YARN cluster defaults when running on YARN.
>
> --
> Reply to this email directly or view it on
GitHub<https://github.com/apache/spark/pull/148#issuecomment-37769373>
> .
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---