Github user mridulm commented on the pull request:
https://github.com/apache/spark/pull/148#issuecomment-37766354
Sure, or merge it with the user provided logging config even if provided
(which could be quite tricky, maybe not a good idea).
I just want to ensure that users have the ability to customize and/or
override logging config for their specific app.
On Sun, Mar 16, 2014 at 11:55 AM, Sandy Ryza
<[email protected]>wrote:
> @pwendell <https://github.com/pwendell> Spark's build excluding
> log4j.properties is not enough to keep it off the executor classpath.
> Executor classpaths include the Hadoop jars as installed locally on the
> cluster machines. And those include a log4j.properties.
>
> @mridulm <https://github.com/mridulm> My bad, I hadn't noticed that there
> was a way to pass a log4j.properties. In that case, I think the best thing
> would be to use container-log4j.properties by default, but, if the client
> gives one, to use that instead. What do you think?
>
> --
> Reply to this email directly or view it on
GitHub<https://github.com/apache/spark/pull/148#issuecomment-37766301>
> .
>
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---