[
https://issues.apache.org/jira/browse/SPARK-8009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14712666#comment-14712666
]
Philipp Hoffmann commented on SPARK-8009:
-----------------------------------------
I think this is already solved by SPARK-8798, I linked the ticket.
> [Mesos] Allow provisioning of executor logging configuration
> -------------------------------------------------------------
>
> Key: SPARK-8009
> URL: https://issues.apache.org/jira/browse/SPARK-8009
> Project: Spark
> Issue Type: Improvement
> Components: Mesos
> Affects Versions: 1.3.1
> Environment: Mesos executor
> Reporter: Gerard Maas
> Labels: logging, mesos
>
> It's currently not possible to provide a custom logging configuration for the
> Mesos executors.
> Upon startup of the executor JVM, it loads a default config file from the
> Spark assembly, visible by this line in stderr:
> > Using Spark's default log4j profile:
> > org/apache/spark/log4j-defaults.properties
> That line comes from Logging.scala [1] where a default config is loaded if
> none is found in the classpath upon the startup of the Spark Mesos executor
> in the Mesos sandbox. At that point in time, none of the application-specific
> resources have been shipped yet, as the executor JVM is just starting up.
> To load a custom configuration file we should have it already on the sandbox
> before the executor JVM starts and add it to the classpath on the startup
> command.
> For the classpath customization, It looks like it should be possible to pass
> a -Dlog4j.configuration property by using the
> 'spark.executor.extraClassPath' that will be picked up at [2] and that should
> be added to the command that starts the executor JVM, but the resource must
> be already on the host before we can do that. Therefore we need some means of
> 'shipping' the log4j.configuration file to the allocated executor.
> This all boils down to the need of shipping extra files to the sandbox.
> There's a workaround: open up the Spark assembly, replace the
> log4j-default.properties and pack it up again. That would work, although
> kind of rudimentary as people may use the same assembly for many jobs.
> Probably, accessing the log4j API programmatically should also work (we
> didn't try that yet)
> [1]
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/Logging.scala#L128
> [2]
> https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala#L77
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]