Re: Is there a way to turn on spark eventLog on the worker node?

2014-11-24 Thread Harihar Nahak
You can set the same parameter when launching an application, if you use
sppar-submit tried --conf to give those variables or from SparkConfig also
you can set the logs for both driver and workers.   



-
--Harihar
--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Is-there-a-way-to-turn-on-spark-eventLog-on-the-worker-node-tp19714p19716.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Is there a way to turn on spark eventLog on the worker node?

2014-11-24 Thread Marcelo Vanzin
Hello,

What exactly are you trying to see? Workers don't generate any events
that would be logged by enabling that config option. Workers generate
logs, and those are captured and saved to disk by the cluster manager,
generally, without you having to do anything.

On Mon, Nov 24, 2014 at 7:46 PM, Xuelin Cao  wrote:
> Hi,
>
> I'm going to debug some spark applications on our testing platform. And
> it would be helpful if we can see the eventLog on the worker node.
>
> I've tried to turn on spark.eventLog.enabled and set spark.eventLog.dir
> parameters on the worker node. However, it doesn't work.
>
> I do have event logs on my driver node, and I know how to turn it on.
> However, the same settings doesn't work on the worker node.
>
> Can anyone help me to clarify whether event log is only available on
> driver node?
>



-- 
Marcelo

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org