Re: Worker node log not showed

2017-06-08 Thread Eike von Seggern
2017-05-31 10:48 GMT+02:00 Paolo Patierno : > No it's running in standalone mode as Docker image on Kubernetes. > > > The only way I found was to access "stderr" file created under the "work" > directory in the SPARK_HOME but ... is it the right way ? > I think that is the

Re: Worker node log not showed

2017-06-07 Thread Ryan
oro Roman <alons...@gmail.com> > *Sent:* Wednesday, May 31, 2017 8:39 AM > *To:* Paolo Patierno > *Cc:* user@spark.apache.org > *Subject:* Re: Worker node log not showed > > Are you running the code with yarn? > > if so, figure out the applicationID through the web ui

Re: Worker node log not showed

2017-05-31 Thread Paolo Patierno
Roman <alons...@gmail.com> Sent: Wednesday, May 31, 2017 8:39 AM To: Paolo Patierno Cc: user@spark.apache.org Subject: Re: Worker node log not showed Are you running the code with yarn? if so, figure out the applicationID through the web ui, then run the next command: yarn logs your_applic

Re: Worker node log not showed

2017-05-31 Thread Alonso Isidoro Roman
Are you running the code with yarn? if so, figure out the applicationID through the web ui, then run the next command: yarn logs your_application_id Alonso Isidoro Roman [image: https://]about.me/alonso.isidoro.roman

Worker node log not showed

2017-05-31 Thread Paolo Patierno
Hi all, I have a simple cluster with one master and one worker. On another machine I launch the driver where at some point I have following line of codes : max.foreachRDD(rdd -> { LOG.info("*** max.foreachRDD"); rdd.foreach(value -> { LOG.info("*** rdd.foreach"); });