: Spark with log4j
Hi Kalpesh,
If you are using spark on yarn, it may not work. Because you write log to
files other than stdout/stderr, which yarn log aggregation may not work. As
I understand, yarn only aggregate log in stdout/stderr, and local cache will
be deleted (in configured timeframe
Hi Siva,
Through this command it doesn’t print log.info messages whatever I have written
in application.
Thanks,
Kalpesh Jadhav
From: Siva [mailto:sbhavan...@gmail.com]
Sent: Tuesday, December 22, 2015 6:27 AM
To: Zhan Zhang
Cc: Kalpesh Jadhav; user@spark.apache.org
Subject: Re: Spark
Hi Kalpseh,
Just to add, you could use "yarn logs -applicationId " to
see aggregated logs once application is finished.
Thanks,
Sivakumar Bhavanari.
On Mon, Dec 21, 2015 at 3:56 PM, Zhan Zhang wrote:
> Hi Kalpesh,
>
> If you are using spark on yarn, it may not work. Because you write log to
>
Hi Kalpesh,
If you are using spark on yarn, it may not work. Because you write log to files
other than stdout/stderr, which yarn log aggregation may not work. As I
understand, yarn only aggregate log in stdout/stderr, and local cache will be
deleted (in configured timeframe).
To check it, at a
; Still print logs on console only.
>
>
>
>
>
> Thanks,
>
> Kalpesh Jadhav.
>
>
>
> *From:* Ted Yu [mailto:yuzhih...@gmail.com]
> *Sent:* Friday, December 18, 2015 9:15 PM
> *To:* Kalpesh Jadhav
> *Cc:* user
> *Subject:* Re: Spark with log4j
&g
Hi Ted,
Thanks for your response, But it doesn’t solve my issue.
Still print logs on console only.
Thanks,
Kalpesh Jadhav.
From: Ted Yu [mailto:yuzhih...@gmail.com]
Sent: Friday, December 18, 2015 9:15 PM
To: Kalpesh Jadhav
Cc: user
Subject: Re: Spark with log4j
See this thread
See this thread:
http://search-hadoop.com/m/q3RTtEor1vYWbsW
which mentioned:
SPARK-11105 Disitribute the log4j.properties files from the client to the
executors
FYI
On Fri, Dec 18, 2015 at 7:23 AM, Kalpesh Jadhav <
kalpesh.jad...@citiustech.com> wrote:
> Hi all,
>
>
>
> I am new to spark, I am