>> --conf
>>
>> "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:///local/file/log4j.properties"
>>
>> FYI
>>
>> On Fri, Apr 29, 2016 at 6:03 AM, dev loper <spark...@gmail.com> wrote:
>>
>>> Hi Spark Team,
>&
> On Fri, Apr 29, 2016 at 6:03 AM, dev loper <spark...@gmail.com> wrote:
>
>> Hi Spark Team,
>>
>> I have asked the same question on stack overflow , no luck yet.
>>
>>
>> http://stackoverflow.com/questions/36923949/where-to-find-logs-within-spar
here in the
executor machine) or see through webui
2016-04-29 20:03 GMT+07:00 dev loper <spark...@gmail.com>:
> Hi Spark Team,
>
> I have asked the same question on stack overflow , no luck yet.
>
>
> http://stackoverflow.com/questions/36923949/where-to-find-logs-withi
Hi Spark Team,
I have asked the same question on stack overflow , no luck yet.
http://stackoverflow.com/questions/36923949/where-to-find-logs-within-spark-rdd-processing-function-yarn-cluster-mode?noredirect=1#comment61419406_36923949
I am running my Spark Application on Yarn Cluster