Hopefully someone will give you a more direct answer but whenever I'm
having issues with log4j I always try -Dlog4j.debug=true.This will tell you
which log4j settings are getting picked up from where. I've spent countless
hours due to typos in the file, for example.

On Mon, Sep 7, 2015 at 11:47 AM, Jeetendra Gangele <gangele...@gmail.com>
wrote:

> I also tried placing my costomized log4j.properties file under
> src/main/resources still no luck.
>
> won't above step modify the default YARN and spark  log4j.properties  ?
>
> anyhow its still taking log4j.properties from YARn.
>
>
>
> On 7 September 2015 at 19:25, Jeetendra Gangele <gangele...@gmail.com>
> wrote:
>
>> anybody here to help?
>>
>>
>>
>> On 7 September 2015 at 17:53, Jeetendra Gangele <gangele...@gmail.com>
>> wrote:
>>
>>> Hi All I have been trying to send my application related logs to socket
>>> so that we can write log stash and check the application logs.
>>>
>>> here is my log4j.property file
>>>
>>> main.logger=RFA,SA
>>>
>>> log4j.appender.SA=org.apache.log4j.net.SocketAppender
>>> log4j.appender.SA.Port=4560
>>> log4j.appender.SA.RemoteHost=hadoop07.housing.com
>>> log4j.appender.SA.ReconnectionDelay=10000
>>> log4j.appender.SA.Application=NM-${user.dir}
>>> # Ignore messages below warning level from Jetty, because it's a bit
>>> verbose
>>> log4j.logger.org.spark-project.jetty=WARN
>>> log4j.logger.org.apache.hadoop=WARN
>>>
>>>
>>> I am launching my spark job using below common on YARN-cluster mode
>>>
>>> *spark-submit --name data-ingestion --master yarn-cluster --conf
>>> spark.custom.configuration.file=hdfs://10.1.6.186/configuration/binning-dev.conf
>>> <http://10.1.6.186/configuration/binning-dev.conf> --files
>>> /usr/hdp/current/spark-client/Runnable/conf/log4j.properties --conf
>>> "spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties"
>>> --conf
>>> "spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j.properties"
>>> --class com.housing.spark.streaming.Binning
>>> /usr/hdp/current/spark-client/Runnable/dsl-data-ingestion-all.jar*
>>>
>>>
>>> *Can anybody please guide me why i am not getting the logs the socket?*
>>>
>>>
>>> *I followed many pages listing below without success*
>>>
>>> http://tech-stories.com/2015/02/12/setting-up-a-central-logging-infrastructure-for-hadoop-and-spark/#comment-208
>>>
>>> http://stackoverflow.com/questions/22918720/custom-log4j-appender-in-hadoop-2
>>>
>>> http://stackoverflow.com/questions/9081625/override-log4j-properties-in-hadoop
>>>
>>>
>>
>
>
>
>

Reply via email to