Re: Configuring logging properties for executor

2015-04-20 Thread Michael Ryabtsev
Hi,

I would like to report on trying the first option proposed by Lan - putting
the log4j.properties file under the root of my application jar.
It doesn't look like it is working on in my case: submitting the
application to spark from the application code (not with spark-submit).
It seems that in this case the executor doesn't see the log4j.properties
that is located int he application jar and will use the default properties
file.
I can conclude it from the fact that the log is not created, and that's
what I see in the executor console:

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

I've also tried to add the below config to the application code, which
didn't have any apparent influence:

sparkConf.set("spark.executor.extraJavaOptions",
"-Dlog4j.configuration=log4j.properties");

On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang  wrote:

> Rename your log4j_special.properties file as log4j.properties and place it
> under the root of your jar file, you should be fine.
>
> If you are using Maven to build your jar, please the log4j.properties in
> the src/main/resources folder.
>
> However, please note that if you have other dependency jar file in the
> classpath that contains another log4j.properties file this way, it might
> not work since the first log4j.properties file that is loaded will be used.
>
> You can also do spark-submit —file log4j_special.properties … ,which
> should transfer your log4j property file to the worker nodes automatically
> without you copying them manually.
>
> Lan
>
>
> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev 
> wrote:
> >
> > Hi all,
> >
> > I need to configure spark executor log4j.properties on a standalone
> cluster.
> > It looks like placing the relevant properties file in the spark
> > configuration folder and  setting the spark.executor.extraJavaOptions
> from
> > my application code:
> > sparkConf.set("spark.executor.extraJavaOptions",
> > "-Dlog4j.configuration=log4j_special.properties");
> > does the work, and the executor logs are written in the required place
> and
> > level. As far as I understand, it works, because the spark configuration
> > folder is on the class path, and passing parameter without path works
> here.
> > However, I would like to avoid deploying these properties to each worker
> > spark configuration folder.
> > I wonder, if I put the properties in my application jar, is there any
> way of
> > telling executor to load them?
> >
> > Thanks,
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>


Re: Configuring logging properties for executor

2015-04-20 Thread Michael Ryabtsev
Oh, you are right, thanks.

On Mon, Apr 20, 2015 at 6:31 PM, Lan Jiang  wrote:

> Each application gets its own executor processes,  so there should be no
> problem running them in parallel.
>
> Lan
>
>
>
> On Apr 20, 2015, at 10:25 AM, Michael Ryabtsev 
> wrote:
>
> Hi Lan,
>
> Thanks for fast response. It could be a solution if it works. I have more
> than one log4 properties file, for different run modes like
> debug/production, for executor and for application core. I think I would
> like to keep them separate. Then, I suppose I should give all other
> properties files a special names and keep the executor configuration with
> the default name? Can I conclude that going this way I will not be able to
> run several applications on the same cluster in parallel?
>
> Regarding submit, I am not using it now, I submit from the code, but I
> think I should consider this option.
>
> Thanks.
>
> On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang  wrote:
>
>> Rename your log4j_special.properties file as log4j.properties and place
>> it under the root of your jar file, you should be fine.
>>
>> If you are using Maven to build your jar, please the log4j.properties in
>> the src/main/resources folder.
>>
>> However, please note that if you have other dependency jar file in the
>> classpath that contains another log4j.properties file this way, it might
>> not work since the first log4j.properties file that is loaded will be used.
>>
>> You can also do spark-submit —file log4j_special.properties … ,which
>> should transfer your log4j property file to the worker nodes automatically
>> without you copying them manually.
>>
>> Lan
>>
>>
>> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev 
>> wrote:
>> >
>> > Hi all,
>> >
>> > I need to configure spark executor log4j.properties on a standalone
>> cluster.
>> > It looks like placing the relevant properties file in the spark
>> > configuration folder and  setting the spark.executor.extraJavaOptions
>> from
>> > my application code:
>> > sparkConf.set("spark.executor.extraJavaOptions",
>> > "-Dlog4j.configuration=log4j_special.properties");
>> > does the work, and the executor logs are written in the required place
>> and
>> > level. As far as I understand, it works, because the spark configuration
>> > folder is on the class path, and passing parameter without path works
>> here.
>> > However, I would like to avoid deploying these properties to each worker
>> > spark configuration folder.
>> > I wonder, if I put the properties in my application jar, is there any
>> way of
>> > telling executor to load them?
>> >
>> > Thanks,
>> >
>> >
>> >
>> > --
>> > View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
>> > Sent from the Apache Spark User List mailing list archive at Nabble.com
>> .
>> >
>> > -
>> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>> > For additional commands, e-mail: user-h...@spark.apache.org
>> >
>>
>>
>
>


Re: Configuring logging properties for executor

2015-04-20 Thread Lan Jiang
Each application gets its own executor processes,  so there should be no 
problem running them in parallel. 

Lan


> On Apr 20, 2015, at 10:25 AM, Michael Ryabtsev  wrote:
> 
> Hi Lan, 
> 
> Thanks for fast response. It could be a solution if it works. I have more 
> than one log4 properties file, for different run modes like debug/production, 
> for executor and for application core. I think I would like to keep them 
> separate. Then, I suppose I should give all other properties files a special 
> names and keep the executor configuration with the default name? Can I 
> conclude that going this way I will not be able to run several applications 
> on the same cluster in parallel?
> 
> Regarding submit, I am not using it now, I submit from the code, but I think 
> I should consider this option.
> 
> Thanks.
> 
> On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang  <mailto:ljia...@gmail.com>> wrote:
> Rename your log4j_special.properties file as log4j.properties and place it 
> under the root of your jar file, you should be fine.
> 
> If you are using Maven to build your jar, please the log4j.properties in the 
> src/main/resources folder.
> 
> However, please note that if you have other dependency jar file in the 
> classpath that contains another log4j.properties file this way, it might not 
> work since the first log4j.properties file that is loaded will be used.
> 
> You can also do spark-submit —file log4j_special.properties … ,which should 
> transfer your log4j property file to the worker nodes automatically without 
> you copying them manually.
> 
> Lan
> 
> 
> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev  > <mailto:michael...@gmail.com>> wrote:
> >
> > Hi all,
> >
> > I need to configure spark executor log4j.properties on a standalone cluster.
> > It looks like placing the relevant properties file in the spark
> > configuration folder and  setting the spark.executor.extraJavaOptions from
> > my application code:
> > sparkConf.set("spark.executor.extraJavaOptions",
> > "-Dlog4j.configuration=log4j_special.properties");
> > does the work, and the executor logs are written in the required place and
> > level. As far as I understand, it works, because the spark configuration
> > folder is on the class path, and passing parameter without path works here.
> > However, I would like to avoid deploying these properties to each worker
> > spark configuration folder.
> > I wonder, if I put the properties in my application jar, is there any way of
> > telling executor to load them?
> >
> > Thanks,
> >
> >
> >
> > --
> > View this message in context: 
> > http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
> >  
> > <http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html>
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org 
> > <mailto:user-unsubscr...@spark.apache.org>
> > For additional commands, e-mail: user-h...@spark.apache.org 
> > <mailto:user-h...@spark.apache.org>
> >
> 
> 



Re: Configuring logging properties for executor

2015-04-20 Thread Michael Ryabtsev
Hi Lan,

Thanks for fast response. It could be a solution if it works. I have more
than one log4 properties file, for different run modes like
debug/production, for executor and for application core. I think I would
like to keep them separate. Then, I suppose I should give all other
properties files a special names and keep the executor configuration with
the default name? Can I conclude that going this way I will not be able to
run several applications on the same cluster in parallel?

Regarding submit, I am not using it now, I submit from the code, but I
think I should consider this option.

Thanks.

On Mon, Apr 20, 2015 at 5:59 PM, Lan Jiang  wrote:

> Rename your log4j_special.properties file as log4j.properties and place it
> under the root of your jar file, you should be fine.
>
> If you are using Maven to build your jar, please the log4j.properties in
> the src/main/resources folder.
>
> However, please note that if you have other dependency jar file in the
> classpath that contains another log4j.properties file this way, it might
> not work since the first log4j.properties file that is loaded will be used.
>
> You can also do spark-submit —file log4j_special.properties … ,which
> should transfer your log4j property file to the worker nodes automatically
> without you copying them manually.
>
> Lan
>
>
> > On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev 
> wrote:
> >
> > Hi all,
> >
> > I need to configure spark executor log4j.properties on a standalone
> cluster.
> > It looks like placing the relevant properties file in the spark
> > configuration folder and  setting the spark.executor.extraJavaOptions
> from
> > my application code:
> > sparkConf.set("spark.executor.extraJavaOptions",
> > "-Dlog4j.configuration=log4j_special.properties");
> > does the work, and the executor logs are written in the required place
> and
> > level. As far as I understand, it works, because the spark configuration
> > folder is on the class path, and passing parameter without path works
> here.
> > However, I would like to avoid deploying these properties to each worker
> > spark configuration folder.
> > I wonder, if I put the properties in my application jar, is there any
> way of
> > telling executor to load them?
> >
> > Thanks,
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
>


Re: Configuring logging properties for executor

2015-04-20 Thread Lan Jiang
Rename your log4j_special.properties file as log4j.properties and place it 
under the root of your jar file, you should be fine.

If you are using Maven to build your jar, please the log4j.properties in the 
src/main/resources folder.

However, please note that if you have other dependency jar file in the 
classpath that contains another log4j.properties file this way, it might not 
work since the first log4j.properties file that is loaded will be used.

You can also do spark-submit —file log4j_special.properties … ,which should 
transfer your log4j property file to the worker nodes automatically without you 
copying them manually.

Lan


> On Apr 20, 2015, at 9:26 AM, Michael Ryabtsev  wrote:
> 
> Hi all,
> 
> I need to configure spark executor log4j.properties on a standalone cluster. 
> It looks like placing the relevant properties file in the spark
> configuration folder and  setting the spark.executor.extraJavaOptions from
> my application code:
> sparkConf.set("spark.executor.extraJavaOptions",
> "-Dlog4j.configuration=log4j_special.properties");
> does the work, and the executor logs are written in the required place and
> level. As far as I understand, it works, because the spark configuration
> folder is on the class path, and passing parameter without path works here.
> However, I would like to avoid deploying these properties to each worker
> spark configuration folder.
> I wonder, if I put the properties in my application jar, is there any way of
> telling executor to load them?
> 
> Thanks,
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Configuring logging properties for executor

2015-04-20 Thread Michael Ryabtsev
Hi all,

I need to configure spark executor log4j.properties on a standalone cluster. 
It looks like placing the relevant properties file in the spark
configuration folder and  setting the spark.executor.extraJavaOptions from
my application code:
sparkConf.set("spark.executor.extraJavaOptions",
"-Dlog4j.configuration=log4j_special.properties");
does the work, and the executor logs are written in the required place and
level. As far as I understand, it works, because the spark configuration
folder is on the class path, and passing parameter without path works here.
However, I would like to avoid deploying these properties to each worker
spark configuration folder.
I wonder, if I put the properties in my application jar, is there any way of
telling executor to load them?

Thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Configuring-logging-properties-for-executor-tp22572.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org