[jira] [Commented] (SPARK-16784) Configurable log4j settings

2019-02-25 Thread Narcis Andrei Moga (JIRA)


[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16776633#comment-16776633
 ] 

Narcis Andrei Moga commented on SPARK-16784:


I have the same problem after migration from Spark 2.2.1 to 2.4.0 and deploy 
mode  cluster & standalone namager (it not happens in client mode deploy)

I test in docker and all required files are present in all containers (1 master 
& 2 workers - Spark have no config for this test - it is just untar)

*1) Executor command observed in the stderr file*

Spark Executor Command: "/srv/java/jdk/bin/java" "-cp" 
"/usr/lib/spark/conf/:/usr/lib/spark/jars/*" "-Xmx1024M" 
"-Dspark.driver.port=45431" "-Dspark.cassandra.connection.port=9042" 
"-Dspark.rpc.askTimeout=10s" "-Dspark.application.ldap.port=55389" 
_*"-Duser.timezone=UTC"*_ 
_*"-Dlog4j.configuration=file:///log4j.properties.executor"*_ 
"-Dcom.sun.management.jmxremote" 
"-Dcom.sun.management.jmxremote.authenticate=false" 
"-Dcom.sun.management.jmxremote.local.only=false" 
"-Dcom.sun.management.jmxremote.ssl=false" "-Djava.net.preferIPv4Stack=true" 
"-Dcom.sun.management.jmxremote.port=0" 
"-Djava.util.logging.config.file=/jmx-logging.properties" 
"org.apache.spark.executor.CoarseGrainedExecutorBackend" "--driver-url" 
"spark://CoarseGrainedScheduler@c1-spark-executor2:45431" "--executor-id" "1" 
"--hostname" "172.18.0.22" "--cores" "1" "--app-id" "app-20190224171936-0010" 
"--worker-url" 
"spark://Worker@172.18.0.22:36555"

*2) Partial command of the Driver observed in the stderr file*

Launch Command: "/srv/java/jdk/bin/java" "-cp" 
"/usr/lib/spark/conf/:/usr/lib/spark/jars/*" "-Xmx1024M" 
_*"-Dspark.driver.extraJavaOptions=-Duser.timezone=UTC 
-Dlog4j.configuration=file:///log4j.properties.driver*_
"-Dspark.kafka.ppu.topic.name=..." 


*3) Submit command*

spark-submit \
--deploy-mode cluster \
--master spark://172.18.0.20:7077 \
--properties-file /application.properties \
--class com... \
/logs-correlation-2.4.1-1.noarch.jar

*4) application.properties contains*

spark.driver.extraJavaOptions=-Duser.timezone=UTC 
-Dlog4j.configuration=file:///log4j.properties.driver

spark.executor.extraJavaOptions=-Duser.timezone=UTC 
-Dlog4j.configuration=file:///log4j.properties.executor

 

 

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>Priority: Major
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2017-07-24 Thread HanCheol Cho (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16099477#comment-16099477
 ] 

HanCheol Cho commented on SPARK-16784:
--

Hi, 

I used the following options that allows both driver & executors use a custom 
log4j.properties,
{code}
spark-submit \
  --driver-java-options=-Dlog4j.configuration=conf/log4j.properties \
  --files conf/log4j.properties \
  --conf 
"spark.executor.extraJavaOptions='-Dlog4j.configuration=log4j.properties'" \
  ...
{code}
I used a local log4j.properties file for the driver and the file in the 
distributed cache (by --files option) for the executors.
As shown above, the paths to driver and executors are different.

I hope it might be useful to the others.



> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2017-06-07 Thread Irina Truong (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16041856#comment-16041856
 ] 

Irina Truong commented on SPARK-16784:
--

In 2.1.0, setting  "spark.driver.extraJavaOptions" to 
"-Dlog4j.configuration=file:/home/hadoop/log4j.properties" in SparkConfig 
seemed to work.

In 2.1.1, it does not work anymore, but setting it via "--driver-java-options" 
still works.

Is this a bug in 2.1.1?

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2017-04-11 Thread Josh Bacon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15964471#comment-15964471
 ] 

Josh Bacon commented on SPARK-16784:


[~tscholak]  I have not created a follow up issue for this. For your situation 
I'd suggest trying --driver-java-options='..' instead of 
--conf='spark.driver.extraJavaOptions=...'  because the latter is applied after 
driver jvm actually starts (too late for log4j). My use-case I abandoned 
attempting to configured log4j for executors, but was able to work with 
driver/application logs in both cluster and client mode (standalone) via baking 
my log4j.properties files into my apps Uber jar resources. I think the need of 
this issue is a new feature for distributing files/log4j.properties in the 
cluster before the actual Spark Driver starts which I'd imagine is not a 
pressing enough to warrant actual development at the moment.

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2017-04-07 Thread Torsten Scholak (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15960922#comment-15960922
 ] 

Torsten Scholak commented on SPARK-16784:
-

I having this exact problem. I need to be able to change the log settings 
depending on the job and/or the application. The method illustrated above, i.e. 
specifying

spark.driver.extraJavaOptions=-Dlog4j.configuration=log4j.properties,
spark.executor.extraJavaOptions=-Dlog4j.configuration=log4j.properties

using spark-submit with "--files log4j.properties" does indeed NOT work.
However, I was surprised to find it suggested as solution on stackoverflow 
(http://stackoverflow.com/questions/28454080/how-to-log-using-log4j-to-local-file-system-inside-a-spark-application-that-runs),
 since it is in direct contradiction to issue described in this ticket.

[~jbacon] have you created a follow-up ticket?

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2017-03-13 Thread Josh Bacon (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15907766#comment-15907766
 ] 

Josh Bacon commented on SPARK-16784:


>From what I've seen, this limitation is also experienced on the Spark 
>Standalone Cluster Manager.

There doesn't appear to be a way to write custom log4j files for driver and 
executor JVMs, on a per application/submission basis. Spark appears to 
provision configuration files (via --files option) after the driver/executor 
JVMs are already started. A work-around exists by including log4j files in the 
classpath of your application's uber jar (e.i. /src/main/resources/), and then 
appending the following spark-submit options: 

--driver-java-options 
'-Dlog4j.configuration=jar:file:your-application-uber-.jar!/your-custom-driver-log4j.properties
 -Dlog4j.debug'

Unfortunately this does not appear to work for executor log4j, because the 
executor JVM appears to start before provisioning your-application-uber.jar 
file, in the case of the driver, the provisioning takes place before driver JVM 
starts so you're able to reference the relative uber jar file path of the 
driver's working directory.
 
THIS DOESN'T WORK:
--conf 
'spark.executor.extraJavaOptions=-Dlog4j.configuration=jar:file:your-application.jar!/your-custom-executor-log4j.properties
 -Dlog4j.debug'

I'm not familiar with the internals, but if this warrants a new jira ticket, 
let me know and I can create one and will work out a proper description!
Thanks



> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0, 2.1.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2016-08-11 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15417889#comment-15417889
 ] 

Sean Owen commented on SPARK-16784:
---

Oh, I really meant {{log4j.configuration}} to specify your own config.

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2016-08-11 Thread Michael Gummelt (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15417856#comment-15417856
 ] 

Michael Gummelt commented on SPARK-16784:
-

`log4j.debug=true` only results in log4j printing its debugging messages.  It 
doesn't turn on debug logging for the application.

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-16784) Configurable log4j settings

2016-07-29 Thread Sean Owen (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-16784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15399275#comment-15399275
 ] 

Sean Owen commented on SPARK-16784:
---

You can configure log4j on the command line with system properties to some 
extent. Does that cover it? 
https://logging.apache.org/log4j/1.2/faq.html#sysprops

> Configurable log4j settings
> ---
>
> Key: SPARK-16784
> URL: https://issues.apache.org/jira/browse/SPARK-16784
> Project: Spark
>  Issue Type: Improvement
>Affects Versions: 2.0.0
>Reporter: Michael Gummelt
>
> I often want to change the logging configuration on a single spark job.  This 
> is easy in client mode.  I just modify log4j.properties.  It's difficult in 
> cluster mode, because I need to modify the log4j.properties in the 
> distribution in which the driver runs.  I'd like a way of setting this 
> dynamically, such as a java system property.  Some brief searching showed 
> that log4j doesn't seem to accept such a property, but I'd like to open up 
> this idea for further comment.  Maybe we can find a solution.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org