[
https://issues.apache.org/jira/browse/SPARK-11927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Marcelo Vanzin resolved SPARK-11927.
------------------------------------
Resolution: Unresolved
Hi Alex,
We don't use JIRA for questions (in spite of that option being there). Please
use the users mailing list for that (see
http://spark.apache.org/community.html).
> configure log4j properties with spark-submit
> ---------------------------------------------
>
> Key: SPARK-11927
> URL: https://issues.apache.org/jira/browse/SPARK-11927
> Project: Spark
> Issue Type: Question
> Components: Spark Submit
> Affects Versions: 1.5.1
> Reporter: Alex Kazantsev
> Priority: Minor
>
> How to properly configure log4j properties on worker per single application
> using spark-submit script?
> Currently setting --conf
> 'spark.executor.extraJavaOptions=-Dlog4j.configuration=file:"log4j.properties"'
> and --files log4j.properties does not work, because according to worker logs
> loading of specified log4j configuration happens before any files are
> downloaded from driver. Is it a bug or a feature? Is it possible to
> reconfigure log4j properties after properties file was downloaded from driver?
> Application was submitted with following script
> {noformat}
> exec /opt/spark/current/bin/spark-submit \
> --name App \
> --master spark://master:17079 \
> --executor-memory 4G \
> --total-executor-cores 4 \
> --driver-java-options '-Dspark.ui.port=4056 -Dconfig.file=application.conf
> -Dlog4j.configuration=file:"./log4j.properties"' \
> --conf 'spark.executor.extraJavaOptions=-XX:+UseParallelGC
> -Duser.timezone=GMT -Dconfig.file=application.conf
> -Dlog4j.configuration=file:"log4j.properties"' \
> --files application.conf,log4j.properties \
> --class default.Main \
> App.jar $*
> {noformat}
> Worker logs:
> {noformat}
> log4j:ERROR Could not read configuration file from URL
> [file:log4j.properties].
> java.io.FileNotFoundException: log4j.properties (No such file or directory)
> at java.io.FileInputStream.open(Native Method)
> at java.io.FileInputStream.<init>(FileInputStream.java:146)
> at java.io.FileInputStream.<init>(FileInputStream.java:101)
> at
> sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90)
> at
> sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188)
> at
> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557)
> at
> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
> at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
> at org.apache.spark.Logging$class.initializeLogging(Logging.scala:122)
> at
> org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
> at org.apache.spark.Logging$class.log(Logging.scala:51)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.log(CoarseGrainedExecutorBackend.scala:136)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:147)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:250)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> log4j:ERROR Ignoring configuration file [file:log4j.properties].
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 15/11/23 11:47:30 INFO CoarseGrainedExecutorBackend: Registered signal
> handlers for [TERM, HUP, INT]
> 15/11/23 11:47:30 WARN NativeCodeLoader: Unable to load native-hadoop library
> for your platform... using builtin-java classes where applicable
> 15/11/23 11:47:30 INFO SecurityManager: Changing view acls to: root
> 15/11/23 11:47:30 INFO SecurityManager: Changing modify acls to: root
> 15/11/23 11:47:30 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(root); users
> with modify permissions: Set(root)
> 15/11/23 11:47:31 INFO Slf4jLogger: Slf4jLogger started
> 15/11/23 11:47:31 INFO Remoting: Starting remoting
> 15/11/23 11:47:31 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://[email protected]:33953]
> 15/11/23 11:47:31 INFO Utils: Successfully started service
> 'driverPropsFetcher' on port 33953.
> 15/11/23 11:47:31 INFO SecurityManager: Changing view acls to: root
> 15/11/23 11:47:31 INFO SecurityManager: Changing modify acls to: root
> 15/11/23 11:47:31 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view permissions: Set(root); users
> with modify permissions: Set(root)
> 15/11/23 11:47:31 INFO RemoteActorRefProvider$RemotingTerminator: Shutting
> down remote daemon.
> 15/11/23 11:47:32 INFO RemoteActorRefProvider$RemotingTerminator: Remote
> daemon shut down; proceeding with flushing remote transports.
> 15/11/23 11:47:32 INFO RemoteActorRefProvider$RemotingTerminator: Remoting
> shut down.
> 15/11/23 11:47:32 INFO Slf4jLogger: Slf4jLogger started
> 15/11/23 11:47:32 INFO Remoting: Starting remoting
> 15/11/23 11:47:32 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://[email protected]:39111]
> 15/11/23 11:47:32 INFO Utils: Successfully started service 'sparkExecutor' on
> port 39111.
> 15/11/23 11:47:32 INFO DiskBlockManager: Created local directory at
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/blockmgr-2ca7e526-1fc8-445d-a7bc-bcb4068ec6c5
> 15/11/23 11:47:32 INFO MemoryStore: MemoryStore started with capacity 4.1 GB
> 15/11/23 11:47:32 INFO CoarseGrainedExecutorBackend: Connecting to driver:
> akka.tcp://[email protected]:60791/user/CoarseGrainedScheduler
> 15/11/23 11:47:32 INFO WorkerWatcher: Connecting to worker
> akka.tcp://[email protected]:17080/user/Worker
> 15/11/23 11:47:32 INFO WorkerWatcher: Successfully connected to
> akka.tcp://[email protected]:17080/user/Worker
> 15/11/23 11:47:32 INFO CoarseGrainedExecutorBackend: Successfully registered
> with driver
> 15/11/23 11:47:32 INFO Executor: Starting executor ID 0 on host 10.1.1.102
> 15/11/23 11:47:32 INFO Utils: Successfully started service
> 'org.apache.spark.network.netty.NettyBlockTransferService' on port 50793.
> 15/11/23 11:47:32 INFO NettyBlockTransferService: Server created on 50793
> 15/11/23 11:47:32 INFO BlockManagerMaster: Trying to register BlockManager
> 15/11/23 11:47:32 INFO BlockManagerMaster: Registered BlockManager
> 15/11/23 11:47:33 INFO CoarseGrainedExecutorBackend: Got assigned task 2
> 15/11/23 11:47:33 INFO Executor: Running task 2.0 in stage 0.0 (TID 2)
> 15/11/23 11:47:33 INFO Executor: Fetching
> http://10.1.1.100:36308/files/application.conf with timestamp 1448279248917
> 15/11/23 11:47:33 INFO Utils: Fetching
> http://10.1.1.100:36308/files/application.conf to
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/fetchFileTemp1187637846804218655.tmp
> 15/11/23 11:47:33 INFO Utils: Copying
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/-474154711448279248917_cache
> to /tmp/spark/work/app-20151123154729-4851/0/./application.conf
> 15/11/23 11:47:33 INFO Executor: Fetching
> http://10.1.1.100:36308/files/log4j.properties with timestamp 1448279248928
> 15/11/23 11:47:33 INFO Utils: Fetching
> http://10.1.1.100:36308/files/log4j.properties to
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/fetchFileTemp1241687594284329302.tmp
> 15/11/23 11:47:33 INFO Utils: Copying
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/7353827741448279248928_cache
> to /tmp/spark/work/app-20151123154729-4851/0/./log4j.properties
> 15/11/23 11:47:33 INFO Executor: Fetching
> http://10.1.1.100:36308/jars/App.jar with timestamp 1448279248776
> 15/11/23 11:47:33 INFO Utils: Fetching http://10.1.1.100:36308/jars/App.jar
> to
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/fetchFileTemp5193813948784260413.tmp
> 15/11/23 11:47:34 INFO Utils: Copying
> /tmp/spark/tmp/spark-7842cb37-6048-4b42-a926-034b42366dd5/executor-55bea81c-3bc2-4e96-9090-33d968a91a10/spark-b655bb7f-3ca9-41f9-90e1-110ef258e5ac/-11786589191448279248776_cache
> to /tmp/spark/work/app-20151123154729-4851/0/./App.jar
> 15/11/23 11:47:35 INFO Executor: Adding
> file:/tmp/spark/work/app-20151123154729-4851/0/./App.jar to class loader
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]