vinodkc commented on code in PR #41746:
URL: https://github.com/apache/spark/pull/41746#discussion_r1247986697
##########
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala:
##########
@@ -311,13 +311,24 @@ class CoarseGrainedSchedulerBackend(scheduler:
TaskSchedulerImpl, val rpcEnv: Rp
decommissionExecutors(Array((executorId, v._1)), v._2, v._3)
unknownExecutorsPendingDecommission.invalidate(executorId)
})
+ // propagate current log level to new executor only if flag is true
+ if (conf.get(EXECUTOR_ALLOW_SYNC_LOG_LEVEL)) {
+ data.executorEndpoint.send(RefreshExecutor(Map("logLevel" ->
Utils.getLogLevel)))
+ }
Review Comment:
The actual issue is, with `spark.log.level` in `sc.conf` or `--conf
spark.log.level` or `sc.setLogLevel` , executor log level can not be changed.
it just changes the Driver side log level alone
To debug issues on the Executor side, only option currently available is to
change the log level by modifying log4j/log4j2 properties file. This will force
the user to make custom log4j/log4j2 files for each application.
With this PR, the current log level applied through `spark.log.leve`l in
`sc.conf` or `--conf spark.log.level` or `sc.setLogLevel` will be passed to
executors
##########
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala:
##########
@@ -311,13 +311,24 @@ class CoarseGrainedSchedulerBackend(scheduler:
TaskSchedulerImpl, val rpcEnv: Rp
decommissionExecutors(Array((executorId, v._1)), v._2, v._3)
unknownExecutorsPendingDecommission.invalidate(executorId)
})
+ // propagate current log level to new executor only if flag is true
+ if (conf.get(EXECUTOR_ALLOW_SYNC_LOG_LEVEL)) {
+ data.executorEndpoint.send(RefreshExecutor(Map("logLevel" ->
Utils.getLogLevel)))
+ }
Review Comment:
The actual issue is, with `spark.log.level` in `sc.conf` or `--conf
spark.log.level` or `sc.setLogLevel` , executor log level can not be changed.
it just changes the Driver side log level alone.
To debug issues on the Executor side, only option currently available is to
change the log level by modifying log4j/log4j2 properties file. This will force
the user to make custom log4j/log4j2 files for each application.
With this PR, the current log level applied through `spark.log.leve`l in
`sc.conf` or `--conf spark.log.level` or `sc.setLogLevel` will be passed to
executors
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]