This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.4 by this push:
new f219c113f95 [SPARK-42357][CORE] Log `exitCode` when
`SparkContext.stop` starts
f219c113f95 is described below
commit f219c113f955a9c7f21eab22becc007e43586f16
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Mon Feb 6 11:23:53 2023 -0800
[SPARK-42357][CORE] Log `exitCode` when `SparkContext.stop` starts
### What changes were proposed in this pull request?
This PR aims to log `exitCode` when `SparkContext.stop` starts as a clear
boundary to ignore the meaningless log messages from user jobs.
### Why are the changes needed?
This PR adds the following log.
```
23/02/06 02:12:55 INFO SparkContext: SparkContext is stopping with exitCode
0.
```
In the simplest case, it stops like the following.
```
$ bin/spark-submit examples/src/main/python/pi.py
...
Pi is roughly 3.147080
23/02/06 02:12:55 INFO SparkContext: SparkContext is stopping with exitCode
0.
23/02/06 02:12:55 INFO AbstractConnector: Stopped Spark1cb72b8{HTTP/1.1,
(http/1.1)}{localhost:4040}
23/02/06 02:12:55 INFO SparkUI: Stopped Spark web UI at
http://localhost:4040
23/02/06 02:12:55 INFO MapOutputTrackerMasterEndpoint:
MapOutputTrackerMasterEndpoint stopped!
23/02/06 02:12:55 INFO MemoryStore: MemoryStore cleared
23/02/06 02:12:55 INFO BlockManager: BlockManager stopped
23/02/06 02:12:55 INFO BlockManagerMaster: BlockManagerMaster stopped
23/02/06 02:12:55 INFO
OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:
OutputCommitCoordinator stopped!
23/02/06 02:12:55 INFO SparkContext: Successfully stopped SparkContext
23/02/06 02:12:56 INFO ShutdownHookManager: Shutdown hook called
```
However, in the complex case, there are many many logs after invoking
`SparkContet.stop(0)`. Sometimes, this makes users confused. New log will show
a clear boundary to ignore the meaningless messages.
```
23/02/06 02:59:27 INFO TaskSetManager: Starting task 283.0 in stage 34.0
(TID 426) (172.31.218.234, executor 5, partition 283, PROCESS_LOCAL, 8001 bytes)
...
23/02/06 02:59:27 INFO BlockManagerInfo: Removed broadcast_35_piece0 on
172.31.218.244:41741 in memory (size: 5.7 KiB, free: 50.8 GiB)
...
23/02/06 02:59:27 INFO SparkUI: Stopped Spark web UI at
http://r6i-16xlarge-3402-0203-apple-3-bf3f7e8624a90a37-driver-svc.default.svc:4040
...
23/02/06 02:59:27 INFO DAGScheduler: ShuffleMapStage 34 (q24a) failed in
0.103 s due to Stage cancelled because SparkContext was shut down
...
23/02/06 02:59:27 INFO KubernetesClusterSchedulerBackend: Shutting down all
executors
...
23/02/06 02:59:27 INFO
KubernetesClusterSchedulerBackend$KubernetesDriverEndpoint: Asking each
executor to shut down
...
23/02/06 02:59:27 ERROR TransportRequestHandler: Error while invoking
RpcHandler#receive() for one-way message.
org.apache.spark.SparkException: Could not find CoarseGrainedScheduler.
```
### Does this PR introduce _any_ user-facing change?
No, this is a log-only change.
### How was this patch tested?
Manually.
Closes #39900 from dongjoon-hyun/SPARK-42357.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
(cherry picked from commit 3e40b38bcc9d4529dcd868d67e079330b93c464e)
Signed-off-by: Dongjoon Hyun <[email protected]>
---
core/src/main/scala/org/apache/spark/SparkContext.scala | 1 +
1 file changed, 1 insertion(+)
diff --git a/core/src/main/scala/org/apache/spark/SparkContext.scala
b/core/src/main/scala/org/apache/spark/SparkContext.scala
index 62e652ff9bb..bb1d0a1c98d 100644
--- a/core/src/main/scala/org/apache/spark/SparkContext.scala
+++ b/core/src/main/scala/org/apache/spark/SparkContext.scala
@@ -2092,6 +2092,7 @@ class SparkContext(config: SparkConf) extends Logging {
* @param exitCode Specified exit code that will passed to scheduler backend
in client mode.
*/
def stop(exitCode: Int): Unit = {
+ logInfo(s"SparkContext is stopping with exitCode $exitCode.")
if (LiveListenerBus.withinListenerThread.value) {
throw new SparkException(s"Cannot stop SparkContext within listener bus
thread.")
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]