[ 
https://issues.apache.org/jira/browse/SPARK-48334?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17908210#comment-17908210
 ] 

Soumasish Goswami commented on SPARK-48334:
-------------------------------------------

I manged to reproduce this locally.
JDK 11

Spark 3.4.0

Scala 2.13.15

This is the code I ran
{code:java}
object SparkTutorial {
  def main(args: Array[String]): Unit = {

    val conf = new SparkConf()
      .setAppName("RpcServerRepro")
      .setMaster("local[*]")
      .set("spark.executor.memory", "1")

    try {
      val sc = SparkContext.getOrCreate(conf)
      sc.stop()
    } catch {
      case e: Exception =>
        println(s"Initialization error occurred: ${e.getMessage}")

    }
    Thread.sleep(100000)
  }

}{code}
>From the logs:
24/12/25 11:47:20 INFO Utils: Successfully started service 'sparkDriver' on 
port 49294.



~ % jps -l                         

50880 jdk.jcmd/sun.tools.jps.Jps

38736 com.intellij.idea.Main

49798 org.jetbrains.plugins.scala.nailgun.NailgunRunner

11846 com.intellij.idea.Main

50875 org.jetbrains.jps.cmdline.Launcher

50876 cloud.datamate.SparkTutorial

~ % lsof -nP -p 50876 | grep LISTEN

java    50876 soumasish  155u     IPv6 0x455bde48090eb9d5       0t0             
    TCP *:49292 (LISTEN)

java    50876 soumasish  207u     IPv6 0x2245a0346a48f3b6       0t0             
    TCP 192.168.1.174:49294 (LISTEN)


 
 

> NettyServer doesn't shutdown if SparkContext initialize failed
> --------------------------------------------------------------
>
>                 Key: SPARK-48334
>                 URL: https://issues.apache.org/jira/browse/SPARK-48334
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.1.3
>            Reporter: IsisPolei
>            Priority: Critical
>
> When obtaining a SparkContext instance using SparkContext.getOrCreate(), if 
> an exception occurs during initialization (such as using incorrect Spark 
> parameters, e.g., spark.executor.memory=1 without units), the RpcServer 
> started during this period will not be shut down, resulting in the port being 
> occupied indefinitely.
> The action to close the RpcServer happens in _env.stop(), where 
> rpcEnv.shutdown() is executed, but this action only occurs when _env != null 
> (SparkContext.scala:2106, version 3.1.3). However, the error occurs during 
> initialization, and _env is not instantiated, so _env.stop() will not be 
> executed, leading to the RpcServer not being closed.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to