bharath kumar created SPARK-23497:

             Summary: Sparklyr Applications doesn't disconnect spark driver in 
client mode
                 Key: SPARK-23497
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core, YARN
    Affects Versions: 2.1.0
            Reporter: bharath kumar


When we use Sparklyr to connect to Yarn cluster manager in client mode or 
cluster mode, Spark driver will not disconnect unless we mention the 
spark_disconnect(sc) in the code.

Does it make sense to add a timeout feature for driver to exit after certain 
amount of time, in client mode or cluster mode. I think its only happening with 
connection from Sparklyr to Yarn. Some times the driver stays there for weeks 
and holds minimum resources .

*More  Details:*

Yarn -2.7.0

Spark -2.1.0

Microsoft R Open 3.4.2
Rstudio Version:


yarn application -status application_id

18/01/22 09:08:45 INFO client.MapRZKBasedRMFailoverProxyProvider: Updated RM 
address to


Application Report : 

        Application-Id : application_id

        Application-Name : sparklyr

        Application-Type : SPARK

        User : userid

        Queue : root.queuename

        Start-Time : 1516245523965

        Finish-Time : 0

        Progress : 0%

        State : RUNNING

        Final-State : UNDEFINED

        Tracking-URL : N/A

        RPC Port : -1

        AM Host : N/A

        Aggregate Resource Allocation :266468 MB-seconds, 59 vcore-seconds

        Diagnostics : N/A




I can provide more details if required




This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

Reply via email to