Vira Vitanska created DLAB-1074:
-----------------------------------

             Summary: [RStusio]: Remote kernel list is still available after 
spark cluster termination
                 Key: DLAB-1074
                 URL: https://issues.apache.org/jira/browse/DLAB-1074
             Project: Apache DLab
          Issue Type: Bug
          Components: DLab Main
            Reporter: Vira Vitanska
            Assignee: Andrii Dumych
             Fix For: v.2.2
         Attachments: Renviron.png, Rprofile.png

The bug was found on GCP. Please, please take into consideration if it is cloud 
related?

*Preconditions:*

1. Spark cluster is created on RStudio

*Steps to reproduce:*

1.  Terminate Spark cluster

2. Go to RStudio UI

3. Open <.Renviron> file

4. Open <.Rprofile> file

*Actual result:*
 # SPARK_HOME="/opt/vit-3008-pr-project1-de-rs-03-spark1/spark/" is present in 
<.Renviron> file
 # #SPARK_HOME="/opt/spark/" is commented  in <.Renviron> file
 # master="spark://172.31.16.44:7077" # Cluster - 
"vit-3008-pr-project1-de-rs-03-spark1" is present in <.Rprofile> file

*Expected result:*

1. SPARK_HOME for deleted Spark cluster is absent in <.Renviron> file

2. Master for deleted Spark cluster  is absent in  <.Rprofile> file

 

How should it work:

After removing a Spark cluster will be activated the last one, if there is not 
any Spark cluster, activate Dataproc, and latest - local.



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to